But why do all these technologies exist? We can look to the rate that camera manufacturers have improved sensor capabilities for an answer. Newer cinema cameras all shoot in f-stops that either match or exceed the film cameras of old: 17-22 f-stops at 8K with 60fps base frame rates. But most of the captured data - and the grading capability that comes along with it - is being thrown away when displayed on “normal” screens. Things need to change.
Whereas an older HDTV might hit 100-200 nits newer screens have a brightness range of 400-2,000 nits, giving us not only the ability to have brighter colours but also the range of colours displayed from black to white. Suddenly the wider gamut makes things more akin to how the human eye perceives colour and gives the luscious range that has always been missing from the standard TVs of today. TV luminance - once determined by standing in front of a few of them until you decided on which brightness level you liked - is something we need to measure and accurately in the age of OLED and HDR.
This brings a natural advancement to the creative grading that we’ve seen since the dawn of film, adding to the extensive uses of light in all still and moving pictures. HDR gives us the option to show our imagined work as being closer to the real one with the highs and lows of light that we see every day. HDR technology is not, however, a side in the resolution wars but rather an intrinsic part of whatever result we see win out. The question now is whether 8K will dominate 4K. NHK - Japan’s lead broadcaster - is opening an 8K TV channel at end of this year, and the next Olympics will be shot at 8K HDR. And with the early stage 8K displays just about coming to market, should we be leapfrogging 4K and going straight to 8?