Web Analytics
Re: HDR question

Re: HDR question


alister@...
 

That's the key point, "look the same" not "be the same." It has long been accepted in traditional TV that a system gamma of about 1.2 was required to make an image on a TV appear perceptually the same as the scene.

But they don’t “look the same” The 709 standard was based on CRT TV’s with very limited brightness ranges and white at 100 NITs, so a bit of contrast was added to make up for the lack of brightness, so the image was perceived to be better. But now most TV’s hit 300 NT’s or more so most viewers now have even more contrast than before, which they seem to like - but in this case, no it’s not accurate, because a gamma miss-match is being added/created.

But this is getting away from my original point which is that the way a monitor generates contrast is exactly the same as the way contrast for a scene is produced. How we perceive an image on a screen that only fills a small part of our FOV is a different thing and as we all know the same monitor will look different in different viewing environment just as the view out of a window will be perceived differently depending on how bright it is inside the room.

Intuitively you could think that with HDR it might be possible to have the absolute luminance of the screen be identical to that of the scene, and therefore require no system gamma or other picture rendering. However, while this might be possible for some low contrast scenes, where there are specular reflections of the sun, or even just a bright sky, in the scene, co current monitor can reproduce that.

I think we are all aware that no TV can reproduce very high contrast scenes and in particular speculars, I’ve already said that. But if you are within the monitors range and brightness (which can be 8 stops or more) then with ST2084 you should be able to get the same light levels coming from the monitor as the scene, thus the same contrast and same DR. 

Look up the BBC's experiments when developing HLG (Google Tim Borer and Andrew Cotton). They found (surprisingly, according to classical colour science) that as the peak brightness of the monitor increased, the system gamma had to be increased to get a perceptual match. For a 1000 Nit monitor in a typical environment, HLG uses a system gamma of 1.4.

I though the BBC were aiming for 1.2? Again though, this is for perceptual reasons and as you know it gets adjust according to ambient light levels but not because the contrast that comes from a screen is somehow different to the contrast we see in a scene. It’s because TV’s almost never fill our FOV so only a small part of the what we are looking at is changing and the ambient light in the room changes our perception of the contrast. It would be different if the screen totally filled our FOV or everyone had blacked out living rooms. I have no problem with the notion of ambient light levels changing perceptions. This happens not just to monitors but to everything we see. 

This is why it’s normal to use viewfinders with monoculars to exclude ambient light or video villages in blacked out tents. We are eliminating the otherwise distracting ambient light that alters our perception so that all we see is the true contrast of the monitor which should them match the contrast of the scene (or at the very least be very very close). And this comes back to my original point which is that there is no difference in the contrast ratio of a screen compared to the contrast ratio of a scene, they are the same thing, a contrast ratio is a contrast ratio wether that of the scene or that of the display.

A simple test would be to shoot a chart with matching camera and display gammas and have the monitor next to the chart with the lighting set so that the chart is reflecting the same amount of light off the white chip as the monitor is outputting light off the white chip. I bet that if I take a photograph of this both the chart and monitor will look virtually identical in the resulting picture.


Alister Chapman

DoP - Stereographer
UK Mobile +44 7711 152226
US Mobile +1(216)298-1977


www.xdcam-user.com    1.5 million hits, 100,000 visits from over 45,000 unique visitors every month!  Film and Video production techniques, reviews and news.

















On 17 Mar 2018, at 16:25, Nick Shaw <nick@...> wrote:

On 17 Mar 2018, at 16:06, alister@... wrote:

I really hope they aren’t different. What would be the point of my expensive DSC Charts?

Sure, it can be different, but it doesn’t have to be, it all depends on the grade. If the camera and monitor gammas are properly matched then capture and display range should also be matched. The engineers didn’t spend decades developing different gammas to make the pictures on monitors look different to the scenes we are shooting. They were developed to make them look the same

That's the key point, "look the same" not "be the same." It has long been accepted in traditional TV that a system gamma of about 1.2 was required to make an image on a TV appear perceptually the same as the scene.

Intuitively you could think that with HDR it might be possible to have the absolute luminance of the screen be identical to that of the scene, and therefore require no system gamma or other picture rendering. However, while this might be possible for some low contrast scenes, where there are specular reflections of the sun, or even just a bright sky, in the scene, co current monitor can reproduce that.

Look up the BBC's experiments when developing HLG (Google Tim Borer and Andrew Cotton). They found (surprisingly, according to classical colour science) that as the peak brightness of the monitor increased, the system gamma had to be increased to get a perceptual match. For a 1000 Nit monitor in a typical environment, HLG uses a system gamma of 1.4.

Nick Shaw
Workflow Consultant
Antler Post
Suite 87
30 Red Lion Street
Richmond
Surrey TW9 1RB
UK

+44 (0)7778 217 555

Join cml-raw-log-hdr@cml.news to automatically receive all group messages.