Web Analytics
   Date   

Re: HDR question

Art Adams <art.cml.only@...>
 

Please, Adam. There's a Philz Coffee up the street from Chater. No Starbucks that day.

--
Art Adams
DP
San Francisco Bay Area



On Mar 18, 2018 at 12:42 AM, <Adam Wilt> wrote:

I assume there were no illicit substances involved in the test that day ;)

We were at Chater Camera that day, so it’s possible we went to Picante for lunch (http://www.picanteberkeley.com). If so, I would have had their exquisite sopa verde. Art almost certainly had a Starbucks coffee. Make of those what you will.

Adam Wilt
technical services: consulting / coding / camerawork
Vancouver WA USA (no, not that Vancouver, the other one)


Re: HDR question

alister@...
 

[S-Log3(Live HDR)] is the setting for which this unit is used as the reference monitor in the S-Log3 Live HDR workflow which Sony advocates. Displays the S-Log3 input signal adding the system gamma.

So Sony is recommending that when connecting the S-Log3 output of a camera to the X300, you set it in a mode where the EOTF is explicitly not an inverse of the camera's OETF. It includes a "system gamma". The other S-Log3 mode is intended for using S-Log3 as a display referred encoding for a grading system, where the picture rendering (OOTF) is applied by the grading system.

My understanding of this is that the Live HDR option adds a system gamma that so that the X300 presents S-Log3 as an image representative of what should be seen on a Rec709 TV from 0 to 100% of the scene brightness. Highlights are then  representative of highlight performance on an TV. Sony latest studio and OB BPU’s output 4K S-Log3 plus HD Rec709. Differential gain is normally applied to the outputs between 6dB and 10dB to compensate for the slightly different contrast requirements of each. So if you rack the SDR image the final HDR image won’t suddenly become too contrasty etc. So in an existing SDR OB unit by replacing the BPU’s and just some of the monitors a full combined HDR/SDR workflow is possible. The SDR outputs from the BPU’s are used to rack the cameras, so there is no need to change the preview monitors. The S-Log3 then passes through the switcher/mixer, monitored using X300’s with the special Live HDR OETF and on the output of the switcher it is converted to SDR and HDR versions. This is a very unique EOTF and a quite unusual.

The other is display referenced with no system gamma as the system gamma is added during the grade and output signal conversion, so this option should dhow the S-Log3 at the same contrast as the scene, no perceptual adjustments applied.

System gammas in log and raw workflows are interesting things as once a colourist has had a go at the pictures who knows what the system gamma is? Of course every TV and monitor needs to be setup the same so that the picture reproduction is as expected, but the end to end gamma is determined by the colourists grading choices.


Alister Chapman

DoP - Stereographer
UK Mobile +44 7711 152226
US Mobile +1(216)298-1977


www.xdcam-user.com    1.5 million hits, 100,000 visits from over 45,000 unique visitors every month!  Film and Video production techniques, reviews and news.

















On 17 Mar 2018, at 21:40, Nick Shaw <nick@...> wrote:

First of all, I want to be clear that I am not talking about artistic adjustments made to the image. I am talking about a baseline before grading, where the image on the screen appears perceptually (and it's all about perception) to be a faithful representation of the scene in front of the camera.

On 17 Mar 2018, at 18:53, Adam Wilt <adam@...> wrote:

Through no fault of our own, we managed to get the image on the monitor looking identical to the physical chart, at least to my eye: brightness, gamma, contrast, and color matched so closely that they could not be told apart.

I'm not saying it's not possible, by design or accident, to end up with a situation where the luminance on a screen is measurably identical to the luminance of the actual chart, and they look the same to the eye. But that is the exception to the rule, and it is not the system working "as designed". I suppose it is more likely to be the case where the monitor is on the set near the chart, so the viewing environment and surround of both the monitor and the chart are the same. That's why we put monitors in black tents, because the set environment is not representative of the intended viewing environment.

I obviously wasn't there on your set, so cannot comment with certainty on the situation. But be wary of things looking identical (or different, for that matter) "to the eye". I am sure we have all seen the famous checker shadow illusion – https://en.wikipedia.org/wiki/Checker_shadow_illusion

And it’s possible the gentle highlight compression of ARRI’s “709” curve just negated the 1.2 gamma boost of the display chain closely enough to generate that profound pellucidity throughout the tonal scale.

The 1.2 system gamma doesn't apply when using the ARRI LUT. In that case the picture rendering is being done by the tone mapping incorporated into the LUT.

On 17 Mar 2018, at 19:22, alister@... wrote:

Perhaps I miss-understood what Nick meant to say which is I suspect is that sometimes camera gamma and screen gamma are not the same.

Not sometimes. Always, if things are set up correctly. Even for HDR. You will notice that Sony's BVM-X300 has two options for S-Log3 as an EOTF, "S-Log3 (HDR)" and "S-Log3 (Live HDR)". To quote from the X300 manual:

[S-Log3(Live HDR)] is the setting for which this unit is used as the reference monitor in the S-Log3 Live HDR workflow which Sony advocates. Displays the S-Log3 input signal adding the system gamma.

So Sony is recommending that when connecting the S-Log3 output of a camera to the X300, you set it in a mode where the EOTF is explicitly not an inverse of the camera's OETF. It includes a "system gamma". The other S-Log3 mode is intended for using S-Log3 as a display referred encoding for a grading system, where the picture rendering (OOTF) is applied by the grading system.

If somebody here from Sony disagrees with my interpretation of their intent, please feel free to correct me.

Nick Shaw
Workflow Consultant
Antler Post
Suite 87
30 Red Lion Street
Richmond
Surrey TW9 1RB
UK

+44 (0)7778 217 555


Re: HDR question

Adam Wilt
 

I assume there were no illicit substances involved in the test that day ;)

We were at Chater Camera that day, so it’s possible we went to Picante for lunch (http://www.picanteberkeley.com). If so, I would have had their exquisite sopa verde. Art almost certainly had a Starbucks coffee. Make of those what you will.

Adam Wilt
technical services: consulting / coding / camerawork
Vancouver WA USA (no, not that Vancouver, the other one)


Re: HDR question

John Tarver
 

I assume there were no illicit substances involved in the test that day ;)

John Tarver, csc
Director of Photography
In TO

On Mar 17, 2018, at 3:03 PM, Art Adams <art.cml.only@...> wrote:

Adam, I remember that. It was almost creepy. It would make a great, but very short, Black Mirror episode.

I remember looking from the monitor (Sony A170, I believe) to the chart and thinking, "I will never see this again."

--
Art Adams
DP
San Francisco Bay Area



On Mar 17, 2018 at 1:53 PM, <Adam Wilt> wrote:

> because of gamma and the responses of displays the total range may be 
> the same but the way it is represented can be totally different.

Can be, yes. And it often (usually) is different. But sometimes the monitor and the scene line up exactly, just as Alister says.

A few years back Art Adams and I were shooting chart tests (I think it was when we were testing a display LUT for a vendor). We had a ChromaDuMonde set up in front of an Alexa, and a Sony OLED monitor set up next to the camera. Through no fault of our own, we managed to get the image on the monitor looking identical to the physical chart, at least to my eye: brightness, gamma, contrast, and color matched so closely that they could not be told apart.

Granted, the dynamic range was limited to the levels reflected off a chart, so there weren’t any HDR speculars: the “scene” was a Rec.709-compatible scene. And we just happened to get the perceptual brightness of chart and monitor precisely lined up, completely by accident. And it’s possible the gentle highlight compression of ARRI’s “709” curve just negated the 1.2 gamma boost of the display chain closely enough to generate that profound pellucidity throughout the tonal scale.

But still, accidental as it may have been, it was a stunning illusion of reality that appeared on the display, a “scene referred” reproduction right next to the referring scene. 

Adam Wilt
technical services: consulting / coding / camerawork
Vancouver WA USA (no, not that Vancouver, the other one)


Re: HDR question

Nick Shaw
 

First of all, I want to be clear that I am not talking about artistic adjustments made to the image. I am talking about a baseline before grading, where the image on the screen appears perceptually (and it's all about perception) to be a faithful representation of the scene in front of the camera.

On 17 Mar 2018, at 18:53, Adam Wilt <adam@...> wrote:

Through no fault of our own, we managed to get the image on the monitor looking identical to the physical chart, at least to my eye: brightness, gamma, contrast, and color matched so closely that they could not be told apart.

I'm not saying it's not possible, by design or accident, to end up with a situation where the luminance on a screen is measurably identical to the luminance of the actual chart, and they look the same to the eye. But that is the exception to the rule, and it is not the system working "as designed". I suppose it is more likely to be the case where the monitor is on the set near the chart, so the viewing environment and surround of both the monitor and the chart are the same. That's why we put monitors in black tents, because the set environment is not representative of the intended viewing environment.

I obviously wasn't there on your set, so cannot comment with certainty on the situation. But be wary of things looking identical (or different, for that matter) "to the eye". I am sure we have all seen the famous checker shadow illusion – https://en.wikipedia.org/wiki/Checker_shadow_illusion

And it’s possible the gentle highlight compression of ARRI’s “709” curve just negated the 1.2 gamma boost of the display chain closely enough to generate that profound pellucidity throughout the tonal scale.

The 1.2 system gamma doesn't apply when using the ARRI LUT. In that case the picture rendering is being done by the tone mapping incorporated into the LUT.

On 17 Mar 2018, at 19:22, alister@... wrote:

Perhaps I miss-understood what Nick meant to say which is I suspect is that sometimes camera gamma and screen gamma are not the same.

Not sometimes. Always, if things are set up correctly. Even for HDR. You will notice that Sony's BVM-X300 has two options for S-Log3 as an EOTF, "S-Log3 (HDR)" and "S-Log3 (Live HDR)". To quote from the X300 manual:

[S-Log3(Live HDR)] is the setting for which this unit is used as the reference monitor in the S-Log3 Live HDR workflow which Sony advocates. Displays the S-Log3 input signal adding the system gamma.

So Sony is recommending that when connecting the S-Log3 output of a camera to the X300, you set it in a mode where the EOTF is explicitly not an inverse of the camera's OETF. It includes a "system gamma". The other S-Log3 mode is intended for using S-Log3 as a display referred encoding for a grading system, where the picture rendering (OOTF) is applied by the grading system.

If somebody here from Sony disagrees with my interpretation of their intent, please feel free to correct me.

Nick Shaw
Workflow Consultant
Antler Post
Suite 87
30 Red Lion Street
Richmond
Surrey TW9 1RB
UK

+44 (0)7778 217 555


Re: HDR question

alister@...
 

On 17 Mar 2018, at 18:53, Adam Wilt <adam@...> wrote:

Can be, yes. And it often (usually) is different. But sometimes the monitor and the scene line up exactly, just as Alister says.

A few years back Art Adams and I were shooting chart tests (I think it was when we were testing a display LUT for a vendor). We had a ChromaDuMonde set up in front of an Alexa, and a Sony OLED monitor set up next to the camera. Through no fault of our own, we managed to get the image on the monitor looking identical to the physical chart, at least to my eye: brightness, gamma, contrast, and color matched so closely that they could not be told apart.

Granted, the dynamic range was limited to the levels reflected off a chart, so there weren’t any HDR speculars: the “scene” was a Rec.709-compatible scene. And we just happened to get the perceptual brightness of chart and monitor precisely lined up, completely by accident. And it’s possible the gentle highlight compression of ARRI’s “709” curve just negated the 1.2 gamma boost of the display chain closely enough to generate that profound pellucidity throughout the tonal scale.

Thank you Adam. I think there may have been some miss-understanding in what I have been saying which was in response to Nicks comment that "scene contrast ratios and screen contrast ratios are not the same thing”. Perhaps I miss-understood what Nick meant to say which is I suspect is that sometimes camera gamma and screen gamma are not the same.

All I have been trying to say is that contrary to what Nick wrote I believe that monitor contrast ratios and scene contrast ratios are the same thing. They are not different things. And I stand by what I have said which is IF the screen can match the brightness and contrast of the scene, when viewed in the same environment both will look the same. If I get my light meter out and measure the contrast between the deepest blacks and brightest whites and both the scene and the monitor are 8 stops, then both have the same contrast ratio and if both have similar brightness levels they will (in the same viewing environment) look the same. There is no difference between the way a screen creates contrast and the way a scene creates contrast. 

Now, perhaps I did miss-understand Nicks words or what he meant to say, but I am surprised that my notion that a contrast ratio is a contrast ratio has been greeted with so many questioning that fundamental premise. Lots of miss-understandings perhaps.

Sure we do all sorts in between scene and screen to change the audiences perception of the image they are looking at for all kinds of different reasons. Gamma (or to be more trendy - transfer functions), grade, etc etc. So now the monitor and screen have different contrasts and that is also what my light meter would tell me, even if my eyes don’t (which will depend on both screen and viewing environment brightness), but that is an entirely different and largely artistic/perceptual matter aimed at present the best looking image to the viewer. A contrast ratio is however a contrast ratio, whether that is from the light coming from a screen or the light coming from a scene, there is no difference.


Alister Chapman

DoP - Stereographer
UK Mobile +44 7711 152226
US Mobile +1(216)298-1977


www.xdcam-user.com    1.5 million hits, 100,000 visits from over 45,000 unique visitors every month!  Film and Video production techniques, reviews and news.

















On 17 Mar 2018, at 18:53, Adam Wilt <adam@...> wrote:

> because of gamma and the responses of displays the total range may be 
> the same but the way it is represented can be totally different.

Can be, yes. And it often (usually) is different. But sometimes the monitor and the scene line up exactly, just as Alister says.

A few years back Art Adams and I were shooting chart tests (I think it was when we were testing a display LUT for a vendor). We had a ChromaDuMonde set up in front of an Alexa, and a Sony OLED monitor set up next to the camera. Through no fault of our own, we managed to get the image on the monitor looking identical to the physical chart, at least to my eye: brightness, gamma, contrast, and color matched so closely that they could not be told apart.

Granted, the dynamic range was limited to the levels reflected off a chart, so there weren’t any HDR speculars: the “scene” was a Rec.709-compatible scene. And we just happened to get the perceptual brightness of chart and monitor precisely lined up, completely by accident. And it’s possible the gentle highlight compression of ARRI’s “709” curve just negated the 1.2 gamma boost of the display chain closely enough to generate that profound pellucidity throughout the tonal scale.

But still, accidental as it may have been, it was a stunning illusion of reality that appeared on the display, a “scene referred” reproduction right next to the referring scene. 

Adam Wilt
technical services: consulting / coding / camerawork
Vancouver WA USA (no, not that Vancouver, the other one)



Re: HDR question

Art Adams <art.cml.only@...>
 

Adam, I remember that. It was almost creepy. It would make a great, but very short, Black Mirror episode.

I remember looking from the monitor (Sony A170, I believe) to the chart and thinking, "I will never see this again."

--
Art Adams
DP
San Francisco Bay Area



On Mar 17, 2018 at 1:53 PM, <Adam Wilt> wrote:

> because of gamma and the responses of displays the total range may be 
> the same but the way it is represented can be totally different.

Can be, yes. And it often (usually) is different. But sometimes the monitor and the scene line up exactly, just as Alister says.

A few years back Art Adams and I were shooting chart tests (I think it was when we were testing a display LUT for a vendor). We had a ChromaDuMonde set up in front of an Alexa, and a Sony OLED monitor set up next to the camera. Through no fault of our own, we managed to get the image on the monitor looking identical to the physical chart, at least to my eye: brightness, gamma, contrast, and color matched so closely that they could not be told apart.

Granted, the dynamic range was limited to the levels reflected off a chart, so there weren’t any HDR speculars: the “scene” was a Rec.709-compatible scene. And we just happened to get the perceptual brightness of chart and monitor precisely lined up, completely by accident. And it’s possible the gentle highlight compression of ARRI’s “709” curve just negated the 1.2 gamma boost of the display chain closely enough to generate that profound pellucidity throughout the tonal scale.

But still, accidental as it may have been, it was a stunning illusion of reality that appeared on the display, a “scene referred” reproduction right next to the referring scene. 

Adam Wilt
technical services: consulting / coding / camerawork
Vancouver WA USA (no, not that Vancouver, the other one)


Re: HDR question

Adam Wilt
 

> because of gamma and the responses of displays the total range may be 
> the same but the way it is represented can be totally different.

Can be, yes. And it often (usually) is different. But sometimes the monitor and the scene line up exactly, just as Alister says.

A few years back Art Adams and I were shooting chart tests (I think it was when we were testing a display LUT for a vendor). We had a ChromaDuMonde set up in front of an Alexa, and a Sony OLED monitor set up next to the camera. Through no fault of our own, we managed to get the image on the monitor looking identical to the physical chart, at least to my eye: brightness, gamma, contrast, and color matched so closely that they could not be told apart.

Granted, the dynamic range was limited to the levels reflected off a chart, so there weren’t any HDR speculars: the “scene” was a Rec.709-compatible scene. And we just happened to get the perceptual brightness of chart and monitor precisely lined up, completely by accident. And it’s possible the gentle highlight compression of ARRI’s “709” curve just negated the 1.2 gamma boost of the display chain closely enough to generate that profound pellucidity throughout the tonal scale.

But still, accidental as it may have been, it was a stunning illusion of reality that appeared on the display, a “scene referred” reproduction right next to the referring scene. 

Adam Wilt
technical services: consulting / coding / camerawork
Vancouver WA USA (no, not that Vancouver, the other one)


Re: HDR question

JD Houston
 

On Mar 17, 2018, at 3:59 AM, alister@... wrote:

So if a monitor can manage 8 stops and the scene is 8 stops and if the brightness of the monitor can match the brightness of the scene, in the same environment both will look the same. So to say that monitor contrast and scene contrast are different things is not correct. It is the viewing environments that are different and it is the difference in the viewing environment that changes the perception of the image, not differences in the contrast ratios of monitors. Unless I’m greatly mistaken and my whole knowledge of contrast and brightness is flawed.
 
The first line is correct.  If you have a monitor that can show up to 10,000 nits (the brightness of a piece of paper in the Sun) so that you can match the brightness of daylight
and you are also outside looking at a monitor, it will look the same up to the point that the monitor can show  (you would need a monitor of 1000000 nits to match the Sun and
direct Sun glints.   Vision is a log thing)
 
But I would dare to say that ALL monitor viewing is currently done in environments that are not a direct match to the original scene.
This is as true in dark scenes where the eye is dark adapted and the surround is also dark.
 
So in practical terms,  the scene contrast and the display contrast are almost never the same. Yes, you are correct that
the difference is because of viewing environment.
 
In most video and film, the contrast build-up in a display is systematically designed into the system.  This is what makes the ratio’s 
different.
 
In cameras, the ‘taking curve’ has an assumption (say Rec709 curve) and the output curve of a display
has a buildup assumption (Rec.1886’s gamma 2.4 for example),  this gives the video system an effective
contrast boost of about 9%  (2.4/2.2)
 
So by design, comparing scene contrast ratios and output display contrast ratios will not give you the same
number.  To simplify the example,  if you have a scene contrast of 100:1, and you want to show it in a 
dark environment, you need a 150:1 to accurately show it.  If you are showing it in a video dim environment,
you would need an output contrast of 109:1 to show it.  (As an aside, the overall gain of 1.2 that is sometimes 
used brings in a complicating factor of audience preference for a certain style of reproduction.  Audiences prefer
boosted contrast looks over reality matches)
 
There is one other consideration.  Most scene contrasts are described in terms of simultaneous contrast — the dark to light ratio
including everything in the scene at the same time (fill, flare, etc).  Most display contrasts are described as 
sequential contrasts — the ratio of a full on white to a full-on black.  This makes the displays seem to have a higher contrast.
But it is not true.    It is especially a problem with projectors because the projection lens can have a drastic effect
on the display contrast.   But even OLED monitors can have systematic issues in displaying images with the original scene contrast.
This is all a reason to never use display contrast as a way to evaluate scenes.
 
From an operator standpoint,  the ratios to use are based on the light in the scene.  So targeting key to fill ratios
of a certain number is the right way to do it.  You are mixing light and light is linear but it can be expressed as
ratios — photographically almost everything is a ratio (e.g.  this light is twice as bright as that light, this exposure is
half as much as the previous one )
 
Why does an operator need to know about contrast ratios and the effects of picture rendering.   Because in the world of HDR, it now 
matters which target you are going for.  If you are never going into a movie theater, then you don’t have to maintain the same limits
for where you would want Zone 1 or Zone 10 detail (to use that as a metaphor).
 
For a project going onto full range PQ, you may want tonal separations of as much as 1.5 million (20.55 stops)  Of course current cameras
aren’t really there yet, so you might have a little time before you have to worry about that.  But even today’s cameras with about 14+ stops
requires decisions about where to place highlight and shadow details that are going to reproduce cleanly on certain types of monitors.
For most productions, the least common denominator applies (which is LCDs).  
 
Because of that, it is important to know that Cinema Output Ratios for a Dark Surround, are different that Video Output Ratios for a Dim or Average Surround).
Dark needs a 1.5 boost.  Dim needs a 1.2 Boost.  So when you consider what you are trying to achieve, it helps if you know
what your audience will be looking at.
 
Yes, it is a colorist problem to fix in the end — it is what you see is what you get.  But knowing the choices that will be faced
improves the usefulness of the source material, so is good for DPs to know.
 
 

Jim Houston
Consultant, Starwatcher Digital, Pasadena, CA


Re: HDR question

JD Houston
 

On Mar 17, 2018, at 9:25 AM, Nick Shaw <nick@...> wrote:

That's the key point, "look the same" not "be the same." It has long been accepted in traditional TV that a system gamma of about 1.2 was required to make an image on a TV appear perceptually the same as the scene.
I’m with Nick on this.

Jim Houston

Jim Houston
Consultant, Starwatcher Digital, Pasadena, CA


Re: HDR question

alister@...
 

That's the key point, "look the same" not "be the same." It has long been accepted in traditional TV that a system gamma of about 1.2 was required to make an image on a TV appear perceptually the same as the scene.

But they don’t “look the same” The 709 standard was based on CRT TV’s with very limited brightness ranges and white at 100 NITs, so a bit of contrast was added to make up for the lack of brightness, so the image was perceived to be better. But now most TV’s hit 300 NT’s or more so most viewers now have even more contrast than before, which they seem to like - but in this case, no it’s not accurate, because a gamma miss-match is being added/created.

But this is getting away from my original point which is that the way a monitor generates contrast is exactly the same as the way contrast for a scene is produced. How we perceive an image on a screen that only fills a small part of our FOV is a different thing and as we all know the same monitor will look different in different viewing environment just as the view out of a window will be perceived differently depending on how bright it is inside the room.

Intuitively you could think that with HDR it might be possible to have the absolute luminance of the screen be identical to that of the scene, and therefore require no system gamma or other picture rendering. However, while this might be possible for some low contrast scenes, where there are specular reflections of the sun, or even just a bright sky, in the scene, co current monitor can reproduce that.

I think we are all aware that no TV can reproduce very high contrast scenes and in particular speculars, I’ve already said that. But if you are within the monitors range and brightness (which can be 8 stops or more) then with ST2084 you should be able to get the same light levels coming from the monitor as the scene, thus the same contrast and same DR. 

Look up the BBC's experiments when developing HLG (Google Tim Borer and Andrew Cotton). They found (surprisingly, according to classical colour science) that as the peak brightness of the monitor increased, the system gamma had to be increased to get a perceptual match. For a 1000 Nit monitor in a typical environment, HLG uses a system gamma of 1.4.

I though the BBC were aiming for 1.2? Again though, this is for perceptual reasons and as you know it gets adjust according to ambient light levels but not because the contrast that comes from a screen is somehow different to the contrast we see in a scene. It’s because TV’s almost never fill our FOV so only a small part of the what we are looking at is changing and the ambient light in the room changes our perception of the contrast. It would be different if the screen totally filled our FOV or everyone had blacked out living rooms. I have no problem with the notion of ambient light levels changing perceptions. This happens not just to monitors but to everything we see. 

This is why it’s normal to use viewfinders with monoculars to exclude ambient light or video villages in blacked out tents. We are eliminating the otherwise distracting ambient light that alters our perception so that all we see is the true contrast of the monitor which should them match the contrast of the scene (or at the very least be very very close). And this comes back to my original point which is that there is no difference in the contrast ratio of a screen compared to the contrast ratio of a scene, they are the same thing, a contrast ratio is a contrast ratio wether that of the scene or that of the display.

A simple test would be to shoot a chart with matching camera and display gammas and have the monitor next to the chart with the lighting set so that the chart is reflecting the same amount of light off the white chip as the monitor is outputting light off the white chip. I bet that if I take a photograph of this both the chart and monitor will look virtually identical in the resulting picture.


Alister Chapman

DoP - Stereographer
UK Mobile +44 7711 152226
US Mobile +1(216)298-1977


www.xdcam-user.com    1.5 million hits, 100,000 visits from over 45,000 unique visitors every month!  Film and Video production techniques, reviews and news.

















On 17 Mar 2018, at 16:25, Nick Shaw <nick@...> wrote:

On 17 Mar 2018, at 16:06, alister@... wrote:

I really hope they aren’t different. What would be the point of my expensive DSC Charts?

Sure, it can be different, but it doesn’t have to be, it all depends on the grade. If the camera and monitor gammas are properly matched then capture and display range should also be matched. The engineers didn’t spend decades developing different gammas to make the pictures on monitors look different to the scenes we are shooting. They were developed to make them look the same

That's the key point, "look the same" not "be the same." It has long been accepted in traditional TV that a system gamma of about 1.2 was required to make an image on a TV appear perceptually the same as the scene.

Intuitively you could think that with HDR it might be possible to have the absolute luminance of the screen be identical to that of the scene, and therefore require no system gamma or other picture rendering. However, while this might be possible for some low contrast scenes, where there are specular reflections of the sun, or even just a bright sky, in the scene, co current monitor can reproduce that.

Look up the BBC's experiments when developing HLG (Google Tim Borer and Andrew Cotton). They found (surprisingly, according to classical colour science) that as the peak brightness of the monitor increased, the system gamma had to be increased to get a perceptual match. For a 1000 Nit monitor in a typical environment, HLG uses a system gamma of 1.4.

Nick Shaw
Workflow Consultant
Antler Post
Suite 87
30 Red Lion Street
Richmond
Surrey TW9 1RB
UK

+44 (0)7778 217 555


Re: HDR question

JD Houston
 


On Mar 17, 2018, at 3:59 AM, alister@... wrote:

So if a monitor can manage 8 stops and the scene is 8 stops and if the brightness of the monitor can match the brightness of the scene, in the same environment both will look the same. So to say that monitor contrast and scene contrast are different things is not correct. It is the viewing environments that are different and it is the difference in the viewing environment that changes the perception of the image, not differences in the contrast ratios of monitors. Unless I’m greatly mistaken and my whole knowledge of contrast and brightness is flawed.

The first line is correct.  If you have a monitor that can show up to 10,000 nits (the brightness of a piece of paper in the Sun) so that you can match the brightness of daylight
and you are also outside looking at a monitor, it will look the same up to the point that the monitor can show  (you would need a monitor of 1000000 nits to match the Sun and
direct Sun glints.   Vision is a log thing)

But I would dare to say that ALL monitor viewing is currently done in environments that are not a direct match to the original scene.
This is as true in dark scenes where the eye is dark adapted and the surround is also dark.

So in practical terms,  the scene contrast and the display contrast are almost never the same. Yes, you are correct that
the difference is because of viewing environment.

In most video and film, the contrast build-up in a display is systematically designed into the system.  This is what makes the ratio’s 
different.

In cameras, the ‘taking curve’ has an assumption (say Rec709 curve) and the output curve of a display
has a buildup assumption (Rec.1886’s gamma 2.4 for example),  this gives the video system an effective
contrast boost of about 9%  (2.4/2.2)

So by design, comparing scene contrast ratios and output display contrast ratios will not give you the same
number.  To simplify the example,  if you have a scene contrast of 100:1, and you want to show it in a 
dark environment, you need a 150:1 to accurately show it.  If you are showing it in a video dim environment,
you would need an output contrast of 109:1 to show it.  (As an aside, the overall gain of 1.2 that is sometimes 
used brings in a complicating factor of audience preference for a certain style of reproduction.  Audiences prefer
boosted contrast looks over reality matches)

There is one other consideration.  Most scene contrasts are described in terms of simultaneous contrast — the dark to light ratio
including everything in the scene at the same time (fill, flare, etc).  Most display contrasts are described as 
sequential contrasts — the ratio of a full on white to a full-on black.  This makes the displays seem to have a higher contrast.
But it is not true.    It is especially a problem with projectors because the projection lens can have a drastic effect
on the display contrast.   But even OLED monitors can have systematic issues in displaying images with the original scene contrast.
This is all a reason to never use display contrast as a way to evaluate scenes.

From an operator standpoint,  the ratios to use are based on the light in the scene.  So targeting key to fill ratios
of a certain number is the right way to do it.  You are mixing light and light is linear but it can be expressed as
ratios — photographically almost everything is a ratio (e.g.  this light is twice as bright as that light, this exposure is
half as much as the previous one )

Why does an operator need to know about contrast ratios and the effects of picture rendering.   Because in the world of HDR, it now 
matters which target you are going for.  If you are never going into a movie theater, then you don’t have to maintain the same limits
for where you would want Zone 1 or Zone 10 detail (to use that as a metaphor).

For a project going onto full range PQ, you may want tonal separations of as much as 1.5 million (20.55 stops)  Of course current cameras
aren’t really there yet, so you might have a little time before you have to worry about that.  But even today’s cameras with about 14+ stops
requires decisions about where to place highlight and shadow details that are going to reproduce cleanly on certain types of monitors.
For most productions, the least common denominator applies (which is LCDs).  

Because of that, it is important to know that Cinema Output Ratios for a Dark Surround, are different that Video Output Ratios for a Dim or Average Surround).
Dark needs a 1.5 boost.  Dim needs a 1.2 Boost.  So when you consider what you are trying to achieve, it helps if you know
what your audience will be looking at.

Yes, it is a colorist problem to fix in the end — it is what you see is what you get.  But knowing the choices that will be faced
improves the usefulness of the source material, so is good for DPs to know.



Jim Houston
Consultant, Starwatcher Digital, Pasadena, CA


Re: HDR question

Art Adams <art.cml.only@...>
 

This is correct from an engineering standpoint. From an artistic standpoint... not so much.

There's a difference between capturing images and creating visual stories. For the former, a chart is an absolute reference. For the latter, it's a tool meant to create a consistent starting point when imposing an artistic vision, and ensuring that vision is the one that reaches the audience. 

A chart is not necessarily an image reference, but can be a look calibration reference.

--
Art Adams
DP
San Francisco Bay Area



On Mar 17, 2018 at 11:06 AM, <Alister> wrote:

Geoff wrote:

I have to disagree with you on this Alister, because of gamma and the responses of displays the total range may be the same but the way it is represented can be totally different.
 
I really hope they aren’t different. What would be the point of my expensive DSC Charts?

Sure, it can be different, but it doesn’t have to be, it all depends on the grade. If the camera and monitor gammas are properly matched then capture and display range should also be matched. The engineers didn’t spend decades developing different gammas to make the pictures on monitors look different to the scenes we are shooting. They were developed to make them look the same. If you use a real 709 gamma curve in a camera and have a correctly matched 709 gamma in the monitor then the contrast range of what the camera captures should be reproduced on the monitor with the same contrast and it should look the same. Have you never shot a DSC chart while looking at a 709 monitor and noticed how when it’s all working correctly the chart on the monitor looks just like the chart you are shooting? Both the total range and the contrast range are the same, so they look the same because the monitor output mirrors the light being reflected from the chart. Of course you can screw this up by doing this on a bright sunny day where a 709 monitor has no chance of reaching the same output levels as a chart illuminated by direct sunlight, but do it under controlled lighting where the lighting does not exceed the monitor output and both should look near identical, if they didn’t charts like the CamBelles would be pointless.


Alister Chapman

DoP - Stereographer
UK Mobile +44 7711 152226
US Mobile +1(216)298-1977


www.xdcam-user.com    1.5 million hits, 100,000 visits from over 45,000 unique visitors every month!  Film and Video production techniques, reviews and news.

















On 17 Mar 2018, at 13:42, Geoff Boyle <geoff.cml@...> wrote:

I have to disagree with you on this Alister, because of gamma and the responses of displays the total range may be the same but the way it is represented can be totally different.
 
Cheers
 
Geoff Boyle NSC FBKS
Cinematographer
Zoetermeer
+31 (0) 637 155 076
 
So if a monitor can manage 8 stops and the scene is 8 stops and if the brightness of the monitor can match the brightness of the scene, in the same environment both will look the same. 
 


Re: HDR question

Nick Shaw
 

On 17 Mar 2018, at 16:06, alister@... wrote:

I really hope they aren’t different. What would be the point of my expensive DSC Charts?

Sure, it can be different, but it doesn’t have to be, it all depends on the grade. If the camera and monitor gammas are properly matched then capture and display range should also be matched. The engineers didn’t spend decades developing different gammas to make the pictures on monitors look different to the scenes we are shooting. They were developed to make them look the same

That's the key point, "look the same" not "be the same." It has long been accepted in traditional TV that a system gamma of about 1.2 was required to make an image on a TV appear perceptually the same as the scene.

Intuitively you could think that with HDR it might be possible to have the absolute luminance of the screen be identical to that of the scene, and therefore require no system gamma or other picture rendering. However, while this might be possible for some low contrast scenes, where there are specular reflections of the sun, or even just a bright sky, in the scene, co current monitor can reproduce that.

Look up the BBC's experiments when developing HLG (Google Tim Borer and Andrew Cotton). They found (surprisingly, according to classical colour science) that as the peak brightness of the monitor increased, the system gamma had to be increased to get a perceptual match. For a 1000 Nit monitor in a typical environment, HLG uses a system gamma of 1.4.

Nick Shaw
Workflow Consultant
Antler Post
Suite 87
30 Red Lion Street
Richmond
Surrey TW9 1RB
UK

+44 (0)7778 217 555


Re: HDR question

alister@...
 

Geoff wrote:

I have to disagree with you on this Alister, because of gamma and the responses of displays the total range may be the same but the way it is represented can be totally different.
 
I really hope they aren’t different. What would be the point of my expensive DSC Charts?

Sure, it can be different, but it doesn’t have to be, it all depends on the grade. If the camera and monitor gammas are properly matched then capture and display range should also be matched. The engineers didn’t spend decades developing different gammas to make the pictures on monitors look different to the scenes we are shooting. They were developed to make them look the same. If you use a real 709 gamma curve in a camera and have a correctly matched 709 gamma in the monitor then the contrast range of what the camera captures should be reproduced on the monitor with the same contrast and it should look the same. Have you never shot a DSC chart while looking at a 709 monitor and noticed how when it’s all working correctly the chart on the monitor looks just like the chart you are shooting? Both the total range and the contrast range are the same, so they look the same because the monitor output mirrors the light being reflected from the chart. Of course you can screw this up by doing this on a bright sunny day where a 709 monitor has no chance of reaching the same output levels as a chart illuminated by direct sunlight, but do it under controlled lighting where the lighting does not exceed the monitor output and both should look near identical, if they didn’t charts like the CamBelles would be pointless.


Alister Chapman

DoP - Stereographer
UK Mobile +44 7711 152226
US Mobile +1(216)298-1977


www.xdcam-user.com    1.5 million hits, 100,000 visits from over 45,000 unique visitors every month!  Film and Video production techniques, reviews and news.

















On 17 Mar 2018, at 13:42, Geoff Boyle <geoff.cml@...> wrote:

I have to disagree with you on this Alister, because of gamma and the responses of displays the total range may be the same but the way it is represented can be totally different.
 
Cheers
 
Geoff Boyle NSC FBKS
Cinematographer
Zoetermeer
+31 (0) 637 155 076
 
So if a monitor can manage 8 stops and the scene is 8 stops and if the brightness of the monitor can match the brightness of the scene, in the same environment both will look the same. 
 


Re: HDR question

MARK FOERSTER
 

“here burn in their LUTs in this way”
____________________________________

Until log showed up say five years ago the only way - was to “burn” your lut into your work. You picked a gamma curve, set a white point and used traditional methods like protecting whites, nd’s in windows and hmi’s for keys and fills. Don’t be afraid to use a properly calibrated monitor on set and deliver a fully exceptable picture. I still deliver this way 50% of the time. I’ll also mention here that I still see plenty of “post corrected log” looking worse than had I delivered the way (shocker!) I thought it should look. Cranky blacks especially. Finally a 10 bit file even baked with a decent lut has enormous head room and shadow pull - try it in any NLE.


Mark Foerster csc
Toronto
(905) 922 5555


Re: HDR question

Geoff Boyle
 

I have to disagree with you on this Alister, because of gamma and the responses of displays the total range may be the same but the way it is represented can be totally different.

 

Cheers

 

Geoff Boyle NSC FBKS

Cinematographer

Zoetermeer

www.gboyle.co.uk

+31 (0) 637 155 076

 

From: cml-raw-log-hdr@... <cml-raw-log-hdr@...> On Behalf Of alister@...

So if a monitor can manage 8 stops and the scene is 8 stops and if the brightness of the monitor can match the brightness of the scene, in the same environment both will look the same.

 


Re: HDR question

Jan Klier
 

I just upgraded to the Varicam LT. It has the ability to record VLog to the primary memory card and a proxy file to the secondary memory card, while applying a built-in or user supplied LUT to the proxy file. So you get the best of both straight from camera.

Jan Klier
DP NYC

On Mar 17, 2018, at 8:24 AM, Jonathon Sendall <jpsendall@...> wrote:

I occasionally burn in a LUT to a monitor/recorder (if I have time and have a post plan) so that there’s at least a reference image in proxies, but only if I’m happy with the camera log/raw footage being the only copy in that format.



Re: HDR question

alister@...
 

To make an image appear perceptually like the scene to a viewer, you do not want the luminance of the screen to be proportional to the luminance of the scene. You require what is referred to as "picture rendering" applied. This is the function of the RRT in the ACES block diagram, or the 1.2 "system gamma" of traditional video, to give two examples.

I’m fully familiar with viewing environments etc and the examples above assume that the viewing environment is different to the scene and that the monitors contrast range is significantly less than the scenes contrast range. But if we are talking about a monitor that can match the brightness and contrast of a scene, and if that monitor was placed into the scene the contrast range and perceived range of both could be matched. Display contrast is no different to scene contrast. Yes, viewing environment changes how we perceive an image (or a scene) relative to the ambient environment, but this is no different to looking through a small window at an exterior from inside a dark room. The brightness of the room will alter our perception of how bright it is outside.

So if a monitor can manage 8 stops and the scene is 8 stops and if the brightness of the monitor can match the brightness of the scene, in the same environment both will look the same. So to say that monitor contrast and scene contrast are different things is not correct. It is the viewing environments that are different and it is the difference in the viewing environment that changes the perception of the image, not differences in the contrast ratios of monitors. Unless I’m greatly mistaken and my whole knowledge of contrast and brightness is flawed.

Alister Chapman

DoP - Stereographer
UK Mobile +44 7711 152226
US Mobile +1(216)298-1977


www.xdcam-user.com    1.5 million hits, 100,000 visits from over 45,000 unique visitors every month!  Film and Video production techniques, reviews and news.

















On 16 Mar 2018, at 15:02, Nick Shaw <nick@...> wrote:

On 16 Mar 2018, at 13:19, alister@... wrote:

Nick, could you enlighten me as to how the contrast ratio of a display is different to the contrast ratio of a scene, surely a ratio is a ratio, if a screen can show 10:1 and a scene is 10:1 are these ratios not the same?

A scene and display contrast ratio of e.g. 10:1 may be numerically the same, but they are not perceptually the same, because there is a difference in both the absolute luminance and the viewing environment.

To make an image appear perceptually like the scene to a viewer, you do not want the luminance of the screen to be proportional to the luminance of the scene. You require what is referred to as "picture rendering" applied. This is the function of the RRT in the ACES block diagram, or the 1.2 "system gamma" of traditional video, to give two examples.

Nick Shaw
Workflow Consultant
Antler Post
Suite 87
30 Red Lion Street
Richmond
Surrey TW9 1RB
UK

+44 (0)7778 217 555


Re: HDR question

Jonathon Sendall
 

I occasionally burn in a LUT to a monitor/recorder (if I have time and have a post plan) so that there’s at least a reference image in proxies, but only if I’m happy with the camera log/raw footage being the only copy in that format.

Jonathon Sendall
DP, London

On Sat, 17 Mar 2018 at 11:58, Nick Shaw <nick@...> wrote:
On 17 Mar 2018, at 01:34, Seth Marshall via Cml.News <sethmarshall=yahoo.com@...> wrote:

I would like to know how many users here burn in their LUTs in this way. I am so used to delivering Log in scenes with a high dynamic range to protect myself. Do some here always burn it in?
That video is not suggesting you burn in the LUT. Indeed, the Odyssey does not even have the option to do that. The LUT is used for monitoring only. I did another video for CD showing how to apply the print-down LUTs in Resolve.

Regarding my comments about picture rendering, that is not something either a DP or a colourist really needs to concern themselves with an any practical way. I was just pointing out that screen contrast and scene contrast don't relate directly in the way Alistair was suggesting. It is a small insight into the colour science of what is happening in a camera LUT. But in practical terms you don't really need to know why. You just use ACES, or a manufacturer's LUT, or just grade until it "looks right"!

Nick Shaw
Workflow Consultant
Antler Post
U.K.

1841 - 1860 of 1984