Re: HDR question
alister@...
The other is display referenced with no system gamma as the system gamma is added during the grade and output signal conversion, so this option should dhow the S-Log3 at the same contrast as the scene, no perceptual adjustments applied. System gammas in log and raw workflows are interesting things as once a colourist has had a go at the pictures who knows what the system gamma is? Of course every TV and monitor needs to be setup the same so that the picture reproduction is as expected, but the end to end gamma is determined by the colourists grading choices.
|
|
Re: HDR question
Adam Wilt
> I assume there were no illicit substances involved in the test that day ;)
We were at Chater Camera that day, so it’s possible we went to Picante for lunch (http://www.picanteberkeley.com). If so, I would have had their exquisite sopa verde. Art almost certainly had a Starbucks coffee. Make of those what you will. Adam Wilt technical services: consulting / coding / camerawork Vancouver WA USA (no, not that Vancouver, the other one)
|
|
Re: HDR question
John Tarver
I assume there were no illicit substances involved in the test that day ;)
toggle quoted messageShow quoted text
On Mar 17, 2018, at 3:03 PM, Art Adams <art.cml.only@...> wrote:
|
|
Re: HDR question
Nick Shaw
First of all, I want to be clear that I am not talking about artistic adjustments made to the image. I am talking about a baseline before grading, where the image on the screen appears perceptually (and it's all about perception) to be a faithful representation of the scene in front of the camera. On 17 Mar 2018, at 18:53, Adam Wilt <adam@...> wrote: Through no fault of our own, we managed to get the image on the monitor looking identical to the physical chart, at least to my eye: brightness, gamma, contrast, and color matched so closely that they could not be told apart. I'm not saying it's not possible, by design or accident, to end up with a situation where the luminance on a screen is measurably identical to the luminance of the actual chart, and they look the same to the eye. But that is the exception to the rule, and it is not the system working "as designed". I suppose it is more likely to be the case where the monitor is on the set near the chart, so the viewing environment and surround of both the monitor and the chart are the same. That's why we put monitors in black tents, because the set environment is not representative of the intended viewing environment. I obviously wasn't there on your set, so cannot comment with certainty on the situation. But be wary of things looking identical (or different, for that matter) "to the eye". I am sure we have all seen the famous checker shadow illusion – https://en.wikipedia.org/wiki/Checker_shadow_illusion And it’s possible the gentle highlight compression of ARRI’s “709” curve just negated the 1.2 gamma boost of the display chain closely enough to generate that profound pellucidity throughout the tonal scale. The 1.2 system gamma doesn't apply when using the ARRI LUT. In that case the picture rendering is being done by the tone mapping incorporated into the LUT. On 17 Mar 2018, at 19:22, alister@... wrote: Perhaps I miss-understood what Nick meant to say which is I suspect is that sometimes camera gamma and screen gamma are not the same. Not sometimes. Always, if things are set up correctly. Even for HDR. You will notice that Sony's BVM-X300 has two options for S-Log3 as an EOTF, "S-Log3 (HDR)" and "S-Log3 (Live HDR)". To quote from the X300 manual: [S-Log3(Live HDR)] is the setting for which this unit is used as the reference monitor in the S-Log3 Live HDR workflow which Sony advocates. Displays the S-Log3 input signal adding the system gamma. So Sony is recommending that when connecting the S-Log3 output of a camera to the X300, you set it in a mode where the EOTF is explicitly not an inverse of the camera's OETF. It includes a "system gamma". The other S-Log3 mode is intended for using S-Log3 as a display referred encoding for a grading system, where the picture rendering (OOTF) is applied by the grading system. If somebody here from Sony disagrees with my interpretation of their intent, please feel free to correct me. Nick Shaw Workflow Consultant Antler Post Suite 87 30 Red Lion Street Richmond Surrey TW9 1RB UK +44 (0)7778 217 555
|
|
Re: HDR question
alister@...
All I have been trying to say is that contrary to what Nick wrote I believe that monitor contrast ratios and scene contrast ratios are the same thing. They are not different things. And I stand by what I have said which is IF the screen can match the brightness and contrast of the scene, when viewed in the same environment both will look the same. If I get my light meter out and measure the contrast between the deepest blacks and brightest whites and both the scene and the monitor are 8 stops, then both have the same contrast ratio and if both have similar brightness levels they will (in the same viewing environment) look the same. There is no difference between the way a screen creates contrast and the way a scene creates contrast. Now, perhaps I did miss-understand Nicks words or what he meant to say, but I am surprised that my notion that a contrast ratio is a contrast ratio has been greeted with so many questioning that fundamental premise. Lots of miss-understandings perhaps. Sure we do all sorts in between scene and screen to change the audiences perception of the image they are looking at for all kinds of different reasons. Gamma (or to be more trendy - transfer functions), grade, etc etc. So now the monitor and screen have different contrasts and that is also what my light meter would tell me, even if my eyes don’t (which will depend on both screen and viewing environment brightness), but that is an entirely different and largely artistic/perceptual matter aimed at present the best looking image to the viewer. A contrast ratio is however a contrast ratio, whether that is from the light coming from a screen or the light coming from a scene, there is no difference.
|
|
Re: HDR question
Art Adams <art.cml.only@...>
Adam, I remember that. It was almost creepy. It would make a great, but very short, Black Mirror episode. I remember looking from the monitor (Sony A170, I believe) to the chart and thinking, "I will never see this again."
|
|
Re: HDR question
Adam Wilt
> because of gamma and the responses of displays the total range may be
> the same but the way it is represented can be totally different. Can be, yes. And it often (usually) is different. But sometimes the monitor and the scene line up exactly, just as Alister says. A few years back Art Adams and I were shooting chart tests (I think it was when we were testing a display LUT for a vendor). We had a ChromaDuMonde set up in front of an Alexa, and a Sony OLED monitor set up next to the camera. Through no fault of our own, we managed to get the image on the monitor looking identical to the physical chart, at least to my eye: brightness, gamma, contrast, and color matched so closely that they could not be told apart. Granted, the dynamic range was limited to the levels reflected off a chart, so there weren’t any HDR speculars: the “scene” was a Rec.709-compatible scene. And we just happened to get the perceptual brightness of chart and monitor precisely lined up, completely by accident. And it’s possible the gentle highlight compression of ARRI’s “709” curve just negated the 1.2 gamma boost of the display chain closely enough to generate that profound pellucidity throughout the tonal scale. But still, accidental as it may have been, it was a stunning illusion of reality that appeared on the display, a “scene referred” reproduction right next to the referring scene. Adam Wilt technical services: consulting / coding / camerawork Vancouver WA USA (no, not that Vancouver, the other one)
|
|
Re: HDR question
JD Houston
The first line is correct. If you have a monitor that can show up to 10,000 nits (the brightness of a piece of paper in the Sun) so that you can match the brightness of daylight
and you are also outside looking at a monitor, it will look the same up to the point that the monitor can show (you would need a monitor of 1000000 nits to match the Sun and
direct Sun glints. Vision is a log thing)
But I would dare to say that ALL monitor viewing is currently done in environments that are not a direct match to the original scene.
This is as true in dark scenes where the eye is dark adapted and the surround is also dark.
So in practical terms, the scene contrast and the display contrast are almost never the same. Yes, you are correct that
the difference is because of viewing environment.
In most video and film, the contrast build-up in a display is systematically designed into the system. This is what makes the ratio’s
different.
In cameras, the ‘taking curve’ has an assumption (say Rec709 curve) and the output curve of a display
has a buildup assumption (Rec.1886’s gamma 2.4 for example), this gives the video system an effective
contrast boost of about 9% (2.4/2.2)
So by design, comparing scene contrast ratios and output display contrast ratios will not give you the same
number. To simplify the example, if you have a scene contrast of 100:1, and you want to show it in a
dark environment, you need a 150:1 to accurately show it. If you are showing it in a video dim environment,
you would need an output contrast of 109:1 to show it. (As an aside, the overall gain of 1.2 that is sometimes
used brings in a complicating factor of audience preference for a certain style of reproduction. Audiences prefer
boosted contrast looks over reality matches)
There is one other consideration. Most scene contrasts are described in terms of simultaneous contrast — the dark to light ratio
including everything in the scene at the same time (fill, flare, etc). Most display contrasts are described as
sequential contrasts — the ratio of a full on white to a full-on black. This makes the displays seem to have a higher contrast.
But it is not true. It is especially a problem with projectors because the projection lens can have a drastic effect
on the display contrast. But even OLED monitors can have systematic issues in displaying images with the original scene contrast.
This is all a reason to never use display contrast as a way to evaluate scenes.
From an operator standpoint, the ratios to use are based on the light in the scene. So targeting key to fill ratios
of a certain number is the right way to do it. You are mixing light and light is linear but it can be expressed as
ratios — photographically almost everything is a ratio (e.g. this light is twice as bright as that light, this exposure is
half as much as the previous one )
Why does an operator need to know about contrast ratios and the effects of picture rendering. Because in the world of HDR, it now
matters which target you are going for. If you are never going into a movie theater, then you don’t have to maintain the same limits
for where you would want Zone 1 or Zone 10 detail (to use that as a metaphor).
For a project going onto full range PQ, you may want tonal separations of as much as 1.5 million (20.55 stops) Of course current cameras
aren’t really there yet, so you might have a little time before you have to worry about that. But even today’s cameras with about 14+ stops
requires decisions about where to place highlight and shadow details that are going to reproduce cleanly on certain types of monitors.
For most productions, the least common denominator applies (which is LCDs).
Because of that, it is important to know that Cinema Output Ratios for a Dark Surround, are different that Video Output Ratios for a Dim or Average Surround).
Dark needs a 1.5 boost. Dim needs a 1.2 Boost. So when you consider what you are trying to achieve, it helps if you know
what your audience will be looking at.
Yes, it is a colorist problem to fix in the end — it is what you see is what you get. But knowing the choices that will be faced
improves the usefulness of the source material, so is good for DPs to know.
Jim Houston
Consultant, Starwatcher Digital, Pasadena, CA
|
|
Re: HDR question
JD Houston
On Mar 17, 2018, at 9:25 AM, Nick Shaw <nick@...> wrote:I’m with Nick on this. Jim Houston Jim Houston Consultant, Starwatcher Digital, Pasadena, CA
|
|
Re: HDR question
alister@...
But they don’t “look the same” The 709 standard was based on CRT TV’s with very limited brightness ranges and white at 100 NITs, so a bit of contrast was added to make up for the lack of brightness, so the image was perceived to be better. But now most TV’s hit 300 NT’s or more so most viewers now have even more contrast than before, which they seem to like - but in this case, no it’s not accurate, because a gamma miss-match is being added/created. But this is getting away from my original point which is that the way a monitor generates contrast is exactly the same as the way contrast for a scene is produced. How we perceive an image on a screen that only fills a small part of our FOV is a different thing and as we all know the same monitor will look different in different viewing environment just as the view out of a window will be perceived differently depending on how bright it is inside the room.
I think we are all aware that no TV can reproduce very high contrast scenes and in particular speculars, I’ve already said that. But if you are within the monitors range and brightness (which can be 8 stops or more) then with ST2084 you should be able to get the same light levels coming from the monitor as the scene, thus the same contrast and same DR.
I though the BBC were aiming for 1.2? Again though, this is for perceptual reasons and as you know it gets adjust according to ambient light levels but not because the contrast that comes from a screen is somehow different to the contrast we see in a scene. It’s because TV’s almost never fill our FOV so only a small part of the what we are looking at is changing and the ambient light in the room changes our perception of the contrast. It would be different if the screen totally filled our FOV or everyone had blacked out living rooms. I have no problem with the notion of ambient light levels changing perceptions. This happens not just to monitors but to everything we see. This is why it’s normal to use viewfinders with monoculars to exclude ambient light or video villages in blacked out tents. We are eliminating the otherwise distracting ambient light that alters our perception so that all we see is the true contrast of the monitor which should them match the contrast of the scene (or at the very least be very very close). And this comes back to my original point which is that there is no difference in the contrast ratio of a screen compared to the contrast ratio of a scene, they are the same thing, a contrast ratio is a contrast ratio wether that of the scene or that of the display. A simple test would be to shoot a chart with matching camera and display gammas and have the monitor next to the chart with the lighting set so that the chart is reflecting the same amount of light off the white chip as the monitor is outputting light off the white chip. I bet that if I take a photograph of this both the chart and monitor will look virtually identical in the resulting picture.
|
|
Re: HDR question
JD Houston
The first line is correct. If you have a monitor that can show up to 10,000 nits (the brightness of a piece of paper in the Sun) so that you can match the brightness of daylight and you are also outside looking at a monitor, it will look the same up to the point that the monitor can show (you would need a monitor of 1000000 nits to match the Sun and direct Sun glints. Vision is a log thing) But I would dare to say that ALL monitor viewing is currently done in environments that are not a direct match to the original scene. This is as true in dark scenes where the eye is dark adapted and the surround is also dark. So in practical terms, the scene contrast and the display contrast are almost never the same. Yes, you are correct that the difference is because of viewing environment. In most video and film, the contrast build-up in a display is systematically designed into the system. This is what makes the ratio’s different. In cameras, the ‘taking curve’ has an assumption (say Rec709 curve) and the output curve of a display has a buildup assumption (Rec.1886’s gamma 2.4 for example), this gives the video system an effective contrast boost of about 9% (2.4/2.2) So by design, comparing scene contrast ratios and output display contrast ratios will not give you the same number. To simplify the example, if you have a scene contrast of 100:1, and you want to show it in a dark environment, you need a 150:1 to accurately show it. If you are showing it in a video dim environment, you would need an output contrast of 109:1 to show it. (As an aside, the overall gain of 1.2 that is sometimes used brings in a complicating factor of audience preference for a certain style of reproduction. Audiences prefer boosted contrast looks over reality matches) There is one other consideration. Most scene contrasts are described in terms of simultaneous contrast — the dark to light ratio including everything in the scene at the same time (fill, flare, etc). Most display contrasts are described as sequential contrasts — the ratio of a full on white to a full-on black. This makes the displays seem to have a higher contrast. But it is not true. It is especially a problem with projectors because the projection lens can have a drastic effect on the display contrast. But even OLED monitors can have systematic issues in displaying images with the original scene contrast. This is all a reason to never use display contrast as a way to evaluate scenes. From an operator standpoint, the ratios to use are based on the light in the scene. So targeting key to fill ratios of a certain number is the right way to do it. You are mixing light and light is linear but it can be expressed as ratios — photographically almost everything is a ratio (e.g. this light is twice as bright as that light, this exposure is half as much as the previous one ) Why does an operator need to know about contrast ratios and the effects of picture rendering. Because in the world of HDR, it now matters which target you are going for. If you are never going into a movie theater, then you don’t have to maintain the same limits for where you would want Zone 1 or Zone 10 detail (to use that as a metaphor). For a project going onto full range PQ, you may want tonal separations of as much as 1.5 million (20.55 stops) Of course current cameras aren’t really there yet, so you might have a little time before you have to worry about that. But even today’s cameras with about 14+ stops requires decisions about where to place highlight and shadow details that are going to reproduce cleanly on certain types of monitors. For most productions, the least common denominator applies (which is LCDs). Because of that, it is important to know that Cinema Output Ratios for a Dark Surround, are different that Video Output Ratios for a Dim or Average Surround). Dark needs a 1.5 boost. Dim needs a 1.2 Boost. So when you consider what you are trying to achieve, it helps if you know what your audience will be looking at. Yes, it is a colorist problem to fix in the end — it is what you see is what you get. But knowing the choices that will be faced improves the usefulness of the source material, so is good for DPs to know. Jim Houston Consultant, Starwatcher Digital, Pasadena, CA
|
|
Re: HDR question
Art Adams <art.cml.only@...>
This is correct from an engineering standpoint. From an artistic standpoint... not so much. There's a difference between capturing images and creating visual stories. For the former, a chart is an absolute reference. For the latter, it's a tool meant to create a consistent starting point when imposing an artistic vision, and ensuring that vision is the one that reaches the audience. A chart is not necessarily an image reference, but can be a look calibration reference.
|
|
Re: HDR question
Nick Shaw
On 17 Mar 2018, at 16:06, alister@... wrote:
Intuitively you could think that with HDR it might be possible to have the absolute luminance of the screen be identical to that of the scene, and therefore require no system gamma or other picture rendering. However, while this might be possible for some low contrast scenes, where there are specular reflections of the sun, or even just a bright sky, in the scene, co current monitor can reproduce that. Look up the BBC's experiments when developing HLG (Google Tim Borer and Andrew Cotton). They found (surprisingly, according to classical colour science) that as the peak brightness of the monitor increased, the system gamma had to be increased to get a perceptual match. For a 1000 Nit monitor in a typical environment, HLG uses a system gamma of 1.4. Nick Shaw Workflow Consultant Antler Post Suite 87 30 Red Lion Street Richmond Surrey TW9 1RB UK +44 (0)7778 217 555
|
|
Re: HDR question
alister@...
Geoff wrote:
I really hope they aren’t different. What would be the point of my expensive DSC Charts?
|
|
Re: HDR question
MARK FOERSTER
“here burn in their LUTs in this way”
____________________________________ Until log showed up say five years ago the only way - was to “burn” your lut into your work. You picked a gamma curve, set a white point and used traditional methods like protecting whites, nd’s in windows and hmi’s for keys and fills. Don’t be afraid to use a properly calibrated monitor on set and deliver a fully exceptable picture. I still deliver this way 50% of the time. I’ll also mention here that I still see plenty of “post corrected log” looking worse than had I delivered the way (shocker!) I thought it should look. Cranky blacks especially. Finally a 10 bit file even baked with a decent lut has enormous head room and shadow pull - try it in any NLE. Mark Foerster csc Toronto (905) 922 5555
|
|
Re: HDR question
Geoff Boyle
I have to disagree with you on this Alister, because of gamma and the responses of displays the total range may be the same but the way it is represented can be totally different.
From: cml-raw-log-hdr@... <cml-raw-log-hdr@...> On Behalf Of alister@... So if a monitor can manage 8 stops and the scene is 8 stops and if the brightness of the monitor can match the brightness of the scene, in the same environment both will look the same.
|
|
Re: HDR question
Jan Klier
I just upgraded to the Varicam LT. It has the ability to record VLog to the primary memory card and a proxy file to the secondary memory card, while applying a built-in or user supplied LUT to the proxy file. So you get the best of both straight from camera. Jan Klier DP NYC
|
|
Re: HDR question
alister@...
So if a monitor can manage 8 stops and the scene is 8 stops and if the brightness of the monitor can match the brightness of the scene, in the same environment both will look the same. So to say that monitor contrast and scene contrast are different things is not correct. It is the viewing environments that are different and it is the difference in the viewing environment that changes the perception of the image, not differences in the contrast ratios of monitors. Unless I’m greatly mistaken and my whole knowledge of contrast and brightness is flawed.
|
|
Re: HDR question
Jonathon Sendall
I occasionally burn in a LUT to a monitor/recorder (if I have time and have a post plan) so that there’s at least a reference image in proxies, but only if I’m happy with the camera log/raw footage being the only copy in that format.
toggle quoted messageShow quoted text
Jonathon Sendall DP, London
On Sat, 17 Mar 2018 at 11:58, Nick Shaw <nick@...> wrote: On 17 Mar 2018, at 01:34, Seth Marshall via Cml.News <sethmarshall=yahoo.com@...> wrote:That video is not suggesting you burn in the LUT. Indeed, the Odyssey does not even have the option to do that. The LUT is used for monitoring only. I did another video for CD showing how to apply the print-down LUTs in Resolve.
|
|
Re: HDR question
Nick Shaw
On 17 Mar 2018, at 01:34, Seth Marshall via Cml.News <sethmarshall=yahoo.com@...> wrote:That video is not suggesting you burn in the LUT. Indeed, the Odyssey does not even have the option to do that. The LUT is used for monitoring only. I did another video for CD showing how to apply the print-down LUTs in Resolve. Regarding my comments about picture rendering, that is not something either a DP or a colourist really needs to concern themselves with an any practical way. I was just pointing out that screen contrast and scene contrast don't relate directly in the way Alistair was suggesting. It is a small insight into the colour science of what is happening in a camera LUT. But in practical terms you don't really need to know why. You just use ACES, or a manufacturer's LUT, or just grade until it "looks right"! Nick Shaw Workflow Consultant Antler Post U.K.
|
|