Topics

Color gamuts

Art Adams
 

I've tried to grade footage shot a fair bit under water and there's simply no red there. It's impossible. Interesting about the possibility of underwater UV, but I don't think that's the issue. Every time I've aimed a modern single sensor camera at a test chart cyan has always fallen short. Even most of the prism cameras fell short.

For some reason Panasonic cameras seem to do cyan really well and it's startling when I see it, because I realize that I'm just not used to seeing that color. They've had a number of demos where they play up greenish-blue skylight and it's pretty dazzling. It's not accurate, but it's pretty.

--
Art Adams
Director of Photography
San Francisco Bay Area

daveblackham@...
 

A side issue but may be a pointer to the cyan question. This is more a comment from a practical perspective than a scientific test. We get issues replicating cyan a lot the time when filming underwater, its an unforgiving environment for cyan and blue so we see the problem a lot, usually grading them we end up with Magenta blooming errors being produced, it can look like magenta cast to the entire image. Using format -20 UVLAR filter help which is a UV cut filter to remove any unwanted UV. This crops up on DSLR cameras as well as current cameras used for cinematography so appears to be a sensor design issue or an issue of the UV cut filtration provided in the OLPF filter stack. in this environment its rather like the IR issue of using heavy ND filters to cut visible light and transmitting IR to the sensor resulting in a colour cast to the blacks, the UV causes much the same problem but at the opposite end of the spectrum. In the Underwater environment UV transmits well whist the red end of the spectrum is attenuated first in the water column. I just wonder if the same scenario is occurring in normal filming environments and UV is causing cyan and other issues but to a lesser extend than we see underwater. Some cinematographers we work with have commented that Sony blues have always been preferred, I don't know how their sensors or filtration differs from others. From a practical perspective it seems the field has narrowed recently with new sensor and OLPF releases from other manufacturers.

Art Adams
 

Thanks, Adam. That makes sense. I do find it interesting how such mathematical gamuts are used for marketing purposes. For example, RED Wide Gamut and Canon's color gamut show a green that's fantastically far out. Clearly the camera isn't capturing that but simply encoding data within that space.

I remember when Sony came out with SGamut3 and the green primary was within the visible spectrum but left of the vector along which the P3 and Rec 709 primaries fell. As a result, lift/gamma/gain alone weren't an option or you'd end up with really weird results. My guess is that everyone is pushing that primary out there to preserve that rotation while expanding their usable gamuts, having learned a lesson from Sony's experience, but that this gamut doesn't represent what's actually possible.

Of course, when companies show this diagram, they don't say that.

The things I've learned this week in my research:

A color model is a way of numerically designating a color. For example: RGB, CMYK, HSL, etc. These are simply numbering systems.

A color space is specific implementation of a color model. It specifies a white point, and a specific set of numbers (such as RGB) can be translated into one, and only one, specific hue that relates directly to human vision.

A color gamut is a range of values within that color space.

Cameras do not have an inherent color gamut. What information they capture is translated into a color space, where it fills a gamut. That gamut may or may not represent what the camera actually sees: there may be hues outside of that gamut that it can capture, but the design of the gamut in relation to the other primaries may require those hues to be scaled down to fit.

--
Art Adams
Director of Photography
San Francisco Bay Area

shanedaly@...
 

“...editors are not colorists, no matter how much my clients with "in-house coloring departments" want to believe this.”

Could. Not. Agree. More.

My current indie edit team have requested flat dailies so they can choose the LUT!


Shane Daly
LA DP
shanedaly.info

Adam Wilt
 

When I see the green primary outside of the 1931 CIE color space, does that mean: It’s a color that doesn’t exist, but is a convenient mathematical construct?

Yes. Early color spaces (SMPTE C, EBU, 601, 709) were based on physically obtainable/realizable phosphor colors. More modern color spaces, hoping to embrace, encode, and compute colors outside the triangle of any three physical primaries, push their mathematical corner points out into “supersaturated” areas so that more (or all) physically realizable colors can be found within the bounds of the color-primary triangle. This allows most/all real colors to be represented as positive-value R,G,B triads.

and/or Does it exist outside the human visual range, and the camera’s color science will map it into a viewable color?

Yes, outside human visual range, and no, the camera will never see that pure value in the first place: that far-out “supergreen” simply makes the math of wide-gamut colors palatable. 

So (he said, making up numbers from a handwaving color space as opposed to any real color space), the greenest green you might ever see would be something like R,G,B = 0,0.7,0. But that nice saturated cyan that’s well outside 709 or P3 might be coded as 0,.99,.99.

Adam Wilt
technical services: consulting / coding / camerawork
Vancouver WA USA (no, not that Vancouver, the other one)

deanan@gmail.com
 

It's the coloars visible for an average person's eyesight. 

Side note: 1931 was heavily euro centric. Color accuracy does differ depending on cultural/linguistic variations as well some genetic oddities. 

Anyone know if 1976 comprised a more global sampling?

Deanan
Consultant
Playa del Rey,CA 

On Mon, Jan 29, 2018 at 1:21 PM, Art Adams <art.cml.only@...> wrote:
Okay, further question:

When I see the green primary outside of the 1931 CIE color space, does that mean:

It's a color that doesn't exist, but is a convenient mathematical construct?

and/or

Does it exist outside the human visual range, and the camera's color science will map it into a viewable color?

Thanks- Art

Art Adams
 


Do the edit suites have software with waveform monitor/vectorscope in the software?
 
As best I know, they all do—but editors are not colorists, no matter how much my clients with "in-house coloring departments" want to believe this.

I've gotten rough cuts of projects that were presented in log because, while I emailed a LUT to the producer or to post, it didn't make it to the editor and/or the editor didn't realize it was important and/or the editor didn't know how to apply it. Great edits, though.

Editors do wonderfully creative things but they often don't seem to have a clue as to what happens technically in the real world. For example, I had a project where we shot five high-end talking head interviews of between 60-90 minutes each and the editor wanted the footage in open gate Arri Raw. He got 3.2K ProRes4444 instead, because there was no need for that kind of material and the producer was not going to pay for the necessary infrastructure to deal with that on set.

--
Art Adams
Director of Photography
San Francisco Bay Area

Art Adams
 

Okay, further question:

When I see the green primary outside of the 1931 CIE color space, does that mean:

It's a color that doesn't exist, but is a convenient mathematical construct?

and/or

Does it exist outside the human visual range, and the camera's color science will map it into a viewable color?

Thanks- Art

--
Art Adams
Director of Photography
San Francisco Bay Area

Brian Higgins
 

On Sun, Jan 28, 2018 at 8:40 PM Ted Langdell <tedlangdell@...> wrote:

Do the edit suites have software with waveform monitor/vectorscope in the software? And are the users aware of their presence? Not asking if they know how to use them…

Ted

Ted Langdell
(530)301-2931


_._,_._,_

Mine does, but I have a color/Flame room. Most of our Avid editors and all of the assistants would have no idea what to do with them. 

-b
--

brian higgins | creative director

Flavor

ny • chicago • la • detroit • tokyo

312.706.5500

brian.higgins@...

flavor.tv


Ted Langdell
 


On Jan 27, 2018, at 2:39 PM, Bob Kertesz <bob@...> wrote:

And when I asked if there was a calibrated professional monitor and a 
waveform anywhere in the building, I was met with blank stares.

I guess I should be shocked but then maybe not.

Do the edit suites have software with waveform monitor/vectorscope in the software? And are the users aware of their presence? Not asking if they know how to use them…

Ted

Ted Langdell
(530)301-2931

Dictated into and Sent from my iPhone, which is solely responsible for any weird stuff I didn't catch.

Jeff Kreines
 


On Jan 27, 2018, at 2:11 PM, Bob Kertesz <bob@...> wrote:

the generation of wildly illegal colors which make the camera's matrix writhe in pain 
and turn to garbage downstream when the inevitable compression is applied.

Bob has always had a way with words.

Jeff Kreines
Kinetta
jeff@...
kinetta.com


Art Adams
 

The clipped lights that I was seeing are most likely Sky Panels. It's easy to go out of Rec 709 gamut on those, which is why the desaturation control exists. Apparently someone didn't get that memo.

In one show, I remember skin looking much like it was illuminated by narrow band blue Kinos: reddish skin turns almost black and blotchy, and the only time one sees reddish skin is when something is wrong with it. The actor looked leprous, but in a very pretty and saturated blue kind of way.

--
Art Adams
Director of Photography
San Francisco Bay Area

Jan Klier
 

The lights I was referring to are not the high CRI lights like Arri Skypanel, but the much cheaper LED bars stage designer like to place at the bottom of stage walls to provide a nice magenta wash. These have separate RGB LEDs, and the lighting boards typically have separate RGB controls, so when they dial in magenta it’s the equivalent of 255/0/255 RGB, which will be 100% saturation in whatever color space that gets mapped. If you bring this type of footage into Scopebox you see the vectors scope go well beyond limits, in fact in one clip I looked at recently with the chroma zebra set at 95% half the clip next to the lights is covered by the zebras. If that were to go through a legalizer it may cause clipping.

If they just added a bit of green, to get to 255/50/255 RGB, saturation drops to 80% and things start looking much better, and still nice and magenta to the eye.

For a better comparison we would have to see a spectrogram of a traditional Leko par with a magenta gel next to a LED bar driven to 255/0/255. We most likely would see that LED spectrogram would be much more extreme. I’ve been meaning to do that...

But this is more of a real-life issue responding to what Ted observed, separate from the various color gamuts this discussion was originally about.

Jan Klier
DP NYC

On Jan 27, 2018, at 4:05 PM, Art Adams <art.cml.only@...> wrote:

I'm curious, though, why an LED light would saturate colors differently at different angles, as opposed to broad spectrum sources. Is this a reflection issue...?

Bob Kertesz
 

Art mentioned clipping and maybe that's what I'm seeing.

This is happening on at least one of the Sacramento television
stations newscasts I see. The cameras are apparently three chip HD
cameras from what I can tell when they shoot bumpers wide shots of the
cameras going to a commercial break.

Another station has a field camera with a red – green registration
error horizontally. I'd say something to the engineering department,
but I don't know how to put it diplomatically. This is been seen on
the air for a year to a year and a half and this camera gets used
regularly. I'm hoping it's just one camera that has this problem.

You would think it would be very visible to anybody either using the
camera or editing the material
Maybe, maybe not. When the bean counters say it's time to cut costs, the
engineering and/or IT departments always get savaged, and no one seems
uncomfortable with that.

I very recently did some training at a station here in L.A. that does at
least six hours of original broadcasting a day (the rest is syndicated),
have three working stages going all the time, along with at least two
ENG style crews shooting reality shows, and they have ONE engineer on
staff whose shift is 3 am to 11 am. Other than that, no one who knows
anything about that stuff is on site.

And when I asked if there was a calibrated professional monitor and a
waveform anywhere in the building, I was met with blank stares. The
'editing suite' where I did the training had three large monitors
mounted on three walls, all different brands, different colorimetries,
all bought at Costco or some other retail store, none having been ever
calibrated or even tweaked, and no one, NO ONE, noticed or cared. When I
asked the editor which monitor he preferred, he said whichever one had
the nicest image with whatever he was working on.

-Bob

Bob Kertesz
BlueScreen LLC
Hollywood, California

DIT, Video Controller, and live compositor extraordinaire.

High quality images for more than four decades - whether you've wanted
them or not.©

* * * * * * * * * *

Ted Langdell
 

Art indeed! A extremely helpful thread number of levels.

Thank you all gentleman I've learned a lot.

Art mentioned clipping and maybe that's what I'm seeing.

This is happening on at least one of the Sacramento television stations newscasts I see. The cameras are apparently three chip HD cameras from what I can tell when they shoot bumpers wide shots of the cameras going to a commercial break.

Another station has a field camera with a red – green registration error horizontally. I'd say something to the engineering department, but I don't know how to put it diplomatically. This is been seen on the air for a year to a year and a half and this camera gets used regularly. I'm hoping it's just one camera that has this problem.

You would think it would be very visible to anybody either using the camera or editing the material, and that they would bring it to someone's attention. Maybe they have, and noone has figured out what to do about it.

< shakes head. Shrugs.>

Ted

Ted Langdell
tedlangdell@...
(530)301-2931

Dictated into and Sent from my iPhone, which is solely responsible for any weird stuff I didn't catch.

On Jan 27, 2018, at 1:05 PM, Art Adams <art.cml.only@...> wrote:

as I see a lot of hard color clipping, as well as nasty skin blemishes that are enhanced by saturated blue and purple lighting

Art Adams
 

Wow. Thanks, everyone. Jim especially, as your posts went straight into Evernote.

I'm curious, though, why an LED light would saturate colors differently at different angles, as opposed to broad spectrum sources. Is this a reflection issue...?

I've avoided LED use for a long time, except for a certain few lights, but I'm being forced into using them more and more. I feel like it's a little bit less of an issue in the past, but whereas the big questions that defined past looks were "film stock?" and "lenses?", it's now "camera?", "lenses?", "codec?", "log/raw/bit depth?", "gamut?" and "lights used?". Cameras are designed around a couple of standard spectral light sources, and now every light on set has a different gamut.

I recently shot two days of product photography on a monochromatic product and we used all LEDs due to size of crew and heat issues. At one point I had three different kinds of LED lights working, and to ensure the product and background were perfectly monochromatic I had to white balance the camera to the key light, with all other lights turned off, and then go through each light in turn and match it to the first light's white balance—all done manually using a parade RGB waveform and Kelvin/CC controls on the lights and camera. The lights were cool, convenient and quickly adjustable, but required extra time just to match them so the camera saw them all as white.

The plus was that we had a phone screen in the shot at one point, so I was able to manually white balance to the screen and then match all the lights to that white balance, as opposed to the old way of building gel packs for every light.

The RGB lights tend to be a bit unpredictable. I remember when I helped build the Kelvin Tile and we had five dye LEDs and one phosphor. It looked great on an F900 and other prism cameras, but years later when I looked at it on an Alexa it was awful. Flesh tone was cadaverous, while certain colors popped unnaturally. From that I learned to favor phosphor LEDs when possible, although Sky Panels have a range of really handy effects in them (lightning, flash bulbs, fire, police lights, etc.) that save a lot of time. They also have a desaturation control that a number of people working in episodic television should learn more about, as I see a lot of hard color clipping, as well as nasty skin blemishes that are enhanced by saturated blue and purple lighting schemes.

--
Art Adams
Director of Photography
San Francisco Bay Area

Bob Kertesz
 

On 1/27/2018 10:16 AM, Jan Klier wrote:
In most cases it appears to be a side effect of the ever present LED
lighting on sets.
I've been waiting for someone to mention this. With few exceptions, my
experience with LEDs has varied from occasionally terrific, to somewhat
adequate, to generally horrendous. Many people, even those who have been
around a LONG time, seem to believe that 'light is light' and spend the
least amount possible on the instruments (or give in to production on
the money aspect), the result of which is a spectral curve that
resembles a roller coaster. Attempts to fix the issue with gels almost
always makes things (much) worse.

And then there are the sets that use what I lovingly refer to as 'party
lights', where the main design criteria was to get the maximum amount of
saturation with no regard to CRI or spectral response or the generation
of wildly illegal colors which make the camera's matrix writhe in pain
and turn to garbage downstream when the inevitable compression is applied.

In general, it seems to me that the CIE triangles become less meaningful
with poor quality LED instruments because no matter how wondrous the
camera's response is, if there's a big dip or big peak in the light's
red spectrum (say), it's going to look weird. As David Stump said in his
terrific book on the subject, CRI isn't everything; if there's a heavy
dip in the spectral response, you can't put that color back in post
because it doesn't exist.

We are far along enough in the process of making LED instruments where
this shouldn't be a problem any longer, but the drive to "Make it
faster, make it cheaper!" still sometimes results in the mumbled
question "Why does this look like crap?"

And just to say something positive, far and away the very best LED
instruments I've worked with thus far in terms of high CRI and nice
spectral response where the images are as I expected them to be come
from ARRI, who seem to understand the relevance of making LED light be
as close to the natural properties of tungsten light as possible, but
also allow them to be tortured to make them look somewhat awful to match
the other LED lights on set.

-Bob

Bob Kertesz
BlueScreen LLC
Hollywood, California

DIT, Video Controller, and live compositor extraordinaire.

High quality images for more than four decades - whether you've wanted
them or not.©

* * * * * * * * * *

Paul Curtis
 

There's also a practical side in that in order to record those cyans then the primaries need to be way outside and in more limited bit depths and recording formats this would reduce tonality but not that is an issue these days but may explain historically why. Also i think some of the primaries become negative which can mess up some math.

Cheers
Paul

Paul Curtis, VFX & Post | Canterbury, UK (in LA at the mo)

Sent from my iPad

On 27 Jan 2018, at 10:17, Ted King <theodore.rex.king@...> wrote:

>>> I can’t answer Art’s question about why, but it always has seemed to me that my Phase One back (CCD sensor) has better color discrimination in the greens than any of my other cameras: Red, Nikon or Sony.

I believe part of this has to do with the fact that your phase one produces a true raw file with zero compression of any kind and thus "sees" more color than the other cameras you mention. 

Ted King | DP NYC 
--
TRK
302.690.1742

Ted King
 

>>> I can’t answer Art’s question about why, but it always has seemed to me that my Phase One back (CCD sensor) has better color discrimination in the greens than any of my other cameras: Red, Nikon or Sony.

I believe part of this has to do with the fact that your phase one produces a true raw file with zero compression of any kind and thus "sees" more color than the other cameras you mention. 

Ted King | DP NYC 
--
TRK
302.690.1742

JD Houston
 


On Jan 27, 2018, at 8:40 AM, Ted Langdell <tedlangdell@...> wrote:

Anyone finding that magenta can be so saturated as to lose detail?  

Yes.   If you look at the CIE diagram it implies that there is just as much magenta as say yellows and greens.
But that is misleading… magenta is very hard to discriminate and there are fewer shades between
Red and Blue than you might think.  It also saturates very easily.  [Technically, magenta doesn’t exist as a color but that is a whole philosophy
thing — google “why purple does not exist”]


I see this on a number of television programs produced with HD broadcast cameras. A cyan dress in the same shot looks fine. 

This also seems to affect the color of lips. They can look unnatural. Men seeming to be wearing a pit of purplish lipstick. 

Yes, this shows up in a lot of shows, it is because the colorist adds contrast to shots (which increase saturation of colors)  but
magenta gets exaggerated and likely because of a lack of time, the colorist does not have time to tweak the lips to a better shade.


Watching a CBS 48 Hours recently, the woman being interviewed appeared to be wearing a magenta-ish shade of lipstick in shots from the primary camera. 

Shots from the B camera off to the side showed a more "natural" looking woman, with what looked like no lipstick. 

This can happen when the light direction really matters for the color reproduction — like when using LED lights, head-on might be fine but off axis can have 
color shifts.  It can also happen if one camera is ‘painted’ to match another camera with a more aggressive correction.



Distracted from the storytelling. 

Makes me wonder--during the program-- about what's being used to shoot the content and whether anyone's paying attention to stuff like this.

Not often.


And why this is occurring. Seems visible across several brands of HD and UHD sets I own.

It is built into 709 color spaces and camera sensitivities.  It has been a problem for a long time.
It is getting more obvious now that sets are showing wider color gamuts and they are often not
really calibrated for rec.709.  The belief is that more color is always better, but not so.

…and don’t get my started on VIVID mode. ;-)

Jim



🤔

Ted

Ted Langdell
(530)301-2931

Dictated into and Sent from my iPhone, which is solely responsible for any weird stuff I didn't catch.

On Jan 27, 2018, at 8:10 AM, Art Adams <art.cml.only@...> wrote:

Thanks for all the information, I really appreciate it. The one thing I will point out is that I don't know many cameras that can saturate the cyan chip on a DSC Labs Chroma Du Monde chart to full Rec 709 levels. Maybe this was possible in prism camera days (although I remember having problems then as well) but it seems especially problematic now.

Not that anyone wants an image that saturated... but it's still interesting.

--
Art Adams
Director of Photography
San Francisco Bay Area

JD Houston
 

daring to go deep…..


The charts of the CIE diagram for the different color camera color spaces is JUST the encoding available to the camera engineer and
are not plots of the capability of the camera to cover the color space.  It is even possible that a camera sensor with processing will produce a
color on the CIE diagram that is outside of these triangles but the only available codes to represent that color are going to be within the
camera code gamut triangle. [So the engineers will force it to the nearest available code]  As Daniele says, spectral sensitivity diagrams can show the real capability of a camera to discriminate 
color, but these are often kept as a secret.  (If you match the spectral sensitivity curve, then you have a significant match to the cameras
unique character).

It is because of the position of the Green and Blue Primaries of these camera color encodings that some cyans and greens are lost.
Remember that the CIE diagram is a tilted slice through a 3D color volume that comes to a point at top and bottom.

It can be hard to read the sensitivity diagrams and there is a good discussion in RGW Hunts book on Reproduction of Color.
When the spectral curves are narrow, the color that can be captured is far away from white.  When the curves are broader than the CIE eye curves,
then the color it can capture is closer to white. [e.g. Rec709]

The essence of it though is that the more the curves overlap, the less sensitive those regions of color are.  Wherever there is overlap,
there is also color crosstalk between RGB.  If there is no overlap, then there are gaps where a real world spectral color can be, that the
camera cannot see.  Balancing these two demands on the spectral sensitivity of the color filter arrays is an essence of the art of 
camera design.  Some cameras can be OVERLY sensitive to certain colors, or the color being encoded in the camera makers space actually
represents an unreal color. [The green in Red Wide Gamut]

On the issue of saturated cyans, we lose a lot from film going to digital because a subtractive color system can capture much more in certain areas
because of the system of   negative and print.   An additive color system like all digital devices has problems reproducing dark colors
because three colors (color vectors) must add to a location in the color space and fit within the triangle.  All additive color spaces come to a point
at white AND at black.  So as you get to darker colors, the available colors are reduced in digital, they all become desaturated.  The same thing 
happens as you get brighter:  there are brighter versions of a color that can be captured by the camera but then they desaturate on an additive display
with a peak white.

It is one of the benefits of HDR that having a much greater peak white now allows some of those colors to be shown on the display without too much limitation.
So if you think that everything in the scene has the same brightness but for some reason the flesh tones are better,  you are seeing a real effect.
You are ‘opening up’ the range of displayable colors.

Jim Houston
Pasadena, CA




On Jan 27, 2018, at 4:25 AM, Daniele Siragusano via Groups.Io <daniele@...> wrote:

Reproducing strong saturated vivid cyans like we had on film is rather a display problem and not an camera problem, I believe. But than I am not a camera manufacturer.

The camera triangles plotted in any 2D diagram should be taken with lots of grain of salt:

A camera does not have an additive triangle ‘gamut’ like a display, but has spectral sensitivity. So they can see all of the visible spectrum, but due to differences in spectral response between human and camera their response to the same spectra is just different. The gamut plots you see from camera manufacturer are just an effort to minimise those metameric differences between their camera and an standard observer. Also typically they try to accomplish this with just an affine transformation (typically a matrix), and this cannot do a perfect job for all spectra, so they need to choose where to minimise the error, and this is typically not for strong saturated colours but for pastel colours.

I hope some of this makes sense.
Daniele
_._,_._,_

Jan Klier
 

A bit of a tangent to the conversation here, but I did some research on this not long ago. In most cases it appears to be a side effect of the ever present LED lighting on sets. Due to their design of using separate RGB channels, rather than filtering from white light it’s easy to drive them in ways that creates 100% saturation on a color primary or secondary, which the cameras have a hard time handling. All it would take for the lighting designers to mix just 10%-20% of the the other channel in and it would bring down saturation to 85% and within range of the cameras. But the operators haven’t had to worry about that win the past and there is a lack of awareness, and the typical controls make it more likely for them to dial in pure colors.

The other day I was filming a stage and happened to be positioned next to the lighting board. I asked her to mix in some green and showed her the result on the vector scope of the camera. A light bulb went off :-) And that little extra green makes no difference to the eye for the most part, but a big difference to the camera - mostly the processing and codec, not the as much the actual sensor which is also RGB photo sites.

Easy fix, long road to get there…

Jan Klier
DP & Colorist

On Jan 27, 2018, at 11:40 AM, Ted Langdell <tedlangdell@...> wrote:

Anyone finding that magenta can be so saturated as to lose detail?  I see this on a number of television programs produced with HD broadcast cameras. A cyan dress in the same shot looks fine. 

Ted Langdell
 

Anyone finding that magenta can be so saturated as to lose detail?  I see this on a number of television programs produced with HD broadcast cameras. A cyan dress in the same shot looks fine. 

This also seems to affect the color of lips. They can look unnatural. Men seeming to be wearing a pit of purplish lipstick. 

Watching a CBS 48 Hours recently, the woman being interviewed appeared to be wearing a magenta-ish shade of lipstick in shots from the primary camera. 

Shots from the B camera off to the side showed a more "natural" looking woman, with what looked like no lipstick. 

Distracted from the storytelling. 

Makes me wonder--during the program-- about what's being used to shoot the content and whether anyone's paying attention to stuff like this.

And why this is occurring. Seems visible across several brands of HD and UHD sets I own.

🤔

Ted

Ted Langdell
(530)301-2931

Dictated into and Sent from my iPhone, which is solely responsible for any weird stuff I didn't catch.

On Jan 27, 2018, at 8:10 AM, Art Adams <art.cml.only@...> wrote:

Thanks for all the information, I really appreciate it. The one thing I will point out is that I don't know many cameras that can saturate the cyan chip on a DSC Labs Chroma Du Monde chart to full Rec 709 levels. Maybe this was possible in prism camera days (although I remember having problems then as well) but it seems especially problematic now.

Not that anyone wants an image that saturated... but it's still interesting.

--
Art Adams
Director of Photography
San Francisco Bay Area

Art Adams
 

Thanks for all the information, I really appreciate it. The one thing I will point out is that I don't know many cameras that can saturate the cyan chip on a DSC Labs Chroma Du Monde chart to full Rec 709 levels. Maybe this was possible in prism camera days (although I remember having problems then as well) but it seems especially problematic now.

Not that anyone wants an image that saturated... but it's still interesting.

--
Art Adams
Director of Photography
San Francisco Bay Area

David Fuller
 

Great discussion!

I can’t answer Art’s question about why, but it always has seemed to me that my Phase One back (CCD sensor) has better color discrimination in the greens than any of my other cameras: Red, Nikon or Sony.


David Fuller
david@...
207-415-1986

On Jan 26, 2018, at 11:27 AM, Art Adams <art.cml.only@...> wrote:

So, does anyone know why saturated cyans tend to be hard for cameras to reproduce? Is that by design, or due to something else?

Nick Shaw
 

On 27 Jan 2018, at 12:25, Daniele Siragusano via Groups.Io <daniele@...> wrote:

Reproducing strong saturated vivid cyans like we had on film is rather a display problem and not an camera problem, I believe. But than I am not a camera manufacturer. 

If you look at Rec.709 and even P3 on a CIE xy plot, you can see that a considerable amount more of the green/cyan area is cut off that in any of the plots of wide gamut spaces with real primaries (the right hand three) in the article. In fact, I believe the xy coordinates of the cyan chip of a Macbeth chart are well outside the Rec.709 gamut. So I agree with Daniele that it is a display problem, not a camera one.


The camera triangles plotted in any 2D diagram should be taken with lots of grain of salt

The plots you see are of the colour spaces defined by each camera manufacturer. They are actual colour spaces, with defined primaries. But, as Daniele says, the matrices used to map camera spectral sensitivities to those defined colour spaces give larger and larger errors the further you get from "normal" colours like skin tones.

Nick Shaw
Workflow Consultant
Antler Post
Suite 87
30 Red Lion Street
Richmond
Surrey TW9 1RB
UK

+44 (0)20 8892 3377
+44 (0)7778 217 555

Daniele Siragusano
 

Reproducing strong saturated vivid cyans like we had on film is rather a display problem and not an camera problem, I believe. But than I am not a camera manufacturer.

The camera triangles plotted in any 2D diagram should be taken with lots of grain of salt:

A camera does not have an additive triangle ‘gamut’ like a display, but has spectral sensitivity. So they can see all of the visible spectrum, but due to differences in spectral response between human and camera their response to the same spectra is just different. The gamut plots you see from camera manufacturer are just an effort to minimise those metameric differences between their camera and an standard observer. Also typically they try to accomplish this with just an affine transformation (typically a matrix), and this cannot do a perfect job for all spectra, so they need to choose where to minimise the error, and this is typically not for strong saturated colours but for pastel colours.

I hope some of this makes sense.
Daniele

Art Adams
 

Daniele, thanks! That makes a lot of sense as well.

So, does anyone know why saturated cyans tend to be hard for cameras to reproduce? Is that by design, or due to something else?

Thanks.

--
Art Adams
Director of Photography
San Francisco Bay Area

Daniele Siragusano
 

To add to Nicks great post:
The missing bits are very extrem saturated colours which we are not really good at quantitatively discriminating.
Or in other words:
These are colours that occur rather seldom in reality and therefore we have no idea if the colour reproduction is correct or not.

Daniele

Art Adams
 

Thanks, Nick. That makes perfect sense. I thought it might have something to do with the reason that cameras have such a hard time saturating cyan hues, but apparently that's a different thing.

Geoff... sorry. :) I shall strive to do better in the future. I suspect the workshop went fine anyway.

--
Art Adams
Director of Photography
San Francisco Bay Area

Nick Shaw
 

On 26 Jan 2018, at 01:07, Art Adams <art.cml.only@...> wrote:

I'm curious as to why most of them eliminate bluish greens and greenish blues.

You need to bear in mind that a CIE 1931 xy plot is not perceptually uniform. This means that although it may look as if a huge proportion of the 'visible colours' is cut off in the greens, it is not as significant as you might think.

And, as is pointed out in the article you link to, Rec.2020 covers almost all of Pointer's gamut – the colours of reflective objects in the real world.

Compare a plot of Rec.2020 in CIE xy with one in CIE u'v', a colour space designed to be more perceptually uniform, and you will get a different idea of the "missing colours":


Plots created using Colour Science for Python

Nick Shaw
Workflow Consultant
Antler Post
Suite 87
30 Red Lion Street
Richmond
Surrey TW9 1RB
UK

+44 (0)20 8892 3377
+44 (0)7778 217 555

Geoff Boyle
 

A day late Art!

I could have really used this in my workshop on ACES for the NSC.

 

Cheers

Geoff Boyle FBKS
Cinematographer
EU Based
geoff@...
Skype  geoff.boyle
mobile: +31 (0) 637 155 076
www.gboyle.co.uk

-- 

 

From: <cml-post-vfx-aces@groups.io> on behalf of Art Adams <art.cml.only@...>
Reply-To: <cml-post-vfx-aces@groups.io>
Date: Friday, 26 January 2018 at 02:10
To: <cml-post-vfx-aces@groups.io>
Subject: [cml-post-vfx-aces] Color gamuts

 

I'm doing some research on color gamuts and stumbled across this handy chart:

Argyris_Theos_cml
 

Because that's the way it works. Any gamut will reproduce what's inside the triangle. It's math. To get more visible colors you need to extend the gamut so that it includes non-existing (=never to be used) values. E.g. ACES
I have the impression that these "non existing colors" may reproduce in the form of weird or unexpected colors, or artifacts. Hence the need to have the deliverable in a color space that lies within the CIE area, like P3, 2020 or 709
Best

Argyris Theos, gsc
DoP, Athens Greece,
theos@...
+306944725315
Skype Argyris.Theos
www.vimeo.com/argyristheos
via iPhone

26 Ιαν 2018, 3:07 π.μ., ο/η "Art Adams" <art.cml.only@...> έγραψε:

I'm curious as to why most of them eliminate bluish greens and greenish blues.

Art Adams
 

Hi all-

I'm doing some research on color gamuts and stumbled across this handy chart:


I'm curious as to why most of them eliminate bluish greens and greenish blues.

Thanks.

--
Art Adams
Director of Photography
San Francisco Bay Area

Previous Topic Next Topic