Topics

Skintones - Are They Affected at the Sensor Level? And Motion Cadence Question

Ed David <ed.david@...>
 

Okay two-parter and this is from me, a technical luddite:

Is the concept of a digital camera's ability to make nice or not-so-photorealistic skintones a factor based at the sensor level or is it further down the pipeline?

For example, we all can agree the Alexa has beautiful skintones and is this the result of how the sensor is developed - its CMOS array of pixels and how it processes it, or is it more in its codec and it's code of LOG C?

Another example, most of us agree that we don't like the skintones of Sony as much (SORRY SONY I LOVE YOU I OWNED LIKE 8 of your cameras) - in that they are more of an orange hue. Or that Red cameras have a more greenish cast, well in the dragon skintone optimized OLPF days.

And then they release OLPF and the skintones improve.  And then again with redwidegamut, better skintones.  So what's going on - is it all just sensor level stuff - is it everything?  

And now second part:  digital cameras and their motion cadence.  To my eyes, the motion of digital cameras all feel different at 24 FPS.  I really love the motion feeling of red cameras at 180 degree shutter.  To me, alexa cameras feel a little too blurry in their motion.  And the king for me of recent was the Sony F65 with a mechanical shutter.  What is happening at the shutter technology - is there some kind of interpolation of how frames are put together along with digital shutter that makes a camera have motion that feels one way and another the others?  Is this affected by the codec of how motion blur is captured?

Another case in point, I feel the motion is more pleasing to my eyes of the blackmagic pocket camera vs the newer ursa mini pro 4.6k cameras - again is this something that is affected by the programmers?

I hope I don't start a flame war with camera loyalists.  I am really agnostic about cameras and lenses.  And these are just my opinions on what I like and don't like. 

Thank you!

ed david | cinematographer

 

WWW.EDDAVIDDP.COM   /   917.449.0739

Philip Holland
 

Hi Ed,


I'll lightly tackle the sensor/color question.


There's a lot of factors in the imaging chain that result in the final color of what's presented at the end of the line.


Some quickies:

 - The base sensor technology itself and particularly how it's made/what it's directed at

 - Pixel design even comes into play

 - The in camera sensor color calibration

 - Any sort of color filter, IR cutting/blocking/absorption filter and how it's target is calibrated toward the general sensor 

 - (if we are talking RED) their various color calibrations for each OLPF (which are nowadays just color filters, these days the low pass is actually in the sensor coverglass)

 - Color Science

 - Underlying color processing engine (for instance, not all cameras have ideal rec.2020 image processing pipelines)


Outside of that there's a great deal of potential image processing within the cameras.  Some produce pleasing results, some are not exactly ideal towards a discerning eye or post QC.  Some cameras have settings you can get into, most have have settings and image processors we'll never have access to.


Linearity and color temperature and interesting concepts when digging deep into what each sensor can do.  For instance Monstro is notably more linear than Dragon and has a great deal more color information.  This is tied to the sensor tech as a whole and a few other things, but you can shoot a side by side and clearly see some differences.  RED's new IPP2 workflow improved a great deal of things such as potential ugly out of gamut clipping, improved tonal dimension, a newer highlight extension algorithm, and even a new debayer algorithm.  Much of this equates to better overall image quality, which is what you are seeing.


There's a lot that goes into the color and tonality we experience and capture.  There's a lot of fun stuff to dig into from there in terms of gamma curves and color space related things.  


Motion cadence is another interesting topic.  Some can be tied to readout, global versus rolling shutter, integration time, even dual gain related things.  There are certain types of image processing that can effect the "drawn image", but generally most higher end digital cinema cameras are doing a pretty darn good job.  As for general progressive playback that is a known quantity.  Interestingly, and something to note particularly in relationship to HDR mastering, is all of these cameras have different levels of Total Captured Dynamic Range and that has subtle potential impacts as contrast levels and peak highlight values and how you finish your material may impact things like judder.


Phil


-----------------
Phil Holland - Cinematographer
http://www.phfx.com
http://www.imdb.com/name/nm0390802/
818 470 0623


From: cml-raw-log-hdr@... <cml-raw-log-hdr@...> on behalf of Ed David <ed.david@...>
Sent: Thursday, March 29, 2018 5:37:49 AM
To: cml-raw-log-hdr@...
Subject: [raw-log-hdr] Skintones - Are They Affected at the Sensor Level? And Motion Cadence Question
 

Okay two-parter and this is from me, a technical luddite:

Is the concept of a digital camera's ability to make nice or not-so-photorealistic skintones a factor based at the sensor level or is it further down the pipeline?

For example, we all can agree the Alexa has beautiful skintones and is this the result of how the sensor is developed - its CMOS array of pixels and how it processes it, or is it more in its codec and it's code of LOG C?

Another example, most of us agree that we don't like the skintones of Sony as much (SORRY SONY I LOVE YOU I OWNED LIKE 8 of your cameras) - in that they are more of an orange hue. Or that Red cameras have a more greenish cast, well in the dragon skintone optimized OLPF days.

And then they release OLPF and the skintones improve.  And then again with redwidegamut, better skintones.  So what's going on - is it all just sensor level stuff - is it everything?  

And now second part:  digital cameras and their motion cadence.  To my eyes, the motion of digital cameras all feel different at 24 FPS.  I really love the motion feeling of red cameras at 180 degree shutter.  To me, alexa cameras feel a little too blurry in their motion.  And the king for me of recent was the Sony F65 with a mechanical shutter.  What is happening at the shutter technology - is there some kind of interpolation of how frames are put together along with digital shutter that makes a camera have motion that feels one way and another the others?  Is this affected by the codec of how motion blur is captured?

Another case in point, I feel the motion is more pleasing to my eyes of the blackmagic pocket camera vs the newer ursa mini pro 4.6k cameras - again is this something that is affected by the programmers?

I hope I don't start a flame war with camera loyalists.  I am really agnostic about cameras and lenses.  And these are just my opinions on what I like and don't like. 

Thank you!

ed david | cinematographer

 

WWW.EDDAVIDDP.COM   /   917.449.0739

alister@...
 

I would add that motion and cadence perception can be effected by factors such as noise, contrast/sharpness and depth of field. A camera with a lot of fixed pattern noise can look a little odd when you pan as the noise remains largely static while the images move, a lot of random noise can help smooth an otherwise juddery pan. A image with a lot of sharp edges will tend to appear to skip more as your eye’s tend to notice sharp well defined edges moving rather than blocks of color. So a shallow depth of field shot will tend to exhibit less judder than a shot with a very deep DoF. It can be tough trying to get a pleasing cadence from a lot of the small sensor handycams as often the image processing adds a lot of edge sharpening or contrast enhancement and that combined with the typically deep DoF can lead to juddery looking images at 24p etc. Bigger sensor, better image processing with no added sharpening, shallow DoF etc and the motion tends to be more pleasing. 

Alister Chapman

DoP - Stereographer
UK Mobile +44 7711 152226
US Mobile +1(216)298-1977


www.xdcam-user.com    1.5 million hits, 100,000 visits from over 45,000 unique visitors every month!  Film and Video production techniques, reviews and news.

alister@...
 

And if you are looking at the cadence on a TV, who knows what the TV is doing to your motion. It’s getting harder and harder to deliver a controlled cadence to home viewers as most modern TV’s incorporate all kinds of sophisticated (or less sophisticated) motion smoothing and motion estimation processes that turn your 24p into 60p or 120p or…….   of highly variable quality. 
 
Alister Chapman

DoP - Stereographer
UK Mobile +44 7711 152226
US Mobile +1(216)298-1977


www.xdcam-user.com    1.5 million hits, 100,000 visits from over 45,000 unique visitors every month!  Film and Video production techniques, reviews and news.

Geoff Boyle
 

Interesting that Ed mentions the mechanical shutter.
I did blind tests a few years ago at the Hannover workshops with an F65 and Alexa Studio.
We shot scenes that had a lot of motion in them on both cameras with and without the mechanical shutter.
We projected at 2k in a good theatre and the results were 100% in favour of the mechanical shutter.
I'll remind you, these were totally blind tests.
I had intended to constitute the tests but to include the Tessive filter, unfortunately they were taken over before I could do the tests.

Cheers
Geoff Boyle NSC
Netherlands

Sent from BlueMail


alister@...
 


On 29 Mar 2018, at 17:47, Geoff Boyle <geoff@...> wrote:

Interesting that Ed mentions the mechanical shutter.
I did blind tests a few years ago at the Hannover workshops with an F65 and Alexa Studio.
We shot scenes that had a lot of motion in them on both cameras with and without the mechanical shutter.
We projected at 2k in a good theatre and the results were 100% in favour of the mechanical shutter.


An interesting test and I’m not surprised by the result, I think the F65 always looks nicer with the mechanical shutter. It would be interesting to do a similar test to compare a global shutter camera and a mechanical shutter camera. My theoretical brain say's there shouldn’t be a difference but I also suspect that quite possibly they will look different. 

Alister Chapman

DoP - Stereographer
UK Mobile +44 7711 152226
US Mobile +1(216)298-1977


www.xdcam-user.com    1.5 million hits, 100,000 visits from over 45,000 unique visitors every month!  Film and Video production techniques, reviews and news.














David Brillhart
 
Edited

Ed,

I agree with your assessment of the various sensors you mentioned.  I think its extremely helpful to know about these characteristics prior to choosing your camera, if you are so lucky.  Over the decades of my filming, I’ve focused on crafting the light to meet my artistic and client needs.  (My professional work has been limited to digital acquisition) No matter what the sensor’s characteristics, if there was time and a decent monitor on set, I’d often filter my key or fill to offset what was unwanted in the colorspace.  This sounds too simple as I write it, but my roll of gels is one of my most important tools.  With the ability today to shoot most everything in ‘existing light’ settings, it seems to have created this desire for the perfect camera sensor.  For me however, most of the fun is making whatever I have and whatever situation, work.

 

My opinion about motion cadence - don’t pan unless you have something foreground to follow.  I do pan, but I bristle at the cadence issue even on the best funded features.  I’m certain in terms of ‘electronic shutters’ this is a technology that is in need of disruption.

 

David Brillhart

Cinematographer, Sacramento

www.brillhart.com

410-707-3552 

Ed David <ed.david@...>
 

Thank you guys so much for your answers - all very interesting points.

And after watching 35mm film vs alexa vs f65 vs red epic from CML - https://vimeo.com/105714783 - I am now thinking alexa matches the motion of 35mm the most.

I'm going to try to add some film grain to the clip to see if that helps as well
ed


ed david | cinematographer
WWW.EDDAVIDDP.COM   /   917.449.0739

dhisur@...
 

IMHO, all current electronic cameras, are using a two axis grid (the photosites in lines and columns) and as such, anything to be acquired, will have this underlying structure, and when panning the camera, it's, in my opinion, this underlying structure that generates the artefacts like judder.
 
At Fotoquímica Films SpA we prefer to shoot on film. With Vision3 color negative platform and proper ECN-2 development (thus using the colour agent CD-3) we do get a very wide exposure latitude, and natural skin tones, and a kind of image that is tri-dimensional (because the emultion loaded on the camera is conformed by microscopic silver halides cristals).
BTW DFT Scanity HDR is a great scanner available in the industry, that delivers DPX RGB 16-bit log/flat image sequences. Other scanners available: Arri Scan, Filmight Northlight, LaserGraphics scanners etc.

But then there is also the on-going R&D from Eastman Kodak regarding the announced new Ektachrome 100D that may eventually set new standards in motion picture and photo color reversal.

Best regards,
Daniel Henriquez Ilic
Filmmaker 
Santiago de Chile

Adam Wilt
 

We projected at 2k in a good theatre and the results were 100% in favour of the mechanical shutter.
I had intended to constitute the tests but to include the Tessive filter…

The mechanical shutter with its penumbral sweep effectively fades-in / fades-out its exposures; an electronic shutter has an instant-on / instant-off, square-wave exposure profile (I’m talking about the effect at any point on the sensor surface, quite apart from global vs. rolling effects over the entire sensor). That slight “rounding off” of the mechanical shutter’s exposure window reduces high-frequency aliasing that increases perceptible judder / strobing. 

The Tessive Time Filter went even further, offering exposure profiles all the way from square-wave to nearly sinusoidal, and it yielded very smooth, highly watchable 24 fps images at pretty much any panning rate. At the time I thought it was one of the biggest advances in 24fps motion rendering ever, and I regret that the Time Filter didn’t get any wider exposure (pun intended) before Tessive was taken over.

Adam Wilt
technical services: consulting / coding / camerawork
Vancouver WA USA (no, not that Vancouver, the other one)

alister@...
 


On 29 Mar 2018, at 21:30, Adam Wilt <adam@...> wrote:

The mechanical shutter with its penumbral sweep effectively fades-in / fades-out its exposures; an electronic shutter has an instant-on / instant-off, square-wave exposure profile (I’m talking about the effect at any point on the sensor surface, quite apart from global vs. rolling effects over the entire sensor). That slight “rounding off” of the mechanical shutter’s exposure window reduces high-frequency aliasing that increases perceptible judder / strobing. 

But isn’t the fade in/fade out lost when the frame is recorded or presented on a video display. After all each individual isn’t faded in and out, it’s on or off. I can see how a mechanical shutter would add a fade effect on projection. But not when each frame is recorded as a video frame I would have thought the effect was lost unless the frame rate was several times the shutter speed.


Alister Chapman

DoP - Stereographer
UK Mobile +44 7711 152226
US Mobile +1(216)298-1977


www.xdcam-user.com    1.5 million hits, 100,000 visits from over 45,000 unique visitors every month!  Film and Video production techniques, reviews and news.














Keith Putnam
 

On Thu, Mar 29, 2018 at 12:57 pm, <dhisur@...> wrote:
this underlying structure that generates the artefacts like judder.
24fps judder will exist in any 24fps capture system, no matter what medium is at the "film" plane.

On Thu, Mar 29, 2018 at 01:35 pm, Adam Wilt wrote:
The mechanical shutter with its penumbral sweep effectively fades-in / fades-out its exposures
Yes, a spinning mirror shutter is not a global shutter; not all of the frame is exposed simultaneously.

Keith Putnam
Local 600 DIT
New York City

JD Houston
 


On Mar 29, 2018, at 1:30 PM, Adam Wilt <adam@...> wrote:

biggest advances in 24fps motion rendering ever, and I regret that the Time Filter didn’t get any wider exposure (pun intended) before Tessive was taken ov

It is still I availableI think as RealD TrueMotion (on the cloud even for processing "
  • Runs on MacOS, and coming soon to Linux and cloud based service from Sundog Media Toolkit.

But it seems to be one of those ideas everyone likes
but few people pay for.  No sure how they are doing these days.  I think Tony Davis is still at RealD.

Jim


Jim Houston
Consultant, Starwatcher Digital, Pasadena, CA

Adam Wilt
 

But isn’t the fade in/fade out lost when the frame is recorded or presented on a video display. 

No, the shaped exposure window attenuates high-frequency aliases during capture, before they are inextricably embedded in the captured image. Tessive images were clearly smoother than “normal” electronic-shuttered images even on the coarse and primitive LCD displays of that long-distant era (2011, grin).

I’m embedding an image in my email and with any luck it’ll appear; if not, it’s towards the bottom of the page at Cine Gear Expo LA 2011 by Adam Wilt - ProVideo Coalition


Using a Gaussian sampling window as the Time Filter did works in the temporal domain just like Gaussian sampling in the spatial domain. Put “gaussian sampling to reduce aliasing” into your favorite search engine and you’ll get enough results to keep you busy for hours.

There’s a whole bunch of artifacts possible depending on the presentation technology (don’t get me started on DLP artifacts during eye saccades) but these are in addition to temporal capture artifacts. Once you’ve captured the spinning wagon wheel spinning backwards, you’re stuck with it.

It is still I availableI think as RealD TrueMotion

That’s a different way of skinning the same cat: TrueMotion (formerly Tessive Time Shaper) uses a 360º shutter and high-frame-rate (120fps+) capture to record a scene, and then lets you selectively pull groups of frames from that stream to synthesize lower frame rates (e.g., 24fps) with “shutter angles” determined in post by combining more or fewer source frames into each output frame, dimming the leading & trailing source frame(s) as needed to “shape” the synthetic shutter window. I saw a demo (I think by Jim DeFilippis) at the Tech Retreat a few years ago: it worked better than it had any right to, disgustingly well given how “coarse” the source sampling was. 

Mind you, you do need an HFR-capable camera and plenty of light, and the storage requirements are scandalous!

Adam Wilt
technical services: consulting / coding / camerawork
Vancouver WA USA (no, not that Vancouver, the other one)

deanan@gmail.com
 


That's the dyes on the sensor, the IR/UV cut, and the color pipeline have the biggest effects on skintone color.

The spectral response of the sensor/dyes is further shaped by the IR/UV cut, which in turn is massaged into something pleasing by the color processing.
If the dyes and IR/UV cut filters aren't balanced favorably for skintones, the color pipeline has to work againt that.

Latitude, precision, olpf (separate from IR/UV cut), lossy compression, and lens qualities also affect skintones but in different ways (grading, texture, etc).

Deanan DaSilva
Playa del Rey, Ca

On Thu, Mar 29, 2018 at 5:37 AM, Ed David <ed.david@...> wrote:

Is the concept of a digital camera's ability to make nice or not-so-photorealistic skintones a factor based at the sensor level or is it further down the pipeline?

And now second part:  digital cameras and their motion cadence.  To my eyes, the motion of digital cameras all feel different at 24 FPS.  I really love the motion feeling of red cameras at 180 degree shutter.  To me, alexa cameras feel a little too blurry in their motion.  And the king for me of recent was the Sony F65 with a mechanical shutter.  What is happening at the shutter technology - is there some kind of interpolation of how frames are put together along with digital shutter that makes a camera have motion that feels one way and another the others?  Is this affected by the codec of how motion blur is captured?


Daniel Drasin
 

Keith Putnam writes: Yes, a spinning mirror shutter is not a global shutter; not all of the frame is exposed simultaneously.

-------------------

Well, the net effects (gross and subtle) of a spinning shutter depend on shutter-opening angle, shutter orientation (focal-plane vs 45 degrees) and lens telecentricity. At one extreme it's global (with fade-in/out), and at the other extreme a rolling shutter.

Dan Drasin
Producer/DP
Marin County, CA


Mark Weingartner, ASC
 


On 29Mar, 2018, at 17:41 30, Adam Wilt <adam@...> wrote:

it worked better than it had any right to, disgustingly well given how “coarse” the source sampling was. 

Jon Erland at the Pickfair Institute took a slightly different tack in this sailboat race - we photographed a variety of types of motion (stick fighting, Tai-chi, archery, etc) at a number of frame rates and also at 250 fps using a Phantom…    and then went back and generated various playback speeds and shutter angles by picking and processing different numbers of frames.

As a creative exercise it was really interesting -   You could easily create 24 fps playback frames with the motion blur characteristics of 12fps or 6fps photography…   or 60fps or 120fps with the motion blur you associate with 24fps.

Not just a good analysis tool but a really interesting production tool if one were to go down that path

One thing interesting about this approach is that integrating frames from high speed photography can be done without resorting to twixtor or any  optical flow frame interpolation…  I won’t say it is an “organic” approach exactly but it is additive without synthesizing…   thus avoiding some of the odd artifacts that re-speeding can cause.


As Adam pointed out, this i pretty storage-intensive at the on-set photographic stage of the process…  but hey - storage is getting cheaper every day…  I just bought a couple of 4TB hard drives for about twenty-five bucks per TB…   


Cheers,

Mark Weingartner, ASC
LA-based DP/VFX Supervisor





alister@...
 

I would have though that any fade-in/fade out effect with a mechanical shutter was mostly a function of the edge of the shutter blade creating a shadow/cut-off that isn’t entirely instant due to refraction, rather than the sweeping motion. The shutter sweep will create it’s own flavour of scan artefact which may exhibit similarities to an electronic rolling shutter. But it is all these tiny things that give film or certain cameras their individual looks. 

Thanks for the images posted though - a picture tells a thousand words!

Alister Chapman

DoP - Stereographer
UK Mobile +44 7711 152226
US Mobile +1(216)298-1977


www.xdcam-user.com    1.5 million hits, 100,000 visits from over 45,000 unique visitors every month!  Film and Video production techniques, reviews and news.


dhisur@...
 

"24fps judder will exist in any 24fps capture system, no matter what medium is at the "film" plane."
Fair enough... but using a fixed grid of photosites (a pattern of lines and columns) instead of silver salts of a photosensitive emultion, is likely to contribute to more judder.

So things in my opinion, all textures, for example skin and fabrics are smoother, more life alike, when shot on film rather than when their reflected light is acquired by a grid of photosites, that is non continuous (no all the surface of the sensor integrates the incoming light, there are 'holes' in between each potosite)

If the 'electronic digital camera' has only one sensor then demosaic is algo required (meaning another pattern - the color filter array or color filter mosaic to be decoded through an algoritm) to finally be able to create an RGB colour image.


Best regards,
Daniel Henríquez Ilic
Filmmaker / Photographer
Tech. Consultant
Santiago de Chile

deanan@gmail.com
 


On Fri, Mar 30, 2018 at 11:19 AM, <dhisur@...> wrote:
​>​
Fair enough... but using a fixed grid of photosites (a pattern of lines and columns) instead of silver salts of a photosensitive emultion, is likely to contribute to more judder.


Judder is effected by shutter cadence (open time vs closed time @ eg. 180deg), the gap between frames caused by the cadence, motion blur (or lack of enough blur to cover that gap), contrast ratio between the elements in the scene that are moving with respect to that gap.

We often view digital on higher contrast devices than in a theater, which amplifies the perception of judder. 

Deanan DaSilva
Playa del Rey, CA



 

Previous Topic Next Topic