Jump to content

Is there truth to CCD cameras having more filmic or pleasing colours than CMOS cameras ?? If true, what makes it so ??


 Share

Recommended Posts

Reading through various forums, I always see people talk about the filmic colours on their old CCD stills camera and older cinema cameras like the Sony F35 having a filmic look and I wonder if this is just some nostalgia nonsense or there's some truth to it?? 

Link to comment
Share on other sites


  • Premium Member

It’s bit of both, there was a difference in look due to the nature of the color array filters and the recording formats of the time, and the Sony F35 didn't use a Bayer mosaic pattern but an RGB stripe approach, so each of the three colors were captured equally rather than 50% green, 25% blue, 25% red. And with CCDs, you don't have rolling shutter issues. But whether any of that made the image more "filmic" is up to debate.

You could look at the first "Captain America" movie, shot mostly on the Panavision Genesis (similar to the Sony F35) but the final sequence set in modern times shot on the ARRI Alexa.

  • Like 1
Link to comment
Share on other sites

  • Site Sponsor

I think allot or most of the of the early CCDs were made by Kodak which makes excellent color dyes and the CFA filters made by Kodak for the CCDs were based in the legacy of color science from film.

All color filter array sensor have some or allot of color channel crosstalk as the color dyes on the photo sites are not in any sense perfect. So I think the better the CFA the better the outcome especially back when there was allot less CPU power to apply to the demosaic math.

 

Link to comment
Share on other sites

3 hours ago, Robert Houllahan said:

I think allot or most of the of the early CCDs were made by Kodak which makes excellent color dyes and the CFA filters made by Kodak for the CCDs were based in the legacy of color science from film.

All color filter array sensor have some or allot of color channel crosstalk as the color dyes on the photo sites are not in any sense perfect. So I think the better the CFA the better the outcome especially back when there was allot less CPU power to apply to the demosaic math.

 

In the consumer market there's a discussion about how older sensors (the original Black Magic Pocket, the 5D3, etc.) had nicer color rendering than today's cameras. I've anecdotally noticed that the F3 had nicer color than the F5/5 and C300 than the C300 Mk2 but I figured it was just growing pains going from 2K to 4K and in increasing dynamic range, etc. as I think they advertised wider color gamuts from the sensors.

Was there a change in design philosophy re: CFAs? I still prefer the look of some of those older cameras. But I also REALLY like the image from the Venice, and even the Varicam35, for instance. 

Are different cameras quite different today? I have noticed the A7S3 has redder skin tones than the Venice, so I imagine different dyes? Black Magic seems to use Sony sensors with some built-to-order tweaks in their new Pocket Cameras, and even those have very nice looking color to me. Do they just profile the sensors better or is there a difference in CFA between the P6K and XT3?

Edited by M Joel W
Link to comment
Share on other sites

If you take a look at the Digital Bolex which was a S16 camera created from a kickstart campaign. It had a CCD sensor and you could straight away feel that there was something diffrent in a really nice way about this camera The textures and colours were somehow diffrent, real straight out of the camera without any tweaking and so on. So I do think that there is something to CCD sensors but of course there are pluses and minuses with both sensors.

If i am not mistaken the Aaton Penelope Delta was with a 3.5K CCD sensor of course that unfortunately never took off but those guys were really ahead of their time back then.

Link to comment
Share on other sites

  • Premium Member

One thing about CMOS sensors is that they're much more likely to have multi-tap outputs, which means that subsections of the array will potentually go through different amplifiers and analogue-to-digital converters. The flexibility to do that is a large part of the attractiveness of the CMOS process in the first place; there were a few CCDs which read out from both sides, and thus had two sets of output amplifiers, but modern designs with very high resolution and dynamic range need more.

The electronics associated with each tile inevitably have small tolerances so they won't all perform in exactly the same way, particularly in terms of dynamic range and noise floor. As a result, each of them ends up requiring a certain amount of correction to avoid visible tiling in the image, which means that the overall performance of the whole sensor is intrinsically limited to the performance of the worst-performing tiles.

That's possibly one of several contributory reasons why CMOS sensors can look a little less good than otherwise equivalent CCDs, particularly in terms of highlight handling. The difficulty with all this is that you'll very rarely see CMOS and CCD sensors with anything like identical performance goals, so, as the article Karim linked suggests, other factors are likely to make it difficult for anyone to get a particularly good eye for spotting the difference from an otherwise unmarked image. It's probably possible to train one's eye to spot a particular camera, and knowing what that camera uses might make it possible to associate CMOS or CCD with a particular image, but it's a bit of a stretch.

I don't think it would ever have been plausible to achieve things like 8K, full-frame sensors which exhibit acceptable noise at hundreds of frames per second and thousands of ISO without using CMOS. Since those are things we apparently want, it's a moot point anyway.

P

Link to comment
Share on other sites

  • Site Sponsor

A few thoughts

I think at least some of the consumer line of sensors are build with multiple markets in mind and the same basic design of a sensor may be in your phone, a lower end camera and a machine vision camera sorting beer bottles at a packaging plant. So the CFA dyes may be a mix of color performance and sensitivity etc and not for ultimate pix quality.

The Digital Bolex used a S16MM sized Kodak CCD with 4-taps and I remember the thing that was long holding back the camera release was the tap balancing, which if done wrong will show four quadrants with obvious tap lines in the picture. For some reason it seems that CMOS sensors which have many many more taps do not have the very fussy tap balance issues that CCDs do. CCDs can be run in single tap for best performance but there is a big speed hit.

Other than the Digi Bolex and the Dalsa Origin I don't know of many or any D-Cine camera systems based around CCDs due to tap balance and speed readout issues. Cmos is just faster to read as a global shutter system with many more taps and largely ushered in the era of digital cine cameras.

It seems that CCDs have mostly been retired by On-Semi (used to be Kodak) and with Sony making 14K 16bit cmos sensors and others like G-Pixel with the 9.4K sensor and the potential for high framerates with those super high res cmos sensors the saleability of CCDs in the "big" markets for imagers is gone.

  • Like 1
Link to comment
Share on other sites

As a side note,
While a camera's CFA and initial spectral primaries impact colour rendition, it is essential to note that any tristimulus observer can resolve any colour. 
It is also important to note that almost all cameras do not conform to the Luther-Iver; by that, I mean a single linear transform away from XYZ CMFs (a single linear transform away from our photopic visions cone functions). 
So you have your initial spectral primaries (the RGB value of each primary - R, G, B), and you apply gain to each primary to manipulate your spectral primaries and, therefore, your gamut. That is how a 3x3 linear matrix works regarding an OETF (optoelectric transfer function).

So most, if not all, cameras (apart from scientific use cases) generally resolve primaries that can only be transformed linearly with error to our XYZ/photopic vision. The CFA densities and primaries perform well on a Luther-Iver front but poorly on an SNR and dynamic range front. 

So all cameras on a sensor level resolve something wacky, all manufacturers apply a 3x3 matrix to transform to their acquisition gamut or as close to 709 as possible etc. 
Note that the limitation here is the 3x3 linear matrix. Think about an RGB cube; you have three axes, and your 3x3 linear matrix applies a transform on these three axes; you can only transform linearly. That is Steve Yedlin's whole display prep schtick. He gives control to the secondaries as well as the primaries. He divides each primary and secondary into tetrahedra, applying interpolation across values dependent on our gain level to each primary and secondary RGB value. 

Link to comment
Share on other sites

  • Premium Member
On 3/21/2023 at 6:44 PM, John Shell said:

a filmic look and I wonder if this is just some nostalgia nonsense or there's some truth to it

Of course is it nonsense. Video camera sensors are sensitive to very narrow colour bands, almost dead on one wavelength per primary colour. The advantage of it is a much preciser or more clearly defined colour image from whose data a result can be tweaked way more easily than with film.

Film is sensitive to wide parts of the spectrum with overlapping sections between blue and green or green and red, respectively. That complicates matters. Most people in the trade are ignorants and worse still, they don’t even want to know shit. Filmic look is a acromoron expression of blunt foolishness, if it can be put so. The tip of the top of the vanity are perforation hole and scratch-‘n’-dust overlays to videos.

You can’t polish a turd.

Link to comment
Share on other sites

9 hours ago, Simon Wyss said:

Of course is it nonsense. Video camera sensors are sensitive to very narrow colour bands, almost dead on one wavelength per primary colour. The advantage of it is a much preciser or more clearly defined colour image from whose data a result can be tweaked way more easily than with film.

Film is sensitive to wide parts of the spectrum with overlapping sections between blue and green or green and red, respectively. That complicates matters. Most people in the trade are ignorants and worse still, they don’t even want to know shit. Filmic look is a acromoron expression of blunt foolishness, if it can be put so. The tip of the top of the vanity are perforation hole and scratch-‘n’-dust overlays to videos.

You can’t polish a turd.

Is this true? I'm behind unfair here by choosing a film that is known in particular for its color rendition and the first Sony sensor I found on google. But if I'm reading this correctly (I'm probably not) there's (much) more separation on film:

https://www.dpreview.com/forums/post/61982384?image=0

https://www.ishootfujifilm.com/uploads/VELVIA 50 Data Guide.pdf

Edit: 

Much more overlap here:

https://www.kodak.com/content/products-brochures/Film/VISION3_5219_7219_Technical-data.pdf

But looks similar to digital?

 

Edited by M Joel W
Link to comment
Share on other sites

The problem is that the cost of film easily negates those advantages to most except for studios and such. It costs 12-1400 dollars per page when you factor in film purchase and lab expenses. A 10 page short for instance costs 12-14k just in film stock related expenses... let's add location(s), crew, food, rentals... you see how crazy numbers get very quickly. I would not want to shoot a narrative project for anything less than 10:1 ratio and sometimes 20:1 for the dialogue coverage. You could shoot for less but then you will pay for that mistake in the editing room for sure. Now you got an unfinished project that you spent thousands for. S16 is cheaper but s16 is not an alternative to an Alexa... 35 is- especially 500t. It is far too grainy on s16 for movie screens. Surely movies like Carol looks great and pretty but it also looked soft in the wide shots too. 200t should be as high speed as one should get if the project is meant to be blown up to 35 or dcp.

Edited by Giray Izcan
Link to comment
Share on other sites

When I look at articles by digital stills photographers discussing the specific look of different cameras the comparisons to me always look exactly the same. If there are differences in the look of the image in broad daylight shots they seem to me to be somewhere in the realm of 'miniscule to the point of absurdity'. The differences are nothing like the difference in look in the film world between a fast film and a slower one or a different brand eg. Kodak vs Agfa. In digital cinema cameras I can see a slight difference in look between a Sony FX9 and a Canon C300 Mark III, but I think even here the differences to anyone but the cinematographer are basically meaningless.

  • Upvote 1
Link to comment
Share on other sites

Jon, let's ask those superhumans to do a blind test to pick out the cameras used for each of those images to see how many of them will actually accurately pick out the cameras just by looking. 

Link to comment
Share on other sites

  • Premium Member

All color imaging systems relying on RGB filtering, whether film or digital, have to allow some crosstalk between the three primaries to record information about those shades in between.  Too much crosstalk, though, and you get a desaturated image. So it's as much an art as a science in terms of picking separation filters to be used.  And too narrow a bandwidth for each color separation filter, not only do you get more saturation in the primaries (something like the old Technicolor look) but less of those shades in between, but you also lose sensitivity due to the density of the filters.

  • Like 1
Link to comment
Share on other sites

Oh it is film vs digital again? 

It should not matter at all anymore in 2023 when we have so many great camera systems and formats and workflows available. 

Except, of course, that it is really boring to shoot with digital cameras. Takes all the "magic" away. Not fun to shoot anything then.

Maybe it is why digital people try to find some meaning in all it by endlessly comparing their gear and nitpicking about their current gear being so obsolete that they can't shoot anything before getting more pixels and ten trillion iso sensitivity 😄

 

Link to comment
Share on other sites

6 hours ago, David Mullen ASC said:

And too narrow a bandwidth for each color separation filter, not only do you get more saturation in the primaries (something like the old Technicolor look) but less of those shades in between, but you also lose sensitivity due to the density of the filters.

The Phase One Trichromatic sensors were designed to prioritise colour. And although they are less sensitive, they make up for it with a cleaner noise pattern in the shadows.

2 hours ago, aapo lettinen said:

Maybe it is why digital people try to find some meaning in all it by endlessly comparing their gear and nitpicking about their current gear being so obsolete that they can't shoot anything before getting more pixels and ten trillion iso sensitivity 😄

Some people are still using the BlackMagic Pocket HD. And the Red ONE. Etc. If you budget doesn't stretch very far, make it work harder!

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...