Jump to content

Debayering resolution loss


Recommended Posts

  • Premium Member

You can do what people have done for years, shoot tests.

 

The thing is that the actual number is rather meaningless -- I never shot a resolution chart for all the years I was shooting in 35mm. You shot a test and looked at it and decided if it looked sharp enough for you, and even if it didn't, if it couldn't get any sharper, you lived with it.

 

But speaking of film, the whole 4K holy grail number comes from years of being told that 35mm film was 4K, when a 4K RGB scan of 35mm movie film would not resolve 4K in each color channel. On the other hand, you had detractors claim that film wasn't even HD in resolution. The truth was somewhere in between.

 

Given that you roughly get 75% of the resolution of the pixel count of the sensor after debayering, the published specs of these cameras are not that far off -- you can pretty much guess just by the pixel count of the sensor that the Sony F65 would give you more resolution than the Epic, which in turn would have more resolution than the Alexa. So I'm not sure why anyone thinks there is some sort of conspiracy to mislead people going on here when the truth is easy enough to find out. And every day, someone somewhere is publishing some sort of test of resolution, particularly for digital still cameras.

 

But you really have to be some sort of pixel peeper to really care that much about actual figures rather than what your eye tells you.

 

As for labeling the files by their measured resolution rather than their pixel dimensions, that doesn't make much sense either. For one thing, few images achieve the maximum possible resolution, so even if the camera is capable of achieving, let's say, 3.8K or 2.4K, whatever, most shots will be slightly below that and some may be way below that, so how could you label the files by the actual resolution? It wouldn't make much sense -- you'd debayer a 4K raw file to 4K RGB but have to label each shot "actual 2.4K RGB", "actual 3.6K RGB", etc. even though the file size was still 4K per channel? And there would be no way to know the actual resolution in each shot unless each shot contained a properly shot line resolution chart, which it won't.

 

So you basically have no choice but to label the debayered RGB file by its size and not by the amount of detail it contained, if you label it at all (the size of the file is its size no matter what you want to call it.)

 

As for the cameras being listed by the specs of their sensor, that is real data, it's just up to users to educate themselves on the issues of detail, resolution, whatever. Besides, it's a bit like dynamic range information, people seem to get slightly different results every time they do a test.

 

Ultimately though it's a bit like comparing film stocks and lenses, people pick them more for the look (and sometimes for the price) and less for the specs. I certainly never picked a stock for a feature based on charts from Kodak about RMS granularity, etc. But now I'm supposed to pick a digital camera by which one has the highest resolving power? And the truth is that I can make a pretty good guess just based on the sensor specs. But let's say every camera had the same sensor specs, they all were 4096 pixels across, then perhaps you'd do a little more investigating if you were curious if one resolved detail a bit better than another, if that mattered a lot to you.

 

But if getting a "true" 4K image mattered the most to you, people would only be shooting on the Sony F65 (though I guess the 5K Epic image comes close to measuring 4K)... but obviously a lot of people like shooting on the Alexa.

 

This whole "is it really 4K argument?" meant more ten years ago when people were seriously contemplating dumping 35mm in favor of 1080P HD cameras, but now it's becoming a bit moot as we have digital cameras that are in that ballpark or seem similar to what we get with 35mm film. We live and work in a world with all sorts of resolutions being shot.

 

As for technically ignorant producers, that has been a problem since the dawn of cinema. Products advertised to emphasize their strengths over their competition, or to de-emphasize their weaknesses, that's also not new with digital. It's always been necessary to educate yourself and see past advertising, and to draw your own conclusions... there's a reason why the ancient world invented the phrase "caveat emptor" and it's still relevant today because people are always going to be sold at.

Link to comment
Share on other sites

Wise words.

 

But for those of you out there having a laugh at other side of the discussion - the pixel peepers ;) - remember that some of them are doing meaningful things for the film-making community, things that could be really interesting. Just at the moment the signal to noise ratio is painfully negative.

Link to comment
Share on other sites

"4k" nor "4:4:4" are measurements but just technical specifications. "4k" is just 4000 (as long as we are picky: "kilo" is always 1000 and never 2^10 ;-) pixels horizontally and "4:4:4 RGB" is full luminance AND chrominance information in each pixel in every frame.

"Debayering" is just a specific form of interpolation using the different sensibility of human vision for chrominance and luminance.

 

Why do we have to be this specific and picky? Because it makes it easier to differentiate technologies by spec and not only measurement. Otherwise it becomes even more difficult for customers to understand the value of technical concepts like oversampling bayer-sensors, beam-splitters or other technologies.

 

Now a company tries to push "4k 60p" in a 20MBit/s-stream (or is that just the bitrate for 24p?) so large images without lot's of actual detail due to a low bitrate. The same issue here: "More pixels are just better, ALWAYS!"...

Link to comment
Share on other sites

I don't want to endlessly continue this as I think most of the people contributing to the thread understand the issues. But...

 

Don't you see that specifying cameras based on the output resolution as opposed to their actual capacity to resolve detail, is a licence to print money for camera manufacturers? You could claim anything, given software, and producers will believe it. That's not OK.

 

P

 

It's okay because no camera manufacturer is going to test and certify the system MTF based on an infinite number of variables from lens to filters.

 

Chip resolution + pixel pitch + pixel fill factor gives us a pretty good idea if there will be a lot or little resolution loss after debayering.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...