Jump to content

New ways to determine effective resolution


Keith Walters

Recommended Posts

  • Premium Member

I was going to add this to the other thread I started, but now it's been closed.

(Does every discussion here have to end in a verbal punch-up? And do you people think nobody reads these forums except signed-up members?)

 

Anyway, like many people, I'm beginning to question the validity of simple resolution test charts for testing the effective performance of modern electronic cameras. I'm not saying that there shouldn't be ANY sort of quantitive testing, just that we need some other way of accomplishing this.

 

I think the only real way the effectiveness of the Bayer decoding software can be tested is to do what TV researchers did in the 1930s. They ran exhaustive tests of prototype TV systems before large audiences of unbiased lay people (or simulations using film projectors if the equipment had not yet been developed), asked them to rate their visual experiences, and rigorously analyzed the data. The current NTSC TV standard of 575 lines 30 frames per second was laboriously worked out that way, to derive the best possible balance between useable picture quality and the number of channel frequencies possible. (They obviously did their homework well, because 66 years on, the same basic standard is still being used for broadcasting).

 

In the days of analog TV cameras, resolution test charts were considered a reliable means of quantifying a camera's resolution performance, but that wasn't always a clear indicator of perceived image quality.

 

Up until the late 1980s, most broadcast quality cameras used Saticon or Plumbicon tubes, which require fairly regular maintenance and adjustment. The focus and general "health" of the camera tubes could most easily be checked by observing how well they could image a pattern of fine vertical lines. This could be quantitatively tested with an oscilloscope or Waveform Monitor, or simply adjusted by eye with a high-resolution monitor. (With the advent of CCD sensors permanently glued to dichroic prism blocks, all this fell into disuse of course).

 

However, optical resolution charts weren't the whole story. Most 2/3" tubes in compact cameras (Betacam and the like) had a pretty ordinary high frequency response, and it was a real struggle to get anything out of them much past 300 lines, even when they were new. All manufacturers sought to overcome this in one way or another by adding what was euphemistically referred to as "aperture corrrection", "Detail Correction" or "Edge Correction". Basically the camera's electronics would look for places in in the image that look like an abrupt change in brightness might have taken place, and basically add a bit of "makeup" in the form of a thin black or white line to give the illusion of a sharper transition than the tube had actually produced.

 

Used judiciously, detail correction can certainly make a soft-looking image look a lot sharper and more pleasing to the eye than it really is, and was a major driver behind the success of compact portable cameras. But it doesn't really add any resolution, it just gives the illusion that it does.

 

What it cannot do make a 2/3" tube perform like a 1" tube. A good 1" Plumbicon tube can resolve up to around 450 lines (although not a 100% amplitude, so some detail correction was still employed). With both cameras pointed at a resolution chart, at some point both images would start to show the finer lines as blocks of grey, and of course it would happen with the 2/3" tube before the 1" tube.

 

But when you pointed them at an actual newsreader or similar, it was often quite hard to tell which picture was which. The only thing that gave it away was when the subject was wearing some article of clothing with a repetitive pattern, and more often than not, it produced strobing or "cross colour", which was actually considered by many to be less desirable than a soft image with detail correction! I don't know how many (if any) blind comparisons were made of the performance of cameras with "true" resolution vs cameras that just "faked it" with detail correction, but it would be an interesting exercise with the new digital cinametography cameras.

 

Frankly, the people who are obviously hoped to be the paying customers of all the RED purchasers are not terribly likely to be swayed purely by the self-congratulatory endorsements of the said owners (or intending owners). The only things that might carry any weight are seeing other people's commercial projects (movies, TV commercials or TV shows) with really impressive imagery, or some completely unbiased "market research" using the abovementioned lay people.

 

You may have noticed that none of the Digital Cinematography systems developed over the past ten years have had more than a minor input to the motion picture industry generally. You can try to talk up the success of any format as much as you like, the "Big Boys" are not going to take any notice of that; they want to see hard evidence.

Link to comment
Share on other sites

  • Premium Member
(although not a 100% amplitude, so some detail correction was still employed).

What we need far more than any new way to test resolution is a realization that resolution cannot be reduced to a single numerical measurement. Downconversions can be brick-wall filtered and cut off sharply in the target system, native camera resolution is always subject to the rolloff of the OLPF, which has to start an octave lower than the required cutoff.

 

But when you pointed them at an actual newsreader or similar, it was often quite hard to tell which picture was which.

Quite a bit of that can be attributed to the monitors. The shadow mask and phosphor triads impose their own re-sampling on the image, with its own Nyquist limit, just like looking at the real world through a screen door. Last I checked, even the best HD CRT's were about 900 triads across.

 

The only thing that gave it away was when the subject was wearing some article of clothing with a repetitive pattern, and more often than not, it produced strobing or "cross colour", which was actually considered by many to be less desirable than a soft image with detail correction!

Cross color happens (here in the NTSC world) when something in the image has fine detail that hits right at the chroma subcarrier frequency, 3.58 MHz. You can acutally have luminance detail in the region between the chroma and audio subcarriers, 3.58 to 4.5 MHz. In component, you can just notch 3.58 out of the luma before you encode as baseband NTSC.

 

 

 

-- J.S.

Link to comment
Share on other sites

  • Premium Member
Quite a bit of that can be attributed to the monitors. The shadow mask and phosphor triads impose their own re-sampling on the image, with its own Nyquist limit, just like looking at the real world through a screen door. Last I checked, even the best HD CRT's were about 900 triads across.

There is some of that, but we had some pretty good grade 1 monitors, and we also had pattern generators that could produce multibursts up to 10MHz, so we had a fair idea of what the monitors were capable of. Needless to say, none of the cameras were capable of producing more resolution than the monitors could handle.

 

I have to admit, I've seen my share of "HD ready" CRT TV sets that seemed to have exactly the same sort of tubes that were fitted to equivalent SD models! A lot of people attribute the improvement in quality made simply by using S-Video or component inputs to them being "HD" sets. I've also seen at least one 640 x 480 LCD TV proudly boasting about its "built-in HDTV tuner"!

 

Cross color happens (here in the NTSC world) when something in the image has fine detail that hits right at the chroma subcarrier frequency, 3.58 MHz. You can acutally have luminance detail in the region between the chroma and audio subcarriers, 3.58 to 4.5 MHz. In component, you can just notch 3.58 out of the luma before you encode as baseband NTSC.

-- J.S.

Actually, cross-colour happens whenever the luminance frequencies fall within the bandpass of the chroma decoder, not just at the subcarrier frequency. With mass-market NTSC TV sets, that's usually +/- 500kHz of the subcarrier frequency. Most PAL receivers have a considerably wider bandpass, up to +/- 1MHz.

It's not simply a matter of notching it out with a simple tuned trap. That works well in the luminance channel a TV receiver, because the subcarrier frequency modulated by low frequency chrominance signals is most noticeable as a dot pattern; the higher frequency sidebands are nowhere near as visible.

 

With a chroma decoder, it's the other way round: The luminance components furtherest away in frequency from the subcarrier frequency are the ones that cause the most cross-colour. You can certainly filter them out with a band-stop filter, but that usually takes most of the higher frequency luminance components with it!

 

Somewhere I've got some photos of the results I obtained with a monochrome video sweep generator I once made and fed into a PAL colour vision mixer, so I could see what happens at the various frequencies. From about 3.5MHz to 5MHz, it's just a flickering mess! It would be a great way to demonstrate the difference between S-Video and composite video!

Edited by Keith Walters
Link to comment
Share on other sites

You're right about the passbands -- I wasn't being sufficiently precise the way I wrote that.

-- J.S.

 

From DP perspective and one who has many conversations with Dave at DSC labs about new types of charts when HD began to take off, I'd like to see a chart/s for daily use that reveal

1/ Dynamic range of camera

2/ Resolution of camera/lens

3/ Tracking of colours in highlights.

4/ Noise in shadows.

5/ Alaising issues.

6/ Colur patch

 

 

A rear illuminated chart is the way to go to create a wide dynamic range. A portable LED illuminated chart/screen that can be used in daylght would be neat.

 

 

 

Mike Brennan

Link to comment
Share on other sites

Actually the NTSC did a poor job with their homework 66 years ago by adopting the 480 line interlace scanning system which resulted in very poor picture quality. What the NTSC should have done 66 years ago was adopt a 360 line progressive scanning system which would have resulted in much better picture quality.

Link to comment
Share on other sites

  • Premium Member
Actually the NTSC did a poor job with their homework 66 years ago by adopting the 480 line interlace scanning system which resulted in very poor picture quality. What the NTSC should have done 66 years ago was adopt a 360 line progressive scanning system which would have resulted in much better picture quality.

If you can get hold of one of the early Television engineering textbooks by Donald Fink or similar, you'll see than an enormous amount of research went into developing the NTSC TV standard. The proposal you mention was considered but when the researchers went to the trouble of getting hard data on what lay people liked and disliked, they found that most people preferred the 525-line interlaced scan to 60Hz 400 line progressive, particularly on large-screen displays.

 

People seem to forget that, yes, a progressive scan has less horizontal edge flicker, but unless a much higher horizontal scan frequency is used, the line structure is much coarser.

 

"Very poor Picture Quality" is a rather serious exaggeration in any case. A major part of Dr Raymond Kell's research in the 1930s (which became known as the "Kell Factor") was to see how people reacted to the artifacts of interlaced scanning. When encouraged to find the viewing distance they found most comfortable, they almost invariably chose a viewing distance where, (after checking their visual acucity with eye charts) it was estimated they would be able to take in somewhere between 60 - 70 percent of the total resolution of the display. Generally, people found the interlace artifacts less distracting than the coarse line structure, which introduced its own artifacts.

 

Apart from that, with 1940s technology they could barely get screen sizes of 17 inches to work with a horizontal scan frequency of 15.75kHz. Increasing this to the higher frequencies needed for non-interlaced scanning would have been a major engineering challenge at the time. And the VHF spectrum would not be able to hold anywhere near as many transmission channels.

 

If you want to criticise the FCC for not moving to update the US TV standards for fifty years, well that's your privelege, but it's rather poor to criticise the 1930s engineers for doing the best they could with the technology available to them. They certainly did not go with any unsupported assumptions. People forget just how technologically advanced the NTSC system was for its time. In a space of just 15 years, domestic electronics technology went from AM radios based on battery operated triode tubes, to a television system virtually identical to the ones we still use today.

 

But how bad is Interlace anyway? The vast majority of the world's population still watch interlaced video, and I don't hear too many complaints. The problems only really started to appear when systems like the "Video Toaster" and other cheap desktop editing systems came on the market. People suddenly discovered that the lovely non-interlaced scan graphics on their computer turned into a horrible flickering mess when they converted them to NTSC. There were similar problems with the earlier 3-Chip CCD cameras, of course.

 

I notice that most modern professional video cameras still give you a choice of Interlaced or Progressive capture.

Edited by Keith Walters
Link to comment
Share on other sites

  • Premium Member
From DP perspective and one who has many conversations with Dave at DSC labs about new types of charts when HD began to take off, I'd like to see a chart/s for daily use that reveal

1/ Dynamic range of camera

2/ Resolution of camera/lens

3/ Tracking of colours in highlights.

4/ Noise in shadows.

5/ Alaising issues.

6/ Colur patch

A rear illuminated chart is the way to go to create a wide dynamic range. A portable LED illuminated chart/screen that can be used in daylght would be neat.

Mike Brennan

As far as a dynamic range chart goes, you probably don't appreciate what an engineering challenge that would be.

 

I once made up a demonstration system with two white LEDs (of the type used in flashlights). One had a steady 20 milliamps flowing through it, the other also has the same 20 milliamps, but switched on and off with a 1:2000 duty cycle. That is, 100 times per second, the LED was pulsed on for just 5 microseconds. The result is that the pulsed LED only glows 1/2,000th as bright as the steady LED. (Although 10 microamps is 1/2,000 of 20 mA, you can't just run the second LED with a steady 10 microamps, because their efficiency falls with very low currents).

 

The result is that one LED glows blindingly bright, while the other is just visibly glowing!

 

But 2,000 times is approximately equivalent to an 11-stop (66dB) dynamic range of a video camera. What that means is that the ratio of brightness of the two LEDS is approximately equivalent to the difference in brightness between the darkest and brightest pixels of a camera sensor working with the full 66dB dynamic range!

 

A grayscale for an 11-stop dynamic range would be a real challenge. The whitest bar would have to be glaringly bright, and I'm not sure how you could do that, apart from installing a couple of halogen downlight bulbs! Actually, I'd like to know just how the various manufacturers measure this.

 

But if you then take that blinding LED out in the sunlight, it suddenly stops being blinding....

 

Resolution test charts are another problem. I can't believe how inkjet printer manufacturers can get away with some of the extravagant claims they make for "dots per inch" on their products. I've never found any printer that can produce anywhere near the 1200DPI some manufacturers routinely claim for photo-quality work!

 

The only practical way I can envisage of making your own charts would be to print them out large on a lot of A4 or "Letter" sized sheets, and then glue them onto a large sheet of cardboard.

Link to comment
Share on other sites

  • Premium Member
A major part of Dr Raymond Kell's research in the 1930s (which became known as the "Kell Factor") was to see how people reacted to the artifacts of interlaced scanning.

What Kell found out is how far they could push the Nyquist limit before the interlace artifacts became too obvious. The result was that interlace lets you send about 65% of the resolution of a progressive picture using 50% of the bandwidth. That was a good trade-off.

 

People today may find it hard to wrap their heads around the constraints the NTSC guys were under. They had to push the best picture they could through a 6 MHz channel using analog vacuum tube technology. Interlace is a sort of ham-fisted lossy compression scheme whose unique advantage is that it could be implemented in the analog world. The redundancy that it discards can be squeezed out far more elegantly using digital methods, interlace only interferes with digital compression. So, bottom line, interlace was the right answer in 1936, 1940, 1953, and 1967. Progressive is the right answer today.

 

 

 

-- J.S.

Link to comment
Share on other sites

  • Premium Member
What Kell found out is how far they could push the Nyquist limit before the interlace artifacts became too obvious. The result was that interlace lets you send about 65% of the resolution of a progressive picture using 50% of the bandwidth. That was a good trade-off.

 

People today may find it hard to wrap their heads around the constraints the NTSC guys were under. They had to push the best picture they could through a 6 MHz channel using analog vacuum tube technology. Interlace is a sort of ham-fisted lossy compression scheme whose unique advantage is that it could be implemented in the analog world. The redundancy that it discards can be squeezed out far more elegantly using digital methods, interlace only interferes with digital compression. So, bottom line, interlace was the right answer in 1936, 1940, 1953, and 1967. Progressive is the right answer today.

-- J.S.

Link to comment
Share on other sites

  • Premium Member
What Kell found out is how far they could push the Nyquist limit before the interlace artifacts became too obvious. The result was that interlace lets you send about 65% of the resolution of a progressive picture using 50% of the bandwidth. That was a good trade-off.

 

People today may find it hard to wrap their heads around the constraints the NTSC guys were under. They had to push the best picture they could through a 6 MHz channel using analog vacuum tube technology.

 

Loftily sitting in judgement of decisions made in the past, but applying the standards of today, has always been a popular pastime. Understanding history is like watching classic Science Fiction movies: you have to imagine yourself living in the time they were made. (You also need to look for books and magazines actually published at the time, not depend on someone else's "research")

 

Interlace is a sort of ham-fisted lossy compression scheme whose unique advantage is that it could be implemented in the analog world. The redundancy that it discards can be squeezed out far more elegantly using digital methods, interlace only interferes with digital compression.

Well, all I can say is, it's kept a lot of people glued to their screens for 70 years, so they must have done something right.

 

The first practical AM radio transmissions were made just over a century ago. When FM was invented in 1935 the writing was supposedly on the wall for AM then. I don't know how it is in other parts of the world, but in Sydney at least, the number one rated radio station is not on FM!

 

So, bottom line, interlace was the right answer in 1936, 1940, 1953, and 1967. Progressive is the right answer today.

-- J.S.

 

 

Which is just one of the reasons why for many people film is also the right answer today, and has been since before the start of broadcast TV. Even back in the 1930s, engineers admitted that 35mm film scanned by a mechanical Telecine scanner gave far better image quality than the best electronic cameras of the time.

 

It's ironic that movies like "Gone With the Wind" and TV Sitcoms shot on film in the 60s and 70s are much more compatible with modern digital transmission, than interlaced videotape shot just a few years ago!

Link to comment
Share on other sites

The only reason why the NTSC adopted a scrambled interlaced broadcasting standard when a perfectly good progressive scanning technology was in place had nothing to do with picture quality but it was simply an attempt to artificially boost resolution numbers. By 1944 Britain had already accepted a 1000 line triple interlace scanning system as its official broadcasting standard so it seemed that the NTSC would have to push a minimum 500 lines of resolution in order to compete with the British. Although the British dropped its proposed 1000 line standard France implemented a 890 triple interlaced scanning system.

 

Although these scanning systems would have seemed to meet todays high definition standards the fact of the matter is that the triple interlaced 1000 line scanning system only resolves 333p and Franced 890 line triple interlaced system only resolves 296p. So the fact of the matter is that these systems actually had low definition performance with high definition numbers. The NTSC system with its 525 lines and only 480 lines visible actually resolves 240 lines and has a terrible screen door effect and is in fact a low definition scanning system.

 

Had the NTSC adopted a 60 hertz 400 line progressive scanning system whith 360 lines visible the result would have been a rock solid picture free of interlace artifacts with picture quality that blows away the interlace alternatives.

 

I find it very interesting that even today people promote the obsolete 1080i interlace standard and call this full high definition 1080 even though the 720p standard outperforms 1080i.

Link to comment
Share on other sites

  • Premium Member

France had an 819 line system in the 1950's, but it didn't last long.

 

http://www.tv-technology.com/pages/s.0079/t.2012.html

 

The analog interlace systems all had odd numbers of total lines: 405, 441, 525, 625, 819, 1035, etc. The reason for this is that the horizontal oscillator runs continuously, and dividing the odd line count by two gives you half a line's worth of time to land the second field in between rather than on top of the first. (BTW, the interlace systems that went on the air were always two field. Tests were done with higher order interlace, but it was abandoned.)

 

 

 

-- J.S.

Link to comment
Share on other sites

From what I gather, the triple interlace didn't get as far as being "accepted" in Britain, it was merely being considered by a committee.

 

With the British economy in an extremely poor state after WW2, 405 line black and white TV was regarded as a luxury for quite a few years. So, chances are the Baird system that wouldn't have survived long in the market at that time because very few people could've afforded it compared to the 405 line TV.

Link to comment
Share on other sites

Indeed France's 819 line so called high definition scanning system was indeed a triple interlace scanning system but the effective resolution was only about 364 x 273 lines and it produced a very scrambled picture.

 

Baird's phony high definition 1000 line triple interlace scanning system only resolved about 443 x 333 lines of resolution so it was a feasable proposal.

 

The analog NTSC resolved about 333 x 240 lines of resolution.

 

If the interlacers could have their way we would be using a triple interlace scanning system today because it affords the most resolution for the least amoung of bandwidth. However even a pharmacist knows you can only dilute the medicine so much until people start to notice.

Edited by Thomas James
Link to comment
Share on other sites

  • Premium Member
Indeed France's 819 line so called high definition scanning system was indeed a triple interlace scanning system .....

 

Here's more detail on how 2:1 interlace works, and always uses odd numbers:

 

http://freespace.virgin.net/ljmayes.mal/var/tvsync.htm

 

819 = 273 x 3, it's an even multiple of three. So, how did they get the necessary delay to put the fields in between each other?

 

 

 

-- J.S.

Link to comment
Share on other sites

I'm beginning to question the validity of simple resolution test charts for testing the effective performance of modern electronic cameras. I'm not saying that there shouldn't be ANY sort of quantitive testing, just that we need some other way of accomplishing this.

 

I thought the established Cinematography.com resolution/latitude and color rendition test was to ask the wife. Studies on this forum show that wives can pick out subtle image characteristics more than 10 times more accurately than the leading charts and patterns.

 

:)

Link to comment
Share on other sites

  • Premium Member
Although these scanning systems would have seemed to meet todays high definition standards the fact of the matter is that the triple interlaced 1000 line scanning system only resolves 333p and Franced 890 line triple interlaced system only resolves 296p. So the fact of the matter is that these systems actually had low definition performance with high definition numbers. The NTSC system with its 525 lines and only 480 lines visible actually resolves 240 lines and has a terrible screen door effect and is in fact a low definition scanning system.

 

Had the NTSC adopted a 60 hertz 400 line progressive scanning system whith 360 lines visible the result would have been a rock solid picture free of interlace artifacts with picture quality that blows away the interlace alternatives.

The problem is billions of people quite happily watch interlaced TV without giving it a second thought. Other countries have installed their own Television systems since 1940, and not a single one has decided to go with progressive scanning. Progressive scanning TVs have been on the market for nearly 20 years, and give visibly superior performance with film-derived video, particularly at 25 frames per second, but they mostly just sat on the retailers' shelves. Nobody wanted to pay the extra money. Modern LCD TVs are neither progressive or interlaced, they're more like an illuminated transparency, and they're selling very well, but not because of that.

 

I think we're going to see the same sort of argument with Bayer-Masked HD cameras but in reverse. People are going to insist till their dying day that they can't be any good because they don't resolve such-and-such a test chart, but the majority of users will be more than happy with the results they get, whether it can be "proven" to be 4K or not.

 

Whatever else you want to say about the RED, on a scale of ten, you have to give it a score of 200 for value for money, if nothing else. (How long they have the market to themselves remains to be seen of course).

Link to comment
Share on other sites

  • Premium Member
I thought the established Cinematography.com resolution/latitude and color rendition test was to ask the wife. Studies on this forum show that wives can pick out subtle image characteristics more than 10 times more accurately than the leading charts and patterns.

 

:)

 

Hi Gavin,

 

When friends of Geoff Boyle's wife were asked which pictures they preferred they always chose shots where the lenses were made by Cooke.

 

Stephen

Link to comment
Share on other sites

I thought the established Cinematography.com resolution/latitude and color rendition test was to ask the wife. Studies on this forum show that wives can pick out subtle image characteristics more than 10 times more accurately than the leading charts and patterns.

 

:)

Now THAT'S funny! I have to admit I'm frequently guilty of employing this measurement system.

Link to comment
Share on other sites

From the getting 4k thread to this one, I just wanted to thank you guys for taking the time discussing and sharing these concepts in a language many can understand. It is fascinating to get a glimpse of where we're at with this technology within a historical perspective...

Link to comment
Share on other sites

As far as the legacy of interlaced television is concerned there is no doubt that there is a proud legacy of satisfied consumers and professionals that to this day advocate the continued use of this obsolete technology . However the same can be said of the legacy analog television broadcasting which still has zealous advocates to this day who insist that analog broadcasting should continue for the usefull life of the remaining analog televisions. It took an act of Congress to mandate the end of analog broadcasting by 2009 and this was only accomplished by the promise of free digital to analog converter boxes yet people are still writing their congressman threatening to vote them out of office if their analog televisions go dark.

 

 

As far as the comment that no country has ever accepted a progressive scan broadcasting standard. Even if it is true that 90 percent of all worldwide HD broadcasting is in the 1080i format it is still a known fact that much of the material is acquired progressively either using film or progressive scan HD cameras. Using progressively segmented frame technology this material is outputed in a interlace stream and reconverted to progressive by the television. So if we count this technolgy as progressive scanning we can say that in all likelihood the majority of worldwide HD broadcasting is in some type of progressive scan format rather than interlaced.

 

As far as the comment that modern day flat panel televisions are neither progressive or interlace is quite misleading. It may be true that LCD televsions do not scan but rather each pixel is lit up simultaneously. But the fact is that flat panel televisions are much more natively compatible to progressive scaned signals than they are to interlaced signals. Progressive scanning mimicks the creation of photographs much better than interlace scanning can and is much better for frame grabs. Some modern day televisions use wobulation technology which mimicks interlace scanning however these televisions run at 120 hertz which minimizes interlace artifacts and also there are no temporal displacements between 2 adjacent fields.

 

The only problem with progressive scanning is that they can become a nightmare for marketing departments. This is not true for television marketing because the first consideration for the consumer is screen size and most consumers won't pay more for a 1080p television if they can get a cheaper big screen 720p television. However for cameras resolution is the main consideration and can be a deal breaker because the consumer simply won't buy a 720p camera if they can get a full high definition 1080i camera at a reasonable price. However a lot of camera makers are offering multi-format interlaced cameras that have progressive scanning capability as a bonus and some camera makers offer massive discounts on progressive scan high definition cameras to encourage consumers to make the switch.

 

As far as Red is concerned its philosophy is to offer so much resolution so the consumer does not have to take a resolution hit when they make the switch to progressive. However there are questions as to whether or not Red generates a 4k color resolution. If interlace scanning dies in the ultra high definition department it may survive as a way to boost color resolution in ultra high definition. I believe the first 8k NHK video camera uses progressive scanning for luma information and interlace scanning for chroma information so it is indeed a hybrid camera. This scheme may not improve the cameras overall performance but it should help sales as long as people believe they are getting 4:4:4 color.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...