Jump to content

Is Super 16 really good enough for HDTV ?


Keith Walters

Recommended Posts

  • Premium Member

Until recently I have never had access to full 1920 x 1080 HD at home.

 

I have had unlimited access to large-screen Full-HD sets at work for some time, but the trouble has been that there is virtually nothing transmitted during the day that is true HD or even vaguely approaching it.

 

I now have a large screen 1920 x 1080 set at home and so I can watch some the Prime Time offerings in full HD (although virtually all that is available is re-runs right now).

 

One thing that immediately struck me is the variability in quality between shows that are all originated on 35mm film.

 

The other disturbing thing is how grainy shows are that are shot on super-16 (such as the various Law and Order dialects).

 

At the moment the bulk of HDTV panels in use are "Half-HD" with only about one million pixels instead of the two million for full HD, and on one of those the grain is acceptable, and it's not really noticeable at all on an SD set. However as more and more consumers move up to full HD (the prices being is free-fall at the moment) I wonder how long super-16 will remain acceptable.

 

I think this may well be the reason the various studios are quietly checking out Super-35 type HD origination. (However not on any of their flagship productions).

 

Mind you, the 1970s, 1980s and 1990s all had brief spasms of interest in video origination for prime-time shows, which very quickly died out. It's not entirely out of the question that as 50" 1920 x 1080 TVs start to proliferate in Account Executives' living rooms, 35mm film will rise yet again Phoenix-like from the mud :lol:

Link to comment
Share on other sites

I'm no expert but wonder in the transfer from 35mm for example Is picture evenly reconstructed and proportional in its interpretation of film?

 

In fact I'm wondering if any HD film gives an exact reflection or just its own interpretation? With pixels its a determind pattern but electronically manipulated for storage purposes. Many claims are made for accuracy etc. But....

 

Perhaps the faults with grain etc are being over exaggerated by the electronic capture? Because lets face it when you go see a film 40' wide and sit 20 feet away.. YOU don't notice it and yet on a 40" TV Suddenly 35mm film has all its grain clearly seen? Some would say thats because the defination in the cinema is not as good as a 40 " screen at home in HD.

 

Something seems not quite right and I dont think its the low quality of film.

Link to comment
Share on other sites

There is also the HDTV compression factor. Most NFL games, for one, look so heavily compressed they look like they were shot on security cams, even on "half HD" monitors (like the term, BTW!). . . So it can't be helping 16mm.

 

I think a better way to tell how 16mm behaves on full HD is if you were looking at a properly telecined S16-originated and adequately compressed Blue Ray Disc on a full HD monitor.

 

Also, someone could design some really kick ass anamorphic 16mm lenses, to get all of the resolution one can squeeze out of the format . . .

Link to comment
Share on other sites

  • Premium Member

After being denied it for years on standard def, I actually LIKE being able to see film grain on HDTV.

 

I think the only problem with it is how it interacts with compression in HD broadcast. But aesthetically, I don't mind seeing the grain structure of the different formats, at least it stands out next to digital photography. Of course, I don't want the grain to be exaggerated in some way beyond what would be perceived in print projection.

Link to comment
Share on other sites

  • Premium Member
Of course, I don't want the grain to be exaggerated in some way beyond what would be perceived in print projection.

This was way beyond that.

With a Full HD screen (assuming it has a competent HD decoder - some TVs do not) I can easily see 35mm film grain, but only to the extent that I can see it's there, certainly not intrusive, and about the same visibility you see in a cinema.

 

What I saw on Law & Order looked like an old Betacam with the gain set to +6dB!

 

Incidentally, switching to the SD digital broadcast of the same show, it didn't look too bad at all. Hence my comments about the small number of Full HD screens currently in use.

Edited by Keith Walters
Link to comment
Share on other sites

  • Premium Member
Where does this silly term "half HD" come from?

 

Not silly at all.

A Full-HD screen has 1920 x 1080 RGB pixels, which comes to 2,073,600 pixels.

 

The vast majority of flat panel TV displays sold now and in the past had only about half that number ot less, the highest resolution available typically being about 1366 x 768, which comes to about 1,049,088 - half the number that are actually in a 1920 x 1080 transmission, or about 70% of the resolution. I suspect that number is chosen both because 768 is a binary-friendly number (3 x 256) widely used in computer displays, and because it then comes to about one million pixels.

 

What else do you want to call it. "One Over the Square Root of 2 HD"?

 

No mention was ever made of any of this until now of course, when full HD sets have become available at very attractive prices. On a still night you can still hear the howls of outrage from the blowhard early adopters who paid a now-outrageous sum for a half-HD (or less!) panel with just an analog tuner, when a less impulsive neighbour shows them what he has just bought himself for Christmas for about one-quarter the price they paid in 2005 dollars and had a couple of hundred dollars worth of extras thrown in. Heh heh heh... :lol:

 

If your eyes are good enough can definitely tell the difference with the right source material, although a lot of people's presumably are not good enough. It's not just a matter of the right glasses either, I've had three decades of arguing with people who I can only assume have Handycam sensors at the back of their eyes :lol:

Link to comment
Share on other sites

  • Premium Member
Sometimes over-sharpening can make the grain pop out, as well as adding too much contrast. And sometimes you're also seeing noise, not grain, depending on the telecine used, how it was graded, etc.

You could well be right there. The Super-16 footage didn't look particularly soft, which suggests that they cranked up the detail correction to make it look more like 35mm. But if that was the best Telecine could do, I would strongly suggest they take their business elsewhere!

 

Other shows on the same channel, and with the same preset "Picture" setting on my TV, were not grainy at all, it was definitely inherent in the show I watched.

Link to comment
Share on other sites

  • Premium Member
Not silly at all.

A Full-HD screen has 1920 x 1080 RGB pixels, which comes to 2,073,600 pixels.

...

 

 

No offense but you don't know what you are talking about. You know little about the mathematics of how the human eye sees and hence why manufacturers make some sets 1366 and others 1920 based on your response.

Link to comment
Share on other sites

  • Premium Member
No offense but you don't know what you are talking about. You know little about the mathematics of how the human eye sees and hence why manufacturers make some sets 1366 and others 1920 based on your response.

 

Making some pretty radical assumptions there.

 

Until about 3 years ago, for all practical purposes you couldn't buy a domestic 1920 x 1080 flat panel TV at all.

 

Now, many manufacturers are offering a range of screen sizes, sometimes even in a choice of Plasma or LCD, most in a choice of 1366 or 1920, 1366 surprise, surprise being somewhat cheaper.

 

Inside, in most cases the tuner/AV processing boards are identical, in some cases they even use exactly the same board for both Plasma and LCD versions, with the same remote.

 

All modern flat panels take in a DVI-type signal, and all signals boards put out a matching DVI type signal. The only difference is that a video board for 1366 has different firmware from one meant for 1920.

 

The only reason some sets are 1366 and others are 1920 is price.

 

There's an enormous amount of utter codswhallop talked about viewing distances and screen heights and so on; these are all related to 1930s interlaced scanning technology and have no relevance whatever to flat panel displays.

 

If there is some esoteric mathematical relationship between the mathematics of human vision and the ideal resolution of a TV screen, I doubt too many manufacturers have heard of it.

 

I have seen identical-screen-sized 1366 and 1920 versions of the set I bought side by side; the picture is noticeably sharper on the 1920 set, what more do you want?

 

Are you going to tell me your math says it's NOT sharper?

Link to comment
Share on other sites

You are assuming HD is only 1080...there are approximately 20 resolution standards considered HD; from 720P (1280 x 720 pixels) on up. Some networks such as FOX only supply a 720P signal, so your t.v. set (and mine) are uprezzing to 1080. Better to talk of HD resolution than "analog" like half (versus 1/4....3/4...HD).

Link to comment
Share on other sites

  • Premium Member
You are assuming HD is only 1080...there are approximately 20 resolution standards considered HD; from 720P (1280 x 720 pixels) on up. Some networks such as FOX only supply a 720P signal, so your t.v. set (and mine) are uprezzing to 1080. Better to talk of HD resolution than "analog" like half (versus 1/4....3/4...HD).

 

This may help explain some of it. Bottom line you can't simply say I saw two TVs and one looked better because of the resolution.

 

http://www.cnet.com/hdtv-resolution/?tag=c...ain;contentBody

Link to comment
Share on other sites

I'm no expert, but I think that film resolution is as good as it's mm times 100 in lines. For instance, 8mm is only as good as 800 lines of resolution, 16mm is as good as 1600 lines (perfectly fine for 1080 HD scan), and 35mm is as good as 3500 lines (which is why it's usually scanned at 3k or 4k).

 

I've seen well scanned 16mm that looked sharper than my HVX at 1080. There will come a day when 1080 HDTVs will be old and cheap, and we'll me looking at 5k TVs. There will be more cameras with 5k sensors, ad people will start asking the question "is 35mm good enough for 5K TVs?". But I will always love the grainy look of film over the sharp, through the window look of HD digital.

  • Upvote 1
Link to comment
Share on other sites

  • Premium Member

"There will come a day when 1080 HDTVs will be old and cheap, and we'll me looking at 5k TVs"

 

Where dos this notion come from? Don't confuse the marketing race of camera manufactures looking to get you to buy cameras with broadcast TV. It's not a money tree. How many years did we live with 525 TV in the US? I would not expect a change in home TV broadcasting any farther than what we have now for many years to come. They got what they wanted. To change it again requires major infrastructure changes again. Yes they will market all sorts of gimmicks like 3D but the core of HD is what it is.

Link to comment
Share on other sites

... the mathematics of how the human eye sees and hence why manufacturers make some sets 1366 and others 1920 ...

 

Though conformance to certain parameters of human vision has been utilized in some cases (e.g., chroma subsampling), a strict adherence to the principles of vision has not been the criteria when it has interfered with picture quality. For e.g., almost always, the NTSC luma coefficients (0.299, 0.587, 0.114) are used incorrectly with nonlinear gamma encoded signals, where these coefficients are strictly valid only for linear signals; but that is done since it results in better color fidelity near primaries.

 

The parameters that HDTV committees have considered important are: aspect ratio, digital sampling structure, total number of lines, colorimetry, and transfer characteristics among others. Hardware timings have been important considerations in HDTV, and an important consideration when these standards were being worked out was that the harmonics of the clock frequency do not interfere with international distress frequencies of 121.5 MHz as the civil, and 243 MHz as the military aircraft emergency frequencies.

 

HDTV has taken several decades to be formalized properly and there are tons of standards and resolutions besides 1920x1080. In fact, even for 1920 x 1080, many MPEG2 encoders used 1920x1088. Even when the horizontal resolution is 1920, there are standards that don't have 1080 in vertical, for e.g., SMPTE 260 specifies 1920x1035 active pixels out of 2200x1125.

 

If there is some esoteric mathematical relationship between the mathematics of human vision and the ideal resolution of a TV screen, I doubt too many manufacturers have heard of it.

 

Human vision is a complicated subject. We don't know much about the process in the cognitive stages of vision as we have more knowledge about the lower-level vision stages (retinal sampling structures, cones, rods, etc.). Though, TV manufactures and other people in this area are generally aware of vision principles in this lower level area, that arguments should not be carried too far, as the higher level processing in the vision are still not fully discovered.

 

As I said above full conformance with human vision has traditionally been dropped in areas that have resulted in more pleasing pictures.

Link to comment
Share on other sites

  • Premium Member
As I said above full conformance with human vision has traditionally been dropped in areas that have resulted in more pleasing pictures.

 

Bottom line anyone that talks resolution numbers is not fully realizing the aspects that make up a pleasing picture. In fact all studies show resolution to be low on the list of importance to what humans perceive as sharpness and detail. Resolution numbers in fact say very little about how good your TV works. To me it's the equivalent of the useless thread after thread about what camera is the best.

Link to comment
Share on other sites

  • Premium Member

There's a Tektronix White Paper "A Guide to Standard and High-Definition Digital Video Measurements" that has a lot of information about different digital and analog TV standards at:

 

http://www2.tek.com/cmswpt/tidownload.lotr...=2210&lc=EN

 

Tektronix sent me a hardcopy when I requested one.

Link to comment
Share on other sites

  • Premium Member
This may help explain some of it. Bottom line you can't simply say I saw two TVs and one looked better because of the resolution.

I never said that.

I said I examined two otherwise identical 42" LCD TV sets (and I mean "identical", in that I took the covers off and examined the circuit boards) from the same manufacturer. On suitable test transmission (admittedly very rare) the picture on the 1920 set was noticeably sharper than the picture on the 1366 version. Stepping back a few metres the pictures looked identical.

 

A 1920 x 1080 pixel panel is capable of displaying more resolution than a 1366 x 768 pixel panel.

WHAT ELSE would you expect?!!

Link to comment
Share on other sites

  • Premium Member
You are assuming HD is only 1080...there are approximately 20 resolution standards considered HD; from 720P (1280 x 720 pixels) on up. Some networks such as FOX only supply a 720P signal, so your t.v. set (and mine) are uprezzing to 1080. Better to talk of HD resolution than "analog" like half (versus 1/4....3/4...HD).

This is quite correct; basically anything better than Analog PAL or NTSC counts as "High Definition" for many regulatory authorities.

But these are all pretty much interim measures.

 

The whole point of this thread is speculation on what will happen if/when a significant percentage of the viewing population become equipped with large-screen TV sets that can actually display the full 1920 x 1080 resolution that current HD standards are capable of. At present, most HD panels have only 768 or thereabouts vertical pixels, so most viewers would not be able to tell the difference between 720 and 1080. However, as I was pointing out, this situation is changing rapidly.

 

If Super-16 is really good enough for 1920 x 1080 HDTV, then it should also be suitable for 2K cinema releases. I haven't seen too many of those to date.

 

So personally I think maybe people have been able to get away with Super-16 for HD up until now because its deficiencies have been masked by the (relatively) low resolution TV sets currently used by the bulk of the population. The productions may not stand up so well in the future. I can certainly see a difference right now which is not apparent on lower resolution sets. Fine-grain S-16 stocks might improve matters but people seem to insist on using the same speed stocks as they use in 35mm productions, with predictable results.

 

I'm surprised that Jannard and Co haven't been pushing this future-proofing aspect more, since it's a situation where the RED and similar cameras may have something significant to offer High-end TV production companies other than cheapness.

Link to comment
Share on other sites

  • Premium Member
[standing directly in front of the sets I noticed that] the picture on the 1920 set was noticeably sharper than the picture on the 1366 version. Stepping back a few metres the pictures looked identical.

 

WHAT ELSE would you expect?!!

 

You answered your own question. Watching the two from normal viewing distances your eye can not see the difference. That is a fact. The angular subtense of a full pixel at 1920 is is 0.7 arc minutes which is far beyond what they eye can see. So scream all the resolution you want, it makes no difference. We’ll never need 4k TVs cause you can’t see 4k. If someone makes a 70 inch TV then resolution above 1024 becomes a factor but for right now it isn’t and that is not going to change. If only we sat one foot from a TV, we'd appreciate the higher resolution, if it was there (as long as other factors were correct) but we don't sit on a TV so talking resolution numbers as a gateway to quality is not only silly but misleading. Note: My three year old used to but then I put the TV on the wall.

 

The reason most TVs are scaled to 1366 has to do with the limits of processing by the VRAM inside the TV that most manufactures use which works out to 1024, but in a 16x9 configuration they have to scale to 1366x768 to fit to the screen which works out to 10.5 million pixels. To get the cost down and to get most people to buy, they went the cheaper route with 1024 VRAMs. More expensive sets, and more marketing is bringing more of the more expansive VRAMs but unless you have a service like FIOS, or over the air antenna, you are not getting anywhere near 1920 anyway, if it's even broadcast in that speed or if the show is actually made at that resolution. Most are not giving you that.

 

No two manufactures sets often look good side by side because of the quality of the scalers used. Usually the same manufactures sets are identical guts with added bonus' (similar to the extra crap you never used in VCRS) so it's not a factor in that situation of side-by-side same manufactures.

 

All I ask is that the constant barrage of resolution numbers as if by themselves really mean anything be stopped. Resolution by itself means nothing without the other three factors (far more important factors) that make the picture you look at seem "sharper". You can’t say 16mm is equal to so many pixels and all the other stuff that flies around the web like fact when it’s far more complicated than that. Sort of like saying my car has 16 inch tires, but doesn’t tell you what kind of engine or transmission it has.

Link to comment
Share on other sites

  • Site Sponsor
If Super-16 is really good enough for 1920 x 1080 HDTV, then it should also be suitable for 2K cinema releases. I haven't seen too many of those to date.

 

 

?? Look harder I think there are five in theaters right now including Aranofsky's The Wrestler..

 

S-16 has been widely used for 2K for some time now.. including in Apcoalypto and mixed with 35mm and digital in multiple productions.

 

-Rob-

  • Upvote 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...