Jump to content

Is Super 16 really good enough for HDTV ?


Keith Walters

Recommended Posts

Will the studios really bother to rescan the negatives of their entire back catalogs? It all reminds me of the early days of CDs.

 

When i was at WRS, MGM usually did a new Xfer whenever there was an improvement in scanners & they were usually done from a new I/P. Sony/Columbia would do new scans of TV series, also from new I/Ps.

All those Screen Gems sitcoms had A&Broll OCNs.

Link to comment
Share on other sites

  • Premium Member

I helped install a bunch of Baselight grading gear at Warner's video operations centre in Burbank where they were doing exactly that, although I got the impression (perhaps false) that the facility was more geared toward somewhat remedial restoration of very old stuff.

 

P

Link to comment
Share on other sites

  • Premium Member
Actually, it's ashes, not mud. The Phoenix is supposed to burn itself in a pyre and rise from the ashes. ;-)

 

-- J.S.

Harsh.

I thought they'd fixed the RED One's overheating problems ;)

Well as they say, one bad egg can rot the whole barrel....

Edited by Keith Walters
Link to comment
Share on other sites

  • Premium Member

Back to the original subject...

 

I've been looking at the reruns of L&O Special Victims Unit, L&O Criminal Intent, and CSI Miami on the big screen.

 

The L&O shows are both broadcast on the same HD channel, so I doubt it could be attributed to equipment differences.

 

The L&O SVU episodes were shot on Super 16, and they are consistently diabolical noise-wise, like the screen has German Measles or something.

 

The L&O CI episodes are from 2007, shot with a Panaflex and the difference is staggering. You could project that footage in a cinema and I doubt there would be too many questions asked.

 

CSI Miami is on a different HD channel, and while I've always liked the pictures on SD, on HD noise was definitely present in the darker areas, although nothing like what I saw on SVU. So I would imagine they are using a faster film stock than they did for L&O CI.

 

I don't know what the archival value is of shows like that, but as full-HD screens start to become mainstream, I think people are going to come to regret inplementing such false economies.

 

If Eye Candy is going to remain a significant component of shows like that, they better not skimp on the sugar content, or they risk malnutrition of the re-runs.

Link to comment
Share on other sites

Hi Keith,

Ive seen the same shows on the same networks and what bothered me more than the noise was the awful motion artifacting on some broadcasts of some shows. Its pretty awful, something goes wrong in the NTSC conversion. I always thought the noise was more related to poor broadcasting as it never looks much like grain to me.

I think 7HD looks better than 9 or 10s with some of their locally produced stuff at primetime looking great (quality of the production and show put to one side). I watch most of my HDTV on a 720p projector or computer monitor and sometimes on a 1080p samsung.

I wonder what "Freeview" will be like? All SD no?

Link to comment
Share on other sites

  • Premium Member

Guess some of you don't realize you aren't watching 'uncompressed' HD. You are not, nor are you ever watching 1920 HD if you watch Amercian TV. The best signal is at the network head end for a network show. If you are watching over the air you are watching MPEG compression which in a word can be brutal. If you are not watching over the air, the signal goes normally through at least one sat transmission. That would mean at least one transcoder, and one decoder and at least a few digital proc amps along the way. Then usually about four or five different processors and scalers on the receive end, and then is broadcast or transmitted down a cable, not including the four to ten amplifiers it took to get the signal from the cable head end to you. Digital cable and satellite compression transport stream is always MPEG, so as I said can be brutal. There is no such thing as 1920 TV broadcast in the US (the signal is, but not the source) as all HD origination is 1440 HDCAM or 1280 DVCpro. So anyone that thinks they are watching 1920 doesn't know how broadcast TV works. Please don't do web searches for HDCAM specs and say HD-SDI is 4:2:2 1920. All HDCAM is recorded at 1440 3:1:1. The 1920 is a manufactured output of 1440. No such thing as 'half HD". Only HD you see is either mastered in 1440 or 1280, nothing else. And once again, all this is in a MPEG stream at 4:2:0 at such low bitrates as to sometimes be embarrassing. Remember, only Blue ray gives you 1920, no American broadcast does.. NONE! Then of course some of the "noise' you are seeing in darker regions of the screen as mentioned here is actually created in the TV set itself, especially if it's a LCD, so many of you think you are seeing noise from film, but in reality are seeing noise created by the set or noise in the transmission pipeline. But in a way, it's good it's really not 1920. At the same bit rate as 1440, the 'more' picture you'd get with 1920 makes for far more compression errors. So a few disturbing facts:

 

Two standerds for broadcast in the US:

HDCAM 135 Mb/s 3:1:1

DVCProHD 100 Mb/s 4:2:2

 

NONE of these formats are native 1920. Everything is stretched in the chain to make it so but it was always recorded at one or the other of the two sizes above.

 

The bad news? Transmission bitrate to your TV is about 14 Mb/s for MPEG2-HD and MPEG4. As a result, nothing is lost in the fact that no network broadcast in the US is really 1920. If it was at those bitrates, the quality would suffer. All upscaling is done by your cable box or your TV. Sorry for the spoiler.

 

I was in HBOs master control room the other day visiting a friend. I was shown off-the-machine playing the master what the picture looked like. Then shown their Time Warner cable feed back to the building. Keep in mind that TW head end is in the same street. The difference in picture was noticeable. Surprisingly noticeable.

 

Want to see the noise that many think is film grain on your TV? Put in a DVD and watch the black between program elements. You are looking at what most people think is film grain in the dark's.

Link to comment
Share on other sites

  • Premium Member
Hi Keith,

Ive seen the same shows on the same networks and what bothered me more than the noise was the awful motion artifacting on some broadcasts of some shows. Its pretty awful, something goes wrong in the NTSC conversion. I always thought the noise was more related to poor broadcasting as it never looks much like grain to me.

I think 7HD looks better than 9 or 10s with some of their locally produced stuff at primetime looking great (quality of the production and show put to one side). I watch most of my HDTV on a 720p projector or computer monitor and sometimes on a 1080p samsung.

I wonder what "Freeview" will be like? All SD no?

To be honest I haven't looked at an enormous amount of prime time stuff since I got the TV, since there isn't too much to choose from at the moment.

I can't say I've seen any motion artifacting to speak of, and I've certainly been looking for it. (No more than I would see on analog TV at any rate).

 

The only thing I've really noticed is the variation in image grain.

 

As for Freeview, as far as I understand it for the forseeable future all the networks will be having 2 x SD and 1 x HD digital channels, with three different programs.

 

I've heard their real agenda is to persuade the government to change the HD spec to MEPG4, allowing them to have 2 x HD channels and maybe 4 x SD for the same bandwidth.

 

However the ACMA is very keen not to allow a repeat of the Digital Terrestrial fiascoes that have ocurred in other countries where broadcasters were allowed to cram as many digital channels as they liked into the allotted bandwidth, so I doubt they have much chance of sliding that past.

Link to comment
Share on other sites

  • Premium Member
Want to see the noise that many think is film grain on your TV? Put in a DVD and watch the black between program elements. You are looking at what most people think is film grain in the dark's.

OK, I just did.

There was just the usual MPEG slight black ground flicker, not noticeable when there is any picture content. I know what film grain looks like, there was nothing like film grain.

 

I think you might have a problem with your equipment.

 

And MEPG does not automatically mean "Brutal". The level of compression distortion is entirely dependent on how much compression the broadcaster chooses to use.

Link to comment
Share on other sites

As for Freeview, as far as I understand it for the forseeable future all the networks will be having 2 x SD and 1 x HD digital channels, with three different programs.

 

I've heard their real agenda is to persuade the government to change the HD spec to MEPG4, allowing them to have 2 x HD channels and maybe 4 x SD for the same bandwidth.

 

However the ACMA is very keen not to allow a repeat of the Digital Terrestrial fiascoes that have ocurred in other countries where broadcasters were allowed to cram as many digital channels as they liked into the allotted bandwidth, so I doubt they have much chance of sliding that past.

 

I had thought it was an effort to expand broadcasting so that they could lobby government to decrease the local content quota to be in line with pay tv now that they have "three individual networks to feed".

 

Maybe my Set top is particularly crap but I see major motion artifacting on some shows but not necessarily on every show on the network, always imported programming (there isn't much else) and its always bothered me more than the compression and noise. Ive noticed a difference in black levels between the HD and SD counter parts and you can here the audio decoding change as well.

Its quite possible I have a very cheap and sensitive Tuner. It was certainly cheap...

 

But, for what its worth, if the show is being broadcast on its HD sibbling I'm tuned into that over the SD counterpart...

Link to comment
Share on other sites

Two standards for broadcast in the US:

HDCAM 135 Mb/s 3:1:1

DVCProHD 100 Mb/s 4:2:2

 

Neither HDCam nor DVCPro is a "broadcast standard." They are videotape recording formats that have nothing to do with broadcast, other than the fact that some - and I emphasize, some - programs are either recorded or delivered on them. However, in terms of prime time programming, the great majority of prime time programming is delivered on HDCam SR, a completely different format than HDCam and one that does not have any of the limitations you're talking about. Many of these programs originate on either film or HDCam SR tape (anything shot with a Genesis or an F23 is likely in this category). They never incur HDCam compression anywhere in their post chain, nor do they they incur it when the delivery masters are created.

 

NONE of these formats are native 1920. Everything is stretched in the chain to make it so but it was always recorded at one or the other of the two sizes above.

 

This is just not the case, as explained above.

 

The bad news? Transmission bitrate to your TV is about 14 Mb/s for MPEG2-HD and MPEG4. As a result, nothing is lost in the fact that no network broadcast in the US is really 1920. If it was at those bitrates, the quality would suffer. All upscaling is done by your cable box or your TV. Sorry for the spoiler.

 

Sorry to question your assertion, but it's simply incorrect. Compression does not affect the pixel size of the original material, although it does of course eliminate a certain amount of information by definition. If you take a 1920x1080 image into Photoshop, and export it as a JPEG image, it's still 1920x1080. Anything else is determined by specific settings in the encoder. There is certainly loss, and there is certainly what appears as noise added - as well as motion artifacts and various other very nasty things - but the image size is not changed. And, once again, HDCam isn't a part of this equation.

Link to comment
Share on other sites

  • Premium Member

"Neither HDCam nor DVCPro is a "broadcast standard." They are videotape recording formats that have nothing to do with

broadcast, other than the fact that some - and I emphasize, some - programs are either recorded or delivered on them. "

 

I deliver dubs to over 50 markets in the US and all HD dubs require both mentioned in my post exclusively. By standerd I was referring to delivery standards (wait before you say anything). And sometimes I am told to deliver even against the standards of the company or network. Take PBS for instance. Many PBS stations I deal with can't afford HDCAM even though they are a 1080 network, so want dubs only in DVCpro. Forgive me, as I am hung up in the world of commercials and sometimes don't think outside of that. In my world, SR is not the option nor the choice for distribution 90% of the time, but I agree with your assertions for long format program delivery. I take back what I said.

 

"Compression does not affect the pixel size of the original material, although it does of course eliminate a certain amount of information by definition. "

 

I agree with your point and am talking less about what you start with but what you end up with aka "it does of course eliminate a certain amount of information by definition"

 

 

"If you take a 1920x1080 image into Photoshop, and export it as a JPEG image, it's still 1920x1080. Anything else is determined by specific settings in the encoder. "

 

To bad no one in the station is using photoshop and rather has encoders :) And since I deliver mostly HDCAM I'm not delivering real 1920. Have you been to any cable head ends recently? Maybe Cablevision in NY, or Comcast, or even the new verizon Fios center down in NC? Or have you been to any local stations to see some of the set-ups used to compensate for cost? SR is a great format, but man even in the top 15 markets I've seen some pretty amazing work arounds in the transmission chain. When you see them you realize why things don't always look like they should. And also realize why you want to start with the best signal you can on the acquisition end.

 

"There is certainly loss, and there is certainly what appears as noise added - as well as motion artifacts and various other very nasty things - but the image size is not changed."

 

Not when you deliver a SR dub. But what happens to that signal from head end to local transmission is what I am more talking about and it would surpise you to know how much digital degrades. I know I was just at teh hub of a large station group in the midwest and was told by the head guy, "we have it set up this way now, but hope to get it set up better sometime in the future is we can get the company to give us the money we need. But no one at home has complained... (he smiles and laughs)"

 

 

"And, once again, HDCam isn't a part of this equation."

 

Thanks again. I work too much in the commercial world and did say some things that do not apply to long format program distribution for a few (not the majority of network delivery). I take them back, but only for the few major networks that require SR delivery like CBS. Most all cable networks like discovery require only HDCAM and only get that, and many times footage was never acquired on anything but HDCAM if that. And if it was direct to a 900 or 950 it might be 1920 by spec but lacks in terms of methods of recording (unless as y And many of the older programs now seen in syndication never made it to SR originally. It's not the 1920 that is the problem but the compression and less than stellar way of recording in a camcorder unless as you say it uses an outside recording medium. But then again, my whole discussion involves the useless use of numbers as if they mean anything in the end. Is 1400 HD? Is a camera that has SD chips pixel shifted really HD? Is if half HD if it isn't 1920? Is 1920 true HD and more importantly does it matter that most all TVs that folks own don't and can't make full raster HD nor need to or that 10 out of 10 people can't see the difference of a 720p set from a 1080p set? Nope, cause HD is made in the mind and perceived by the brain and the numbers alone can't explain why the brain does that. In fact higher numbers often mean little to viewing. But folks sure want to find some kind of numerical nirvana with numbers alone.

Link to comment
Share on other sites

  • Premium Member
I had thought it was an effort to expand broadcasting so that they could lobby government to decrease the local content quota to be in line with pay tv now that they have "three individual networks to feed".

 

Maybe my Set top is particularly crap but I see major motion artifacting on some shows but not necessarily on every show on the network, always imported programming (there isn't much else) and its always bothered me more than the compression and noise. Ive noticed a difference in black levels between the HD and SD counter parts and you can here the audio decoding change as well.

Its quite possible I have a very cheap and sensitive Tuner. It was certainly cheap...

 

But, for what its worth, if the show is being broadcast on its HD sibbling I'm tuned into that over the SD counterpart...

Ah. I wonder if the reason might be that I'm watching it on an Integrated HDTV where the decoder puts out Digital RGB which is specifically engineered to exactly match the display panel. I had sort of noticed that an integrated setup does seem to work better than a STB/monitor combination but I never investigated it that closely. I'll borrow an HD STB and do some comparisons.

 

I'm still enjoying the the fact that there's only one remote to worry about :lol:

Link to comment
Share on other sites

I just found out that Starz' show "Crash" is shot on S16, researching to see what they were using for their tilt-shift lenses. That is for 1080i and Starz is obviously a premium channel that formerly only did movie showings.

 

Not only is it good enough for HDTV, it was good enough to fool me into thinking it was shot entirely on 5218 or 5299 maybe pushed a stop. Grain-wise it looks to be on par with some of the CSI episodes I have seen. . .

Link to comment
Share on other sites

  • Premium Member
However, in terms of prime time programming, the great majority of prime time programming is delivered on HDCam SR, a completely different format than HDCam and one that does not have any of the limitations you're talking about.

 

That's been our experience over the last couple years. SR is a second generation HD recording format. Prior to that, delivery was on either of the first generation formats, original HDCam or D5-HD. Toward the end of that first generation, the majority was D5-HD.

 

With the era of file based post dawning, it's unlikely that we'll see a new helical scan tape transport. There may be upgrades in the electronics, but the mechanics aren't worth touching for the remaining life of helical scan technology. Much simpler head shifting systems such as LTO work fine for files, and cost a lot less to make and maintain.

 

 

 

 

 

-- J.S.

Link to comment
Share on other sites

  • Premium Member

Not super-16 of course, but I saw one of the re-runs of the most recent Dr Who series on the big screen last night.

It always looked pretty good to me on SD, but up-converted to HD the special effects were downright diabolical.

(The normal action sequences were OK for SD capture)

The CGI bit at the start where the Tardis is bouncing down the wormhole looked like something done on an old PC with a CGA adaptor!

What were they thinking?

(And no, the same transmission on an ordinary 20" CRT with a digital set top box looked like it always did).

Link to comment
Share on other sites

  • Premium Member
That's been our experience over the last couple years. SR is a second generation HD recording format. Prior to that, delivery was on either of the first generation formats, original HDCam or D5-HD. Toward the end of that first generation, the majority was D5-HD.

-- J.S.

All this to the side, the noise I see on L&O still looks exactly like film grain to me.

I have never seen that sort of noise produced by video compression.

Actually it's interesting that when I transfer old VHS and Video-8 videotapes onto DVD in the "SLP" (6 hour) mode, the image quality actually is improved, subjectively at any rate! And even that bottom-of-the-barrel format does not introduce "snow- or grain-type" noise.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...