Jump to content

How will 2K material be refinished for UHD?


cole t parzenn

Recommended Posts

For whatever my opinion's worth, I think UHD is silly. (Granted, I think that Technirama should be the standard format for 2.39.) But, if UHD does become the next home video format, how will the countless hours of 2K films be refinished? Surely the HD versions of some will be upscaled and become the new UHD versions but, for the stuff that does get a proper UHD refinish, what will that process be? Go back to the negative/raw files?

Edited by cole t parzenn
Link to comment
Share on other sites

You ask a very good question that I feel a lot of people in the industry are avoiding as it pertains somewhat to archiving.

 

If it was shot on film, you simply rescan the negative. The sharpness won't exactly improve (It's been said Pixel equivalent- resolution wise- is around 6K for 35mm and just under 4k for Super16mm). However, if you needed 4K, 8K or however many pixels, as a negative is a pre-existing image all you would need to do it scan it with a sensor that has that many pixels. That's why every time a new medium comes out (VHS, DVD, Blueray, etc) they pull the negative out of the vault and do a new "restoration" job. More pixels/ quality =new scan.

 

Digital on the other hand I'm not too sure about. Sure you can up-rez files, but at the rate we're going with resolution there is going to be a point where you're not going to be able to show your 1080 stuff on a 4K, 8K, 10K screen without it seriously starting to degrade. On the other hand, I would assume that if you where showing it on lets say a home TV, it would just reset to the lower res setup. For example, my TV does 1080, but if I run a 720 connection, it will adjust to it. Many movies shot for 2K are actually shot at much higher resolutions like their film counterparts (RED for example can shoot at 5K or 6K) so the RAW files will be able to be used for the upgrade for quit awhile. Like a negative, they'll just go back to the RAW files.

 

It's an important question because at the rate we're going ("K"-crazy) even 5k footage is going to have trouble being re-released "up-rezzed" if the new standard goes beyond.

Link to comment
Share on other sites

I gather that HD material looks good on 4K screens. Standard PAL doesn't look too bad on HD either and people tend to watch older programmes and films because of the content. The larger pixel count tends to assist productions with large detailed landscapes, while most or many TV productions tend to feature people. As the resolution of the distribution increases, there will probably be more post production involved in "improving" the skin on faces etc.

 

For domestic use it does start becoming a bit pointless beyond 4k unless people start to change their seating arrangements and convert their living room into an iMAX cinema.

Link to comment
Share on other sites

Except that's not true, I thought the same thing, but 4K on a decently sized tv screen, like 50 inches, there's a huge difference, HUGE. Unless you sit like 5 meters away from the screen or something, what's staggering is seeing how close you can get to the screen and it still being incredibly detailed, whereas the classic 1080p screens next to it looked like mush compared to the 4K Sonys & Panasonics.

Edited by Manu Delpech
Link to comment
Share on other sites

If it was shot on film, you simply rescan the negative. The sharpness won't exactly improve (It's been said Pixel equivalent- resolution wise- is around 6K for 35mm and just under 4k for Super16mm)....

 

Digital on the other hand I'm not too sure about. Sure you can up-rez files, but at the rate we're going with resolution there is going to be a point where you're not going to be able to show your 1080 stuff on a 4K, 8K, 10K screen without it seriously starting to degrade.

 

What makes you say that 35mm is equivalent to 6K and Super 16, just under 4K? At 60 lp/mm, S35 should resolve about 2880 lines and S16, about 1502. Even Velvia would only get (coincidentally) 3840 lines from S35, right?

 

As for HD, every pixel becomes a square of four pixels, right? If you recorded 1080 8 bit 4:2:0, could it still look sharper, just by virtue of the increased number of pixels (more tightly packed color elements, finer noise, etc.)? And the shows recorded in 1080 8 bit 4:4:4 should definitely look sharper, if only because the chroma subsampling is done away with, right?

Link to comment
Share on other sites

  • Premium Member

35mm should be scanned at 6K to avoid aliasing, following the Nyquist rule that you need to sample at double the frequency rate. If 35mm actually resolved 6K worth of detail, you'd have to scan it at 12K to avoid any aliasing. Of course real world images rarely push the limits in terms of creating aliasing artifacts.

 

My own experience with scanning 35mm for D.I.'s is that the image seems to generally fall between 2K and 4K in terms of actual picture detail, i.e. 2K isn't adequate but you don't quite see 4K worth of detail either, hence my belief that 4-perf 35mm is practically speaking like a 3K image, similar to what the Alexa creates.

Link to comment
Share on other sites

 

As for HD, every pixel becomes a square of four pixels, right? If you recorded 1080 8 bit 4:2:0, could it still look sharper, just by virtue of the increased number of pixels (more tightly packed color elements, finer noise, etc.)? And the shows recorded in 1080 8 bit 4:4:4 should definitely look sharper, if only because the chroma subsampling is done away with, right?

 

Pixel 'replication' to scale up leads to 'ugly' images... most 'better' scaling algoritms tend to attempt to interpolate the 'missing' values, given the values of the neighboring pixels.

 

There are also methods which 'construct' detail by computed means that give the human viewer the 'impression' that this computed detail was in the original.

 

For example, there are algorithmic methods to compute 'trees' viewed at various distance. A 'upscale' method would compute the 'trees' then blend those computed trees with higher detail, with the original captured trees...

Link to comment
Share on other sites

Of course it depends where you sit, but that's why I mention at the distance most people watch TV. You can of course choose to sit closer!

 

The 'rule' for NTSC, and presumably PAL TV was 6-7 x the height of the TV screen. Which for old SD TV, for the 'family' sized screen, the viewing distance would be about 10-12 feet.

 

I think for HD TV its about 3x the vertical height, and so, depending, 5 feet for a 32 inch screen. The Clarks tend to view between 6-7 feet from the 32 inch screen we have.

 

Also, the Clarks when viewing DVDs on the 32 inch screen, will often set the 'zoom' mode to 1/2...

 

The Clarks also sit in the back row of the theater, right under the projection booth window...

Link to comment
Share on other sites

Lol, but what do the Clarks think about 5.1 surround and Dolby Atmos then? Inquiring minds want to know.

 

Well, one of the Clarks has 90 db of loss in one ear, and total loss in the other. So 5.1... is... well... useless... About the only thing I'd like is to be able to push dialog 'more out', since dialog is usually centered, and 'background/left/right/whatever' back... There is minor amount of control that the player has that suggests being able to 'enhance' dialog over effects... but it seems pretty limited.

Link to comment
Share on other sites

Ran out of 'edit time'...

 

And needless to say we have the captioning on always... it use to be a real pain prior to the digital era. Until about 15 years ago, unless the film was non-English... forget it. Then various devices started being implemented, along with a caption 'track' (I think it was a player device that sync'd up with the movie run, but I never looked into the details much). During this period we would sometimes drive up to 100 miles to go to a theater that either had 'open caption' presentations at 'low audience' times like sunday morning, or to theaters that had installed the equipment. These days most first run movies have captions in the theaters near us. The only one in recent times that was not captioned was the 30th anniversary release of "Ghost Busters"(1984).

Link to comment
Share on other sites

Of course it depends where you sit, but that's why I mention at the distance most people watch TV. You can of course choose to sit closer!

 

Depends on the size of the screen too of course.

Although this fantasy that UHD won't make a difference is usually based on the idea that people are getting 1080p broadcasts which generally they aren't. It's usually 720p or1080i at best.

So yes if you are watching blu-ray the difference will be more marginal than if you are watching TV.

 

Having said all that my experience of seeing real world 4K runs completely counter to all this.

UHD screens look incredible as you can no longer see the pixels, 4K projection doesn't look special at all. Yeah maybe you can't see the pixels but I could never see the pixels in the average film print either but basically it doesn't jump out as being all that more special than DV-Cam projected in cinemas. Sometimes it seems worse!

 

I actually like UHD It's sensible in that it is not actually 4K (although 3K would have been fine I reckon) but close enough.

I don't believe 8K will ever really happen beyond large outdoor screens. It's just silly.

 

Oh yes and theres one last thing. All this talk of viewing distances assumes more of a cinema use of TV. As the 3D people found out, that isn't the whole truth about how people use TV. In the cinema people sit in their chair and only leave for the loo. With TV people are more likely to move around.

 

Freya

Link to comment
Share on other sites

  • 2 weeks later...
  • Premium Member

 

The 'rule' for NTSC, and presumably PAL TV was 6-7 x the height of the TV screen. Which for old SD TV, for the 'family' sized screen, the viewing distance would be about 10-12 feet.

 

Those figures are based on the 1930s assumption that all displays will use interlaced scanning, and that people would tend to gravitate toward a viewing distance just past the point where interlace artifacts become visible.

For all practical purposes, interlaced scanning is dead and buried for both HDTV receivers and cameras, yet our standards are still based upon it. What we generally have today is non-interlaced capture and display, routinely turned into pseudo-interlace for transmission/distribution.

Broadcasters now tend to use the "Recommended Viewing Distance" as a cop-out for compressing the crap out of what they transmit!

Link to comment
Share on other sites

What we generally have today is non-interlaced capture and display, routinely turned into pseudo-interlace for transmission/distribution....

 

....and than have halve of the resolution thrown out of the window by a crappy deinterlacer cause our posh modern TVs can't deal with interlaced. It's a mess.

Link to comment
Share on other sites

Those figures are based on the 1930s assumption that all displays will use interlaced scanning, and that people would tend to gravitate toward a viewing distance just past the point where interlace artifacts become visible.

For all practical purposes, interlaced scanning is dead and buried for both HDTV receivers and cameras, yet our standards are still based upon it. What we generally have today is non-interlaced capture and display, routinely turned into pseudo-interlace for transmission/distribution.

Broadcasters now tend to use the "Recommended Viewing Distance" as a cop-out for compressing the crap out of what they transmit!

 

Well all 'recommended viewing distances' are based on the Mr/Ms Average Human Viewer's visual accuity, and the point at which the 'display' artifacts are 'averaged' out.

 

Someone with 10/20 vision may need to be 20 feet away from a 4K 32" screen for the micro-pixels to average into non-pixelated images... on the other hand for my pre-replacedlensed cataract+lifelongmyopia eyes... I was ok at about 3 feet even with corrective lenses.

 

This is not to say that TV broadcasters wouldn't cheat their mothers if the opportunity arose...

 

Since I don't watch broadcast TV, cable TV, or for the most part TV shows, especially TV shows targeting the US TV broadcast market... ok... I watch some BBC via Netflix(the Daughter pays for Netflix, I pay for her cell phone... I think she is a better wheelerdealer than I...)... and HBO for 'cable' distribution on disks usually... but for the most part, I've not cared about broadcast TV in a very long time....

 

The problem with these ancient specs hanging around, is that the 'engineering' reasoning goes like this... "I have a $10 crystal, operating at 15 MHz.... and 14.3181 MHz one for less than 20 cents"... in the case of the latter that is a popular crystal frequency to divide by 4 to get 3.5795, which should be well known to all... uh... probably not... but it is the color burst frequency for NTSC color TVs...

Link to comment
Share on other sites

The big thing you are all not bringing up is compression. As it stands now, most HD people watch is via Cable or Satellite. In both cases, the compression is TERRIBLE! Even watching Monday night football on Comcast, Charter, DirecTV, Dish Network or FIOS heavy the wide angle shots (with all that detail) block up something horrible... never mind during panning. This resolution is all well and good, but if your compression and/or bit rates are too low it all goes to hell quickly.

 

I've watched off-air CBS, ABC and NBC broadcasts and the results are much better. But, even then, compression artifacts still show up during panning and other fast motion.

 

Have they established a 4K medium yet, like Blu-Ray for HD? That's where you are really likely to see the difference. I try to encourage my Super 8 wedding film customers to get a Blu-Ray player if they don't already have them just because the higher bit rate allows for better rendering of the film grain, etc. DVDs tend to block up when displaying dark, grainy, Vision3 500T Super 8 footage. I never have that problem with Blu-Ray.

 

Lastly, does anyone have any links to film resolution tests? My general experience with Super 8 is that there is more "image" in a 2K 4:3 scan than a pillar boxed HD (1080p) 16:9 scan. When I say more image, I mean text in the shot (like street signs, etc) are more legible in the 2K scans than the HD scans. The grain may become so heavy that it's distracting or detracts from the overall image. But, the "detail" is still there to be had. And that's with Super 8 (canon 814 and 1014) lenses.

 

It looks like I'm about to have the opportunity to do some real high-end 4K scans of Super 8 in the near future. I'll try to run some SD, HD, 2K, 4K tests and see what happens. I plan to shoot some HD resolution test targets. I'll shoot a clip of 4 of them in a square (simulating almost 4K) and see what kind of detail we get.

Link to comment
Share on other sites

Why is the compression so bad, by the way?

 

There are two basic types of compression... lossless and lossy... in the case of lossless no data is lost, that is the exact bit pattern of the original can be reconstructed from the compressed version.

 

In lossy compression some amount of the data is 'thrown away', never realistically to be recovered... Most video compression techniques are 'lossy', and it is a matter of judgement how much data can be thown away, and still have the 'average' viewer not really notice the difference between 'is it Memorex or is it Live'... to use an old ad campaign slogan.

 

For example, 'edges' require more data to accurately represent the 'edge'. So if data points at edges of elements are 'thrown away', the edge will be less sharp, and the human viewer may detect that by stating the image looks 'soft'.

 

Colors are 'tossed' as well since the human vision system can not detect all the colors that, for example, a 48 bit, with 16 bits per RGB, color representation the data can be 'reduced' by tossing color data.

 

Same for grey values, depending... I think humans usually can detect say 64 shades when focused on a small region, but on a 'large' region only 16... so, why represent things with more grey values than 64, which fits into 6 bits, with an 8 bit number... one has a 25% 'compression' savings right there alone...

 

But the take away message is that most video codecs are lossy, this limits the 'reconstruction' of the actual values taken by the sensor, and one can readily see that when one tries to 'grade' DSLR footage, that has been compressed to say the H.264 codec. Banding and 'wyrd' colors often result because the original scene data has been lost.

 

With that in mind, then the there is a 'race to the bottom' on squeezing images to the point were any 'sophisticated' viewer will immediately notice the compression artifacts.

 

Same is true for highly compressed MP3 audio... high frequency squealing, 'wind tunnel effect' etc. are noticiable... but you can get 1,000,000 songs on your player device...

Edited by John E Clark
Link to comment
Share on other sites

  • 3 months later...

Of course it depends where you sit, but that's why I mention at the distance most people watch TV. You can of course choose to sit closer!

 

They found out that whole way of thinking about TV was a myth with 3D. That's just not the way people use TV. In the cinema yes that's how things are but with TV people are more likely to move about, so they might sometimes be close or further away from the TV.

 

Also all the 4K I have seen on televisions it's instantly noticable that you can't see the pixels anymore and that is a big thing. On the big screen I don't notice the difference as much as I never saw pixels in the film prints in the first place.

 

Freya

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...