Jump to content

4k Ultra HD rant


George Ebersole

Recommended Posts

  • Premium Member

I never bought a DVD of "Close Encounters of the Third Kind", but I did buy the bluray when it came out, and then took a chance on the 4k Ultra HD. I was impressed, and so also bought a 4k Ultra HD bluray of "E.T.". The clarity of image and motion, and the richness of color and depth of black is outstanding. To my eyes the movies look better now than it did when I saw it down at the Stanford Theatre just north of Stanford Univesity off El Camino. Truly the movie looks better now than ever before. I'll even venture to say that it probably looks better now than the raw footage straight from the negative (if that's possible).

 

But I'm tired of paying for repeated revamps of films I already own. During the 80s and up through the 90s and early 2000s, computer technology evolved on a monthly basis. Modem, soundcard, video card, network card you bought today at CompUSA or Fry's were already outdated, or would be in a few weeks. And that's the feeling I'm getting with digital media.

 

I don't want to go back to VHS nor magnetic audio tape for audio (apparently it's also used for computer storage once again), but it would be nice that for once and for all there could be a definitive version of movie-X without requiring a household hardware upgrade of some kind. I don't like contemporary movies all that much in the first place, so I'm spared any expense of a film I'm interested in and may have otherwise been tempted to purchase. But I refuse to buy a 4k Ultra HD of say George Pal's "The Time Machine" or Irwin Allen's "Voyage to the Bottom of the Sea", or even something more mainstream like "The Blue's Brothers" or whatever else.

 

I'm burnt out on media technology progression. Ergo the reason I didn't go see Lucas's 3D version of his Star Wars' films.

 

I know my post will probably fall on deaf ears, and that 4k Ultra HD took a while to get here, also that I'm not longer striving to create my own media, but as former industry member who used to sit and watch rushes, who talked with DPs and cameramen back then when I was working a lot and here on this forum, who took footage to the labs and picked it up, set up lights, did slate and all kinds of other things, I think if I guy like me can get burnt out and upset about shelling out cash every few years for a new upgrade for films I already own, then I can't imagine what's going on through Joe average's mind who may have a few films in his library, owns an average bluray player and TV, and probably has to support a family.

 

I'm an older dude. I'm kind of burnt out on movies anyway. I've rarely seen a truly good, profoundly deep and intellectually moving film, but I still appreciate good drama or good adventure. But unlike the continued computer tech upgrades of the 80s, 90s and early 2000s, which actually gave you significant processing power and data exchange capability, I don't think nor feel that I'm getting more bang for my buck with 4k Ultra. I think it looks great, but you know what? For both of Spielberg's films I think I would have been just as happy with your standard DVD transfer.

 

I think of all of the Disney films in my library, from Snow White to Dumbo and beyond, and all of the other films I have on Bluray, like "The Right Stuff" or "Indochine", and to me if the film looks presentable and professional enough, then all of the extra data to clear the picture is just extra frosting on the visual cake. It's not that I don't appreciate it on some level, but I refuse to spend anymore money on it.

 

Call me whatever you want, but I'm really tired of consumer tech progress that doesn't give me much.

  • Upvote 1
Link to comment
Share on other sites

  • Premium Member

I plan on being very selective with what I buy in 4K UHD HDR blu ray -- I think most older movies won't benefit from it unless it is simply a newer better transfer than what was available before. I was a little upset to see that the 4K blu ray of "Superman: The Movie" is using the old transfer for the 1080P blu-ray copy included -- I don't think this movie really needs to be released in 4K but I was looking forward to a new HD transfer just to compare. I may get some of the old 65mm movies like "2001" and "Lawrence of Arabia", etc. on 4K blu ray someday, but there isn't much reason to get a 4K version of "McCabe and Mrs. Miller" for example.

 

As for HDR, again, I think older movies weren't really shot with this in mind but maybe an HDR version would give you some of the feeling of contrast that a projected print gives you. I don't know. Certainly prints didn't display every stop of luminance information of the original negative.

 

Of course, there might be new movies that take advantage of 4K and HDR, but there are fewer new movies that I feel like owning these days.

  • Upvote 1
Link to comment
Share on other sites

  • Premium Member

The thing is, a lot of what George is saying is completely correct.

 

For instance:

 

 

 

I'll even venture to say that it probably looks better now than the raw footage straight from the negative (if that's possible).

 

It is quite possible. Assuming the 4K was scanned from the original camera negative, it'll be sharper than the print-from-interpos-from-interneg-from-negative you saw on the 35mm prints at the time of release. Conventionally made release prints like that tend to have a resolution well under 2K. The 4K probably won't have the same wide colour gamut as film, but that's not hugely significant on most material.

 

Of course, lots of OCNs probably don't resolve 4K either, given the lenses of the time and so on, but that's another issue.

 

But the general thrust of this discussion seems to be that improving technical quality beyond a certain point isn't that helpful, which in my view is absolutely true.

 

I'm not sure about the technological progress of the 90s being uninteresting, though. I think tech progress, particularly in computing, has become less interesting right now, specifically because computers are now more than good enough to do most of the jobs they've become established in doing. In the 90s, big performance gains were common, and we watched things like nonlinear editing become possible, then affordable, then affordable and slick, over a matter of a few years. When HD came along, one of the downsides was that it made our workstations feel slow again. Same with 4K, and that's happened at a time when the tech progress is much slower.

 

But in general, I'm starting to feel the same way. I turned 40 a few days ago, so I'm not quite ready to describe myself as older, but I'm starting to tire of pixel peeping and tech porn in much the same way. Maybe it's a concomitant of middle age.

Link to comment
Share on other sites

  • Premium Member

I plan on being very selective with what I buy in 4K UHD HDR blu ray -- I think most older movies won't benefit from it unless it is simply a newer better transfer than what was available before. I was a little upset to see that the 4K blu ray of "Superman: The Movie" is using the old transfer for the 1080P blu-ray copy included -- I don't think this movie really needs to be released in 4K but I was looking forward to a new HD transfer just to compare. I may get some of the old 65mm movies like "2001" and "Lawrence of Arabia", etc. on 4K blu ray someday, but there isn't much reason to get a 4K version of "McCabe and Mrs. Miller" for example.

 

As for HDR, again, I think older movies weren't really shot with this in mind but maybe an HDR version would give you some of the feeling of contrast that a projected print gives you. I don't know. Certainly prints didn't display every stop of luminance information of the original negative.

 

Of course, there might be new movies that take advantage of 4K and HDR, but there are fewer new movies that I feel like owning these days.

 

Yeah, that's' kind of my feeling. If it's a big movie that I really like, I may splurge on it; i.e. like you say a 2001 Space Odyssey or Lawrence of Arabia. But for older films or films that are just essentially plain comedies or dramas, I'll beg off.

 

Truly, I don't need to see a Charlie Brown Christmas in Ultra HD 4k. To be a little sarcastic here, I don't need to see the rich colors and deep blacks as the Peanuts' gang dances on stage while Schroder plays the piano. I mean …. come on.

 

But it's like if an Ultra 4K HDR disk of Star Trek comes out, do I really want to throw more money to the studio and its publisher? I'm thinking not. A big notable dramatic film, sure, maybe, if I like it enough. But not every film. And I simply don't like a lot of today's offerings.

Link to comment
Share on other sites

  • Premium Member

The thing is, a lot of what George is saying is completely correct.

 

For instance:

 

 

 

 

It is quite possible. Assuming the 4K was scanned from the original camera negative, it'll be sharper than the print-from-interpos-from-interneg-from-negative you saw on the 35mm prints at the time of release. Conventionally made release prints like that tend to have a resolution well under 2K. The 4K probably won't have the same wide colour gamut as film, but that's not hugely significant on most material.

 

Of course, lots of OCNs probably don't resolve 4K either, given the lenses of the time and so on, but that's another issue.

 

But the general thrust of this discussion seems to be that improving technical quality beyond a certain point isn't that helpful, which in my view is absolutely true.

 

I'm not sure about the technological progress of the 90s being uninteresting, though. I think tech progress, particularly in computing, has become less interesting right now, specifically because computers are now more than good enough to do most of the jobs they've become established in doing. In the 90s, big performance gains were common, and we watched things like nonlinear editing become possible, then affordable, then affordable and slick, over a matter of a few years. When HD came along, one of the downsides was that it made our workstations feel slow again. Same with 4K, and that's happened at a time when the tech progress is much slower.

 

But in general, I'm starting to feel the same way. I turned 40 a few days ago, so I'm not quite ready to describe myself as older, but I'm starting to tire of pixel peeping and tech porn in much the same way. Maybe it's a concomitant of middle age.

 

 

Phil, in the 90s you actually did experience improvements with computer tech. My old 386 was a dinosaur when I built my first Pentium in 91 or 92. But back in the 80s when the family wanted me to get a computer to help me with my school work, at that time, computers couldn't do much, so I was against getting one. And I was coding on the first Apples and Apple clones back in 79 and 80. There was no net access, you had to mail away or drive to the store for new software, and there were no patches for bad software. That and a typewriter could give you a better looking document than any printer alive at the time. So I actually do appreciate the tech advances in computer tech.

 

And yeah, I remember my film instructor's telling about how rushes verse work prints verse distribution copies compared to one another. So I can appreciate the advancement on some level, but otherwise I'm just burnt out on new consumer media tech. I'm glad I bought Close Encounters and ET, but I think I draw the line there.

 

Anyway, I guess I've had my soapbox moment :)

Link to comment
Share on other sites

  • Premium Member

More ranting; a thought just occured to me, and that is when films were released on VHS (and Beta too, I suppose) a lot of light was pumped through the prints during the transfer. Stuff in shadows that you weren't supposed to see in the release print were suddenly visible on the VHS copy. Which, to me at least, says that even though those films weren't shot with HDR in mind, you could, in theory I suppose, create an HDR of nearly every commercial film. Which leads one to ask, why not do that in the first place?



Tons of examples. I won't go off the deep end too much, but I've got a Beta of The Empire Strikes Back. I have got a regular VHS of The Empire Strikes Back when four head VHS VCRs were mustleing out Betamax. I then got a "special edition letterbox" VHS of the same movie. I then got the DVD. And yes, I bought the bluray.



I saw The Empire Strikes Back on opening day with the family. I saw it a couple more times after that. I enjoyed it. I still do. But, with all due deference and respect to George Lucas and Disney both, I simply refuse to give you another red cent for this movie.



I wish I had bought the Laser Disk way back in 84 or 83 or something, because I wouldn't be griping here on this forum.



Another soapbox moment. Thank you :)



p.s. David, yeah, I don't know what it is about today's movies. I think they're more niche and less broad-audience oriented. That is they're aimed at specific demographics, or so it feels like to me. I mean they're technically competent, but I think the writing is lacking in a lot of them. In the end it just means less money out of my wallet.

Link to comment
Share on other sites

  • Premium Member
Which, to me at least, says that even though those films weren't shot with HDR in mind, you could, in theory I suppose, create an HDR of nearly every commercial film.

 

You could. I'm sure they will, if it gets any traction, and they'll want you to buy it again. The usefulness of this is currently dubious in a situation where even TVs advertised as "HDR ready" might struggle to exceed about 600-700 nits, which is practically as bright as many computer monitors anyway.

 

 

Which leads one to ask, why not do that in the first place?

 

If you want the technical answer, because the displays wouldn't have been able to deal with it; it would have been like watching a log picture on a normal TV. You could say, why don't they put some simple hardware in the player so as to crunch the highlights down with something like a 709 curve so that the HDR signal looks reasonable based on knowledge of the level of capability your TV has, and that's basically what Dolby Vision does.

 

HDR is nice but current consumer TV attempts at it, in my view, are not good enough to make it worthwhile.

Link to comment
Share on other sites

  • Premium Member

Correct as usual, Phil. I've gotten so used to watching films on my computer and new-fangled TV that I forgot about CRTs.

 

I want to say more, but I need to sleep on it some. All in all, like I say, I was really blown away by the HDRs of both Close Encounters and E.T. I mean those really looked sterling. They looked just as good or better than when I saw them in the theatre as a kid. But like you say, current display tech hasn't entirely caught up with it yet. Ergo it feels real gimmicky to me in spite of the superior picture quality.

 

Call me a philistine, but I really don't like a lot of superhero movies, or how the scifi genre has been turned into this way over the top action-adventure thing. I mean the range of colors and light far surpasses the films I grew up on, but it's like the technical gloss of being able to crush blacks or enhance highlights and display both next to one another for an HDR experience, doesn't make up for some iffy content.

 

And I don't know, when I was down at Tyler's place looking at his editing suite, to me, it felt like a good solild monitor is all you need, and not some $10,000 monster of a monitor to perfect every single color inthe spectrum for each shot.

 

Just me. I'm kind of on my high horse when it comes to this topic. sorry about that :)

Link to comment
Share on other sites

  • Premium Member

You need to color-correct on a decent professional monitor if you hope to distribute that master to anyone, there are technical standards that have to be met -- sure, you can also monitor the correction on a consumer monitor to see what happens on a typical display.

Link to comment
Share on other sites

  • Premium Member

There are standards to meet, but in terms of creative intent, including critical colour, there isn't really much to say. Getting things past a distributor's quality control process basically means getting them past someone the distributor trusts who sits and watches it intently, and yes, that can absolutely be done on a desktop computer monitor (I've done it and could refer you to other people who have done it.)

 

You have to keep some understanding in mind of what they're looking for, which is obvious technical problems like dead pixels on cameras, compression arefacts, focus or obvious soft shots due to GoPro or whatever, skies being green, or excessive noise, clipping or crushing. Otherwise, the reality is that as long as it's in the ballpark, the microscopic shades of a-few-delta-E that everyone obsesses over simply will not be noticed and will not cause anything to fail any sane QC, because it's not wrong, it's just perhaps not precisely what you intended. That's not great, but it doesn't make your project unsaleable.

These days, more or less anything from the midrange of desktop monitors upward, plus a basic calibration with an affordable probe, will get you more than close enough.

 

P

Link to comment
Share on other sites

  • Premium Member

It's funny because I feel in the 90's when technology was booming, things were a lot slower than they are today. I mean VHS had a life span of the late 70's through 2002? Laserdisc? late 70's through 2000? I use to be in the computer industry and software updates would be yearly, not monthly. Hardware updates were very slow, sometimes using the same tech year after year until some major update happened.

 

Today, tech is moving at such a fever, it's hard to keep up. Where it's true, the media codec's we use are pretty stagnant (which is good) the technology behind it all has become so much better. The fact I can playback 6k Red Code in real time on my Mac Pro Tower from 2010 is pretty amazing. Just look at the cell phone industry and you'll see them pushing for huge advancements in technology.

 

I really never invested in BluRay like DVD or Laserdisc. Yes in recent years with substantially lower pricing on disk's, I've replaced some laserdisc and dvd titles with BluRay and UHD disks. I'm honestly waiting for home theater 4k laser projectors before I invest in any UHD playback systems. I fret UHD BluRay is the end of the road for physical media and it's at a time where studio's are scrambling to find more money in the afterlife for their product. So they've gone to great lengths to restore films and release them on disk in an attempt to get more money from those assets. I do think it's great to see it being done, but at the same time, it's not being done at the level it was during the laserdisc period where you could literally get hundreds of thousands of classic titles on disc. Today, most of the obscure movies on disk, come from LD transfers and are rarely on DVD at all, let alone BluRay. So where it's nice to see the big titles remastered, it's sad the smaller titles are overlooked because it's all about the money in the long run.

 

In terms of Close Encounters, it never looked good on film in my opinion. I haven't seen the 70mm blow up that floats around every few years, but I'm sure it's crisper then any 35mm print. It's amazing to see original prints and realize how bad they really are. I'd love to see a simple cost effective digital to film technology that would allow these restorations to be accurately put back on to film without the huge cost and time associations. I don't wanna see a 4k re-release in cinemas, I want to see a beautiful 35mm or 70mm print made from the restoration elements. To me, a theatrical re-release is more relevant financially then a video re-release.

Link to comment
Share on other sites

  • Premium Member

You need to color-correct on a decent professional monitor if you hope to distribute that master to anyone, there are technical standards that have to be met -- sure, you can also monitor the correction on a consumer monitor to see what happens on a typical display.

 

Yeah, I guess my response to that is, to me, it seems like technical standards are more stringent than they ever used to be. I don't go to too many movies these days, but what I do see on the big screen, and in the trailers, is that there's more control and exercising of color manipulation than ever before. A lot more. And I guess it's just the technical robustness of content that's driving the pro-monitor market.

 

I guess it's a matter of how much artistry the director and the rest of the team want to translate to the final venue; theatre / home video.

Link to comment
Share on other sites

  • Premium Member

For Bluray investment, I've purchased films that I really like and I thought would benefit from the additional visual data. I bought the Indiana Jones set, and it looks far more crisp than it did on DVD. I have the DVDs of the Young Indiana Jones TV series, and since those were shot on Super-16 I'm thinking they're not going to benefit from a bluray release. Ergo I won't buy a bluray set if they're offered.

 

A lot of films I saw as a kid and can appreciate more as an adult, are offered on bluray, but to be honest they look more like they did on screen on DVD than the ultra crisp image you get with Bluray. Example. 1980 was a pretty outstanding year for feature fims, but I only saw a handful in the theatre; one of them being "How to Beat the High Cost of Living". I have it on bluray. But I also have the DVD. To me the DVD appears more like it did when I paid for a ticket to see it than it does on bluray. Meaning I'm happy with the DVD, took a chance on the bluray, was impressed with it, liked it to that extent, but was just as happy with it on regular DVD.

 

The big SFX films benefit from a lot of tech advancement in terms of executing the shot, but that doesn't make the film better. The original 1978 Superman film, to me, looks far better with its anachronistic 35mm with a soft filter look than something very sharp like "Guardians of the Galaxy". I think a lot of that is just my personal taste, but I am struck by the artistry of that film verse more recent offerings; "Green Lantern" as an example. The more recent films will better, but I'm not getting a sense that there's more artistry in films. It actually feels like less.

 

So, getting back on topic, I don't see Ultra4K HDR disks, overall, as a benefit. I don't know. Maybe with some solid anti-piracy technology and with lowered ticket prices more people would go to films or buy physical media. Hence no reason for HDR 4K gimmicks. Just me.

Link to comment
Share on other sites

  • Premium Member

Ohh BluRay and Ultra HD BluRay are entirely different formats. Throw away the resolution...

 

Standard BluRay is .h264 8 bit 4:2:0 Long GOP with a maximum bit rate of around 50Mbps.

Ultra HD BluRay is .h265 10 bit 4:2:2 Long GOP encoding with a maximum bit rate of around 120Mbps.

 

.h265 is much more efficient then .h264, so reality is HEVC .h265 uses around half the bandwidth of .h264. So encoding a 1080p file at similar quality to .h264 will be half the size. So Ultra HD BluRay is, for lack of better words, no matter what, twice the bandwidth. Then you add the 10 bit 4:2:2 nature of MOST encodes and the 4:4:4 full raster encode of some others, you start to see huge differences in quality.

 

Remember, 4:2:0 means that you get full resolution of the Y (green) channel, HALF resolution of the B (blue) channel and QUARTER the resolution of the R (red) channel. What you get is basically aliasing and tearing between channels, which we're so use to seeing, it doesn't bother our eyes anymore. These issues nearly completely go away with 4:2:2 and entirely go away with 4:4:4. In fact, Ultra HD BluRay is the only home media format which can do 10 bit 4:4:4. Currently, I'm unaware of anyone who has pushed that provision for streaming.

 

8 bit vs 10 bit is really all about separation of gray scale. You see this a lot with DVD and BluRay's, a sort of ring or "step" look in anything that has a gradient. It's very disturbing to me because it's such an obvious issue that could be fixed and not intended in the original production.

 

These issues, mixed with higher bandwidth audio and in a lot of cases, makes Ultra HD BluRay a much better format, in my eyes the last physical home theater format to exist. I also don't see streaming catching up anytime soon, as most streamed media is variable bandwidth. Even if they say you're getting 4k, .h265 has this very clever way of adjusting resolution and color sampling rate on the fly based on bandwidth. So most of the time you aren't actually getting a 4k signal in real time, unless you use a hardware playback engine and buffer the entire program. Right now the average home internet bandwidth in the US is still around 25Mbps.. nowhere near enough to stream UHD content.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...