Jump to content

Stupid HDR question


George Ebersole

Recommended Posts

  • Premium Member

So, is HDR really all that "good"? Is anybody here authoring their projects for 4k HDR?

 

I bought one recently, and even though I've built a new rig with the latest video card, a Dell superwidescreen 4k monitor, I can't get it to run, and every site I go to I get "It's great!" or "It sucks!"

 

Anybody here have any insight?

Link to comment
Share on other sites

The last few movies I've shot have all had a HDR color-timing pass done on them, mostly for future proofing, rather than any immediate demand. The images did look incredible on a $30k Sony monitor in a dark room, but how well any of it would translate on a consumer HDR tv in someones home, I have no idea.

Link to comment
Share on other sites

  • Premium Member

Thanks. It seems to me that proper color adjustment can be done on the end user / consumer end to get a desired image. Maybe I'm being naive in that, but for all the visual tweaks I've seen done to source material I've yet to come across any image enhancement that actually makes a poor to so-so film a good film.

Link to comment
Share on other sites

  • Premium Member

i7 3.3GHz with 5800 onboard cache, 32 gigs RAM

 

I built it not more than 8 months ago. It was going to be my editing rig as well as my gaming rig, but if HDR is the new thing, and if I can't edit or author HDR ... well.

Link to comment
Share on other sites

  • Premium Member

There is no particular set of hardware specifications associated with ability or inability to create HDR material, beyond the minimum requirements for running your NLE of choice. Perhaps rendering might take fractionally longer for some techniques.

 

The problem with HDR at the moment is that the (very impressive) trade show demos given by outfits like Dolby and Sony are vastly better than what's actually likely to be sold to the public. The Sony BVM-X300 is an OLED, thus enjoying zero black level and offers 1500 nit peak brightness, and Dolby's Pulsar monitors go all the way up to 4000 using an LCD panel with local backlight dimming. There's no denying that the resulting pictures are clearly and obviously a cut above standard dynamic range. It looks good. It's really nice.

 

Unfortunately, even the targets, let alone the products, for consumer-level devices are nowhere near as capable. They require 1000 nits of peak brightness with an 0.05 nit black level, which is a target clearly aimed at LCDs, or 540 nits peak brightness and 0.0005 nits black level, which is clearly aimed at the emerging consumer-level OLEDs. This looks better than SDR, but you really have to have them side by side to be able to tell. Bear in mind that consumer TVs have been massively exceeding the hundred-or-so nit standard associated with Rec. 709 for years because it looks better and sells more readily. Things like computer monitors regularly go up to 500 or 600, and consumer HDR may fail to even hit the 1000-nit standard, so most consumer HDR is less than a stop brighter, if at all, than conventional displays.

 

This is a shame because really good HDR is, well, really good. It can be clearly and obviously better than preexisting pictures and in a way that doesn't really cause too many problems. It's hard to dislike, even for a dedicated curmudgeon such as yours truly. Even the much less capable consumer level implementations of it are not a bad thing - a thousand-nit TV with really decent blacks is great. It's just not as great as it could be. Partly this is the economics of domestic TVs, partly it's the keenness of the TV industry to have a new badge to put on things without having to do more than the absolute minimum amount of work. In this respect I blame the TV manufacturers for making up a new acronym every week which serves to dilute the impact of genuinely interesting things. Still, the bottom line is that it's generally good, it will get better as TVs, particularly the peak brightness of OLEDs, improves, but the actual deployment of it could be seen as lacking a little in ambition.

 

P

Link to comment
Share on other sites

  • Premium Member

Depends what software you're using, but no, it's just a matter of processing power.

 

Now, particular pieces of software may have specific requirements, but there's no particular reason that a particular frame size might be associated with a specific CPU requirement in the general case.

 

P

Link to comment
Share on other sites

  • Premium Member

Thanks Phil

 

Honestly speaking, as one curmudgeon to another. again this is just for me, it seems like if any DP worth his salt is getting the image the director wanted, then there really isn't a need for new technology like HDR. I've seen demos, and my thinking is that good image comes down to resolution technology, because you can always tweak blacks and colors in post no matter if you're shooting digital or film.

 

That may sound naive or heresy, but I think at some point you have to decide on what or how you want to present your final product to your audience. If you can get that, then you're done.

 

I won't call HDR a gimmick, but in spite of all of the impressive image comparisons that I've seen, again, it doesn't hit me as being something that can't be achieved through other means.

 

Besides, 14grand for a 30" monitor? No thank you.

Link to comment
Share on other sites

I've seen demos, and my thinking is that good image comes down to resolution technology, because you can always tweak blacks and colors in post no matter if you're shooting digital or film.

 

I won't call HDR a gimmick, but in spite of all of the impressive image comparisons that I've seen, again, it doesn't hit me as being something that can't be achieved through other means.

 

 

It can't be achieved through other means, because SDR displays don't have the dynamic range to display the full range of luminance values that the original images have. You can color time for greater detail in the highlights, but you lose shadows, and vice versa. You are always trying to squeeze 14+ stops of detail into a 8-10 stop range

 

It may be true that perceptually there is little difference, but that can be down to something as simple as viewing in too bright an environment.

Link to comment
Share on other sites

  • Premium Member

It can't be achieved through other means, because SDR displays don't have the dynamic range to display the full range of luminance values that the original images have. You can color time for greater detail in the highlights, but you lose shadows, and vice versa. You are always trying to squeeze 14+ stops of detail into a 8-10 stop range

 

It may be true that perceptually there is little difference, but that can be down to something as simple as viewing in too bright an environment.

 

Well that's the whole thing, if you can't tell, then what good is it?

Link to comment
Share on other sites

  • Premium Member

I don't know, my gut feeling is that watching "The Empire Strikes Back" in HDR isn't going to make me relive my boyhood again.

 

When I used to watch movies on TV I noted they mostly had better image quality than home video. But now I'm just not seeing the real benefits for more recent films. Just my opinion though. Maybe I'll change my mind by the end of the year or something.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...