Jump to content

Zero Black is OVERRATED.


Alessandro Machi

Recommended Posts

  • Premium Member

I think zero black is overrated by digital video editors.

 

The very nature of a television screen is such that chroma level needs some brightness level to attach itself to. 7.5 IRE allows chroma more signal to attach itself to. I am not saying that 7.5 black is better than zero black, I am merely saying that it is not obvious that one method or the other is better all the time.

Link to comment
Share on other sites

  • Premium Member

Hi,

 

> The very nature of a television screen is such that chroma level needs some

> brightness level to attach itself to

 

I'm not quite sure what you're trying to say here, but if it's what I think you're trying to say, then while you may have had a point when considering comb filters and summing amplifiers in the RF to RGB convertor in an analogue TV, but this consideration is obviated in digital formats. After all, PAL never had a 7.5% setup level.

 

Phil

Link to comment
Share on other sites

  • Premium Member
Hi,

 

> The very nature of a television screen is such that chroma level needs some

> brightness level to attach itself to

 

I'm not quite sure what you're trying to say here, but if it's what I think you're trying to say, then while you may have had a point when considering comb filters and summing amplifiers in the RF to RGB convertor in an analogue TV, but this consideration is obviated in digital formats. After all, PAL never had a 7.5% setup level.

 

Phil

I think 7.5 gives an overall better result if you are making both DVD and VHS copies.

 

Another issue revolves around viewing environment. I'm wondering if smaller television sets don't show off the contrast value as well as a larger television set to the average viewing audience.

 

Was 7.5 IRE created strictly for analog broadcast spec purposes?

 

Lighting even can enter the equation. Reality TV, shot with low-end digital camcorders, actually benefits when the dynamic range is reduced and the darks are brought up and the brights slightly muted. That type of "adjustment" bodes well for what 7.5 IRE brings to the table.

Link to comment
Share on other sites

Was 7.5 IRE created strictly for analog broadcast spec purposes?

 

Lighting even can enter the equation. Reality TV, shot with low-end digital camcorders, actually benefits when the dynamic range is reduced and the darks are brought up and the brights slightly muted. That type of "adjustment" bodes well for what 7.5 IRE brings to the table.

I don't remember what the reasoning was behind 7.5 Black but it may have had to do with the response charcateristics of tube cameras at the time...

 

Once you go out of digital, most NTSC stuff, monitors etc will just lift it back up.

 

http://pro.jvc.com/pro/attributes/prodv/cl...up/JVC_DEMO.swf

 

I hope the future of digital isn't tied too strongly to "what looks best for Reality TV"

:unsure:

 

-Sam

Link to comment
Share on other sites

  • Premium Member

I thought you were supposed to work at 0 for digital until you made a conversion to analog NTSC, at which point you set the black at 7.5 IRE? Isn't there a problem with doing all the digital work with black at 7.5 IRE because it can accidentally get doubled when converting to analog NTSC? Not to mention conversions to PAL?

 

Anyway, what difference does it make as long as black is at the "correct" IRE level for the particular FINAL distribution medium? I.E. what difference is it if you shoot and work with black at 0 IRE and convert it later to 7.5 IRE for NTSC tape distribution and broadcast, versus working at 7.5 IRE from the moment of capture and all the way through editing?

Link to comment
Share on other sites

  • Premium Member

Hi,

 

Yes, the main problem with it is that it gets overlooked. However this is part of a much wider problem with similar symptoms, in that the black to white range of DV formats is supposed to be between 16 and 235. A lot of software overlooks this and relies on a separately-installed codec to remap the levels; poorly-written codecs get this wrong all the time. You can end up either with imported video sat up 16/255 or exported video getting horribly clipped and crushed; internal re-rendering of DV streams gets horrible and matted items get grey boxes around them. Add 7.5IRE setup and you're in a whole new world of pain.

 

Phil

Link to comment
Share on other sites

  • Premium Member

I just viewed someones home videos that they want transfered to DVD. MY scope showed me levels that were from zero IRE all the way up to 120 and even slightly higher. If I reshape the levels and bring the lower end back up to 7.5 IRE and the top end to a more reasonable 100-105 IRE, I can actually improve the amount of detail that I see, especially on underexposed faces.

 

It seems to me that I would immediately want to make the set-up correction before doing anything else, then make additional adjustments to color and hue. If digital codecs are not all they should be, then it seems to me you would want to do the set-up adjustment BEFORE the codecs do their damage and take close to zero black and make it that much blacker.

Link to comment
Share on other sites

  • Premium Member

When you edit, you have to setup your system as the b/W dynamic fits into the range.

 

I would say that if you have "non standard" material (black level = 0 instead of 7.5, for instance) it's better to put them to the levels before converting thru a codec.

 

But, you may lose some informations.

 

If the black level setup is not at 0, it's because the monitoring tubes don't responde well to low level signals. It's like it'd need to be "preleveled" to react to a low luminance difference -wich is carachteristic of details in the shades areas -

 

The point is that when the signal is - at one step or another - set to real 0 - which means you lose details in the shades - all these details might be lost.

 

It's the same for highlights set upper than 100/105

 

This is why it's better to reduce contrast rather than setting the whites and blacks to some specific values, as you may take every point close to it to the same output value (ie lose details).

 

There is no way for me for a real 0 black setup to look properly on a RTC monitor. Since some product may now be designed for linear display (plasma...) I don't see any other reason for it and I would say that such a product is non standard.

 

thought you were supposed to work at 0 for digital until you made a conversion to analog NTSC

 

 

 

You can have some specific things set at 0, say if you work with alpha coates, balck can be at zero, it doesn't matter, since you don't want any detail, but a pure black in these areas. But the image that comes from a camera or whatever that produces details in the low level should be put at the proper black level setting.

 

Then you have the correct settings for transfering to analog as well as to exploit the digital signal itself.

 

what difference is it if you shoot and work with black at 0 IRE and convert it later to 7.5 IRE

 

You'd lose details if you'd do so.

 

in that the black to white range of DV formats is supposed to be between 16 and 235
.

 

The "normal" range for an image that comes from a camera is 16 to 235. where 16 ie to 7.5 ire and 235 to 100 ire. If you work on an "alpha coat" it becomes 0 to 255. But these are settings for any specific image or shot when you do the color timing on the editor before outputting the master tape.

Link to comment
Share on other sites

  • 3 months later...
  • Premium Member

If anyone wants to see zero black in action, check out the Fox television broadcast of the Major League playoff games. I have never seen worse looking Broadcast video. What is Fox's "secret"?

 

Zero CRUSHED blacks and blooming whites.

 

Two reasons come to mind as to why Fox would choose to make such an ugly looking picture. One is somebody has "computerized" the whole scheme and "Hal" won't give Fox back control of the video levels.

 

Or, and this one is a crazy theory I admit, it's a Homeland security issue and someone is trying to make the stands disappear in a sea of crushed black video level so that all we see is the game and not the fans.

 

What does that have to do with Homeland Security, I'm not sure, :unsure:, but it sure makes for crappy looking video.

Link to comment
Share on other sites

Homeland security or not, but does anyone notice a significant difference in quality of watching a broadcast of a game (with replays and such) and a halftime update of another game (highlights and whatnot)? I would guess most of this could be attributed to generation loss, but I can't believe that they're not running on at least beta tapes... why does the halftime highlight video seem to be about 15 years old?

I might be imagining this, but it sure seemed apparent with all the football (de americano) I watched yesterday.

Link to comment
Share on other sites

Actually, I'd have to disagree with the worst looking broadcast video... I think the talkshow "Jane" gets that... :) ...heaven forbid a guest wears red lipstick...they end up looking like a deranged clown...

 

wouldn't the quality be lost by broadcasing 4:1:1 through satallite? I not too familiar with how that works though...I think I remember an article somewhere about how C-band can mangle 4:1:1

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...