Jump to content

What causes video noise?


DS Williams

Recommended Posts

Hello all,

 

First thread for me on this board, so far very impressed with the professionalism here, and very happy with my decision to register.

 

Anyways, I am fairly new to digital cinematography, being only 19 and still in film school. But I try to soak up everything I can from these baords.

 

I have a question about video noise. I always hear about it, unfortunately I'm always seeing it (I own an HPX170 and HVX200)

 

What exactly causes video noise, and what is meant when people say a camera has a low "noise floor"?

 

This has all been a part of my quest to figure out why the hell my HVX200 is so noisy.

 

Any comments/help would be much appreciated.

Link to comment
Share on other sites

  • Premium Member

Most of the noise in the CCD image sensor itself is thermal noise - that is, electrons being thrown around by vibration (which is all heat is) as opposed to photons colliding with the image sensor. Thermal noise affects both the CCD sensor matrix itself, and the delicate and sensitive electronics used to read information from it and amplify that information to a level that's usable by the rest of the camera. Every pixel on a CCD is capable of converting photons to electrons and storing those electrons until the pixel is read out, and the bigger the sensor the more it can hold, which is why physically bigger chips have lower noise and higher dynamic range. This number is often in the tens of thousands. However, due to thermal noise, all CCDs will always have a certain number of electrons in each well after a given amount of time, regardless of how many photons struck that well. This number is often in the several to many tens. The ratio between the two controls the noise floor of the sensor, so if you have a CCD with a well capacity of 50,000 electrons and a potential random distribution of 50 electrons per pixel, you have a ratio of 1000:1, or 30dB. It will be realised that since most CCDs have the ability to perturb the least significant bit of an 8-bit image (that is, the noise is more than 1/256 the size of the maximum amplitude) most CCD cameras have much poorer noise performance than our theoretical device.

 

Then there's read noise, which, briefly, is connected to the way a CCD shuffles each line of pixels down towards the bottom as it's being read, plus any intrinsic nonlinearity in the CCD itself (that is: each pixel is not necessarily identically sensitive to its neighbours), and amplification noise as the tiny CCD current is increased to a point where it is useful to more conventional electronics.

 

There are additional sources of noise from things like solar wind and other radiation. Memorably, a camera being used to observe decaying Russian submarine reactor cores showed a pronounced speckling of white dots as subatomic particles from the radioactive material ran into the CCD and liberated charge. There is always some degree of background radiation, although much of it won't make it through the casing of the camera.

 

Taking all this into account, it's not surprising that something like the HVX, which is built to a price and has interesting capabilities in terms of high speed shooting, doesn't have terribly good noise performance.

 

P

 

PS - Edited for a stupid generalisation

Link to comment
Share on other sites

  • Premium Member

Phil, is most noise "white" even when it happens in an individual color channel? I assume that these random electrons cause the appearance of a random pixel going to full exposure or level, and thus noise is most visible in black or dark areas of the frame -- as opposed to grain in film, which is most visible in flat expanses of midtones.

Link to comment
Share on other sites

  • Premium Member

Not necessarily, no - the thermal noise (and shot noise, that is, statistically there should be n photons here but there didn't happen to be while we had the shutter open) is added to the photoelectric value. You're just reading the number of electrons in the well; you don't know where they came from. Some of them will be optically motivated, some of them will be thermally motivated.

 

For instance. If you have a sensor with an average noise floor of say 50 electrons, the actual value will vary around that. 68% of the time, the actual number of thermally-motivated electrons will be somewhere between (roughly, in my head) 38 and 62, since thermal noise is the text-book example of a normal (that is, Gaussian) distribution and 68% is the amount of values which fall within one standard deviation of the mean. Therefore, if your well is exactly half full at 5000 optically-motivated electrons, you would read a anywhere from 5038 to 5062 on 68% of occasions.

 

I suspect it's more visible in the dark for several reasons. Despite the fact that they're precisely photon linear, almost nothing else is, and you have to bend CCDs a lot to get a visually-acceptable image out of them. That affects the dark areas most of all. Look at a straight-off-the-chip image from that DVX-100 modification - it's a sea of nearly black with highlights popping up in it, so you're amping up the shadows most of all. Also, noise is a larger proportion of the whole when the region of interest is dark. Psycho-optically, your brain will be winding up the gain when the ROI is poorly illuminated. And I'm not sure it isn't visible in midtones - once you've looked at it enough, you can see the fizz in a sky or area of grass.

 

It's also worth noting that thermal noise in the CCD is not actually that big a deal until you are talking about very long exposures. There are examples on the internet of people having pulled the CCD assembly out of a DSLR, and doing astrophotography with it mounted on a peltier-effect cold plate. The improvement is gigantic and well worth the trouble. The thing is, they're doing exposures in the tens of minutes on tracking telescope heads, which we generally aren't in cinematography. Thermal noise is cumulative over time, and a forty-eighth of a second isn't much time in this context - thermal noise is (depending on the authority you ask) either synonymous with dark current, or a principal component of dark current, dark current being that which will eventually completely fill all the wells on the CCD whether any light hits it or not. 1/48 isn't long enough for much of that to occur.

 

What's the biggest deal is read noise, which is much bigger and nastier because it comes from a variety of hard-to-control sources. There are clever techniques to minimise some of these sources but at the end of the day it's a big amplifier, and big amplifiers introduce noise. Theoretically, CMOS, which can place amplifiers physically closer to bins of pixels, offers advantages in this area.

 

This is a keenly-researched area because if you can reduce the noise of a CCD, you increase its dynamic range. Halve the number of noisy electrons and you have doubled the dynamic range of the sensor, at the cost of reducing its effective sensitivity.

 

And actually yes in one way it is "white" because it's usually very nearly perfect white noise!

 

P

 

PS - Good grief, I'm answering questions for an ASC...

Link to comment
Share on other sites

  • Premium Member
Phil, is most noise "white" even when it happens in an individual color channel? I assume that these random electrons cause the appearance of a random pixel going to full exposure or level, ....

 

The term "white noise" doesn't actually refer to colors in the picture. This is an example of an historic analogy coming back to bite us in the a---.

 

"White noise" and "pink noise" are terms that were coined in the analog audio world. All waves, including those of sound and light, have frequency and wave length (you can compute either given the other, but that's a digression). The sound guys needed a term to describe noise that's uniform over their whole frequency range. They called it "white noise" by analogy with white light, which sort of has equal amounts of all the visible frequencies/wave lengths (it's actually a Planckian distribution, but that's another digression). "Pink noise" is sound with a tilt to the curve to give it more energy in the low frequencies/long wavelengths, sort of like adding more low frequency/long wavelength light, which is red, making "pink".

 

So, the sound guys had noise, which may or may not be "white" running through cables and amplifiers as an analog electronic signal. Time marches on, and we get black and white TV. Again we have analog electronic signals, only at much higher frequencies, and noise in those signals can be described as "white" if it's uniform over the whole frequency range. Time marches on some more, along comes color TV, and now we have three electronic channels, one each for red, green, and blue, each of which can have noise which may or may not be "white" in the frequency distribution sense, the word "white" having nothing to do with their actual colors in the picture. Thus the teeth marks in our collective tush. ;-)

 

Only rarely and in very bright areas of the picture would noise-originated electrons boost a pixel to saturation. It's in the dark areas, where we don't have many electrons, that give or take one or a dozen would change the pixel enough to stick out at us as visible noise.

 

 

 

-- J.S.

Link to comment
Share on other sites

  • Premium Member

I guess my question is whether a point of noise in a video image is always red, green, and blue, and white -- i.e. does it have different values in terms of IRE, etc.? Or is it always full exposure/level at that point? Obviously I've seen blue noise, like in the RED, but I'm not sure if that color to the noise is partly if the debayering algorithm is assigning a color to that noise. My guess is that noise comes in all colors.

 

When you get noise from boosting the gain, are you basically seeing more noise because whatever low levels of noise are essentially getting more value in strength and thus are more visible, or are you generating new noise?

Link to comment
Share on other sites

  • Premium Member
... My guess is that noise comes in all colors.

 

When you get noise from boosting the gain, are you basically seeing more noise because whatever low levels of noise are essentially getting more value in strength and thus are more visible, or are you generating new noise?

 

Yes, noise comes in all colors. Every photosite on every chip will have a little added or subtracted from it at random by noise. Only very rarely and randomly will noise take a given pixel to full exposure or maximum level. The JND (Just Noticeable Difference) level for human vision is a change of about 1%, so any noise that changes a pixel by more than a percent, we'll notice.

 

Boosting the gain may result in a tiny bit of new noise from the amplifier, but the vast majority of it is taking the noise you have and making it bigger. There's no way for an amplifier to know what's signal and what's noise, it just amplifies all of it. Sort of like how a magnifying glass can enlarge the fine print on a piece of paper, along with the coffee stain.

 

 

 

 

-- J.S.

Link to comment
Share on other sites

  • Premium Member

Noise is per-pixel, so it will have colour regardless of the type of camera. If it hits a red-filtered pixel on a bayer array or a pixel on a red-filtered chip in a 3-chip block, you get a red tinted noise artifact.

 

50 more electrons that were thermally motivated will have the same impact on the image as 50 more electrons that were optically motivated - so if your electron well has a capacity of 50,000 electrons, 50 more electrons will make the image 1/1000 brighter. If there's 100 more electrons, it'll make it 2/1000 brighter. It won't clip the pixel unless that pixel was already so near clip that the noise value pushed it over the edge.

 

A debayering algorithm will tend to smear the noise of one pixel over several neighbouring pixels, with complex behaviour depending on how clever the algorithm is and, relatively, how much the noise artifact looks like and is treated like detail to be maintained, used, heterodyned out... in general, the artifact will become bigger, but may be somewhat averaged out by the process.

 

When you apply gain, gain is an exponential function, so you're just making things larger that were already there. Now there are some camera designs that do parts of their signal processing in the analog domain and parts in the digital, and they may add analog noise - but then quantisation (banding) is also strictly speaking noise, so neither route is trouble free.

 

Also, while you can decide to use a higher bit depth of processing to avoid banding, at some point you are just processing noise. From a technical perspective this probably applies quite regularly to 10-bit imaging since few practical CCDs or film stocks reliably provide 10 log (1/(2^10)) = 30.1dB of noisefloor. The implication of this is that you might as well store 8 bit and diffusion dither rounding errors with Gaussian noise during postproduction - but people don't like this, as you then end up with a colour corrector that doesn't quite produce exactly the same result twice.

 

P

Link to comment
Share on other sites

This question is somewhat prosaic compared to the discussion above but....

 

I have recently been experimenting with a letus extreme adapter on an hvx200a and olympus zuiko prime lenses.

 

The images shot with the adapter appear noisier (especially in the shadows) than the images shot solely with the on camera zoom.

 

I would have thought that a correct exposure is a correct exposure.

 

The noise does not look like the artifacts of the ground glass, but definitely more like a compression issue.

 

The zuiko is an f 2.

 

One image is at www.robfeatherstone.com/Blue.jpg

 

the other is www.robfeatherstone.com/green.jpg

 

 

The blue-ish image is the straight lens. The green background is the letus rig.

 

How could the letus increase video noise?

Link to comment
Share on other sites

  • Premium Member

My first reaction would be that, despite your inclination to the contrary, you're seeing groundglass noise. It's also possible that the additional noise produced by the groundglass is doing unpleasant things in conjunction with the compression mathematics in the camera, but that's a speculation.

 

Also, your "greenish" image is much lower in contrast than your "bluish" image, which will make noise more objectionable. Can I suggest a controlled, side-by-side comparison on a test chart or still life of some kind?

 

P

Link to comment
Share on other sites

My first reaction would be that, despite your inclination to the contrary, you're seeing groundglass noise. It's also possible that the additional noise produced by the groundglass is doing unpleasant things in conjunction with the compression mathematics in the camera, but that's a speculation.

 

Also, your "greenish" image is much lower in contrast than your "bluish" image, which will make noise more objectionable. Can I suggest a controlled, side-by-side comparison on a test chart or still life of some kind?

 

P

 

Hi I think you are right.

 

Nothing like a good controlled test to sort these things out.

 

I should have some time next week to set something up in a studio.

 

Thanks

Link to comment
Share on other sites

I'm not sure if that color to the noise is partly if the debayering algorithm is assigning a color to that noise. My guess is that noise comes in all colors.

 

"Color to noise" is being assigned by the optical filter array and not the debayering algorithm; though, as John Sprung has eloquently pointed out that the term "noise color" is being used incorrectly here. White noise is uncorrelated noise as John has explained and not to be confused with the "color" imparted to the noise by the filter array.

 

The implication of this is that you might as well store 8 bit and diffusion dither rounding errors with Gaussian noise during postproduction

 

Not the right approach. Adding noise after quantization is not as effective as adding noise before quantization.

Link to comment
Share on other sites

Low down the sensitivity at -3Db, soft gamma curve, no colour gain, shut down the detail circuit, manual knee to gain extra latitude and you will be able to over exposure a little bit.

In a real world, we are deal with noise especialy due fact of increasing latidute by exposure on hi-lights so NR issue it again a pain in the ass.

 

Gabriel Bordas

Link to comment
Share on other sites

But by over exposure you lose latitude, there are realy very rare situation where you can do it and not always a free-noisse image it a priority.

The noisse it not the enemy of a good image. You must accept the noise as a caracteristic of the image, like a texture. No reason to make a obsesion of this. Make a obsesion over the creativity

Edited by Gabriel Bordas
Link to comment
Share on other sites

  • Premium Member
How could the letus increase video noise?

 

If I understand it correctly, it's a re-shoot off a ground glass device for using film camera lenses. If that's the case, you're adding the GG pattern, which is a sort of random noise, to the video noise.

 

So, what happens when you add two noise sources? Sometimes, both noise sources take a pixel in the plus direction or the minus direction, and the noise gets bigger. Sometimes they're going opposite ways, and it gets smaller. There's a bunch of math to it, but the end result is that the combined noise is the square root of the sum of the squares of the two noises you're adding. Yes, it does sound a lot like Pythagoras and the hypotenuse of a right triangle. That's because in a mathematical sense, random noises are sort of "perpendicular" to each other. Mathematicians call them "orthogonal".

 

 

 

 

-- J.S.

Link to comment
Share on other sites

  • Premium Member
so what is the best way to minimize noise?

 

To overlight, and provide and provide enough signal to overcome the noise?

 

Hi,

 

TURN OFF THE DETAIL! & always use the lowest gain settings. Then much noise & the huge resolution figures you quote will vanish! (Filtering to obtain a white balance will also help)

 

Best,

 

Stephen

Link to comment
Share on other sites

Hi,

 

TURN OFF THE DETAIL! & always use the lowest gain settings. Then much noise & the huge resolution figures you quote will vanish! (Filtering to obtain a white balance will also help)

 

Best,

 

Stephen

 

 

What do you mean the huge resolution figures I quote? And how will filtering help? The HVX isn't any more noisy in tungsten than it is in daylight. It's not a CMOS camera.

Link to comment
Share on other sites

  • Premium Member
so what is the best way to minimize noise?

 

To overlight, and provide and provide enough signal to overcome the noise?

 

I wouldn't say "overlight". What you need to do is light for the dynamic range of whatever system you're using. The less dynamic range you have available, the more you need to master the venerable art of fill light. You've got it right when nobody notices that you did it at all.

 

 

 

 

 

-- J.S.

Link to comment
Share on other sites

  • Premium Member
What do you mean the huge resolution figures I quote? And how will filtering help? The HVX isn't any more noisy in tungsten than it is in daylight. It's not a CMOS camera.

 

You are claiming the camera resolves more than 720p. Why don't you just try turning OFF the detail, with all video cameras detail adds noise. If you look at the blue channel you will see a difference in noise levels when you filter, I tried it today. Boosting the gain to color balance is always a compromise.

 

Stephen

Link to comment
Share on other sites

It does resolve a bit more than 720p with detail off. And because the camera has equal photosites sensitive to blue, and CCD's are more sensitive to shorter wavelengths then CMOS, there is no difference in noise performance between tungsten and daylight, whereas on the EX1, i always shoot 5000k balanced and above because the sensors are daylight balanced, and you can see alot more noise at 3.2k

Link to comment
Share on other sites

I'll have to find the graph. A study was done. I'm going to try to dig it up. I do remember it showed CMOS sensors being more inherently daylight balanced. But I suppose tungsten balance would cause noise in any chip, seeing that tungsten light contains 400% more red light than blue. That's alot of gain added. Whereas daylight have no deficiency of any type of light, although being predominantly blue.

Link to comment
Share on other sites

  • Premium Member

It's possible that any given camera may have IR or other filtering on its chips which will cause its apparent native colour balance to alter.

 

Avoiding blue noise by filtering blue optically and opening up is an obvious step I'd overlooked, and well worth investigating.

 

P

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...