Jump to content

Recommended Posts

  • Premium Member
Posted

Ok..can someone explain why analogue film even NEEDS to be 'color balanced'. I.e. Why can't it see light like my eyes do? For example when I walk from outdoors in daylight in to a room with tungsten lights I do not switch my eyes to 'tungsten balanced'. They can see what's in front of them. If analogue film reacts to red, green and blue wavelengths...why do I need a day light or tungsten 'balanced' stock....I,.e. why doesn't it just react to what's in front of it (i.e. some mixture of red, green and blue)...and render it as the eye does? Tx All.

Posted

Your brain automatically colour grades. If there's a change to your eyes, for example if you have the lenses changed after a cataract operation, the colours will appear very bright. That's because the brain has been correcting the faded colours caused by the cataract. However, after a few days, the brain has settled its correction, so that the colours aren't so saturated. 

The film stocks are designed to have correct colours at a particular colour temperature by using layers: https://en.wikipedia.org/wiki/Color_motion_picture_film

Video cameras have similar difficulties, even automatic white balancing is limited when you have a mixture of colour temperatures within the scene. Parts may be blueish, while others may be warm. 

 

  • Like 1
Posted

If you look closely at mixed lighting with your eyes, you will notice the colour differences in the world. Painters use this in their work, when using colour.

  • Premium Member
Posted

Film doesn’t need to be colour balanced. If exposed without care, developed so-and-so, printed just like that, and projected somehow, you’ll still have images and action. Sound may be distorted, out of synch, the frame rate erroneous, the image aspect wrong.

The scope is vast from haphazard filmmaking to concerted production. Where do you think are you standing with your question?

  • Premium Member
Posted

Your brain is continuously color-balancing and exposure-adjusting the image. If you wear yellow sunglasses for a while, things look less yellow and for a moment after you take them off, everything looks blue-ish until your brain adjusts.

You can make a film or digital camera that has auto-exposure but the adjusting usually happens before the light hits the sensor or film. But you can't really do that with color. However, with a digital camera, color can be balanced after light hits the sensor and the raw signal is processed to RGB for viewing and/or non-raw recording.

  • Like 2
  • Upvote 1
  • Premium Member
Posted

If I may piggyback on this thread. Picture a person standing under a yellow street light being filmed with a digital camera (this is where I am hijacking the thread). If the shot is for a newscast, I can imagine that you would want their white shirt to actually look white. So you white-balance your camera. However, if the shot is for a movie and you're going for a painterly look, why would you not allow the camera to capture what now looks like a yellow shirt? The viewer might still identify it as a white shirt under yellow light, but we would probably not want the camera to correct it back to white. Am I getting this wrong?

 

Immediate follow up question: When I white-balance a video camera I am in fact applying gain to individual color channel which will introduce noise. If I shoot in RAW (actually XOCN) and then instead of balancing the camera sensor apply color correction in post, am I not doing the same thing but with much more control and potentially less noise? As always any pointers are much appreciated.

Posted

Fuji Real did a fairly good job of handling fluorescent lighting. But not all light is equal, and not all spectra are equal. All light sensitive material is sensitive to something, and it's never dynamic. Software behind it can be dynamic, but capture media just captures what it is sensitive to. 

Remember that electronic sensors can't see colour, so their sensitivity depends on several things, including the type of Bayer filter.

  • Premium Member
Posted
10 hours ago, Hannes Famira said:

 

Immediate follow up question: When I white-balance a video camera I am in fact applying gain to individual color channel which will introduce noise. If I shoot in RAW (actually XOCN) and then instead of balancing the camera sensor apply color correction in post, am I not doing the same thing but with much more control and potentially less noise? 

If by “video camera” you mean one with three sensors behind a prism block splitting the light three ways (like a 3-CCD camcorder) then you are already capturing separate RGB signals that can be re-balanced to adjust for color temperature (though this is usually recorded in a compressed video codec using color subsampling that isn’t a simple RGB format.) But if you are talking about a single sensor with an RGB color mosaic filter in front of it, then it has to be demosaiced from raw to RGB for viewing and for recording if you don’t want only a raw recording.

Yes, working from raw in post gives you more control to make changes because you haven’t baked in the color balance but if you’re just talking about white balancing, then I don’t think it’s necessarily less noisy to do it in post versus in-camera but at least you won’t be dealing with a compressed color subsampled recording if you work from raw. But there are also high-quality RGB codecs that are nearly as good as raw.

  • Like 1
Posted

Stare at a green image close enough so that all four quadrants of your visual field (peripheral vision) also appear green and remain there for about a minute. Once finished, everything white will be tinted purple. This is the brain's way of attempting to establish green as neutral. 

I'll spare you of too much ocular anatomy, but the combination of the human eye and the brain is much more analogous to a lens and camera than you'd think. Light enters your eye inverted both vertically and horizontally, the same way it enters a camera's lens inverted, but the brain corrects it.

In fact there is a rare condition called Reversal of Vision Metamorphopsia where individuals' brains fail to fully correct their vision, resulting in a 180-degree 'upside down' vision.

Anyway, I said I'd spare you so I'll stop there. So yes, you are correct in that your eyes don't white balance, your brain does.

 

Posted (edited)

Perhaps other DPs have experienced the phenomenon of opening a cto gelled window into a 3200k practical set. The exterior daylight looks blue for a few moments until your eyes and brain adjust to a new white balance. 

Edited by Doyle Smith

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...