Jump to content

Less saturated colours in older cinema and TV images


James Malamatinas

Recommended Posts

The last few nights I've been watching a number of older films, and in particular, TV shows, a number of which always make me feel old because they're never as vivid as you remembered when you watch them years ago. There's a ton of stuff that affects how well/badly an image endures over time, but one thing I often find stands out is that the colours are often washed out or less saturated, which got me thinking - what is the technical reason for the image on these shows to look like this compared to the images we see today.

It's obviously not one single breakthrough - but is it more to do with advances in the actual TV's and displays we watch the images on, or is it improvements in digital camera / film stocks that continually make newer images more vibrant.

Obviously a lot of older material gets re-mastered for new formats such as DVD, Blu-Ray and these versions do much look much more like today's images in terms of colour - but is this due to a solely digital improvement, or is it the fact that all of the colour information was in the original medium - just that displays at the time couldn't display it all.

Hope this is the right place for this!

EDIT: Also, what is the best way to future-proof this information when shooting now?

Edited by James Malamatinas
Link to comment
Share on other sites

  • Premium Member

Some of it may be fashion. Modern movies - the Transformers of the cinematic world - tend toward highly saturated colour, and people who are paying for new transfers may give instructions to imitate that.

 

Very old film may have faded, but modern equipment gives us a better-than-ever chance of recovering things unless they're really far gone.

 

In the strictest sense of the word, film has always been capable of a wider colour gamut than common video displays, so yes, displays could never (and usually still can't) display all of the colour. This doesn't explain what you're seeing on TV, though, as those standards haven't really changed much.

 

P

Link to comment
Share on other sites

Some older films used desaturated colours through use of filters and even using B&W in combination with the colour neg.

 

Older film material shot for TV can be dramatically improved by retransferring with a modern telecine. Having had to carry out a few of these transfers to cut into modern productions, an editor friend said that it's amazing how good that older material is

Link to comment
Share on other sites

  • Premium Member

A lot is probably just style, but perhaps they kept the chroma down a bit because of worry about smearing in analog NTSC in tape-to-tape copies. Also, a lot of the 70's shows were transferred at the time using a film chain device from prints at the TV station, sometimes low-con prints, and that may have impacted saturation too.

Link to comment
Share on other sites

I overlooked the simple fact that taste in aesthetics simply changes and that's a good point, just like with editing styles and other aspects of filmmaking.

 

Saying that, say we took a film like "The Conversation", or even something more recent like the first Indiana Jones as examples, both of which show the kind of washed out, low contrast look that I was trying to describe (at least in un-remastered prints) - would these films, if reshot today with the same film stock, still display the same tone and saturation in colours on our TV's and at the theatre?

If the film stock is still capable of recording a wider colour gamut then we can display, and if broadcast standards have not changed substantially, is it due more to the original film transfer that the colours appear as they do in comparison to modern cinematography?

I don't know if I'm trying to be too specific, I'm just fascinated by the way some films hold up more than others 10,20 or 50 years after they were made, and was trying to get a finger on some of the characteristics that affect this.


Link to comment
Share on other sites

If it was done for aesthetic reasons, the cinematographers would use techniques like flashing to achieve a similar pastel look. Although, you might now use digital tools to achieve the same effect. Some digital commercials do have a low saturation look, but it's rather different to the classical 1970s flashed look,

Link to comment
Share on other sites

Thanks Brian, I wasn't completely familiar with flashing although I found a previous discussion this forum here - http://www.cinematography.com/index.php?showtopic=36685 - which helped explain it pretty well. It doesn't sound quite as fun to do it in the DI, although understandable as it sounds like there were some risk in doing it on the negative...

Completely unrelated and I know I should start a new thread, but just in case you know... the other thread on flashing mentioned that double-exposure was one way of flashing the negative but that the film had to be lined up to the exact perf when doing the second exposure. How do you actually go about "rethreading" the film to the exact spot without accidently exposing any frames, and was it the 1st AC's job to do this when required?






Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...