Jump to content

How much impact do film stocks have...


Guest mknorth

Recommended Posts

Guest mknorth

I am researching the impact film stocks have on the look of cinema. How do cinematographers choose stocks? Do post production techniques i.e. bleach bypass have more of an effect on the look of film than the stock used? Have new techniques i.e. digital grading changed the way film stocks are made/choosen/used?

 

cheers.

Link to comment
Share on other sites

  • Premium Member

Digital intermediate allows creating so many looks that there is less need in some ways for so many stocks -- the differences can be muted between them or they can be exagerated artificially.

 

However, as long as digital intermediates, skip-bleach processing, etc. are the exception for theatrical features, rather than the rule, then choosing the right film stock does have an affect on the look of the movie, mainly in terms of the old speed-grain ratio and in terms of contrast and saturation. I can't say it is a dramatic effect though unless you work at emphasizing the differences (i.e. intercutting low-con high-speed film with slow-speed contrasty film.) But that doesn't mean that concerning oneself with the subtleties is a waste of time. Just like two cooks might not agree on the same brand of extra virgin olive oil, etc. It's why we are artists -- we are hyper-sensitive to those differences.

Link to comment
Share on other sites

  • Premium Member

Kodak offers a very wide variety of camera films:

 

http://www.kodak.com/US/en/motion/products...1.4.4&lc=en

 

Here are the Kodak color negative films, ranging in speed and color balance from EI50 Daylight to EI800 Tungsten, and offering a variety of "looks". Choice is often based on speed/grain/sharpness, "look" (tone scale and color saturation), and price. The new Kodak VISION2 stocks offer the most advanced emulsion technology, featuring significant improvements in tone scale and image structure:

 

http://www.kodak.com/US/en/motion/products...4.4.4&lc=en

 

The color reversal films offer a very different "look", especially when cross-processed:

 

http://www.kodak.com/US/en/motion/products...4.4.6&lc=en

 

And Kodak continues to offer (and improve) both B&W negative and reversal films:

 

http://www.kodak.com/US/en/motion/products...4.4.8&lc=en

 

As noted elsewhere, the use of "Digital Intermediate" offers remarkable flexibility in adjusting "look", but it can also "filter" the ultimate quality available from a particular stock. For example, the new Kodak VISION2 100T Color Negative Film 5212/7212 is the sharpest color negative film available today, yet going through DI masks some of this improvement.

Link to comment
Share on other sites

No matter the resolution of the digital media, I am still inclined to say that there is a "filtering" effect on the original negative.

be it 2K or whatever.

 

The quality of film has for some irritating reason in discussion been subdued to the level of mere resolution and color depth.

 

Have we forgotten the fact that film is not a tool to represent reality to the ut`most degree, but rather(at least for me), a beautifying and poetic enhancer of the world which we look to depict.

So far I have no visual proof of video being able to do this. And a heightened resolution for video in the future can only make it show the world in an even more accurate way, hence we would be looking at homevideo feeling material for the rest of our working careers.

Film is like the paint with which you form your piece of art. Brush stroke upon brush stroke ever changing. Video will always remain a cold digital representation of reality with facial poors and lack of organic feel of grain.

Somewhat like listening to the differences of LP and CD on any given system.

Audiophile as I am I am yet to hear a digital soundrecording that matches that of an analogue.

 

To quote a music critic I saw on television last week. " the world seems to veere toward loosing touch with true quality, we have begun to stop at good enough too often in our lives".

 

Just had to get it out of my system. Agitated but happy I am. And have full respect for those who like the digital branch´.

:D :D

Link to comment
Share on other sites

Very good, fredrik! I totally agree. More and more I hear the phrase "the average movie goer can't tell the difference" when, implied by that statement, there IS a difference and you're missing out.

 

An ice cream maker in my city used to advertise their ice cream as being better because "it's a subtle little difference that makes all the difference".

Link to comment
Share on other sites

What are you two talking about??

Nobody ever mentioned video in this thread.

 

a 4K or even better 6K DI would capture all the grain and detail

that you want and film would not loose its poetic look.

 

I think this discussion has turned without any logic into another film vs. video

discussion,and nobody even mentioned video.

We are talking about DI.

Link to comment
Share on other sites

  • Premium Member

That's too simplistic a statement to be accurate.

 

For example, a D.I. can manipulate the information on the negative so that it becomes visible in the print made off of the negative that is recorded out from the digital master, while a straight print from a negative will throw away a certain amount of information because of contrast. So why is it when the contrast of the print stock throws away a certain amount of shadow or highlight information, that's OK but somehow using all the information contained on the original negative with a D.I. process to allow more extensive color and contrast manipulations somehow getting less than "all the looks of film"??? If anything, you can get MORE looks from that same negative than the traditional printing process allows.

 

Remember with film there is no single look, not with the variations of negative, print, and development processes. D.I. is just another tool in creating more looks from that original negative, not fewer looks.

 

Certainly if you do the D.I. work at a high enough resolution, you would be hard pressed to tell the difference between a print made off of the original negative and one made off of the digital negative. Most of the problems with D.I. that we have been discussing have been due to working at too low a resolution with too much compression.

 

It sounds like some people are ready to dismiss D.I. out of hand for simply not being a film process, as if no matter how good it gets, it simply isn't acceptable because it's "not film." To me, it's just another image manipulation technology.

 

Is all film transferred to video for TV broadcast "simply video"? That implies that you might as well just shoot video to begin with for TV.

Link to comment
Share on other sites

The problem with video is not its data recording system so much,but its

way of getting the image (CCD sensors)

 

And besides,if you'd use a high qualitty DI like the qualitty used for

restoration projects (like cinesite does,or that japanese facility (can't remember

the name) that scannes black and white films at 6K)

using 4K (or 6K for finer grain films) and storing everything in

separate uncompressed image files with 10bit log color data or 12bit log,

and using state of the art pin registered scanners like norhtlight or

kodak's cineon lightning,you would get great picture quality,

and i dare to say that the negative printed out (of course with a state of the art laser printer) would be almost identical

as the original one.The grain structure would change because every frame

of film is unique,but the level of detail and color deph would be probably just

as good as if the new negative was exposed in camera to a human eye.

Maybe objectivley there would be slight differences in color deph,but i doubt

that it would be visible to humans in any of the film viewing systems (electronic

or film print)

 

This is of course top of the line,and very expensive.And not possible

for most people out there.

 

And scanning and printing technology is improoving.One day people will be

able to make duplicates of negatives that are objectivly indistinguishable from

a negative exposed in a camera.

this can be tested by multiple scanning and printing.

If there is no qualitty loss,a 10th generation rescanned and reprinted negative

would have to look just as good as the original.

 

And besides,a large database of film frame images is not video,it is

what it is,a large database of image files.

 

 

Of course if you scann everything at 2K and convert it to video,and add compression,there would be a significant qualitty loss.

 

And as for film warmth and film look it is not lost at all when converted to video,

an SD video transfer of a film still looks gorgeous.It lacks detail ,but the image is still film like.

 

of course film looks most organic when shown on a chemical medium like

a film print or a photo paper,but it does not loose the film look at its organic

nature on any of the mediums.

Link to comment
Share on other sites

No,first of all DI is a concept,not a standard process,

there is more than one way to do DI.

 

And secondly, even if it was just video,video can store much

more image qualitty than it can capture with cameras.

(regarding color deph)

 

The main trouble with a HD camera is not that its data recording standard is low,it is that its recording potential is not fulfilled by its own image sensors.

But even if it was reached by its CCD's it would still be limited compared

to what film can capture,but it would be far better than what happens in

reality.

 

The proof of that is everywhere.Film still looks better on video that video on video.

Link to comment
Share on other sites

  • Premium Member

Yes and no, Rob. It all depends on the specifics of the equipment and processes used for the D.I. Some are more "lossy" than others, some are more invisible.

 

You have to remember that D.I. is specifically designed for the theatrical projection world, which is where you need a film print. So it's not enough to consider what the original film looks like but how you want the PRINT to look like.

 

An example is if you liked the look of an E6 film like Ektachrome 100D or Fuji Velvia. Well, there is no film-only path to getting that image to a print without altering the look significantly. Duping it to an I.N. increases contrast and flashing the I.N. to reduce that makes the color saturation lower. So a D.I. is the only practical way to create a negative for printing that retains the look of the original reversal image.

 

Also, you have to separate digital image capture from digital duplication. A D.I. does not have to capture all the subtleties of the real world -- it only has to duplicate that image captured on the film negative into a digital form. If you look at a negative under a microscope, it's a bunch of cyan, yellow, and magenta grains with a brick orange mask over everything. It's not impossible for a digital scanner to copy that image more or less unaltered. It's not impossible either to copy it back onto film more or less unaltered -- although one of the points of doing a D.I. is to alter the image in ways beyond that possible with simple RGB printer light changes.

 

So this notion that a film-only path to the release print stage is somehow magical and unmatchable with digital techniques is rather romantic and unscientific. All steps after the original negative, whether optical or contact printed, whether through and IP and IN, and whether digital, involves changing the information on the original negative, some changes deliberate and some part of the process chosen. The only thing that really matters is if you understood those changes and took them into account when shooting so that you can end up with a print that looks the way you want it to. If it does, then whether if one of those intermediate steps was digital is irrelevant.

Link to comment
Share on other sites

Sorry for making it into a dicussion of the D.I:s be or not be. The intention was aimed reallat the video capture process mostly; a none film based original that is.

I was always as a kid wondering about the fact that film on television can look good as opposed to video on television. Something I understand now.

 

Sorry to say though. That at the present moment I find there is a lack of something in there; something the future will hopefully change.

But the lack is of an origin unspecifiable by, at leats, me.

 

I am a believer in the fact that the difference is in the fact that a digital step is always just a representation of true light and colour. Where as film is something generating pictures in an "organical way" through photones of the original light exciting electrons and making a "natural "imprint with all that is there in.....

Photograf a painting and you will never achieve its natural beauty, strokes etc....

 

I love this dicussion so go on my friens :)

Link to comment
Share on other sites

Wasnt "Kill Bill" put through a DI? It looked "crisp" and "sharp". Robert Richardson knows what he is doing. Him and Tarrantino contemplated this together. Read the article at Kodak.

Link to comment
Share on other sites

I'm beginning to get a better understanding. Thank you, David. I need to do some reading up to get an actual feel for what is happening technically (used to be an engineer. But at least it was for sgi and pixar).

Link to comment
Share on other sites

It's very important to understand that Digital Intermediate is NOT video.

 

Video is a stream of recorded data designed just to fill a TV screen. When you transfer film to video, you have no choice but to strip away everything that doesn't fit: resolution, contrast and colour depth are all reduced - and with, them, the essential nature of the various negative stocks. That said, of course, film does a great job of squeezing the massive amount of data in the real world down in the first place, so (as David says), film transferred to video is a whole heap better than scenes captured with a video camera to start with.

 

A DI is a series of files designed to represent ALL the detail that is on the negative (resolution, contrast, colour depth etc) frame by frame, with the intention that ultimately the data will be recorded back out onto film, ideally leading to a print that is indistinguishable from the original.

 

A top quality DI isn't far from that ideal. However, many compromises are often made (2K, reduced colour depth, compression :angry: ). These compromises limit the amount of work that can be done on the data before the image is not just altered, but degraded.

Link to comment
Share on other sites

  • Premium Member
Wasnt "Kill Bill" put through a DI? It looked "crisp" and "sharp". Robert Richardson knows what he is doing. Him and Tarrantino contemplated this together. Read the article at Kodak.

 

No it didn't.

 

It looked soft and smeary. Awful skintones. I was very dissapointed to see such inferior work from Black Bob.

 

There is this new trend toward DIs that I find very disturbing. Whenever you read AC or ICG magazine, Dops always rave about being able to contrast and color, etc... Like Philippe Rousselot on 'Big Fish' recently, making DI sound like the best thing sinced sliced bread. I mean, have you even looked at a projected print of your film???

 

I have yet to see a DI that gives as nice skin tones as an optical print. At best they look halfway decent, at worst they look like crap. See 'Kill Bill', 'Big Fish', 'X-Men 2', 'Confessons of a Dangerous Mind', etc...

 

Just because it looks good on your monitor doesn't mean it looks good on a print.

Link to comment
Share on other sites

  • Premium Member

Hi,

 

> It's very important to understand that Digital Intermediate is NOT video.

 

Personally I'd say that it is, but there's certainly some debate to be had about it. This opinion is based on long experience with video on computers and the need for some way to describe moving images which can be electronically stored on a variety of formats. I find that a lot of film purists resist the idea because they hate to think of film being helped out by video, but I view it more as a rather wonderful synergy of technologies. Sorry, if it's an electronically stored moving image, it's video. The Dalsa Origin is a video camera, no matter how much they might dislike the idea.

 

> A DI is a series of files designed to represent ALL the detail that is on the

> negative

 

No, that's just wrong. Most DI is done at most at 2K resolution, which is not sufficient to record the entire negative information, not nearly - even I can tell, and my contact with the processes behind it is at best peripheral.

 

Phil

Link to comment
Share on other sites

  • Premium Member

Obviously these cinematographers see the final print off of the negative created by the D.I. process. I can't imagine any one of them NOT seeing it.

 

Some of the fleshtone problems are artistic -- i.e. they've played with the image in ways outside what the emulsion was designed to deliver, which is why they are using D.I. in the first place -- and some are technical, like over-use of grain reduction, high compression, etc.

 

I certainly had no problems with Audrey Tatou's skintones in "Amelie"! Catherine Zeta Jone's skintones in "Intolerable Cruelty" looked pretty nice too. Kiera Knightly's skintones in "Pirates of the Carribean" also looked good.

 

And I think that D.I. has improved the look of Super-35 blow-ups to anamorphic, in particular with the graininess problem. I noticed that even on one of the earliest ones, "O Brother Where Art Thou?".

 

"Kill Bill" has problems partly because Tartantino rejected Richardson's color-correction and film-out and re-did it all himself without Richardson's input.

 

"Seabiscuit" has the artifacts I mentioned due to overuse of noise reduction combined with edge sharpening (seems like a contradiction) and some odd compression artifacts.

Link to comment
Share on other sites

David! I admire and respect your work and your oppinions fully!

 

However,,,,( :) ) I must second Audiris on this point.

I wouldn´t even call it the skintones being bad, but more so a look of two dimensionality in any surface represented, then, especially skintones.

Yes one must accept that any given look may or may not be a choice, but why then are all choices made the same way. "Mr grader... give me bad skin otnes and twodimensionality please!!"....

Pushing it too far and adding too much grain reduction is all another thing. That is just a discussion of digital artifacts that any uninvolved can see.....

Link to comment
Share on other sites

  • Premium Member

You have to remember that if you shoot in Super-35 and do an optical printer blow-up using an IP and IN, you also generate artifacts, just different artifacts than doing it with a D.I.

 

Anyway, really well-done D.I. work is fairly invisible when projected side-by-side with material directly printed -- like I said, you have two factors at work when it's not invisible: it was done at lower quality (too low a resolution, too much compression, too much noise reduction, too much edge sharpening) or the image was manipulated in some way not possible normally with film printing. And like I said, it's not like the normal IP/IN process is invisible either, especially if it is combined with optical printing and image resizing and reformating.

 

But if you're convinced that skintones will NEVER look good if a D.I. is involved, then I'd say you be happier if you didn't use that process.

 

Like I said, I was happy with the skintones in "Amelie". I've even seen films where I wasn't sure a D.I. was used at all. I mean, you look at some films with only partial D.I. work (like "Last Samurai" or "Gone in Sixty Seconds") and I don't think the D.I. shots stick out as being different. If it were impossible for a digitally manipulated shot to not match surrounding non-D.I. footage, then digital effects would never cut into a sequence. I mean, look at one of the only digitally cleaned-up shots in the 65mm restoration of "My Fair Lady" -- a shot of Audrey Hepburn that had a big white speck on her neck that was digitally erased. It's not like when you see that movie in a 70mm print on the big screen, you say when that shot comes up "what in the hell happened to Audrey Hepburn's skintones??? They went all two-dimensional..." The skintone problem you are describing is almost entirely due to working in 2K resolution and applying processing to reduce grain and add sharpness.

Link to comment
Share on other sites

I was looking at some articles at millimeter magazine. I sort of got the impression a lot of guys use this as a "we'll fix it in post" concept rather than trying to get it right the first time. One guy said they were undecided whether the light should be morning midday or late. To me that sounds almost like writing the script on the set.

 

All of the articles were about guys using 2k scans.

Link to comment
Share on other sites

  • Premium Member
Hi,

 

> It's very important to understand that Digital Intermediate is NOT video.

 

Personally I'd say that it is, but there's certainly some debate to be had about it. This opinion is based on long experience with video on computers and the need for some way to describe moving images which can be electronically stored on a variety of formats.

No offense, but maybe your long experience with video doesn't go back far enough to include 3/4" drag-along decks and tape-to-tape editing :P

 

To me video is NTSC, PAL, SECAM or HD, with or without the sub-carrier. It's a signal that is encoded a certain way and can only be transmitted and displayed in that arrangement. You can capture, store, edit, and massage that information digitally as some other file type; but only the captured and displayed image on a monitor is video. A quicktime file (for example) isn't video -- it's a quicktime file.

 

My point here is that video is a SIGNAL. Moving electronic images are not automatically video until they're formatted as one of those mentioned signals. An image can begin life as video, get captured onto your computer as a type of moving image file that's NOT video (it's no longer NTSC/PAL/whatever) and be spit back out onto DVD, whereby the player CONVERTS it back to video that can be seen by your TV. So much of your experience with video on computers, well, ISN'T. :D It used to be and will again be video, but on the computer it's not.

 

So when Dominic talks about 2K DI not being video, he's right. Those images never get encoded or dealt with as a video signal, but instead as another digital file type that has none of the characteristics of a video signal.

 

I think you're partially right about film purists not wanting their film to be helped out by video, because video signals in the strictest sense are pretty limiting and frankly, pretty crappy. Digital technology however (even the tiniest 640x480 QT files), is very flexible, scalable (which video isn't) and powerful. Moving digits around is pretty easy, compared to moving around a true video signal. That's one reason online video hardware is so f*&^*g expensive while a desktop video system isn't.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...