Jump to content

Digital Intermediate


Brett B

Recommended Posts

  • Premium Member
It irks me to no end when people assume they speak for an entire industry, especially one that I anticipate being involved in (shooting film all the way to the end) for a long time without the need for digital manipulation or technology.

 

If your goal is to be a director, you could perhaps get away with that, but not if you hope to be a working cinematographer -- digital technology pervades all aspects of modern cinematography and it will get only more so over time, so if you're just starting out a career as a cinematographer and you don't want anything to do with digital technology, in production or in post, I'm afraid you've picked the wrong career. However, if your goal is to direct your own films, you can do whatever you want.

 

The majority of film shot doesn't even end up being projected, but gets converted to digital for distribution, so at minimum, you're always dealing with telecine, digital color-correction and digital recording issues as a cinematographer.

Link to comment
Share on other sites

  • Replies 50
  • Created
  • Last Reply

Top Posters In This Topic

But again, I can assure you that bit deph is not the problem here on its own.

8 bit depth is 'ok' (but not transparent) on delivery but only if the chain before is lossless, just as 16 bit CD is excellent if properly dithered from > 16 bit masters. The problem with 8 bit sample depth are long chains with 8 bit steps in between and color space conversions among other digital manipulations. It took digital audio decades to develop into a medium that is as good or better than the best analogue master tapes. With digital film we are still in the 'early' days. Todays aliased DNR smeared DIs are the equivalent of early shrill CDs with too much noise/distortion, lack of bass and exaggerated highs.

Link to comment
Share on other sites

I'm not sure how you do it, but my photographs, be they still, or shot at 24 fps, are on film.  There are no "bits" just grains, and I am not by any means alone in that I shoot exclusively on film.  There are people in still photography who shoot on 8x10 inch film and pay roughly $20 PER PICTURE.  I know that many fine art photographers would rather have their pictures burned than reduced to 8 bits.  Magazine photographers regularly shoot on 6x6 or 6x7 cm roll film.  There is a hell of a lot more information in a still negative than the best DSLR (unless you have a camera that needs to be tethered into a wireless network and takes ten seconds to take a "picture"), unless you're were into shooting that 8- or 1600 speed crap before you joined the "digital revolution" that's sweeping the world and making Asia a lot richer.  Please don't assert that the "whole photo industry" is based on 8 bit graphics.  A lot is still based on Eastman Kodak photochemical innovation.  It irks me to no end when people assume they speak for an entire industry, especially one that I anticipate being involved in (shooting film all the way to the end) for a long time without the need for digital manipulation or technology.

 

I'm not talking about digital photography, i'm talking about film photography. All of the printing is done at 8-bit. The source is usually up to 16-bit , but it is reduced to 8-bit for printing.

You mention magazine photographers, well, they have to scan it at some point, and print it at 8 bit, it always comes down to 8 bit one way or another, from magazines, to expensive professional prints.

 

Miha, I never said 8-bit is good for manipulations. I was just saying that

a pure 8-bit printing source is good enough for photography.

But best 8-bit files come from 14-bit or 16-bit sources if you require image manipulation.

 

I got the impression that a lot of people think that 10-bit as a standard on its (without manipulations) own is not good enough for film. So this is what I responded to.

 

I'm talking about fully "filled" 10-bit space, not distorted 10-bit space.

Link to comment
Share on other sites

  • Premium Member

Hi,

 

10-bit log is generally considered to be OK, and roughly equivalent to 16-bit linear - although I'm not sure how much this is to do with people being obsessive about the hilights clipping and being compared to video.

 

Sourced from or printed to film, the grain covers almost any visible banding in the same way a diffusion dither does. It'd probably be much more of a problem with CGI or other super-clean sources digitally projected.

 

Phil

Link to comment
Share on other sites

Guest Daniel J. Ashley-Smith

I find it kind of strange that not every movie is put through a DI. I mean, surely Universal Studios aren't doing it the old fashioned way, cutting the neg and cementing bits together.

 

Is there something obvious I am missing here?

 

My understanding of a DI is that the film is scanned onto computer (or tape), and then edited digitally. How do films that don't use a DI cope? It's as good as impossible to make any colour adjustments, add titles/fades e.t.c. (I mean it can be done I know, but it's not exactly a good method)

 

I'm sure I am missing something here. Because I REALLY can't imagine people still doing things the old fashioned way.

 

Thanks for any suggestions.

Link to comment
Share on other sites

  • Premium Member

Post-producing a movie in the traditional, photochemical way still makes a lot of sense. First of all it is much cheaper than doing a DI (a good DI can cost between $200.000 and 400.000). Also it still gives you the best picture quality, because despite what a lot of people say, putting a movie through a DI has a lot of drawbacks: the blacks are weaker, the sharpness gets reduced and the colordepth isn't as good as if timed photochemically.

 

Even if you time a film photochemically, you can still adjust the colors (via the 3 primary colors) and make the picture darker/brighter. For fades, the best way is A/B Printing, otherwise one can do an optical.

 

For a film that creates its look during the shoot (via lighting and filtration and development) and even post-production (complete or partial bleach bypass, flashing, etc...) a DI is not really necessary. For over a century great looking films have been made without a DI.

Link to comment
Share on other sites

Guest Daniel J. Ashley-Smith

Wow...

 

Thing is they have to do a DI sooner or later for DVD release. So if your worried about losing blacks and decide to use a more traditional method your still going to lose those blacks one way or another.

Link to comment
Share on other sites

Wow...

 

Thing is they have to do a DI sooner or later for DVD release. So if your worried about losing blacks and decide to use a more traditional method your still going to lose those blacks one way or another.

 

 

the word "intermedate" suggests that it is the middle of the process and

that it is surrounded by film in the begining and film in the end.

So mastering for DVD is not DI.

 

And besides it's not the same thing. The blacks don't have a problem on their own in the files. Black is pure black on a computer. The problem is how to get that black back on film. A digital image has real black, it has the theorethical pure black, it's the display or film output technology that compromises this.

But a 0,0,0 pixel is in theory pure black.

Link to comment
Share on other sites

Fillip, I"m afraid you and I have differing perceptions of the word "printing". I am talking more about printing through an enlarger of a negative (or slide in special cases) onto RA-4 or Ilfochrome paper, akin to printing ECN-2 onto ECP. You seem to be referring more to newsprint. I thought that you were saying that no one was using the enlarger-to-photosensitive paper technique anymore, which is what got me so angry. It irks me to no end when people ask me if I'm using a digital camera or have a good "inkjet printer", whereas I am in fact still using an optical enlarger. I concede that 8-bit offset printing is of about the same quality as silkscreen, although it is possible to get screening done the old-fashioned way though. I don't mind scanning at this stage, as either analog or digital presswork will reduce quality. However, since the printing of movie prints is quite similar to striking paper prints from still negatives, your analysis would be to me like saying that 8 bit offset prints in National Geographic are identical to a fine-art print onto a piece of Endura paper using optical printing, which doesn't hold water.

 

I find it kind of strange that not every movie is put through a DI. I mean, surely Universal Studios aren't doing it the old fashioned way, cutting the neg and cementing bits together.

 

Is there something obvious I am missing here?

 

I'm sure I am missing something here. Because I REALLY can't imagine people still doing things the old fashioned way.

 

Yes, it is done the same old-fashioned ANALOG way using splicing and negative cutting. Why do it some other way when the movies are in fact still shot on 35mm film, the same format as was used in the beginning, with only a few minor changes and improvements in film base, light sensitivity, and perforation pitch over more than ONE HUNDRED years of motion picture??? It is a system that works, and the only time that negative cutting and splcing has ever been an issue was with the movie Chicken Run, where they were worried that the negative would be too fragile with all of the splices from the many short shots in the movie. In fact, in some films, you can see the fine fine splices between each shot at the bottom of the frame during a scene change, although it depends on theatrical projection format and the projector mattes used. I recall noticing this for the first time while seeing the Alamo. While it may seem "cool" to do everything on a computer, and editing is almost always done on computers now instead of doing everything with workprints, they still go back to the film in most cases to make a final cut and spliced master negative. This reminds me of what my projectionist friend told me about a lady who came to a movie late once and asked an usher in the theatre if they could "rewind the DVD".

 

~Karl Borowski

Link to comment
Share on other sites

  • Premium Member

Actually most theatrical movies are finished photo-chemically, not digitally -- that's more the exception, not the rule.

 

But my point is that even if you finish a movie photo-chemically for the theatrical release, you still have to deal with digital technology for the video dailies and later for the home video mastering -- it's not like you just transfer the print as is with no color-corrections. And since more people will probably see the home video version, spending some time doing those horrible "digital corrections and manipulations" you are so loathe to do is going to be unavoidable and in fact, absolutely necessary in order for people to see it on home video in some manner that looks like you want it to. So there is no way to keep your hands clear of digital technology unless you plan on only cutting workprint and only projecting prints and never allowing it to be transferred to video.

Link to comment
Share on other sites

No Dave, that's not what I am trying to say. Television is and always will be a wonderful medium and right now I love present-day television probably more than I love current movies, especially with some of the stupid flicks in theatres in 2004. I have a problem with people who think that it's old hat to do anything photochemically and take the same attitude of the stupid portrait studios: "if I HAVE to use that film junk, it's just a capture format and I can fix all of my mistakes later in photoshop. .. can't wait until the new line of digital cameras, and new G10s come out.". This is seriously the attitude I have encountered talking to burned-out old photographers who don't have the patience to do anything analog anymore. Frankly, it disgusts me. It's like taking the art out of taking pictures and making movies. I have no problem with telecines, digital editing, and digital effects if they're tasteful. I think digital has it's own aesthetics, it's own beauty, but I don't like using it as a cheat for blowing up film to bigger than it should be or to hide the grain that is going to be there no matter how advanced Vision500 gets. I have a problem with ignoring the technical and artistic merits of film and instead going all out on digital. Now, with television and computers, I understand that transfers need to be tweaked in order to optimize the more narrow dynamic range. This is an unavoidable aspect of showing things on television. However, I am not going to limit my work by making it with television or a computer in mind. Digital's great for archiving and distributing, but if I can do an unlimited number of things with analog gear and effects, I simply don't see the need for anything else. I think that the digital intermediate is synonymous with the mid-life crises of '60s era cinematographers who have run out of creativity frankly. Digital has been well established in music for nearly twenty years now, and I certainly don't think music is any better than it was in the '70s or earlier. In fact, it's worse. I'm so scared that moviemaking is going ot follow the same path and become commercial to the detriment of Ars Artis Gratia. . .

 

Regards.

~Karl Borowski

Link to comment
Share on other sites

Post-producing a movie in the traditional, photochemical way still makes a lot of sense. First of all it is much cheaper than doing a DI (a good DI can cost between $200.000 and 400.000). Also it still gives you the best picture quality, because despite what a lot of people say, putting a movie through a DI has a lot of drawbacks: the blacks are weaker, the sharpness gets reduced and the colordepth isn't as good as if timed photochemically.

The problem is that the processing chains are too long and the conversions not transparent. You combine the worst of film (image degradation from copying in the analogue domain) with the worst of digital (not properly done digital, that is) (aliasing,loss of shadow/highlight detail,artifact ridden DNR...) and the end result on film looks like it: Grainy, fuzzy and with digital artifacts on top. Once the process matures you will see ON quality in the cinema on digital projectors. Till then we are seeing technological bastards that at best look as good as a print from the ON (have yet to see one), but more often like a standard print with digital nasties on top.

Link to comment
Share on other sites

Fillip, I"m afraid you and I have differing perceptions of the word "printing".  I am talking more about printing through an enlarger of a negative (or slide in special cases) onto RA-4 or Ilfochrome paper, akin to printing ECN-2 onto ECP.  You seem to be referring more to newsprint.  I thought that you were saying that no one was using the enlarger-to-photosensitive paper technique anymore, which is what got me so angry.  It irks me to no end when people ask me if I'm using a digital camera or have a good "inkjet printer", whereas I am in fact still using an optical enlarger.  I concede that 8-bit offset printing is of about the same quality as silkscreen, although it is possible to get screening done the old-fashioned way though.  I don't mind scanning at this stage, as either analog or digital presswork will reduce quality.  However, since the printing of movie prints is quite similar to striking paper prints from still negatives, your analysis would be to me like saying that 8 bit offset prints in National Geographic are identical to a fine-art print onto a piece of Endura paper using optical printing, which doesn't hold water.

Yes, it is done the same old-fashioned ANALOG way using splicing and negative cutting.  Why do it some other way when the movies are in fact still shot on 35mm film, the same format as was used in the beginning, with only a few minor changes and improvements in film base, light sensitivity, and perforation pitch over more than ONE HUNDRED years of motion picture???  It is a system that works, and the only time that negative cutting and splcing has ever been an issue was with the movie Chicken Run, where they were worried that the negative would be too fragile with all of the splices from the many short shots in the movie.    In fact, in some films, you can see the fine fine splices between each shot at the bottom of the frame during a scene change, although it depends on theatrical projection format and the projector mattes used.  I recall noticing this for the first time while seeing the Alamo.  While it may seem "cool" to do everything on a computer, and editing is almost always done on computers now instead of doing everything with workprints, they still go back to the film in most cases to make a final cut and spliced master negative.  This reminds me of what my projectionist friend told me about a lady who came to a movie late once and asked an usher in the theatre if they could "rewind the DVD".

 

~Karl Borowski

 

 

Actually I'm thinking both. Both print as in YCM printing and photo printing on photo paper.

 

Of course there are still a lot of people doing optical printing, but digital scanning and printing has become sort of a new standard among both professionals and

amateurs, not to mention the consumer market.

Most of the advanced amateur photographers have no choice but to print color film digitally. Not everyone has access to professional labs that print optically.

And even professionals use digital input/outout.

Fashion work, commercial work, no matter what the output is, photo print or paper print has to come through digital domain to correction, retouching etc.

Either scanned prints are used, or negatives, but it ends in digital at one point.

 

Not to mention that a LOT of amateur and professional photography is done with reversal film, which has to be scanned becuase It's quite rare to find someone who will do internegatives today.

 

If you just look at today's minilabs that are being manufactured, there are no

optical minilabs anymore, all of them have a laser exposure system.

Even though they are consumer oriented, their output is quite good and a lot of amateurs use them to make beautifull prints out of professional quality scans they made at home.

 

I'm not saying that optical printing is gone. As long as there is film, there will be optical printing, because that's the most direct and simple way of doing it.

But it's not the most common way of doing it today.

Edited by Filip Plesha
Link to comment
Share on other sites

Post-producing a movie in the traditional, photochemical way still makes a lot of sense. First of all it is much cheaper than doing a DI (a good DI can cost between $200.000 and 400.000). Also it still gives you the best picture quality, because despite what a lot of people say, putting a movie through a DI has a lot of drawbacks: the blacks are weaker, the sharpness gets reduced and the colordepth isn't as good as if timed photochemically.

 

I can't share any of your points. Maybe you could elaborate why blacks should look weaker (???), why the sharpness should get reduced (if scanned at a decent resolution and when not using DNR) and why, when scanned at a decent depth you should lose colordepth.

I think you create some myth about film as opposed to digital media. Film has a certain amount of information and that's what the whole concept of oversampling is meant for when transferring into the digital realm.

Lets break it down to the facts. Any point of the film negative can be represented by three numbers and if you give those numbers enough digits, you don't lose anything of importance. Period.

Plus if you give them enough digits so that doing some maths with those numbers won't round up to much, you still won't lose anything of importance.

Anything else are problems in the chain, which if you know about them you can work around or solve it (this technology is still pretty new).

 

For over a century great looking films have been made without a DI.

Maybe we should just leave anything like it is and just deny that there is something out there which does(!) give you new possibilities for expressing ideas. Maybe we should just do what we did for over a century cause we're sure nobody will miss anything or get bored plus we are on the save side when not trying anthing new.

Spoken like the real avantgarde...

 

-k

Link to comment
Share on other sites

  • Premium Member

I take it you can't tell a DI from a photochemically finished movie then when you see it at the theatre.

 

The blacks are weaker because the current outputting devices, be it laser or a crt just have certain limitations that do not allow them to produce an identical black than one would get with contact printing. Of these 2, crt produce better blacks than laser.

 

Current DIs are done at 2K and it is widely accepted in the industry that that resolution is not enough. It doesn't matter as much for closer shots, but very wide shots in DI movies always look slightly soft. Scaning at 2K certainly isn't 'oversampling' as you put it, since the ON has more information than 2K to begin with.

 

As for the color reproduction, a DI has problems with accuaretly representing certain colors, particularly warm ones like orange and gold. In this respect Bill Pope's comments on the grading of Spiderman 2 are very interesting.

 

Thinking that with the current technology one can take a movie into the digital realm and back without loosing anything is simply wrong. It changes the feel of the picture. Maybe you don't notice it, but there are people who do.

 

And if I say timing photochemically still makes a lot of sense that doesn't mean that one shouldn't do DIs to develop and improve the technology, like you are implying.

Link to comment
Share on other sites

  • Premium Member
Thinking that with the current technology one can take a movie into the digital realm and back without loosing anything is simply wrong. It changes the feel of the picture. Maybe you don't notice it, but there are people who do.

 

In theory, if one scanned and output at at least 4K, it could come close to duplicating the original negative pretty closely (ignoring the flatter blacks of laser recorder outputs for the moment) IF there was absolutely nothing done to the scan... which never happens because one of the points of a DI is to use digital color-correction tools. And as soon as you start changing the scan, you create colors or lose information, etc. that makes it look different than what a photochemical process would do. However, one of the points is to do something more than what the photochemical process does... So it is sort of a Catch 22.

 

After all, we've all seen movies finished photo-chemically where some digitally fixed shot was cut into the movie quite seemlessly, where we end up reading later in some magazine article "we had to erase that sign on the building..." and think "gee, I don't remember seeing anything odd about the shot." So clearly it's possible to match the look of the original piece of film. Of course, we've also seen digital efx shots that stick out like a sore thumb.

Link to comment
Share on other sites

After all, we've all seen movies finished photo-chemically where some digitally fixed shot was cut into the movie quite seemlessly, where we end up reading later in some magazine article "we had to erase that sign on the building..." and think "gee, I don't remember seeing anything odd about the shot."  So clearly it's possible to match the look of the original piece of film. Of course, we've also seen digital efx shots that stick out like a sore thumb.

Back in 2001, I was co-supervising post production on a Super 35 feature that I had also edited. We started answer printing the flat negative (prior to blowing it up anamorphically via optical printing) and noticed a major daylight sequence riddled with very noticable scratches; approximately 20-30 shots were affected. Liquid gating didn't help, and by now the neg was cut, so with a 9-day deadline looming, we made the decision to do the fix digitally on the entire section of the film (about 1.5 minutes of screen time.)

 

The plan was to do this particularly complex scratch removal assignment without altering the color balance or density of the digital files in any way. We scanned the cut neg section at 2K (10-bit log), and I had about 8 Shake artists tackle it. Once the fixes were complete we recorded the entire section back to film at 2K and cut this new section into the surrounding original neg. I swear it looked great. I was shocked, as I expected reduction in sharpness, contrast - basically everything you associated with film records of the time. I attribute this to the fact that, aside from the cloning and paint work done to affect the scratch removals, the neg went in and out clean. (And incidentally, after we anamorphosized our neg, you still couldn't tell the difference. Honest.)

 

Saul.

Link to comment
Share on other sites

I take it you can't tell a DI from a photochemically finished movie then when you see it at the theatre.

 

The blacks are weaker because the current outputting devices, be it laser or a crt just have certain limitations that do not allow them to produce an identical black than one would get with contact printing. Of these 2, crt produce better blacks than laser.

 

Well I could tell the difference with at least a few but only for the precision with which certain colors have been worked on, which were just unusual in a positive way.

As for the blacks, I've only worked on two projects myself which were later printed. If I remember right these were both done with CRTs (one an Oxberry the other I don't know) and yes I did'nt really noticed weaker blacks but maybe you could.

 

Current DIs are done at 2K and it is widely accepted in the industry that that resolution is not enough. It doesn't matter as much for closer shots, but very wide shots in DI movies always look slightly soft. Scaning at 2K certainly isn't 'oversampling' as you put it, since the ON has more information than 2K to begin with.

 

As for the color reproduction, a DI has problems with accuaretly representing certain colors, particularly warm ones like orange and gold. In this respect Bill Pope's comments on the grading of Spiderman 2 are very interesting.

I did'nt say that 2k is enough. My feeling however is that 4k would be oversampling enough so as not to see any difference concerning sharpness. If you don't think so go higher if you want.

The same is true for color changes. Why should DI in general(!) have problems with representing colors. Maybe 10 Bit log is not enough if you do heavy maths. But you could go higher if you wanted to . 16 Bit or 32 Bit HDR formats (I mean there is no myth to orange is there?) The technology is already there it's just not practicable nor cost effective.

All I am saying is that going digital does'nt necessarily have to mean loss of quality.

 

Even today personally I'd rather live with the mentioned problems and try to solve them as much as possible than living without the amazing possibilities DI has to offer.

Saying that I'm of course not the one paying for it... ;-)

 

-k

Link to comment
Share on other sites

" I'm so scared that moviemaking is going ot follow the same path and become commercial to the detriment of Ars Artis Gratia. . ."

 

Moviemaking will become commercial ? I'm shocked :o

 

 

Print on Premiere, nice blacks.

 

(yeah I know, the same producer who'll spare no expense in the DI stage won't spend a few cents per foot more for Premiere......)

 

-Sam

Link to comment
Share on other sites

So clearly it's possible to match the look of the original piece of film. Of course, we've also seen digital efx shots that stick out like a sore thumb.

David is onto the important point here.

 

Statements elsewhere in this thread that anything through a DI is inherently worse, (or inherently better) are foolish and ill-informed. As with almost anything complex (and both photochemical and digital imaging technology are certainly that), you can do the job well or you can do it badly.

 

Yes we've all seen DI's that show up the technology badly - whether its the vogue for degraining, or just bad colorimetry, or even the use of inferior file formats. And we've seen some excellent ones. People who claim they can always spot a DI in the cinema (or from across the street!) have probably also seen some DI shots (or complete films) that were so good they didn't recognise them as such.

 

Similarly, we've seen some poor examples of photochemical duplication and optical effects, and some that are a triumph of the medium.

 

It's down to who does the job, and how they use the equipment, not a plain, simple "this is better than that, regardless of anything else".

Link to comment
Share on other sites

DI will come fully alive once we have 4K projectors that have contrast like a top print. Till then the digital data is just an abstract promise of an image that could be, but is not, because we can't render fully what it represents. The best overall rendering of digital data for me is a good CRT projector. It works only in home cinemas and it has issues of its own, but as far as creating a film like picture of incredible beauty it's unsurpassed.

Concerning weak blacks from DIs. Well I see weak blacks all the time on traditional prints. They have no blacks and neither does the cinema with the friendly help of bad optics, exit signs next to the screen etc. Blacks that deserve their name I find only at home. Unfortunately industry reference screening rooms are not open to the general public.

Watched "Alexander" yesterday. Despite 2 analogue steps (minimum) the aliasing around the letters in the opening and closing credits (the ones that grow) was easy to see from 1 screen height distance. Digital systems must be aliasing free. As long as such a fundamental requirement is not met we are still fighting with the basics. One advantage in going to 4K is making these letters aliasing free and not losing any sharpness, making them even look sharper, but not with jaggies.

Edited by miha
Link to comment
Share on other sites

Then it wouldn't be a DI -- "digital intermediate", i.e. film-digital-film.

 

Right, it needs a new name. :) Digital final. Digital master original...

Till then we will have DIs that are used for film prints and digital cinema masters as well. The intermediate part will not be kicked out quickly for distribution, and for archival purposes even less.

Link to comment
Share on other sites

Guest Sean McVeigh

It's tough to even fathom the amount of hardware required to store and playback 4K material uncompressed. I'm speccing out storage for 2K onlining and transfer rates approach 200MB/second. At 4K, quadruple that!

 

Seems that pulling celluloid past a flashlight is still by far the cheapest way to show movies :)

 

(obviously, with compression, the transfer specs drop, but wow! I mean.. wow!)

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Forum Sponsors

Broadcast Solutions Inc

CINELEASE

CineLab

Metropolis Post

New Pro Video - New and Used Equipment

Gamma Ray Digital Inc

Film Gears

Visual Products

BOKEH RENTALS

Cinematography Books and Gear



×
×
  • Create New...