Jump to content

The end of film for TV production?


Keith Walters

Recommended Posts

  • Replies 298
  • Created
  • Last Reply

Top Posters In This Topic

... The 2K DI process is however better than the 35mm optical blowup process from S35, which it replaced.

O. K. You're referring here to a movie shot on Super35, and then converted to Regular35. Yes, that is an extremely difficult task to do optically. I had written to Kinoton and Cinemeccanica three years ago to suggest a new design for a Projector that could run both Regular35 and Super35 films so that movies could be released in S35. They indicated that they would have liked investment from Hollywood. The studios are of course pursuing Digital in order to make even more money.

 

AMC is going to convert all of its 4500 screens to 4K digital projection by the year 2012. The biggest difference to be seen in 4K projection will come with 65mm film origination.

Who is manufacturing this 4 MegaPixel Digital Projector?

Just to clarify; the measure of resolution with a Digital Projector is not the same as a Film Recorder. For a Recorder, 2K & 4K refers to the number of Pixels across the width of the 35mm Frame. For a Projector, 2 MP refers to the total number of Pixels in the image.

 

Here is one example of problems with Digital Projectors (from Wikipedia):

DLP projectors utilizing a mechanical spinning color wheel may exhibit an anomaly known as the “rainbow effect”. This is best described as brief flashes of perceived red, blue, and green "shadows" observed most often when the projected content features high contrast areas of moving bright/white objects on a mostly dark/black background. The scrolling end credits of many movies are a common example, and also in animations where moving objects are surrounded by a thick black outline. Some people perceive these rainbow artifacts frequently, while others may never see them at all.

Link to comment
Share on other sites

"The Goods," "Extract," and probablby "Sorority Row" are photo-chemical through-and-through.

 

"The Goods", if you're referring to the recent Jeremy Piven movie, had a DI done at Laser Pacific, by Mike Sowa. Sorority Row was a DI done at Technicolor by Jill Bogdanowicz. I'm not sure about "Extract," so I won't offer information on that.

 

A hard matte has nothing to do with whether it's a DI finish or not. It has more to do with whether it is shot in 4 perf format or, these days, the more common 3 perf format. If a picture is shot in 4 perf, it's usually a studio delivery requirement to supply a "full frame" digital negative. If it's on 3 perf, that's not possible because the "full frame" is 1.77:1.

Link to comment
Share on other sites

O. K. You're referring here to a movie shot on Super35, and then converted to Regular35. Yes, that is an extremely difficult task to do optically. I had written to Kinoton and Cinemeccanica three years ago to suggest a new design for a Projector that could run both Regular35 and Super35 films so that movies could be released in S35. They indicated that they would have liked investment from Hollywood. The studios are of course pursuing Digital in order to make even more money.

 

Do you know what Super 35 is, or why and how it's used? Karl was likely referring to shooting Super 35 for anamorphic release, which requires either an optical step for both the anamorphosing and blowup, or a DI to create a 4 perf anamorphic frame digitally. Not to mention that much of what is shot in Super 35 format today is shot on 3 perf format, which cannot be contact printed to 4 perf release formats. Nobody shoots Super 35 with the intention of releasing that way, and it is basically impossible to build a projector for it because it covers the area normally used for a soundtrack. So unless you're projecting silent films, that's just not a possibility. And your constant criticism of a company trying to make money is getting very tiresome.

 

 

Who is manufacturing this 4 MegaPixel Digital Projector?

Just to clarify; the measure of resolution with a Digital Projector is not the same as a Film Recorder. For a Recorder, 2K & 4K refers to the number of Pixels across the width of the 35mm Frame. For a Projector, 2 MP refers to the total number of Pixels in the image.

 

You are really in way over your head here. Everything you just said is completely incorrect. The term "megapixels" is normally used for describing a Bayer pattern digital still camera. Digital cinema projectors are referenced by the horizontal resolution, in most cases, 2K or 4K. Current 2K projectors have 3 chips, not one. So if you actually wanted to talk about megapixels - even though that is completely incorrect terminology - a 2K DLP Cinema projector would actually be 6.6 megapixels (2048x1080x3), but as I said, nobody uses that terminology. The current 4K projector made by Sony has three 4K chips, yielding a total of over 26 megapixels, if one actually wanted to refer to it that way.

 

Here is one example of problems with Digital Projectors (from Wikipedia):

DLP projectors utilizing a mechanical spinning color wheel may exhibit an anomaly known as the “rainbow effect”. This is best described as brief flashes of perceived red, blue, and green "shadows" observed most often when the projected content features high contrast areas of moving bright/white objects on a mostly dark/black background. The scrolling end credits of many movies are a common example, and also in animations where moving objects are surrounded by a thick black outline. Some people perceive these rainbow artifacts frequently, while others may never see them at all.

 

The only DLP projectors that use color wheels are single chip projectors sold for industrial and home theater use. DLP Cinema projectors all have three DLP chips and use dichroic filters to separate a constant light source (Xenon) into three color components, one for each chip. None of the issues you describe exist in DLP Cinema projectors.

 

I would say that one shouldn't post things that are out of one's area of expertise and state them as fact here, but clearly very few people would actually listen.

Link to comment
Share on other sites

  • Premium Member

The reality is, Current Generation Digital cinema projectors simply don't have enough boxes ticked to be commercially successful.

 

They work, and good ones work extremely well, but not all "Digital Cinemas" have good projectors.

 

The current biggest issues are installation/maintenance cost and uncertainty over useful operational life.

Everything really comes down to money. If the projector costs $20,000 and only lasts for 5 years, that would probably be be acceptable.

$200,000 - no.

 

The real answer is better, cheaper projectors and the technology that will finally make it happen is high-powered LED light sources.

 

Current design projectors use fragile, gas discharge tubes which are dangerous to operate, only have a limited useful operational life, and (usually) require considerable technical skill to maintain. The white light has to be split into its red, green and blue components by a precision dichroic mirror assembly, modulated by three separate light-gate devices (LCD panels or DLP), and then recombined by another dichroic mirror assembly, all of this inside a box containing a powerful heat-generating gas discharge lamp.

 

Separate Red, Green and Blue LED sources would eliminate one dichroic mirror assembly, most of the heat, all the high voltages and most of the risks of operation. Because LEDs run so much cooler and don't produce UV or IR, the life expectancy of LCD or LCOS panels would also be greatly improved.

 

White LED Headlights are already appearing on concept cars and the same technology could be used to produce Mega-bright RED, Green and Blue LEDs for large-screen projectors. Texas Instruments are actively pursuing 4K DLP production, and although they haven't specifically mentioned LED light sources, I would be very surprised if that was not what they have in mind.

 

I really don't get some people's comments about 2K digital vs film projection. If a 2K DI is used, both start with the same images and with a digital projector, those images are then more or less projected directly onto the screen.

 

With 2K DI and film release, generally, the same images will first be burned onto a master negative using an Arrilaser or the like; numerous contact copies will that will be made for distribution to bulk duplicating facilities around the world; after any editing to suit local sensibilities the facilities will then make their own "dupe-ing" copies, and finally the release prints will be contact printed from those. So that's four generations of contact printing from the original DI.

 

Naturally, if you start the chain without the DI and use the master negative instead, the end result is going to be theoretically better because you are starting with much higher resolution, however you lose the very real post-production "tweaking" possible with DI which more than compensates for this.

 

Watch this space.

Link to comment
Share on other sites

  • Premium Member
Next time a questionnaire asks me what I do for a hobby, can I put "correcting the unmitigated drivel of Thomas James"?

But if you corrected it, you've just mitigated it, haven't you?

Whatever happened to Jan von Krogh? HE'LL bring some order to this techno-rabble quicksmart :P

Link to comment
Share on other sites

  • Premium Member

I've had notions of going with a higher res, limited release strategy with my recorder rig: Shoot 2000' neg rolls with the sound track contact printed on. Then pull release prints from those. 2nd gen release prints from 4K negs would probably look pretty good.

Link to comment
Share on other sites

  • Premium Member

Come to think of it, I've still got a Lasergraphics Mk IV (the good one that doesn't have the dot halo problem). I could pull 8K negs off it and contact something like a 6K equivalent release print from it. It would take about 6 months to get that neg. But, boy howdy, what would those release prints look like?

Link to comment
Share on other sites

Do you know what Super 35 is, or why and how it's used? Karl was likely referring to shooting Super 35 for anamorphic release, which requires either an optical step for both the anamorphosing and blowup, or a DI to create a 4 perf anamorphic frame digitally. Not to mention that much of what is shot in Super 35 format today is shot on 3 perf format, which cannot be contact printed to 4 perf release formats. Nobody shoots Super 35 with the intention of releasing that way, and it is basically impossible to build a projector for it because it covers the area normally used for a soundtrack. So unless you're projecting silent films, that's just not a possibility. And your constant criticism of a company trying to make money is getting very tiresome.

Yes I do know -- thank you very much! I've read about James Cameron's excellent work with wet gate! You're describing the exact problem I'm thinking of. You have a habit of condescendingly telling people things that they already know with a tone as if they don't know. Super35mm only displaces the Analogue Sound Track and DTS Time Code -- not the Digital Sound Tracks. Why would producers shoot movies for release in Super35 when there are no Projectors around to run them? That's why I proposed a Regular/Super35 Projector!!! Without endlessly digressing, "a company trying to make money" is an extremely amusing and ridiculous way of referring to a Hollywood Studio.

 

... The term "megapixels" is normally used for describing a Bayer pattern digital still camera. Digital cinema projectors are referenced by the horizontal resolution, in most cases, 2K or 4K. Current 2K projectors have 3 chips, not one. So if you actually wanted to talk about megapixels - even though that is completely incorrect terminology - a 2K DLP Cinema projector would actually be 6.6 megapixels (2048x1080x3), but as I said, nobody uses that terminology. The current 4K projector made by Sony has three 4K chips, yielding a total of over 26 megapixels, if one actually wanted to refer to it that way. ...

I will concede to you 2/4K terminology for DLP Projectors. However, MegaPixels has been used elsewhere in describing the Projector's resolution -- so you're wrong about that. The 6.6 and 26 Megapixels you refer to are false numbers since they include the Red, Green and Blue mirrors added together. The actual image resolution is 2,211,840 to correspond to Pixels of the file. I looked for the specs on the new Sony, and its resolution is 8,847,360 -- NOT 26 million as you state.

Good luck to the theaters in sustaining the high expenses of maintaining these Projectors. They cost up to $150,000 and only last 5 to 10 years -- whereas a Film Projector only costs $50,000 and lasts up to 40 years! This new 4K Projector confirms how stupid it was to waste money buying 2K Projectors. Digital Projectors also have much higher maintenance costs. Another problem with the three-chip Projectors is that they can't produce colours as well as the one-chip Projectors.

 

My apologies to Matthew Phillips for digressing from the main topic of his Thread.

Link to comment
Share on other sites

  • Premium Member

The LED idea is interesting, but I think we're probably a bit premature. LEDs with per-junction power handling above a very few watts don't really exist outside the laboratory, meaning that we need to see them increase in capability by two or three orders of magnitude before they'll be usable at the kilowatts-per-emitter level required for projected light imaging. Also, they tend to become less and less efficient as they run hotter, which in the real world is highly proportional to the amount of power they're being asked to handle. This can lead to the vicious circle of thermal runaway and, eventually, destruction of the device, and it's necessary to aggressively heatsink and even use forced air cooling with current high power LEDs to mitigate this. For this reason I don't think that an LED projector would necessarily run stone cold, though it might run cooler than a discharge type.

 

That said, we'll likely see monochromatic devices hit that point before the white ones that everyone else seems to be most interested in, and the idea about not having to split up white light is an interesting one. Of course if there are fundamental advances in any of these areas, great, we're all happy.

 

P

Link to comment
Share on other sites

  • Premium Member
The LED idea is interesting, but I think we're probably a bit premature. LEDs with per-junction power handling above a very few watts don't really exist outside the laboratory, meaning that we need to see them increase in capability by two or three orders of magnitude before they'll be usable at the kilowatts-per-emitter level required for projected light imaging. Also, they tend to become less and less efficient as they run hotter, which in the real world is highly proportional to the amount of power they're being asked to handle. This can lead to the vicious circle of thermal runaway and, eventually, destruction of the device, and it's necessary to aggressively heatsink and even use forced air cooling with current high power LEDs to mitigate this. For this reason I don't think that an LED projector would necessarily run stone cold, though it might run cooler than a discharge type.

 

That said, we'll likely see monochromatic devices hit that point before the white ones that everyone else seems to be most interested in, and the idea about not having to split up white light is an interesting one. Of course if there are fundamental advances in any of these areas, great, we're all happy.

 

P

I think the most positive aspects are the interest shown by car manufacturers, and people looking for more efficient lighting in order to gain carbon credits.

 

There are already quite a few manufacturers marketing a LED-based video projectors. I've never actually seen one (although this will change shortly), but going by the reviews I've read, they do seem to work. Most of them seem to simply project a small LCD TV screen using a white LED source - inefficient, but cheap.

 

I suspect that the same amount of LED power divided between RED, Green and Blue emittters and using 3 DLP or LCOS light valves would give a surprisingly bright picture.

 

Also, the emitter doesn't have to be a single LED junction. There could either be an array of LEDs on a heavy heatsink, or a light piping system based of fibre optics that would spread the LEDs out a bit. Remember also that the LEDS would only need 4-5 Volts maximum to operate, so they would work perfectly happily under water, and so cooling might not be such a problem as you think.

 

Apart from this, they LEDs don't necessarily have to run at full brightness all the time. In fact this would allow the electronic projectors to produce real blacks for a change :lol:

 

But, basically, if this (or something very like it) doesn't happen, universal digital projection in cinemas is still going to be a long way off.

Link to comment
Share on other sites

  • Premium Member

I recall from my first time in college when I saw a laser show at local rock show. It could form images with a tight beam and, apparently was not a significant health hazard. The light was green. Could a three gun, RGB laser scan line out a satisfactory movie image onto a screen? It could do continuous beam as analog or a pulsed beam for digital.

Link to comment
Share on other sites

Good luck to the theaters in sustaining the high expenses of maintaining these Projectors. They cost up to $150,000 and only last 5 to 10 years -- whereas a Film Projector only costs $50,000 and lasts up to 40 years! This new 4K Projector confirms how stupid it was to waste money buying 2K Projectors.

 

Try 100 years for the life of a film projector. I've heard about multiplexes running 80-year-old machines without any problems.

 

As for AMC and 4K, whoever said that, maybe they can buy some from, I think, Regal, who just decided to send all of theirs back. . .

 

And meanwhile, all of these theatres have bought 2K machines that are now going to look worse than movies made from 4K DIs.

 

 

I am pretty sure that most studios make DI masters onto IP stock or onto neg stock, making 4-6 INs on '01. So it's' usually "only" 1-2 generations removed from the master instead of 3, I think. Could be wrong. All I know is that EFILM DIs consistently look pretty crummy.

 

I also know that films looked a lot better before any of this DI sh*# came around. Now, why is that? Is it the fault of chemical film, and the high-speed contact printing process that movies have gotten so crummy looking that they appear out of focus even when they aren't in the past 5-6 years?

 

Does a digital intermediate with at least two generations of copying loss really not play into this process at all? Is streamlining the cost of television mastering really worth the hit in quality that cinema screens are taking? Are cinemas going to survive the model of reduction in quality at a higher price, really?

Link to comment
Share on other sites

  • Premium Member
I said that the scans are done on a scanner that operates at 4K and scales to 2K for a 2K output. The DI work itself is done at 2K, but the general feeling - borne out by testing - is that this method provides you with a print that is much closer to a full 4K path than a "straight" 2K scan does.

 

Yes. This is true because of what I was talking about before about the Nyquist limit and digital vs. optical low pass filtering.

 

Scanning is optical, so it's constrained to use filters that only have positive coefficients. To make the N/2 Nyquist limit, those filters have to start rolling off at N/4. So, the rolloff has to start at 1K to be out by 2K so as not to alias in a 4K scan. If you were actually scanning at 2K, you'd have to start rolling off at 0.5K to be out by 1K.

 

The resampling step from 4K to 2K happens digitally, so you can get damn close to a brick wall filter. To make the N/2 = 1K limit for the downconversion to 2K, you can get everything up to just a hair under 1K. And you have it, because the 4K OLPF doesn't start rolling off until 1K.

 

That's why 2:1 oversampling gets you the extra top octave detail that you're seeing in your DI's. BTW, there's no real advantage to going much beyond 2:1.

 

 

 

 

 

-- J.S.

Link to comment
Share on other sites

  • Premium Member
Mainly, though, interlaced video has about 70 to 75% the vertical resolution of an otherwise equivalent progressive frame due solely to the need for anti-flicker filtering, and exactly the same horizontal resolution.

 

I'd say more like 65 - 70%. It's a matter of how much small area flicker aliasing you're willing to tolerate in trade for the extra resolution. The one thing that made interlace worth doing is that it worked in the analog world. It's actually a sort of analog lossy compression.

 

 

 

 

 

-- J.S.

Link to comment
Share on other sites

As for AMC and 4K, whoever said that, maybe they can buy some from, I think, Regal, who just decided to send all of theirs back. . .

 

Cite your source for that. Because, just as you for some reason doubted some of the things I posted, I sincerely doubt this one.

Unless, of course, they're sending back units they already received to be replaced with 3D capable newer units.

 

And meanwhile, all of these theatres have bought 2K machines that are now going to look worse than movies made from 4K DIs.

 

Your opinion, not fact. And not shared universally.

 

I am pretty sure that most studios make DI masters onto IP stock or onto neg stock, making 4-6 INs on '01. So it's' usually "only" 1-2 generations removed from the master instead of 3, I think. Could be wrong.

 

Studio pictures with reasonable budgets for the DI usually manufacture multiple printing negatives digitally, so the prints for these pictures are all "first generation," at least in the sense of being from the "original" negative. Other DI's usually manufacture one "original" negative, and then make an IP (or multiple IP's) from that, and multiple printing negatives from the IP - essentially the same process as a photochemical finish, but without the need for the scene to scene timing step.

 

All I know is that EFILM DIs consistently look pretty crummy.

 

Yeah, we get it. You don't like EFilm. I don't know why, nor do I care. I'm quite sure that not having the Karl Borowski seal of approval is not going to substantially hurt their business. Not to mention that I think it will come as a surprise to the producers and directors of photography of pictures like "Iron Man," "Angels and Demons," "Revolutionary Road," and countless others that their movies looked "crummy."

 

I also know that films looked a lot better before any of this DI sh*# came around. Now, why is that? Is it the fault of chemical film, and the high-speed contact printing process that movies have gotten so crummy looking that they appear out of focus even when they aren't in the past 5-6 years?

 

Does a digital intermediate with at least two generations of copying loss really not play into this process at all? Is streamlining the cost of television mastering really worth the hit in quality that cinema screens are taking? Are cinemas going to survive the model of reduction in quality at a higher price, really?

 

The rise in popularity of DI finishing has rarely, if ever, had anything to do with reducing costs. It has always been far more expensive to do even a "budget" DI than a photochemical finish. The fact that it also yields a video master has been one of the factors that made it financially practical, but certainly not cheaper. It has happened because it offers flexibility, both creative and technical, that allows DP's and directors to utilize certain abilities to (hopefully) tell their stories better visually. By allowing things like contrast changes and secondary isolations - both unavailable in a photochemical process - it has allowed them to enhance practical photography, fix production problems, achieve better shot matching in difficult shooting circumstances, better matching of visual effects shots, and other things. It has also allowed the use of things like 3 perf, which can achieve a stock and processing cost savings while not sacrificing quality (the active image area is esentially the same as 4 perf if the aspect ratio is 1.85 or wider, and it directly translates to 1.77:1 video). Not to mention the ability to easily integrate multiple formats, both film and electronic, within the same show.

 

But all of this comes at a price. And yes, part of that price is that electronic color correction is, at its heart, a destructive process. Excessive color manipulation adds noise and other artifacts. But the creative and practical format freedom that DI brought to the table has long since been accepted as a step forward for storytelling, not a step back. It's not all about pristine photographic quality. It's about telling a story and affecting an audience visually. Your view that quality trumps all other considerations is, to me, simplistic and somewhat myopic. It's about entertainment, at a level of quality that an audience expects. You seem to think that audiences crave film projection, and that anything less is seen as a "reduction of quality." I and many others think just the opposite. I'm never going to convince you of that, and you're never going to convince me or a lot of other people. So I'm going to stop trying, and, in all likelihood, so should you.

Link to comment
Share on other sites

  • Premium Member

>> Advanced High Definition one thousand line television systems were used by Germany during World War 2.

 

Bollocks

 

No such system was commercially available in Germany before or during the war, but there was research being done on it. It was at about the same level as 8K TV is today. One thing that the German scientists perfected very early in the time of the Third Reich was the art of grantsmanship. The government was funding all kinds of goofy stuff from the mid 1930's on into the war.

 

An 819 line interlaced monochrome system was tried in France after the war, but it was never a commercial success.

 

 

 

 

-- J.S.

Link to comment
Share on other sites

  • Premium Member
Could a three gun, RGB laser scan line out a satisfactory movie image onto a screen?

 

It seems likely -- here are some folks working on it:

 

http://www.televisionbroadcast.com/article/87438

 

Perhaps the writer is confused on numbers, specifically the "1080 Hz" refresh rate.... ?

 

 

 

 

-- J.S.

Link to comment
Share on other sites

  • Premium Member

There have been laser projection video displays before - I believe one was done for the Tokyo Olympics, almost as a tech demo, and I very nearly tried to build (a monochrome) one as a school project before I realised it would be difficult to achieve a 15.625KHz horizontal scan rate using magnetic coils and mirrors.

 

P

Link to comment
Share on other sites

...

I also know that films looked a lot better before any of this DI sh*# came around. Now, why is that? Is it the fault of chemical film, and the high-speed contact printing process that movies have gotten so crummy looking that they appear out of focus even when they aren't in the past 5-6 years?

Does a digital intermediate with at least two generations of copying loss really not play into this process at all? Is streamlining the cost of television mastering really worth the hit in quality that cinema screens are taking? Are cinemas going to survive the model of reduction in quality at a higher price, really?

These problems can be directly linked to the 'corporate takeover' of Hollywood Studios twenty years ago. Before that time, the Studios were run by people who truly valued movie-making as an art. Robert Evans, the head of Paramount forty years ago, was very actively involved in the production of 'The Godfather'. The Warner Brothers, especially Jack, were actively involved in their movies. Samuel Goldwyn also produced his movies. The great Louis B. Mayer was the mastermind behind those fantastic MGM musicals. The great Cecil B. DeMille (who, by the way, founded Hollywood as the movie studios' location) was a joint owner of Paramount Pictures Corporation, and he very wisely shunned an executive position with the company in order to pursue the much more rewarding field of making movies. Thank goodness he shot 'The Ten Commandments' in much more expensive VistaVision -- DeMille didn't care about costs. I recently read a 1959 memo from a Paramount exec who was tabulating how much extra it cost to film 'Vertigo' in Technicolor VistaVision instead of regular color -- all of $78,000 Dollars (less than $1 million today)! The corporations who now own the Studios are of the same mentality -- they couldn't care less about movies as an art. They only care about making gobs and gobs of money -- nothing else.

 

Keeping on the topic of this Thread, long-term storage of digital recordings is much, much more costly than storing Film -- as well as much less certain. A couple years ago the Academy of Motion Picture Arts and Sciences concluded that the cost of storing digital recordings is "enormously higher - 1100% higher - than the cost of storing film masters." Those TV shows that are using Digital will pay the price in the long-term.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Forum Sponsors

Metropolis Post

New Pro Video - New and Used Equipment

Gamma Ray Digital Inc

Broadcast Solutions Inc

Visual Products

Film Gears

CINELEASE

BOKEH RENTALS

CineLab

Cinematography Books and Gear



×
×
  • Create New...