Jump to content

Star Wars Episode 2 - A millstone in cinematic history :-)


Recommended Posts

  • Premium Member

The point is that having a choice is good for us, not a threat.

So having a choice is good for us as long as we realize that "Digital will improve until it has surpassed film in every single way, except for its nostalgia value, and you'd be fooling yourself if you thought otherwise."

 

Thank you for your open-mindedness.

Link to comment
Share on other sites

I don't see how stating this is anything but open-minded. Technologies always get better, that's the way it works. By the same token that your phone probably now holds five times as much computing power than a full desktop station from 15 years ago, and a million times as much as the first computers from 50 years ago, one day digital camera will be ten, a hundred, a thousand times better than what they are today.

 

You can find it sad or be excited about it, it doesn't change the fact that it will happen. In fact, it is a bit narrow-minded to think otherwise.

 

And let me stress again that film will never disappear, and I am fully happy with it. I am not one of those "digital is better" type of guy. I am just saying that digital will improve and get better, while celluloid will plateau due to its intrinsic nature that takes up physical space. There's only so much space around us that we can grant the medium we shoot on, and it would be impractical to shoot on something five times as large as an IMAX roll. On the other hand, you can definitely count on digital getting smaller and smaller, while it ups its resolution, storage capacity, dynamic range, and everything else.

 

In fact, I wouldn't be surprised if in ten years a new algorithm allowed digital cameras to mimic the qualities of celluloid to the extent that professional DoPs won't be able to say what's what.

 

But really, again, the question is "Who cares?" Embrace the possibilities and leave out the rest. You don't like digital? Shoot film. You don't like film? Shoot digital? You like both? Shoot both. Film will always be around. Things just don't disappear, that's against the cyclical nature of the word. Things come and go, and move in and out of fashion. Again, look at vinyls. People have a craving for nostalgia, they'll keep film alive.

 

In two hundred years, when digital cameras will capture images with a hundred times the resolution of 70mm film, people will still shoot celluloid, "just because". So let's celebrate.

Edited by Nicolas Courdouan
Link to comment
Share on other sites

  • Premium Member

I don't see how stating this is anything but open-minded. Technologies always get better, that's the way it works. By the same token that your phone probably now holds five times as much computing power than a full desktop station from 15 years ago, and a million times as much as the first computers from 50 years ago, one day digital camera will be ten, a hundred, a thousand times better than what they are today.

 

You can find it sad or be excited about it, it doesn't change the fact that it will happen. In fact, it is a bit narrow-minded to think otherwise.

The irony is that, I'm not trying to brag, but you are talking to PhD track in Computer Science who probably understands digital technology better than you do. I am well aware of the advancements made in the field. But you are making a few errors that you may not realize in your reasoning.

 

1) film and digital are entirely different mediums that are not comparable in enough ways to quantify into the type of logic that equates with the advancements in hard drive technology, etc.

 

2) Technology has increased vastly but it is slowing down. Moore's law isnt really a true law and it isn't sustainable. In fact, you notice that clock speed hasn't increased in some time. Hence the need for multi-core technology which is, essentially, parallel computing. This can be great for certain types of tasks that are conducive to it but for certain types of applications, multi core is of very little use. Compare it to two vehicles moving furniture. A U-Haul is great if you need to haul a bunch of stuff and time isnt important but throughput is but when you are 10 minutes late to a job interview you just need to get there fast and space doesnt matter other than moving yourself.

 

I think you are one of many people who expect constant innovation but don't really understand what it entails. The world is going to get a shock in a few years when they realize that the current path of computing progression will stagnant. It has already begun. All the optimal algorithms have already been discovered and implemented and, although there is some research still going on to soak govt grants, it is unlikely that the P=NP problem will ever be solved which means true AI is going the path of string theory. In other words, most of the "advancements" in computing will be gimmicky and artistic and not so much hard improvements that work in every situation.

Link to comment
Share on other sites

  • Premium Member

I would add, eventually, although digital, the technology of a sensor will run into a wall of physics. You can't keep on decreasing the size of photosites for ever in order to gain resolution. Nor can you continue to increase their sensitivity and/or capacity for ISO and Dynamic Range.

 

Where that wall is, I have no idea.

Link to comment
Share on other sites

Fair enough. It is plainly obvious the numbers I quoted were just pulled out of my crack (a hundred times more, a thousand times more, etc...) and I cannot hope to teach you anything about computer science, my last science lesson was fifteen years ago, so I'll take your word for it.

 

I do expect constant innovation, the fact that it is slowing down is irrelevant, I think things will always improve, even if it takes longer than expected looking at the speed things have improved at until today.

 

Looking at the way digital has improved exponentially until now, is it that unreasonable to think that in fifty years the maximum resolution it affords will have increased at the very least five times? It could be, it's just a genuine question I'm asking myself.

 

The world has always been full of people saying "This is impossible" and other people showing them how they did it. Even if digital eventually plateaus at a resolution of 30K, and even if it takes two hundred years to get there, it will still have gone beyond what celluloid offers.

 

The only thing I'm sure of is that all the bullshit that has been cluttering this thread and turning it into another one of those "fim is better - no, digital is better" nonsense is completely irrelevant.

 

Yes, Episode II looks like a bad video movie. But Episode II is no more than a step along a path of discovery that has consistently freed filmmakers around the world by giving them more options to shoot with and making people happy. I don't see why anyone here should feel entitled to mock George Lucas or the countless others who have followed him since. It's everybody's right to say his films are bad, the only thing I have a problem with is people feeling insulted by the advent of digital cinema when they should just ignore it and move on, or embrace it. I just don't see what's so terrible about digital technology. Does the fact that filmmakers embrace digital cameras somehow negate their ability to enjoy celluloid? To use it?

Link to comment
Share on other sites

Film vs digital should be a debate left purely to each filmmaker's artistic sense and personal taste. All this talk about which one is better revolving around resolution, dynamic range etc. is hopelessly irrelevant.

 

No one should choose one over the other just because they're wondering what resolution the recording format should be, that's not what it's about. You either love film or not, it either fits your idea or not. Same thing about digital.

 

I can't believe there are still people feeling endangered by digital technology, and others who are too insecure about their love of it that they feel the need to write pamphlets against celluloid. The film vs digital debate has to be the single most boring topic a cinematographer can spend their time arguing over.

 

And yes, that means I'm out.

Edited by Nicolas Courdouan
Link to comment
Share on other sites

  • Premium Member

Nicholas, digital is phenomenal compared to what it was. I look at old tv shows shot in the 70s, the old Norman Lear shows that were shot on video tape and they haven't aged well. They look plain awful. Compared to modern digital cameras, there is no contest. Digital is a reputable medium to work on now. There is no shame in it. Likewise, because of that fact, digital filmmakers as yourself need to get off the defensive of "film is dead" and all of that. You have achieved a type of parody with film that is not equivalent because both have different pros and cons but parity in the sense of respect and appropriate to given projects.

 

Adrian is quite right that there is a limit to what can be done with the digital sensor in its current state. Unless solid state technology is replaced with something new (which I haven't heard anything about) then we will reach the end of the "true" capabilities of the medium. Broken down to a simple level, sensors are like photo resistors that read light as voltages. To get a usable range, it must be done analog so there is truly no such thing as a "digital acquisition medium" in the purest sense. Digital is binary which only has two values...1 and 0, on or off. If you captured that way, you would only have threshold looking images. So analog still is the only way to capture. But early in the chain there is an ADC to put the data into digital terms. The range of voltages being read by the photo resistor is processed, according to the programmer, into values that are usable into forming a picture. It isn't hard to see that this will have scientific limits, although not artistic ones.

Link to comment
Share on other sites

Nicholas, digital is phenomenal compared to what it was. I look at old tv shows shot in the 70s, the old Norman Lear shows that were shot on video tape and they haven't aged well. They look plain awful.

 

That, I absolutely agree with.

 

Likewise, because of that fact, digital filmmakers as yourself need to get off the defensive of "film is dead"

 

Excuse me, but I've been saying several times now that film will never die... I'm not on the film-haters bandwagon. In fact, I can't stand their silly arguments. All I'm saying is: If your only reason to shoot film is that it has a better resolution, be prepared to face the inevitable, when digital gets higher res than film, because it will.

 

Let's embrace both film and digital as two different options yielding two different results on screen, regardless of all that resolution, DR, etc. bullshit. I'd choose film in a split second if it made sense for a personal project.

Link to comment
Share on other sites

  • Premium Member

I believe about 95% of the people making statements about film theater print resolution have got the number from the ITU research from 2001 or other later works quoting that research. here is one (page 6 to 8) : http://www.etconsult.com/papers/Technical%20Issues%20in%20Cinema%20Resolution.pdf

That document pretty much backs up everything I've said here.

It agrees that 35mm negative can resolve up to 4000 lines. (I know it does; I've seen it under a microscope).

And it states that the maximum visible resolution that typically survives usual the 4-generation release printing process is about 1,500 Analog lines horizontallly.

But that is NOT 1,500 pixels or "1.5K".

 

There is no way a 2K projector can display a pattern of 1,500 vertical lines - 1,000 is all you get.

 

That is the number 1 biggest LIE that has been repeatedly promulgated by the self-appointed champions of the so-called "Digital Revolution" - that 1440 pixels on a crap HD Betacam is the same thing as 1440 analog lines on film. It is a filthy lie!

  • Upvote 2
Link to comment
Share on other sites

  • Premium Member

I would add, eventually, although digital, the technology of a sensor will run into a wall of physics. You can't keep on decreasing the size of photosites for ever in order to gain resolution. Nor can you continue to increase their sensitivity and/or capacity for ISO and Dynamic Range.

 

Where that wall is, I have no idea.

The problem simply is that extending the dynamic range would require noise-free digitization of microvolt levels of signal, with ADCs that have to operate at at least 0ne Volt levels

 

The maximum voltage a silicon photosite can charge up to is 600 millivolts, after that it "overflows".

So if your maximally illuminated photosites are illuminated to be just at the point of clipping, for a 16 stop dynamic range, the darkest pixels would have to producing about 9 microvolts. Sampling and digitizing a voltage that small is like trying to pick up grains of sugar with barbeque tongs. There is a minimal noise level in all conductors called "thermal noise" which at low levels tends to mask the desired signal. The same phenomenon also contaminates the switching signals doing the conversion, with the overall result that 14 stops is the very maximum range possible with a Silicon sensor. Claims of higher dynamic range are usually the result of noise masking, not low inherent noise.

 

It is simply not possible for a Silicon Sensor to achieve much more dynamic range than is currently achieved.

There is no Moore's Law for optical sensors, the limit was reached some time ago.

  • Upvote 2
Link to comment
Share on other sites

  • Premium Member

One of these days Keith, I would love to sit down with you over a few good cases of beer and go into deeper detail on all these things.

The sad thing is, I started working with silicon sensor cameras well over 30 years ago, and I knew exactly what all the limitations are even back then.

What's even sadder is all the math covering Nyquist limits and maximum possible signal to noise ratios can be found in 1930s textbooks.....

Then there was the fact that I worked for a major bulk film duplicator with an in-house chemical plant that look like a mini oil refinery, and not a single person in the place (except me) had more than the vaguest idea how colour film actually works.

Then there are the guys I know at one of our major TV networks, who quite cheerfully admit that that none of them have no real idea how MPEG works, or how the DVB-T digital transmission system works either. In days gone by, just everybody in the Engineering Department would have been conversant with how just about everything in the TV station worked....

  • Upvote 1
Link to comment
Share on other sites

  • Premium Member

The problem simply is that extending the dynamic range would require noise-free digitization of microvolt levels of signal, with ADCs that have to operate at at least 0ne Volt levels

 

The maximum voltage a silicon photosite can charge up to is 600 millivolts, after that it "overflows".

So if your maximally illuminated photosites are illuminated to be just at the point of clipping, for a 16 stop dynamic range, the darkest pixels would have to producing about 9 microvolts. Sampling and digitizing a voltage that small is like trying to pick up grains of sugar with barbeque tongs. There is a minimal noise level in all conductors called "thermal noise" which at low levels tends to mask the desired signal. The same phenomenon also contaminates the switching signals doing the conversion, with the overall result that 14 stops is the very maximum range possible with a Silicon sensor. Claims of higher dynamic range are usually the result of noise masking, not low inherent noise.

 

It is simply not possible for a Silicon Sensor to achieve much more dynamic range than is currently achieved.

There is no Moore's Law for optical sensors, the limit was reached some time ago.

Keith is dead on here. And one thing many dont realize who aren't savvy to electrical engineering is that there is a certain amount of variance in the way current actually flows. You can't just keep "dividing down precision of the light voltages" to achieve more DR. The tolerances of voltage "error" has to be taken into account. This is why it is important to remember that these sensors are not truly digital acquisition. If they were, they could achieve precision limited only by the bit length. But since they are not (they are analog) then they have to be within that range of analog values and take into account precision errors.

Link to comment
Share on other sites

The sad thing is, I started working with silicon sensor cameras well over 30 years ago, and I knew exactly what all the limitations are even back then.

What's even sadder is all the math covering Nyquist limits and maximum possible signal to noise ratios can be found in 1930s textbooks.....

 

Go on...

 

How do the Nyquist limit and aliasing apply to image capture? Light likes to pretend it's a wave but aren't we really just counting the number of photons to strike a certain area? And how is film immune to this? What is the maximum possible signal to noise ratio?

Link to comment
Share on other sites

Just as there is frequency in amplitude over time, frequency can be defined over space - i.e. spatial frequency - white line, black line, white line, black line = a square 'wave'...

 

With some basic enough tweaks in your head all the literature defined in terms of signals over time can be applied to spatial frequencies :)

Link to comment
Share on other sites

Film, at least at the level of abstraction relevant to the discussion simply isn't *intrinsically* defined in a way that the same methods can be applied. You can compare the relative outcomes of film with itself or perhaps even digital given a suitable testing regime, but that is *applied* to it and not inherent in its nature.

 

Heh, perhaps I went a bit too abstract after all...

Link to comment
Share on other sites

  • Premium Member

The numbers discussion is interesting but ultimately what matters is how things look on the big screen. 35mm film looks gorgeous, has wonderful dynamic range and color, and a contact print that is projected has better blacks than any digital projector on the market... But I've shot 35mm, including anamorphic, and I've shot a number of digital cameras, and I've worked on these images in post in various ways and I've seen them on big screens, and it's my opinion that resolution is not the primary reason to shoot 35mm film over something like the Alexa, Sony F65, or Red Epic. There are plenty of other reasons, including the most general one, which is that film looks nice and it's flattering to actors, and I'm sure there are plenty of personal reasons too to prefer film... but resolution? And what's the point of beating up on the Sony F900?

  • Upvote 2
Link to comment
Share on other sites

  • Premium Member

I think the best thing about film is that the look of the final image is NOT defined by the camera system you use, just the film stock and lenses.

You can use dozens of different cameras and get almost similar looking results, so you don't have to live with a look you don't like just because the producer or edit / post people demand the material to be shot with certain camera model and format for budget and workflow reasons

Link to comment
Share on other sites

I think people confuse optimal scanning resolution with measurable detail, hence some people saying that 35mm film is 6K or 8K just because there may be some anti-aliasing benefits to scanning at that high a level of resolution. But I can't find any published line resolution chart that shows a piece of 35mm movie color negative film, 24mm wide, resolving 4K or higher... you would think by now that someone would show that.

 

A look at the Kodak data sheets for 'popular' movie films, such as the 500T 5219, gives a 50% MTF rating for each R, G, and B, of 35 cyc/mm, 50, and 70 respectively.

 

Unfortunately many digital sensors, don't have MTF ratings published... but if one just took the simplistic 'pixel resolution', one would have potentially MTFs higher than that of film...

 

However, lens figures into the resulting image MTF, as well as sensor quality... and as experience shows... can swamp out any high resolution capture media...

 

On item that is not mentioned much is that the grain of film acts as a 'randomizer', and as such does 'fuzz' edges, which reduces jaggies, leading to a different appreciation of the resulting image. On the other hand, it is also known that if the 'grain' is sharp, for some situations, the resulting image is preceived by humans to be 'sharper' that one where the grain is 'soft' and perhaps has a higher MTF.

Link to comment
Share on other sites

  • Premium Member

Why do the comparisons always come back to resolution? I have yet to see any professional colorist say they prefer grading digital footage than film. When 95% of biologists accept evolution, people accept it. If 100% of colorists seem to favor film as a medium, what does that tell you?

Link to comment
Share on other sites

 

A look at the Kodak data sheets for 'popular' movie films, such as the 500T 5219, gives a 50% MTF rating for each R, G, and B, of 35 cyc/mm, 50, and 70 respectively.

 

Unfortunately many digital sensors, don't have MTF ratings published... but if one just took the simplistic 'pixel resolution', one would have potentially MTFs higher than that of film...

 

I find interpreting MTF difficult. Technicians often find the contrast value at lower freq' usefull. I always wonder what would be the highest visible freq' at the lowest possible contrast. I think that contrast is around maybe 15%. The graphs don't normally go there, but at 15% contrast the freq' should go well up on what you are quoting.

 

I think there has been a lot written on the relation between MTF and pixel density. I don't think the pixel columns are literally comparable to the sinusoidal grey wave that gives the MTF, nor even the black and white lines of the lp/mm concept.

Link to comment
Share on other sites

Why do the comparisons always come back to resolution? I have yet to see any professional colorist say they prefer grading digital footage than film. When 95% of biologists accept evolution, people accept it. If 100% of colorists seem to favor film as a medium, what does that tell you?

 

The comparison is specious. I would sort of wonder about such an assertion at this point in time when most if not all major Hollywood productions are scanned to DI these days. Hence the 'colorist' is working with a digital image.

 

It may be that back in the olden days, 100% of the then Hollywood 'colorists' (where there 'colorists' as they are known today, back when one had no real method to really 'change' things, except for 'timing', or more exotic processes like 'preflashing' or more recently 'bleach bypass', etc...)... anyway, it may have been that 100% of those working at such things, preferred some ASA 50 film rather than push process on Double-X to eek out a 400 ASA rating... with attendant increase in grain...

 

I personally wouldn't mind a 4x5 sized movie film negative... but the economics of the situation do not allow such...

 

The question then is what level of image parameters produce an acceptable image quality.

 

I think digital for the 'high end cameras' is approaching the 'resolution' parameter, if not exceeded 'film'.

 

I think the next 'big thing' is the issue of bit depth and resulting dynamic range, especially in the 'low end'... the brackish water I swim in of DSLR filmmaking...

Link to comment
Share on other sites

@jeclark2006

 

Oh, I meant to add, back on that lighting thread about inverse square law, someone told us that the nnecessary "flux" concept was missing. I had a laugh, wondered momentarily whether I should brush up on my surface integral calculations....

Link to comment
Share on other sites

Just as there is frequency in amplitude over time, frequency can be defined over space - i.e. spatial frequency - white line, black line, white line, black line = a square 'wave'...

 

With some basic enough tweaks in your head all the literature defined in terms of signals over time can be applied to spatial frequencies :)

 

Thanks for the reply. This helps me a bit, conceptually. How does this apply when not shooting unevenly lit brick walls? Film?

Link to comment
Share on other sites

 

Thanks for the reply. This helps me a bit, conceptually. How does this apply when not shooting unevenly lit brick walls? Film?

 

Eye lashes with mascra usually form some sort of 'contrast' which gives the impression of 'sharp' focus... or the lack thereof...

 

In some situations the sharpness of the system is degraded. Haze/soft focus filters, etc... or in the case of digital, 'anti-aliasing' filters to avoid jagged lines on image detail.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...