Jump to content

Tired of hearing "Film is Dead?" Well So Are We!


Recommended Posts

Film does a very clever tradeoff between frequency response on the one hand, and dynamic range on the other. From a purely mathematical point of view we can separate out frequency response and dynamic range, ie. defining them independantly of each other. Compartmentalising them. And this component based approach works really well in the domain of digital frameworks. What doesn't always work so well is when recombining components. They don't always go back together again, as easily as they were separated out in the first place. The problem facing all the Kings Men with Humpty Dumpty.

 

In film, the amount of light reaching the screen is a function of the film's density. But what exactly is density? It isn't something that can be measured at a single point. But nor is it something can be measured through an area measurement (such as that done by a sensor cell or 'pixel'). The density of film is a statistical property. It has as much to do with the distribution of measurements as it does a measurement at any single point.

 

Now if all the silver bromide particles in film were the same size and the smallest used in conventional movie film, the resulting image would be 10X the resolution of 4K digital. But it would be extremely high contrast result. This is where the trade-off occurs. Film trades resolution for dynamic range. For the ability to encode a good distribution of grey tones. A side effect of this is that effect we notice as and call grain. What we don't readily acknowledge is the benefits of the tradeoff. The tradeoff doesn't completely sacrifice resolution. It trades some of it - but it trades across the entire frequency band - in a completely random and even handed way. It doesn't trade all the high frequencies everywhere. At any given point the trade could be a high frequency or a low frequency, or anything in between. It is completely random. And right next door, at scales far below that which a sensor cell ever sees, is another trade, also completely random.

 

What this means is that film retains a good deal of the high frequency components but in a way that is difficult to technically localise. But our brains can see it.

 

The following type of image is really quite useful as a guide to understanding this. If you were to do a density measurement at any single point all you would get is one of two values. It is a 2 bit image. But in a way analogous to the way film and perception work, a grey-scale image is reconstructed because it is not just the density at any given point that defines the image. It is also the distribution of those densities, and it's not a distribution that can be localisable to any given area or point. While this can be demonstrated here using a digital image, it is far more powerfully demonstrated in film.

 

cat0-variant.png

  • Upvote 1
Link to comment
Share on other sites

KODAK is coming out of bankruptcy. will be trading in 2 weeks time.

 

http://www.bbc.co.uk/news/business-23774639

Yeah, I read that in the Hollywood Reporter. Apparently the judge fast tracked it, forced, whatever. I would guess NOBODY wants film to "die". I would also bet the powers that be, wanted to make sure the supply of film is maintained for all those movies being shot on this "dead" media!!

Link to comment
Share on other sites

What I'd like to see manufactured in the digital sensor universe, is an alternative approach to sensor design, one which takes some ideas from film and implements them. The first thing I'd recommend is making sensor cells of different sizes and in a random configuration rather than a raster layout. This would immediately eliminate moire issues without having to do any filtering. The second rather counter-intuitve idea is to decrease the number of bits per sensor cell - the purpose of which is not to degrade the image but to provide a relaxation of the constraints otherwise limiting the manufacture of smaller area sensor cells.

 

Consider for a moment, a 2 bit sensor cell that was 1 mm x 1 mm square (for the sake of argument). Such a cell would be capable of capturing 4 light levels (0,1,2,3). Now lets replace this with 4 x 1 bit sensor cells, but each of which was 0.5mm by 0.5 mm.

 

For the same area, the amount of light levels captured is slightly increased. Each of the smaller cells can only register two light levels (0,1) but which gives a total value for a 1mm x 1mm area, between 0 (0,0,0,0) and 4 (1,1,1,1). Not only that but one also increases the frequency response (sharpness).

 

By using cells of varying sizes one can improve the dynamic range. Indeed cameras such as the Alexa, as I understand it, have already caught onto this idea, using two or more different sized cells.

 

Now one of the problems with a random configuration is that it can introduce grain and it would be a fixed pattern. However insofar as the pattern can be calibrated (unlike film) it can also be cancelled out. If the position, size and shape of each sensor cell is known (as it would be or can be) a custom filter can be used to cancel out the grain it would otherwise produce as a side effect. However it could also reduce the definition.

 

A way to eliminate a fixed grain structure (rather than grain itself) would be to change the pattern on each frame. One could have a number of sensors, each of which is used in alternation. However this wouldn't completely solve the problem as the pattern of each sensor would emerge as a statistical observable. A better idea is to have a unique sensor pattern on each frame.

 

This could be done by manufacturing a ribbon of sensors (much like film).

 

One way to create a ribbon of low bit sensors would be to use nano-technology. For example one could use light sensitive silver bromide particles suspended in cellulose acetate.

 

C

Link to comment
Share on other sites

 

....One way to create a ribbon of low bit sensors would be to use nano-technology. For example one could use light sensitive silver bromide particles suspended in cellulose acetate.

 

C

 

 

Is this a joke? We are led to imagine a very long ribbon of acetate with an emulsion of silver bromide stuck to the front of the cellulose acetate (with sprocket holes).

 

I think the idea of pixels with randomized size and position, or "pixels" that can somehow manifest structure of that sort, different for each frame......is actually quite a reasonable and obvious aspiration. But modern sensors are headed towards just smaller and smaller pixels. A very different direction. A sensor with randomized pixels that vibrated (the whole sensor) like the Aaton might be an achievable approximation with current technology.

 

Photons, more densely laden with information than one can imagine, drape themselves onto a unit of silver compound. The image is attempting to make a direct impact upon the emulsion. At least at that point we have what seems to me an astonishing photographic "fact". Compared to that, a sensor has each pixel averaging a photon count. The photons, given the information they must convey from the subject, seem enormously sophisticated, but are reduced to a trickle of zeros and ones. Then that information requires effectively a simulation to make it approximate to the look of film emulsion.

 

Photography, with a film emulsion, seems an enormously sophisticated thing. The digital sensor in comparison seems quite brutal and crude, and this fact won't change with more pixels. If you could (and maybe it already happens) from the sensor output, simulate an image that one could not distinguish from film, why would you do it? Because it's new and exciting, because history demands it, because this is now normal.

 

I think cinema contains two things that are oddly opposite to each other but which somehow synergize very well. On one hand there is what I called the photographic fact. Images of the physical world impacted upon a medium in literal fashion. On the other hand, everything including the technological means of realization can be a fiction, smoke and mirrors to create an experience. Digital images have no problem enabling the second part, but they aren't capable of the first part. They are not even attempting to do it.

 

Historically, we are discarding the real for a simulation. It really sux.

 

Oh well, just venting or riffing I guess.

 

Link to comment
Share on other sites

 

 

Is this a joke? We are led to imagine a very long ribbon of acetate with an emulsion of silver bromide stuck to the front of the cellulose acetate (with sprocket holes).

 

I think the idea of pixels with randomized size and position, or "pixels" that can somehow manifest structure of that sort, different for each frame......is actually quite a reasonable and obvious aspiration. But modern sensors are headed towards just smaller and smaller pixels. A very different direction. A sensor with randomized pixels that vibrated (the whole sensor) like the Aaton might be an achievable approximation with current technology.

 

Photons, more densely laden with information than one can imagine, drape themselves onto a unit of silver compound. The image is attempting to make a direct impact upon the emulsion. At least at that point we have what seems to me an astonishing photographic "fact". Compared to that, a sensor has each pixel averaging a photon count. The photons, given the information they must convey from the subject, seem enormously sophisticated, but are reduced to a trickle of zeros and ones. Then that information requires effectively a simulation to make it approximate to the look of film emulsion.

 

Photography, with a film emulsion, seems an enormously sophisticated thing. The digital sensor in comparison seems quite brutal and crude, and this fact won't change with more pixels. If you could (and maybe it already happens) from the sensor output, simulate an image that one could not distinguish from film, why would you do it? Because it's new and exciting, because history demands it, because this is now normal.

 

I think cinema contains two things that are oddly opposite to each other but which somehow synergize very well. On one hand there is what I called the photographic fact. Images of the physical world impacted upon a medium in literal fashion. On the other hand, everything including the technological means of realization can be a fiction, smoke and mirrors to create an experience. Digital images have no problem enabling the second part, but they aren't capable of the first part. They are not even attempting to do it.

 

Historically, we are discarding the real for a simulation. It really sux.

 

Oh well, just venting or riffing I guess.

 

 

What we need is some more back to basics on how an image works. Indeed what is it? Why is it theorised as an effect? Or as an illusion? That's part of the problem. Two thousand years of ancient superstition in relation to images - that images are illusions. And the strange impulse to maintain this proposition.

 

Re. the sensor ribbon. Not his is not a joke. I'm serious, although not without a certain ironic point being made. The sprocket holes need not be necessary. A simple cheap stepper motor can replace the entire complicated mechanics otherwise associated with sprocket holes.

 

C

Edited by Carl Looper
Link to comment
Share on other sites

One of the unique features of film and I'm thinking of film projection here, is that as one takes in the image one's eyes are not in any fixed position with respect to the image (and not just film images). They move around. A rich resonance is taking place between the random arrangement of silver/dye in the film and the random arrangement of the retinal cells in one's eyes. The fact that as one's eyes move around we don't see the image as moving around (ie. in the opposite direction) is technically very strange but understandable from an evolutionary point of view. Our brain corrects (somehow) for the motion of our eyes. As it would need to.The result is that the image is not in any particular retinal pattern, at any given moment, but is reconstructed as independant of such. It hovers in it's own strange frame of reference. A parallel universe hovering in relation to the more technical ones of retinal cells, sensor pixels, or silver/dye clouds.

 

The image is the result of decoding what is encoded in film, digital screen, or retinal screen.

 

But what is encoded? It is an image. It is an image encoded. It is an image decoded. Between these two images is just a technical transit lounge, be it by means of the density of silver in film, or numbers stored on a card, or any other means of maintaining data. However the data itself is not an image. Or shouldn't be confused with such. An image is that which is visible. It is that which we see, rather that which we can't. Even in film the image is not there in the film. What is in the film is the data necessary to reconstruct an image. The image is that which occurs as a function of decoding the data in the film. One shines light through the film, or holds the film up to a light. Only then is an image created (or decoded). Until then the image is asleep, or hidden, in it's own parallel universe.

 

But what of the image which precedes the act of encoding. That which falls on the sensor, or on the film. Is that not invisible? Certainly we can see something similar to that image when looking through a viewfinder, or even without such (a parallex viewfinder or not even). But what if we're shooting blind? If an image is that which is visible, to whom or what is the image visible, if we are filming blind? How can it be an image that is encoded? Surely it is something else. Either that or we can't define an image as that which is encoded, or we can't define an image as which is visible.

 

However this depends on whether an image, to be visible, requires an observer.

 

We can, if somewhat with difficulty, define as an image/visibility that which does not necessarily require an observer, prior to encoding/decoding. That image which is decoded (and constitutes a sensory experience in ourselves) can be regarded as a reconstruction of this much stranger image. We can say that what precedes encoding is the not-encoded. (or not yet encoded). It is then encoded. By means of technology. And this is subsequently decoded (by a reciprocal technology). The encoding technologically relates the not-encoded to the decoded. Is the not-encoded and the decoded the same thing? In the limit they could be. A fully decoded image could become the not-encoded. But we can also say the encoded, which sits between them is capable of interference with this equation. To such an extent that what is decoded never existed as a not-encoded.

 

There is only the decoded image. The not-encoded remains a conjecture. A proposition.

Link to comment
Share on other sites

http://www.washingtonpost.com/lifestyle/travel/a-new-trend-in-photography--using-film/2013/08/22/76486a16-08ec-11e3-b87c-476db8ac34cd_story.html

 

Washington Post thinks using FILM is a new trend in Photography these days. LMAO! New trend? it was always there. maybe the younger generation in their 20s who may have never seen a film camera before in their life are now getting attracted to it.

 

ALSO,

 

FUJIFILM launches a new polaroid type camera called instax mini 90.

 

and we are discussing film is dead?

Link to comment
Share on other sites

 

.....The image is the result of decoding what is encoded in film, digital screen, or retinal screen....

 

 

Carl,

Lots of interesting things. I started writing a respons to your previous post and was called away. Now speed reading this above. Perhaps I can write something in my sleep. I had one thought....

 

The impression that the photons make upon the emulsion in the camera is not an encodement. It is a literal impression. I call it factual, trying ti give a sense of its' realness. It is palpable evidence, resulting from direct connection with the thing being photographed. This literal impression is factual as soon as the frame is exposed. All that chemical processing does is render it visible again to the eye.

Link to comment
Share on other sites

 

Carl,

Lots of interesting things. I started writing a respons to your previous post and was called away. Now speed reading this above. Perhaps I can write something in my sleep. I had one thought....

 

The impression that the photons make upon the emulsion in the camera is not an encodement. It is a literal impression. I call it factual, trying ti give a sense of its' realness. It is palpable evidence, resulting from direct connection with the thing being photographed. This literal impression is factual as soon as the frame is exposed. All that chemical processing does is render it visible again to the eye.

 

Rather than a factualism I'd suggest "realism". The factual acts as a kind of support for such. Factualism as that which participates in the transit of an image, but which is not, in itself, quite an image. But factualism as that which is nevertheless an important operator, more so than non-factualism.

 

To speak of the emulsion encoding an image is just a way of speaking. A way of talking in a certain way. We could say factualising the image. A sensor or film emulsion encodes/factualises an image, by which is meant it transforms an image from something visible, into something invisible or hidden. It buries the image. Encryption. Entombment. A projector/screen/eye/brain performs the reciprical operation - decrypts, or decodes the image by which is meant it reverses the transformation, brings the image back from the invisible, via the same factual, into the visible. Makes visible again that which was hidden or in transit.

 

 

 

Realism, but one that treats the image as real, rather than as an illusion. An image as that which impresses itself on film, or on a sensor, in a factual way. The image, not as a representation of something other than an image. The image itself as a reality.

Edited by Carl Looper
Link to comment
Share on other sites

In the film/digital debate I don't want to pit one against the other. Each works with the image in it's own way. Each makes it's own tradeoffs. Film has it's way and digital it's way. Film has a future but it's one that is not as big as it once was. Star Wars might be big but film (as a market) has been in decline for decades (or dead for decades is another way of putting it) but that doesn't mean it doesn't have a future. That it hasn't had a future. Film is still a very easy system to manage. It does not depend as much on a big tech infrastructure to support it. It has it's origins at the beginning of the industrial revolution rather than in the middle of it. It is therefore, when compared to contemporary technology at least, relatively simple to recreate. Perhaps not as simple as pre-industrial technology (such as paint/painting).

 

What has always been important in any art is not so much the technology (as much as we love it - I can't wait for quantum computers and teleportation) but the creativity which one then applies to it The simpler the technology the more space there is arguably created for creativity. However the technology is not neutral. An artist working with film learns it's secrets and how to make it work. The same goes for digital. Or it should do. The best work in film, or digital, is by those who come to terms with their nominated materials and make them work, within, or despite, or because, of their limits. Digital is, however, very opaque to most. It becomes easier to not understand it - to become no more than an appendage. A pawn in the process. To become almost irrelevant. The astronaut, on life support, dependant on a thousand technicians, ensuring the machine all works according to plan. The astronaut as an appendage. As an actor required to step down the ladder and plant a flag. And yet this isn't the case at all. There is Neil Armstrong, stretching his thirty seconds of fuel to the last moment. Or the sheer creativity that the Apollo 13 astronauts (and ground crew) relied on to bypass the machine and make it do what it wasn't engineered to do. That which required an intimate knowledge of the technology with which they were working.

Link to comment
Share on other sites

http://www.washingtonpost.com/lifestyle/travel/a-new-trend-in-photography--using-film/2013/08/22/76486a16-08ec-11e3-b87c-476db8ac34cd_story.html

 

Washington Post thinks using FILM is a new trend in Photography these days. LMAO! New trend? it was always there. maybe the younger generation in their 20s who may have never seen a film camera before in their life are now getting attracted to it.

 

ALSO,

 

FUJIFILM launches a new polaroid type camera called instax mini 90.

 

and we are discussing film is dead?

 

Film has a past. A huge past, which means, by sheer inertia, if nothing else, it has a future. For the young, the past is just as new as the present. Indeed more so because the past (and future) is always much bigger than the present. There is more to discover. The young are just able to see this more clearly.

 

C

Edited by Carl Looper
Link to comment
Share on other sites

In the film/digital debate I don't want to pit one against the other.

I don't really want to force a debate about it either. But these two things are not equivalent, or things which we could choose as alternatives based on such apparently objective criteria as cost, practicalitry or (less objectively) "aesthetics"....... I think it is of a similar order to the issue about replacement of food, or eventually our bodies, with new versions that are genetically engineered. As or when this happens it will be assumed as inevitable change, by some progress. With this one innocent word, (progress) we mask a genuinely ugly tendency as a simple, practical, inevitability.

 

If we can't tell the difference between the value of an image on film vs an image on a digital sensor, does the cinema still have meaning? Have we just leaped forward to some new thing, where what was once the cinema, is now something else. And all is OK. Money is made. People are excited about the technological possibilities. The new becomes.....

 

Then one day we are walking in a museum and we see a glass plate photograph. It will be like a kick in the guts. An instantaneous recognition of something, someone, another time and place. But given with such certainty. And we may wonder, why do the digital images in the new cinema lack this.

 

Alternately, walking in the museum, noticing the glass plate photographs, one might feel some affinity with simulations one had recently seen. But there is a dull ache. The instantaneous recognition of the "real", the "fact" of the photograph, the moment of experience that almost occurred, but did not occur. Because or nervous system, our style of seeing was de-natured, conditioned to something else, something less. So what was once considered an indication of the real, something priceless, became as if something remembered from a dream, an abstract connection that one could not quite make. But this thing that one hungered for in the museum, was something we already owned. The idea that anyone or anything would take it away or obscure it is obscene.

 

But this obscene trend is an historic inevitability. We are loosing our natural ability to make direct connection with the universe around us, and the ability to better understand it in the abstract through photography - film in the cinema.

 

 

Link to comment
Share on other sites

I hate these conversations. This kind of argument with any other artist wouldn't hold up as well. Would you tell an illustrator that Illustrating digitally is what he or she should be doing. "Hey buy a mac, and make your art digitally. Those paints and paint brushes are expensive, and then you have scanning fees." Isn't that a moot point? If an illustrator has chosen paint, watercolors, colored pencil, charcoal, or whatever inspires that artwork, who are you to question that?

Give this man an award...I'm being serious. Wally Pfister said himself the sane exact thing in an interview two years back.
Link to comment
Share on other sites

As much I've sought to distinguish something like film, from digital imaging, in terms of the technology (by reference to other technologies such as those used in painting, or used in ceramics, etc) there is also that which is in common to both film and digital imaging and complicates the debate. Indeed it is the very reason for the debates. And it has it's origins in photography.

 

When photography arrived it was immediately grasped as a new art. Many thought it would be the death of painting. As ludicrous as this might sound it was also quite true but what died was not painting as such but a particular type of painting. Painting itself continues and has branched out into all sorts of styles and approaches. And of course the technology of paint is not in any way altered by film. Paints are still made in the way they've always been made, and new types of paint are invented.

 

Photography introduces a very different kind of art from almost all the others. Indeed calling it an art has sometimes proved quite difficult. Many traditional artists think there isn't much art to photography at all. They'd argue it's more science than art. However even traditional art has always had a bit of a quasi-scientific approach to it. The rules of perspective, rediscovered during the Renaissance, was something that many artists got into and used in their work. Mozart wrote many music pieces that he generated entirely by mathematical algorithms he had written, ie. entirely rule based work. There wasn't such a separation between art and science as there might be today. The separation is, however, quite artificial and is a direct result of industrialisation and specialisation. However it's not as if team work was invented at the beginning of the industrial revolution. However it gets a really big boost there. People become not just specialised but highly specialised, so much so they have next to no idea of anything other than their particluar role in the machine. This was not the case in pre-industrial times. One was educated (if one was) across all disciplines and expected to have a big picture take on everything. The term "Renaaisance Man" has come to mean this big picture outlook.

 

But what does photography do. What it does is to radicalise the scene. It throws into question traditional concepts of what visual art means. What role is the artist (the traditional one) if a machine can paint a picture with more realism than an artist? It should be noted that an art movement called "Realism" was going on prior to (and following) the invention of photography. In other words photography and realist paintings were coming into direct conflict with each other. And the reason for the conflict was over which rendered reality better.

 

Now what happens at this juncture is that certain photographers start making their photographs in a way that resemble studio paintings, using as many techniques as they possibly can to make a photograph look as if it was painted by a painter. Some of this photography is quite bizarre, and if it wasn't for a certain novelty value, they can be regarded, I think, as quite artless. The art of photography finds it's feet in what photography itself was, rather than what it wasn't.

 

There is a difference that becomes arguable between that "photography" which is little more than raw material for a graphic artist (for better or worse) and photographic art proper.

 

The concept of realism also meets a fork in the road. There will be that realism that traditional artists had been doing using paint, pen, paper, (or today a computer) can achieve, and that which photography achieves. Realist theories of photography emerge and mature over the decades. It is realism that becomes an important attribute of photography. Not the "realism" that a graphic artist might occasionaly aim at or otherwise achieves, but a different form of realism - a very new form - that which a photograph achieves - which a machine achieves.

 

It is this realism that impresses early commentators.

 

But here's the important thing for today. This realism of which early commentators spoke, is not something that is intrinsically facilitated by photo-chemical materials. It is also something that can be facilitated by video or digital or any other photo-mechanical machine.

 

Realism.

 

Something like digital photography emerges from a graphic artist's approach to photography - photography as raw material to facilitate a work of graphic art. It soon becomes obvious that digital photography can also operate in terms of photographic art (rather than just graphic art).

 

From scanning film to scanning life.

 

C

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...