Jump to content

A Blast From The Past


Keith Walters

Recommended Posts

Just out of interest, who else quite enjoys lurking around these arguments ?

 

(Its like watching professional wrestling)

 

It's beyond me that you'd take time out of your life to prove something to someone on the other side of the world, and this 'proof' will have no tangible benefit to anyone except, what ? You're 'right', what for - for this, what, thread ?

 

Why don't we all just have a barbecue, beer etc... or dress up in those foam sumo suits and bash it out.

 

Keith found an article that is worth sharing, at least its interesting enough for me - and Karl, you've an intellect and apparent amount of experience that is in turn worth sharing but this reactionary stuff you're pushing is just plain, well... plain - I don't get your motivation here, but so as not to appear inconsistent I'll admit to deriving a small pleasure in rubber necking past the train smash of "nyah ! nyah ! blah blah blah - nayh! - et cetera"

 

I dunno, I guess I'm just suggesting that people take a step back and look at the larger context of these 'discussions' - what purpose do they serve for you personally ?

 

I'm idle at work for moments during each day tis all...

Edited by Chris Millar
Link to comment
Share on other sites

  • Premium Member
It's beyond me that you'd take time out of your life to prove something to someone on the other side of the world, and this 'proof' will have no tangible benefit to anyone except, what ? You're 'right', what for - for this, what, thread ?

There is a certain person who is probably, as we speak, twitching uncontrollably with the urge to post something, but knows if he does, he will probably live to regret it.

Actually, I just want him to post here so I can say: "Ah well; there goes the neighbourhood..." :lol:

Don't know why you can't understand that....

Link to comment
Share on other sites

Just for you and Richard though, I am going to put in some legwork and have some solid facts on this trend that no one else here acknowledges for next weekend.

 

Thanks. I was going to do it but since you're doing it, I will await your findings. I acknowledge that digitally shot movies are in the theaters, I just want to know what the film vs video percentages where for 2009?

 

Besides, I have a 35mm feature to finish. And another to plan.

 

R,

Link to comment
Share on other sites

  • Premium Member
They used to make better mummies in Ancient Egypt before the Roman Empire conquered it. Then quality went down-hill before the practice was discontinued.

 

I used to do some volunteer work with the California Museum of Ancient Art. They have a gilded mummy head from the Greco-Roman period (first four centuries CE) that is absolutely first class. And the cartonage of Wen Nefer from the second intermediary period, which is nowhere near as well done. Take pyramid building, for instance -- a couple centuries of it, early in the Old Kingdom, and no projects anywhere near that scale ever again. The Egyptians had their share of ups and downs over a couple dozen centuries -- the two intermediary periods being very major downs, along with the Amarna period.

 

After all, this thread *is* called "A blast from the past" ;-)

 

 

 

 

-- J.S.

Link to comment
Share on other sites

  • Premium Member
In any case, my point has far more to do with 2009 than your magazine article from 1994 does. So what? They jumped the gun in 1994, which is understandable considering that an *electronics magazine writer* probably had only slightly more knowledge of what film was and how it worked in 1994 than he would today, practically none.

 

The replacement of film cameras with digital ones is far more imminent now than in 1994, which in turn was far more imminent to seeing film production replaced then, than in 1956.

 

I know getting the facts straight is not high on Karl's list of priorities but in an effort to prevent this thread veering off in the direction of other people violently disagreeing with statements I did not actually make, this is what I actually said in the first post:

 

(From the article)

"Sony then developed an HDTV recording and editing system, which was loaned to Hollywood as a replacement for film. But Hollywood didn't buy, finding film cheaper and easier to use with better picture quality"

 

Nothing unusual there, I hear you say.

 

Yes, but the magazine in question was the April 1994 issue of the now defunct publication: Electronics and Wireless World, that I found during a cleanout at work. He wrote that 15 years ago!

 

So, just to be clear on this, the *electronics magazine writer* in 1994 was not commenting one way or the other on whether video would replace in 1994 or ever, only that the announcement of the imminent death of film had been made ten years before that, and it never actually happened.

 

The main thrust of the article wasn't about that anyway, it was more a comment on the "Indecent Haste" with which proponents of Analog HDTV were trying to get their systems cemented into place before somebody produced workable digital systems, which in fact happened only a few years later.

 

Then, as now, many of the attempts at producing "Digital Cinematography" systems seem to be more directed at justifying certain people's existences than making life easier for cinematographers.

Link to comment
Share on other sites

  • Premium Member
was

Last night I watched "Knowing" on Blu-Ray which was shot with Red digital cinema cameras. I noticed that a lot of highlights were overexposed which means that film has a lot better dynamic range.

 

Hi,

 

I was very disappointed with Knowing, Focus was in the wrong place in many scenes & I found the motion rendering very hard to watch. Not sure what was wrong, saw Ben Button on the same screen 2 weeks before & was most impressed with the images.

 

Stephen

Link to comment
Share on other sites

  • Premium Member
Hi,

 

I was very disappointed with Knowing, Focus was in the wrong place in many scenes & I found the motion rendering very hard to watch. Not sure what was wrong, saw Ben Button on the same screen 2 weeks before & was most impressed with the images.

 

Stephen

Not to mention the shriekingly fake "Autumn" effect that looked like they'd sprayed the forest with Roundup a couple of weeks before, or the fact that all of Nicholas Cage's exciting adventures had absolutely no bearing whatever on the final outcome, and there was never any possibility they could.

 

The script had actually been kicking around for decades, going from one studio to another. After seeing the film I could see why.

Link to comment
Share on other sites

  • Premium Member
Last night I watched "Knowing" on Blu-Ray which was shot with Red digital cinema cameras. I noticed that a lot of highlights were overexposed which means that film has a lot better dynamic range.

You have no idea how stunned I am over this revelation....

 

It is entirely possble that there never will be a video camera with a dynamic range that can equal film.

That's not to say that people may not come to accept all the deficiencies of electronic acquisition and film usage will eventually die out, but unless there is some totally unforseen technological breakthrough. it will never be because digital cameras are better.

Edited by Keith Walters
Link to comment
Share on other sites

You have no idea how stunned I am over this revelation....

 

It is entirely possble that there never will be a video camera with a dynamic range that can equal film.

That's not to say that people may not come to accept all the deficiencies of electronic acquisition and film usage will eventually die out, but unless there is some totally unforseen technological breakthrough. it will never be because digital cameras are better.

 

"Never"?

 

So... you are saying that there is no possibility that anyone, ever, between now and the end of time, could conceive of a digital device that could ever, ever, ever be better than film. Ever. Film is like a hundred years old, give or take, but Regardless of what unforeseen technological developments that may occur in the next trillion years, it will never be better by digital technology. Never. EVER.

 

Just so we're clear.

 

R.

Link to comment
Share on other sites

  • Premium Member
"Never"?

 

So... you are saying that there is no possibility that anyone, ever, between now and the end of time, could conceive of a digital device that could ever, ever, ever be better than film. Ever. Film is like a hundred years old, give or take, but Regardless of what unforeseen technological developments that may occur in the next trillion years, it will never be better by digital technology. Never. EVER.

 

Just so we're clear.

 

R.

 

So, what part of:

"but unless there is some totally unforseen technological breakthrough"

did you not understand?

 

All the currently used digital image acquisition techniques are based on 1960s technologies, which are rapidly reaching the point of zero returns for R&D dollars. They all have technological flaws for which there seems no workable solution.

 

The problem in all cases boils down to much the same problem: Despite the 'Digital' tag being plastered everywhere it will fit, (and a lot of places it doesn't), at the light-gathering 'coalface' they are still very much analog devices, trying to output sometimes extremely small analog signals, in an extremely noisy digital environment.

 

Digital devices (and signals) have very high noise immunity, but, if an analog signal gets contaminated by noise (such as random digital hash) prior to the Analog-To-Digital conversion process, no amount of digital post-processing is going to ever going to get rid of it, at least not without taking some of the low-level desired signal with it. The dynamic range is going to be forever trapped between the 0.6 Volt "ceiling" of a silicon photocell (that is the point where it saturates), and a noise floor about 11 stops below this. (This noise floor is somewhat 'elastic', depending on the level of fiddling the manufacturers engage in with so-called noise reduction, whether specific, or as a by-product of a recording compression process, but the fact remains, there is very little room left for further improvement)

 

This is probably the least understood aspect about the technology: Digital circuits are extremely tolerant of manufacturing parameter spreads and system noise, because, (uaing an old-fashioned 5 Volt digital system an example), anything slightly above 2.5 volts counts as a "1" and anything slightly below 2.5 Volts counts as a zero. So it is possible to produce gigantic numbers of transistors on a single chip with a good manufacturing yield, because there has to be a truly a massive parameter "deformity" in a particular transistor for it not to work. Digital circuits are very fault-tolerant.

 

(The same applies to digital videotape, optical discs and so on. As long as the readout circuit can clearly distinguish between "zeroes" and "ones" 100% data recovery is the norm, although in practice it is more cost-effective to allow a higher error rate and use error correction).

 

With an analog signal no such parameter relaxation is possible; they are simply not fault-tolerant. If you are making an image pickup device, all of the photocells have to be, as far as possible, the exact same dimensions, and have identical electrical characteristics. An analog circuit cannot tolerate anything like the parameter spreads acceptable for a digital device, and the consequence of this is that the massive advances that have been made in digital processing technology are simply not applicable to the analog part of the system.

 

To give a more film-like response would require each photocell to have its own individual logarithmic "modulator" system that would reduce its sensitivity as the light level increases, which is essentially what happens with film emulsion. Basically it would be as if each photocell had its own individual iris system. (This would also overcome a major flaw in digital image processing systems: The massive imbalance in the digitization ratio over the entire tonal range. For example, with a 10 stop tonal range and 10 bit Analog to Digital converter, the range between 9 and 10 stops gets divided into 512 discrete steps, while the first stop only gets one bit allocated to it and so can only ever be "on" or "off").

 

But here's where we run into a technological brick wall. A lograthmic mdulator system would require quite a few transistors attached to each individual photocell, and they could not have the sort of parameter spreads that would be readily acceptable for digital devices. If it took six extra transistors per sensor (which is rather optimistic), a 12 megapixel sensor would need 72 million extra high-precision transistors (or a lot more lower-precision devices). There is simply no current manufacturing technology which is up to the task, and given the relatively minute size of the HD sensor market in the scheme of things, it is unlikley anybody would be prepared to spend the R&D dollars necessary.

 

There have been reports over the years of manufacturers supposedly overcoming this problem, but, nothing ever seems to come of it. A proposal is one thing, actually getting the thing to work is quite another.

 

I think what is more likely to happen is that HD sensor technology will advance slowly and painfully to the point where it is considered "good enough" by most producers, and film usage will then die off for purely economic reasons. After that, with the yardstick of film quality gone, there be no incentive for manufacturers to improve silicon sensors further, and no further advances will be made.

 

There are any number of technologies that more or less "peaked" long ago. The naive assumption is that, because a particular technology has undergone massive and spectacular advancement in a very short time, (computer technology being the standout example), the same level of advancement can be automaticaly expected from entirely unrelated technologies.

 

You might imagine that a 2009 model car is technologically light years ahead of an equivalent 1959 model, but in reality its basic operating principles have hardly changed at all. What has happened is that advanced features that were only available in luxury models in 1959 have become mainstream. Your 2009 model might be more fuel efficient (but not massively so) and it might give a more comfortable ride, but when it comes to what the device was designed to do, ie get you from A to B on the minimum amount of fuel possible, the improvement is not that earth shaking, again, compared to something like computer technology.

 

Vaccination was discovered well over two centuries ago, and as far as public health measures go, nothing comes even close to it with respect to lives saved for money spent. Dozens of lethal childhood diseases which used to be endemic even in Western comuntries, for all practical purposes no longer exist there.

 

Yet after 200 years of research, there is still no effective vaccine for the common cold, and flu vaccines remain a hit and miss affair. Many other serious diseases, such as AIDS and Malaria similarly show no sign of being defeated anytime soon.

 

Rocket engine technology went through some spectacular technological advances between 1950 and 1970, but in the nearly four decades since then, there haven't been any major advances at all. In real terms it costs about the same to put a kilogram of payload into orbit now as it did in 1970.

 

There were massive advances in digital still camera quality between say 2000 and 2004, but the 3.2 Megapixel 3:1 zoom model I bought in 2004 still holds up very well today, at least for everyday use. For normal size prints, the 8 Megapixel camera I bought earlier this year doesn't give me any real advantage, and even for larger prints, the difference is far from obvious. High-end cameras may be a different story, but I don't have any real use for those, and neither do enough other people to make more intensive R&D worthwhile.

Edited by Keith Walters
Link to comment
Share on other sites

So, what part of:

"but unless there is some totally unforseen technological breakthrough"

did you not understand?

 

All the currently used digital inmge acquisition techniques are based on 1960s technologies, which are rapidly reaching the point of zero returns for R&D dollars. They all have technological flaws for which there seems no workable solution.

 

The problem in all cases boils down to much the same problem: Despite the 'Digital' tag being plastered everywhere it will fit, (and a lot of places it doesn't), at the light-gathering 'coalface' they are still very much analog devices, trying to output sometimes extremely small analog signals, in an extremely noisy digital environment.

 

Digital devices (and signals) have very high noise immunity, but, if an analog signal gets contaminated by noise (such as random digital hash) prior to the Analog-To-Digital conversion process, no amount of digital post-processing is going to ever going to get rid of it, at least not without taking some of the low-level desired signal with it. The dynamic range is going to be forever trapped between the 0.6 Volt "ceiling" of a silicon photocell (that is the point where it saturates), and a noise floor about 11 stops below this. (This noise floor is somewhat 'elastic', depending on the level of fiddling the manufacturers engage in with so-called noise reduction, whether specific, or as a by-product of a recording compression process, but the fact remains, there is very little room left for further improvement)

 

This is probably the least understood aspect about the technology: Digital circuits are extremely tolerant of manufacturing parameter spreads and system noise, because, (uaing an old-fashioned 5 Volt digital system an example), anything slightly above 2.5 volts counts as a "1" and anything slightly below 2.5 Volts counts as a zero. So it is possible to produce gigantic numbers of transistors on a single chip with a good manufacturing yield, because there has to be a truly a massive parameter "deformity" in a particular transistor for it not to work. Digital circuits are very fault-tolerant.

 

(The same applies to digital videotape, optical discs and so on. As long as the readout circuit can clearly distinguish between "zeroes" and "ones" 100% data recovery is the norm, although in practice it is more cost-effective to allow a higher error rate and use error correction).

 

With an analog signal no such parameter relaxation is possible; they are simply not fault-tolerant. If you are making an image pickup device, all of the photocells have to be, as far as possible, the exact same dimensions, and have identical electrical characteristics. An analog circuit cannot tolerate anything like the parameter spreads acceptable for a digital device, and the consequence of this is that the massive advances that have been made in digital processing technology are simply not applicable to the analog part of the system.

 

To give a more film-like response would require each photocell to have its own individual logarithmic "modulator" system that would reduce its sensitivity as the light level increases, which is essentially what happens with film emulsion. Basically it would be as if each photocell had its own individual iris system. (This would also overcome a major flaw in digital image processing systems: The massive imbalance in the digitization ratio over the entire tonal range. For example, with a 10 stop tonal range and 10 bit Analog to Digital converter, the range between 9 and 10 stops gets divided into 512 discrete steps, while the first stop only gets one bit allocated to it and so can only ever be "on" or "off").

 

But here's where we run into a technological brick wall. A lograthmic mdulator system would require quite a few transistors attached to each individual photocell, and they could not have the sort of parameter spreads that would be readily acceptable for digital devices. If it took six extra transistors per sensor (which is rather optimistic), a 12 megapixel sensor would need 72 million extra high-precision transistors (or a lot more lower-precision devices). There is simply no current manufacturing technology which is up to the task, and given the relatively minute size of the HD sensor market in the scheme of things, it is unlikley anybody would be prepared to spend the R&D dollars necessary.

 

There have been reports over the years of manufacturers supposedly overcoming this problem, but, nothing ever seems to come of it. A proposal is one thing, actually getting the thing to work is quite another.

 

I think what is more likely to happen is that HD sensor technology will advance slowly and painfully to the point where it is considered "good enough" by most producers, and film usage will then die off for purely economic reasons. After that, with the yardstick of film quality gone, there be no incentive for manufacturers to improve silicon sensors further, and no further advances will be made.

 

There are any number of technologies that more or less "peaked" long ago. The naive assumption is that, because a particular technology has undergone massive and spectacular advancement in a very short time, (computer technology being the standout example), the same level of advancement can be automaticaly expected from entirely unrelated technologies.

 

You might imagine that a 2009 model car is technologically light years ahead of an equivalent 1959 model, but in reality its basic operating principles have hardly changed at all. What has happened is that advanced features that were only available in luxury models in 1959 have become mainstream. Your 2009 model might be more fuel efficient (but not massively so) and it might give a more comfortable ride, but when it comes to what the device was designed to do, ie get you from A to B on the minimum amount of fuel possible, the improvement is not that earth shaking, again, compared to something like computer technology.

 

Vaccination was discovered well over two centuries ago, and as far as public health measures go, nothing comes even close to it with respect to lives saved for money spent. Dozens of lethal childhood diseases which used to be endemic even in Western comuntries, for all practical purposes no longer exist there.

 

Yet after 200 years of research, there is still no effective vaccine for the common cold, and flu vaccines remain a hit and miss affair. Many other serious diseases, such as AIDS and Malaria similarly show no sign of being defeated anytime soon.

 

Rocket engine technology went through some spectacular technological advances between 1950 and 1970, but in the nearly four decades since then, there haven't been any major advances at all. In real terms it costs about the same to put a kilogram of payload into orbit now as it did in 1970.

 

There were massive advances in digital still camera quality between say 2000 and 2004, but the 3.2 Megapixel 3:1 zoom model I bought in 2004 still holds up very well today, at least for everyday use. For normal size prints, the 8 Megapixel camera I bought earlier this year doesn't give me any real advantage, and even for larger prints, the difference is far from obvious. High-end cameras may be a different story, but I don't have any real use for those.

 

The part I don't understand is the part where you say "never" right after saying "barring unforeseen technological developments"

 

So why bother with the "never" part?

 

"Republicans will NEVER take over the white house again. Unless they get enough votes!"

 

R.

Link to comment
Share on other sites

  • Premium Member
The part I don't understand is the part where you say "never" right after saying "barring unforeseen technological developments"

 

So why bother with the "never" part?

 

R.

Huh? :blink:

Makes perfect sense to me.

No amount of refinement of any current technology is ever likely to cut it.

So, unless some totally new technology springs out of nowhere it will never happen.

There's nothing to say this can't happen, but I have an intense dislike of arguing with people who argue on the assumption that it MUST.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...