Jump to content

Technicolor Dichroic Prism


Leo Anthony Vale

Recommended Posts

Years ago I systematically went through back issues of the SMPTE Journal.

The yearly progress report would mention changes in 3-strip Technicolor when they occured. Usually speed increases.

 

At the beginning of the 50s, the speed doubled. When it switched to a tungsten balence a few years later, the speed doubled again.

Of course Technicolor never mentioned the actual speed. By counting the number of increases, it seemed that the tungsten version was 40 or 50 ASA.

Since Eastman 5248 was only 25T, that figure seemed wrong.

 

The Google group rec.arts.movies.tech rec had a thread about the technicolor prism.

On Sept 4, Peter Haas wrote:

 

"The final Three-Strip patent (1955) had all three colors with nearly

the same transmission factor.

 

After sputtering about eighteen layers of metal in microscopic

thicknesses, Technicolor techs got just about a square wave transfer

function centered on the three primary colors of interest, and about

the same amount of attenuation on all three colors.

 

However, by that time, Three-Strip already had an effective ASA

advantage over Eastmancolor, but so much product had been diverted to

'Scope (for which Three-Strip was impractical) and wide-gage, that it

was overdue to put Three-Strip out of its misery."

 

peter haas used to work at Fox & seems to be the go to guy concerning B&L CinemaScope camera lenses.

 

On Oct.6, Peter Mason posted a link to an article about Don Kelly who built the 50s dichroic prism.

 

http://www.ai.sri.com/~vision/donkelly.html

The efficiency of the new dichroic prism is what accounted for those final jumps in speed.

 

From the Don Kelly article:

 

"The so-called three-strip process, launched in 1932, was in its heyday. It included a unique camera, some very special emulsions and a whole factory full of custom-built printing and processing equipment, all covered by as many patents as possible. Technicolor had tried other processes over the years, but this was the first one that really produced good color rendition. For many years there was no competition worth mentioning, so it became very profitable.

 

The heart of the Technicolor camera consisted of three films, two apertures and one beamsplitting prism. The film movements and aperture assemblies were supplied by Mitchell, of course. Which film went through which aperture was dictated by the need for good color separation. Inside the beamsplitting prism was a thin layer of sputtered silver, whose reflectance was adjusted to balance the film speeds.

 

The Technicolor process required very bright lights on the set, partly because that metallic beamsplitter wasted a lot of light. It sent all wavelengths impartially to both apertures, taking no account of the spectral sensitivities of the red, green and blue negatives. The beamsplitter was about the only way the three-strip camera could be improved, and I set about to do it, by substituting a multilayer, dielectric, color-selective beamsplitter that would send each film just the wavelengths it needed, wasting nothing.

 

There were no suppliers of thin-film interference components in those days-- we had to make our own. Ironically, some of the pioneer work in making interference filters and beamsplitters had been done by Mary Banning, Harry Polster and others at the Institute of Optics while I was there. But I had never worked in the thin-film lab-- my friend Bob Hills had that job. Fortunately, I obtained some good advice and hired some very talented help. We had to design our own vaccuum chambers, jigs, optical control systems, everything. There were no computers, but we did the necessary theoretical work on Hollerith cards, using a calculating card punch in the accounting department.

 

In due course we got the job done. There were 34 Technicolor cameras in the world and each needed a spare prism, so getting the job done meant turning out 68 more-or-less identical dichroic beamsplitters. We had increased the speed of the Technicolor process by two whole F-stops. The new, faster Technicolor could shoot pictures without the old, blinding Kleig lights. A circus picture called "The Big Top" was even photographed in available light."

 

fig5.gif

 

So at its end 3-strip Technicolor was indeed faster than Eastmancolor.

Link to comment
Share on other sites

  • Premium Member
However, by that time, Three-Strip already had an effective ASA

advantage over Eastmancolor, but so much product had been diverted to

'Scope (for which Three-Strip was impractical) and wide-gage, that it

was overdue to put Three-Strip out of its misery."

 

Actually, it was a lawsuit that put an end to three strip. Technicolor was accused of restricting color to certain favored customers. With only 34 cameras, they found it more cost effective to drop it than to make more.

 

 

 

 

-- J.S.

Link to comment
Share on other sites

Man, o man, that was an awesome post! Such a shame three strip had to go out when it did. Imagine what a picture TODAY would look like in three strip, with better film stocks, as well as the ultra-res process to ensure PERFECT alignment. Now if we could only get Kodak to make the special BW film without the antihalation backing for the blue layer, we could dig out one of those old tech cameras and let er rip! Someday....

 

BR

Link to comment
Share on other sites

  • Premium Member
. There were no computers, but we did the necessary theoretical work on Hollerith cards, using a calculating card punch in the accounting department.

It always amazes me the amount of sophisticated engineering that was done with only the aid of unbelievably primitive computers.

Yet, without those vacuum tube monsters, the same work could probably not have been done at all.

 

There was no real programming; engineers used to solve fiendishly complex design problems by simply telling a computer to keep incrementing the input variables until it found a set of numbers that gave the answer they were looking for.

Of course it might have taken days, but computers don't get tired!

Link to comment
Share on other sites

It always amazes me the amount of sophisticated engineering that was done with only the aid of unbelievably primitive computers.

Keith it almost sounds as if you are signed up to the view that NASA never put people on the moon in 1969, because the computers they had then weren't up to it.

 

Having gone through the era of long multiplication & division at junior school, log tables at high school, slide rules at university, and then onto jealously allocated runs at midnight on the accounts department computer, I can believe anything was possible at any time. Even the pyramids, Harrison's chronometer and steam engines. Without computers, even primitive ones.

Link to comment
Share on other sites

  • Premium Member
Keith it almost sounds as if you are signed up to the view that NASA never put people on the moon in 1969, because the computers they had then weren't up to it.

(Amazingly, somebody said almost exactly the same thing to me on a completely different forum. The following is more or less a repeat of my reply :rolleyes: )

 

The above sort of view comes from people ignorant of the technologies used at the time. Computers didn't play as big a part in 1960s space technologies as they do today, but without computers, there still would have been no Space Age.

 

The simple fact is, that in terms of doing something USEFUL, the 1.3MHz IBM discrete-transistor mainframe that ran the entire Apollo program ran rings around the average cyber-turkey's home PC, regardless of its multi-gigabyte multi gigahertz specifications. The average idiot's home PC is a classic example of something going nowhere (extremely) fast.

 

Having gone through the era of long multiplication & division at junior school, log tables at high school, slide rules at university, and then onto jealously allocated runs at midnight on the accounts department computer, I can believe anything was possible at any time. Even the pyramids, Harrison's chronometer and steam engines. Without computers, even primitive ones.

You've completely missed my point.

I also went through "the era of long multiplication & division at junior school, log tables at high school, slide rules" etc myself, and that allowed me to do design work that would have been impractical without them.

 

Certainly if you had enough people available to do the calculations, amazing engineering can be carried out, but in many cases it was so expensive only huge military projects could afford it.

 

A good example is the explosive lens implosion system need for plutonium bombs such as the "Fat Man" weapon dropped on Nagasaki.

 

Most science textbooks tend to gloss over how incredibly complex the design of an atomic bomb actually is. In the early 1940s "computer" was a job description, not a machine, and thousands of such people were employed by the Manhattan Project. Yet they could only work out a rough approximation of the shape of the 32-segment explosive lens assembly required. Without electronic computers, there was no option but to build hundreds of dummy bomb assemblies more or less by trial and error and "X-Ray" each assembly the instant after the 4 tons of TNT was detonated, to find a configuration that produced a precisely shaped imploding shock wave. They eventually succeeded, but the cost was phenomenal.

 

ENIAC, the forerunner of modern electronic conmputers, was under development for most of the war, but wasn't completed until 1946. Its first official task was to solve a mathmatical problem codenamed "Problem #1" which basically simulated the energy buildup in the proposed hydrogen bomb, to determine if it was even theoretically possible. With that gigantic (but by today's standards, unvbelievably primitive) machine, they were able to solve Problem #1 in about 2 months. Without ENIAC or any other comnputer, they would still probably be working on it.

 

My point was simply that even the primitive computers on the 1940s and 1950s were an enormous technological breakthrough, in that they allowed complex research to be carried out by much smaller design teams. My very first personal computer, a Sinclai ZX81, was a 4MHz machine with only 1K of RAM - considered laughable today. Yet I was able to use that to help me design electronic equiopment, by automating some of the complex equations that needed to be solved.

 

One interesting research by-product of this was my discovery that a lot of fiendishly complex textbook design equations are in many cases actually derived from a few, vastly simper, equations that somebody obviously thought look really cool multiplied together. (I can't think of any other explanation) So you go from a handful of equations that you could just about do in your head, to some multi-lined multi-bracketed monster that only a Rain Man could love. Without that little machine. I would never have known that.

 

Even the pyramids, Harrison's chronometer and steam engines. Without computers, even primitive ones.

But the results are protty much in line with what you would expect from the tools available to the people who made them

If you define a "computer" as anything that allows you to automate some mathematical process, they did have "primitive ones". Algebra, calculus, log tables, are all Primitive "computers". The people who built the great Gothic cathedrals of Europe didn't really have many construction tools and materials that weren't available to the ancient Egyptians, what they did have was advanced mathematics and design tools.

Link to comment
Share on other sites

The above sort of view comes from people ignorant of the technologies used at the time. Computers didn't play as big a part in 1960s space technologies as they do today, but without computers, there still would have been no Space Age.

 

Funny, I thought that Werner von Braun and the German V2 Rocket program, coupled with the cold war were responsible for the space age, not primitive computers.

 

I also disagree that the power or speed of the computers have anything to do with their usefulness. It has to do with the skill of the programmer, not the machine, just like anything else.

 

I'd have to do a great deal of work to even FIND a programming interface on my current computer, compared to my trusty Apple IIc and a stack of floppy disks.

 

 

It sounds to me as if you are spouting a lot of elaborately-worded, vaguely factual garbage.

 

 

 

Getting back to the space program, and ignorance. . . It is interesting, Dominic, how a great deal of "conspiracy theories" about the Apollo lunar landings can be explained away as simple photographic artifacts, like the photos where the cross-hairs (Vernier screen?) seemingly are behind objects on the lunar surface in the still photographs.

 

A lot of space travel is simple physics, just as a lot of other conspiracy theories are an ignorance of that same physics.

 

 

Computers of today aren't hindering our ability to return to the moon, or obsoleting motion picture film, human beings, and their continual drive towards making things easier rather than making them better are.

Link to comment
Share on other sites

  • Premium Member
It sounds to me as if you are spouting a lot of elaborately-worded, vaguely factual garbage.

Yeah, it probably does sound like that to you.

 

Funny, I thought that Werner von Braun and the German V2 Rocket program, coupled with the cold war were responsible for the space age, not primitive computers.

The V-2 could only get to the edge of space and then fall back to Earth. Getting a rocket into Earth orbit requires a vastly superior mass/thrust ratio, and it was only possible to develop this using computer simulations. Once again, Popular Science books tend to gloss over all the tedious but important details, to maintain the interest (and subscriptions) of idiots who want to pretend they are Savants.

 

Same reason I tend to restrict my modern "Science Fiction" reading to the Reduser forum...

Link to comment
Share on other sites

Yeah, it probably does sound like that to you.

 

 

The V-2 could only get to the edge of space and then fall back to Earth. Getting a rocket into Earth orbit requires a vastly superior mass/thrust ratio, and it was only possible to develop this using computer simulations. Once again, Popular Science books tend to gloss over all the tedious but important details, to maintain the interest (and subscriptions) of idiots who want to pretend they are Savants.

 

Same reason I tend to restrict my modern "Science Fiction" reading to the Reduser forum...

 

Funny, I haven't heard any factual information, just a lot of supposition as to computing power, so what is it supposed to sound like?

 

 

Again, I am pretty sure von Braun had a lot more to do with getting into space than '50s and '60s computer technology.

 

Feel free to prove me wrong, but I won't just take your word for it, sorry.

 

And no, I don't get my knowledge about rocketry from "Popular Science."

Link to comment
Share on other sites

  • Premium Member

I use a CD ripping and burning program that handles both audio and data that I've had for years. It does everything the current rippers and burners do but without the fancy screens, menus, playlists, etc. One difference: The modern (?) programs usually take about 100mB on your hard-drive, my program at 1mB fits on a floppy disk. It will also run on anything from a DOS box to a multi-GHz Vista machine.

 

Best of all. Quite a few of the 100mB bloated Windows rippers/burners use that teeny tiny little 1mB program as their actual CD engine. The other 99mB is the pretty screens and feature bloat those companies use to sell their elephants.

 

One good programmer and a simple machine can do jobs that others think require a building full of computer "scientists" and bleeding edge hardware.

Link to comment
Share on other sites

I think the more "idiot proof" you make software, the more bloated it will get; simple as that.

 

If you design software for a rocket scientist, or a mechanical engineer, or a physicist designing optical coatings, it is going to be far more streamlined than if you have to somehow try to design the same program for a layman.

 

If you are typing in numbers and the computer is spitting out numbers, and you have to have an understanding of the equations you are plugging in variables for, that is a far more streamlined situation.

Link to comment
Share on other sites

The V-2 could only get to the edge of space and then fall back to Earth. Getting a rocket into Earth orbit requires a vastly superior mass/thrust ratio, and it was only possible to develop this using computer simulations.

 

Von Braun's V-2 team had designed, though didn't build, multistage rockets for intercontinental ranges and orbit.

 

v2-chart3.jpg

 

The A-4(V-2), A-4b, A-9 & A-10

 

a121946.jpg

 

A-9+A-10+A-11

 

http://www.project1947.com/gfb/a-9.htm

 

http://www.pp.htv.fi/jwestman/space/nazispace.html

 

The V-2 was the early step in the space program, not the end all.

 

&the Norden bombsight and the german equivalent contained mechanical analog computers.

Not the ENIAC, but functional. Those weren't the only ones being used.

It wasn't all the constipated mathematician technique.

Link to comment
Share on other sites

Getting back to the space program, and ignorance. . . It is interesting, Dominic, how a great deal of "conspiracy theories" about the Apollo lunar landings can be explained away as simple photographic artifacts, like the photos where the cross-hairs (Vernier screen?) seemingly are behind objects on the lunar surface in the still photographs.

 

A favorite one is that NASA hired Stanley Kubrick to fake the moon landing photos and footage using front projection.

This was not done because NASA hadn't to the moon, rather it was because NASA went to the moon using secret Nazi flying saucer technology!!! & they wanted to keep it secret.

So why couldn't they have taken a prop LEM to the moon in some "Bells" and actually photograph it on the moon?

 

http://jayweidner.com/AlchemicalKubrickIIa.html

Link to comment
Share on other sites

  • Premium Member
Von Braun's V-2 team had designed, though didn't build, multistage rockets for intercontinental ranges and orbit.

Just so.

And when he started working for the US military, he found out how damnably difficult it actually was to put those ideas into practice. Or maybe he already knew.

If I remember correctly, in the 1950s they built 14 prototype 2-stage launchers before they got one that didn't explode on the launch pad. After that they started using computer simulation techniques which sped up progress enormously.

The simple fact is that every little extra bit of launch weight requires an enormously greater mass of fuel to get it into orbit, because the rocket doesn't just have to carry fuel to launch the payload, it also has to carry more fuel to lift that fuel off the ground. So every gram of mass that can be scraped away from the rocket motor means kilograms less fuel to be carried.

Generally, rocket engines have to be pared down to the point where they just work reliably with a specific load of fuel, and after that, the design must be rigidly frozen. They usually have a specific payload mass, which must be topped up with inert ballast if necessary. (Although more often spare payload capacity is taken up with "freebie" small satellite launches). Computer simulation was able to greatly expedite the design process for all this.

 

It's not generally known that the much-maligned Space Shuttle is virtually the only launch vehicle not subject to these constraints.

It's also not generally known that the Russians still use rocket engine designs from the 1950s as booster stages for their larger rockets.

Link to comment
Share on other sites

  • Premium Member
Just so.

It's not generally known that the much-maligned Space Shuttle is virtually the only launch vehicle not subject to these constraints.

It's also not generally known that the Russians still use rocket engine designs from the 1950s as booster stages for their larger rockets.

 

Nor is it generally understood that the main engines on the Shuttle are evolutions of the Saturn V F-1 engines that put us on the moon. So we use 1960's technology to this day and will continue to do so since the engines for Orion are also based on Apollo technology.

 

I read somewhere that NASA had to assign some Orion engineers to reverse engineer some of the Apollo hardware...no one thought to save the drawings and specifications.

Link to comment
Share on other sites

  • Premium Member
the Norden bombsight and the german equivalent contained mechanical analog computers.

Not the ENIAC, but functional. Those weren't the only ones being used.

It wasn't all the constipated mathematician technique.

 

In fact, almost every halfway decent anti aircraft gun uses a mechanical analog computer. The cracking of the daily Enigma code was only possible because of the bombs than Alan Turing ad others built at Bletchley Park. Fact is that the human mind is capable of doing very great deeds, with or with mechanical and or electronic help.

 

Cheers, Dave

Link to comment
Share on other sites

  • Premium Member
Nor is it generally understood that the main engines on the Shuttle are evolutions of the Saturn V F-1 engines that put us on the moon. So we use 1960's technology to this day and will continue to do so since the engines for Orion are also based on Apollo technology.

Actually the bulk of the heavy lifting was done by the first stage, which ran on liquid oxygen and a very specialized form of what is basically kerosene.

However the cryrogenic oxygen/hydrogen thrusters used in the upper two stages were much more advanced for the time. Again it was the use of computer control technology that made them workable. Ironically, the shuttle actually uses 1978-vintage control computers, with no practical way of upgrading them to something more modern. The engine management computers even in cars as much as 20 years old have vastly more computing horsepower.

 

I read somewhere that NASA had to assign some Orion engineers to reverse engineer some of the Apollo hardware...no one thought to save the drawings and specifications.

A lot of the time the problem is lack of documentation of the changes made between the original drafting and final assembly

Link to comment
Share on other sites

  • Premium Member
In fact, almost every halfway decent anti aircraft gun uses a mechanical analog computer. The cracking of the daily Enigma code was only possible because of the bombs than Alan Turing ad others built at Bletchley Park. Fact is that the human mind is capable of doing very great deeds, with or with mechanical and or electronic help.

 

Cheers, Dave

A major problem with this sort of discussion is the way the meaning of technological terms changes over the years.

Originally, a "computer" was a job description, not a machine.

Before the advent of electronic computers, all large research institutions had their own "arithmetical" section, which was populated by people whose job was to carry out the thousands of calculations required by the researchers.

The concept was very much like the office typing pools businesses used to depend on before word processors and cheap printers. Trouble is, while most people will have probably heard of a typing pool, and can understand what it is for, very people even at the time had any idea what a “computer” did.

 

Instead of typewriters, the "computers" would sit all day in front of mechanical adding machines, mindlessly carrying out the calculations printed on sheets of paper in their "in" tray and stacking the answers in their "out" tray.

Specialist mathematicians were given the task of distributing and collecting all the required calculations.

 

An "automatic computer" (which makes up the last two letters of the names of many early computers eg ENIAC) originally meant an automated machine that could duplicate the tasks of human "computers". This was what Charles Babbage envisaged in the 19th century with his calculating "engines".

The first truly programmable and workable electromechanical computer was the Mark I, which basically consisted of a large number of electromechanical calculators connected together by a sort of automated telephone exchange. However, while it did work, the adding machine mechanisms were never intended to work 24/7 and they very quickly wore out.

 

Interestingly John Mauchley, one of the two men behind the ENIAC project, was originally recruited by the US Navy to try to recruit more "computers" (ie people) as the wartime R&D effort required a massive upsurge in arithmetic capability.

 

It's hard to imagine a much more boring job, and you really needed to be a "born" calculator to last very long. Most recruits (mainly young women) weren't, and the staff turnover was astronomical. Eventually Mauchley was introduced to a young graduate student called John Presper Eckert who had an idea for an electronic version of the Mark 1 but he couldn’t raise the funding for it.

With Mauchley’s Navy connections, funding was approved in less than 2 weeks. Unfortunately the war was over before the mighty machine was finished.

Link to comment
Share on other sites

  • Premium Member
I read somewhere that NASA had to assign some Orion engineers to reverse engineer some of the Apollo hardware...no one thought to save the drawings and specifications.

 

It's more that they were given a goal to get to the moon and back before the end of 1969, and making the deadline was the priority, not preserving and documenting what they did to get there. The lesson is, if it's worth doing, it's worth documenting. That saves re-inventing wheels.....

 

 

 

 

-- J.S.

Link to comment
Share on other sites

  • 4 months later...

The original Technicolor 3-strip process(1932) had a speed of 3-4 Weston which is equivalent to 4-5 ASA in present day terms.

In late 1938 beginning with The Wizard of OZ (1939) (Not Gone With the Wind(1939) as most people erroneously state) Technicolor

increased the speed of the process to 8 Weston(10 ASA) and the process retained this speed until 1951 when the Dichroic Prism was

introduced and the Technicolor film speed was increased to 16 Weston(20 ASA). There were no further film speed increases after this time and reports that Technicolor attained a speed of 50 ASA in the mid fifties are incorrect.

 

Regards,

Peter Mason

Link to comment
Share on other sites

It's more that they were given a goal to get to the moon and back before the end of 1969, and making the deadline was the priority, not preserving and documenting what they did to get there. The lesson is, if it's worth doing, it's worth documenting. That saves re-inventing wheels.....

 

 

 

 

-- J.S.

 

The answer (I was always told, one time being in the computer industry) to what happened to all the Saturn program blueprints and documentation, was that it was saved on 7 track EBCIDC magnetic tape. The hardware to read these tapes became obsolete and no longer exists. It's doubtful, if a magnetic tape of such vintage would be readable without tons of dropout errors anyway.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...