Jump to content

Steve Switaj

Basic Member
  • Posts

    83
  • Joined

  • Last visited

Everything posted by Steve Switaj

  1. The color isn't really "color" in the sense that you're thinking of it. It's not a transmission effect, like if it were a filter, instead, it's a dichroic effect caused by optical interference. The coatings on these lenses are, at most, only few wavelengths thick. Their function is to manage the reflections caused by the sudden change of index of refraction as light enters and exists the glass, thus maximizing transmission at nominally perpendicular angles. The idea is the refractive jump is split into two steps between the air and coating and the coating and glass, keeping the steps individually low, which is less reflective than one large step. When light gets into the coating from low raking angles, it can bounce between the air-coating interface and the coating-class interface in a phenomenon similar to fiber optics, and interfere constructively or destructively with itself depending on the wavelengths involved and the coating thickness. This produces color depending on what wavelengths get absorbed and which escape. This is something that should only happen to light rays entering the glass from outside the lens frustum, so it really doesn't affect imaging. The coating varies from lens to lens because it is adjusted to the index of refraction of the underlying glass elements, which vary.
  2. it would be a terrible shame if someone didn't mention the famous "burp loop" used in some IMAX projectors. Not a camera movement, but a strategy to move the large IMAX frames through the projector day after day with good registration while not destroying the perfs with a camera-like movement. The film moves on continuous sprockets through a gate that has fixed pins. The movement is timed so that a 15 perf loop of film builds up on one side of the gate, and at the right moment a puff of air picks up the film off the register pins and the energy built up in the springy loop snaps the film 15 frames forward, where it drops back onto the pins one frame further advanced. Kinda like an the shuttle movement in an old Acme optical printer, but with air instead of the lifting shuttle.
  3. >> However, here you say a prerequisite of filming a CRT television is that you need a 144 degree shutter. >> I think I'm sort of answering my own question here but does this mean that even with the sync box, a >> 180 degree shutter is just simply too long of an exposure to fulfill that precise window of exposure needed >> to film the CRT tv right when the electron beam begins and ends without a second partial electron beam >> exposure? Exactly. An NTSC signal takes 1/60 sec to write one field * A 180 degree shutter at 24fps exposes the film for 1/48 sec ** Since the exposure time is longer than the display refresh time, this means that you end up with time to write about 1-1/3 fields on the CRT. That extra time means that 1/3 of the video screen will be flashed twice, resulting in a bright band. A 144 degree shutter gives a 1/60 sec exposure at 24fps, this is just enough to expose the whole screen once and only once. The periodic relationship between the camera shutter rate (1/24sec) and the NTSC video rate (1/60sec) repeats every 1 camera frame and 3 video fields ( 1/24sec = 6/120sec, 1/60sec = 2/120sec, hence the relationship is 6:2 or 3:1), so the timing between the two resets with every film frame and you start in the same relative place. If it weren't for the dead time in the video signal that's used for vertical sync, if you had a 144 degree shutter you could just start anyplace and let them run wild relative to each other (and in fact, that's what the old telekine camera chains used for network west-coast delay used to do, but they had special long-decay CRT tubes so the dark band wasn't an issue). The disclaimers... * technically 480/15750 sec, but doesn't start rewriting for another 45/15750 sec due to time lost to sync signals ** films shutters aren't perfectly global, so, say, the upper left corner exposure start time skews from the lower right corner start as the shutter moves across the frame. Since you're messing with timing, this can make light or dark bars roll as the image of a small CRT moves around the film frame. *** Again, this only applies to CRT's. Modern LED and LCD monitors are a whole different, and much less predictable situation. Many of them are totally fine at almost any reasonable exposure or framerate. Some of them are Lovecraftian hell. TEST. TEST. TEST.
  4. Probably the biggest issue with the Elaine might have been lack of a well-defined market. Panavision probably felt that they had to have a 16mm camera in their inventory, especially in the late 90's when S16 was ascendant, because all the other rental houses offered one, but 16mm and Panavision were an odd fit. If you were shooting on S16 for TV or theatrical blowup you were probably trying to keep an eye on expenses, but the Elaine, because it came with the whole Panavision ecosystem, was always going to be a premium product. I remember pricing one out at one point and didn't see that much difference between the Elaine and a GII. We went with an XTR and it was fine. And frankly, shows where camera budget wasn't a driving issue would probably try to shoot 35 for 'distribution insurance' in a market where S16 is a less... prestigious... format. So... where does a 'premium downmarket' camera really fit? Unless you had a DP who was really loyal to the Panavision system and the way it handles, there wasn't much reason to rent an Elaine when every serious rental house in town had well dressed Arri or Aaton packages at a much better price. Along similar lines, does anybody know how many 416's Arri ever sold? That seems like a camera that would have the same market issues, especially since it was introduced well into the 2000's, when HD was starting to bite really deep into the 16mm market. It seemed really cool, but I don't think I've ever seen one outside of a trade show.
  5. I had a Nikon lens mount adapter I used on my LTR years ago, IIRC, it was quite similar to the one in the picture. it worked great. I don't know how precise the backfocus was, I used it for long telephotos, and those tend to be pretty forgiving of perfect backfocus requirements, I never tried it with say, a wide zoom.* You will, however, have to use the older lenses, with the manual aperture ring. Many of the new Nikkor series and almost all of the new aftermarket vendors have eliminated the manual ring, so there's no way to adjust aperture in a manual mount. On the plus side, if you're looking for good, cheap long lenses, this is a great way to do it. There's a bunch of lightly used AI-s glass out there that at fire-sale prices because it's been orphaned as Nikon has moved away from mechanical linkages on all but their high-end cameras . * Long lenses have shallow depth of *field* but wide depth of *focus*, they are less sensitive to exact backfocus. Wide lenses have deep depth of *field* but shallow depth of *focus*, they are more sensitive to exact backfocus.
  6. (Long reply, sorry) There are two variables at play, speed and phase. Speed measures how fast you do something, while phase measures the timing difference between when multiple people do the same thing. Imagine several cars driving down a multi-lane highway at *exactly* 55 miles per hour. They will have some physical relationship to each other, and if their speed is right, will hold that relationship forever, but their relative positions will be random. They have a *speed* lock. Now imagine that they all decide to line up with a master car in the left lane. The master sets the pace and all the other drivers take their eyes off the spedometer and concentrate on lining up with the master car, speeding up a bit or slowing down a bit as necessary. They form a line across the highway, they now have *phase* lock. When filming a traditional video monitor it's important to understand that only a small part of the screen is lit up at any given moment. A moving electron beam paints one horizontal line at a time, which starts to exponentially fade almost immediately. It drops several stops in a few milliseconds. So it's important that the camera shutter (which is usually full frame) opens just before the electron beam in the monitor starts the top line and closes before the beam can start the next top line. That way every spot on the CRT gets the same exposure. You need a 144 degree shutter because this is 16.7mS at 23.976fps, and that's how long it takes for one field to be written on a CRT* at 29.97fps. If you used a 180 degree shutter, you'd give the beam time to get back to the top and start writing a second field before the shutter closed again. The beam would get about 1/3rd of a new field written, and those parts of the screen, being flashed twice while the camera shutter was open, would appear as a brighter band. If the beam finished at the bottom and immediately restarted at the top, phase wouldn't make a difference, no matter when you'd started you'd get all the CRT lines only once and you'd catch the same amount of fade-out. But that is not the case. There is significant visual dead time in the video signal - about 10% - used for things like sync. If you could see things in slow motion, this would look like 'write a frame from top to bottom which starts to fade immediately, wait a beat, write a frame from top to bottom which starts to fade immediately, wait a beat, etc'. If you had a button to push for exposure, you'd time yourself to start right before a CRT frame starts, and end just before the next frame begins. You can see where if you got out of sync, and started in the middle, you might catch the dying-out bottom half of the outgoing frame, wait that beat, then catch the bright new top half of the incoming frame. The bottom half would loose that beat's worth of exposure, and you'd see it as darker. This, happening 24 times a second, is what's going on if you point a camera at a CRT and allow them to have random phase. In practice dropping in at a random time gives you an 80% chance of photographing a dark band. If you don't have the speeds matched, that band will slowly roll up and down through the image. The CE box uses timing information from the video signal itself, either from the directly from the signal feeding the monitor or from a magnetic pickup that senses the electron beam jumping to the top of the picture, as the master timing source. Then it modulates the camera speed and phase such that the camera runs at the same basic frame *rate* as the video, and subtly modulates the camera speed up and down a bit to ensure that the camera exposure begins just before the CRT frame starts. This is called phase lock, and there is a knob on the side of the CE box so you can adjust the timing, looking through the viewfinder and adjusting phase to roll the bars that you see till they are out of the image. Now... All this assumes we're talking film cameras (global-ish shutter) and CRT monitors (flying spot). In the world of electronic cameras filming electronic displays there are all kinds of wild and wooly combinations of timing on both the display and camera end. Both could be global with effective 360 degree shutters, and you're golden at all speeds. Or you could have a rolling shutter and high-speed rolling display - in opposite directions, and your world would suck profoundly. But at least with electronic media you get instant feedback. which is nice. ( * It actually takes about 14mS to write the NTSC visible field, then you get to watch the last few lines fade for about 2mS before it starts over again )
  7. If, like me, you started working back in the olde days of standard-def, when dinosaurs walked the Earth, you may have done work on DPS Perception systems - PVR's - which were the simple(ish), cheap(ish) way to do broadcast quality back then. I was cleaning up around the house and discovered a bunch of old work archived on disks in the PVR format. It's really tough to get files off those disks because the format relies on hardware compression - you need the actual DPS Perception board to read them at high resolution. But, I actually *had* one of these boards, stashed deep in my Pile 'O Computer Junk, and I was able to set up a system to download my files in the full 720x480 glory that was the late 90's. Then I thought to myself “Self; there are probably other people who might have a similar orphan disk with shots from the way back, you should tell them about this”. And so here we are. I have a DPS Perception system that can download your old PVR files. Maybe the last one of its kind, who knows? If you have some old Perception projects stuck on a disk, I'm happy to move them to new media for you. I have an NTSC board, but the conversion process only uses the compression hardware, so it may also work on PAL, who knows? Also - and this is a bit of a reach - I have a Reality board (DPV format) that I *might* be able to fire up if absolutely necessary - but no promises. Feel free to pass the offer along if you know someone in a similar spot. Also, I'm not familiar with most of the VXF and animation forums out there, so if you know a place where people are interested in this kind of stuff please feel to forward this offer there, because it's still 2020 and we all gotta help each other any little way we can. But act soon. At some point in the near future there will be the Great Computer Crap Cleanout of 2021, and this system too will be gone. Act now or forever hold your files. You can contact me at steve-att-flywom-dott-com
  8. What kinds of sealing material are you looking for? I used to get soft round rubber material (for rebuilding the edge seals on Arri mags) from McMaster-carr. They have a bottomless assortment of rubber products.
  9. Well, it's totally possible. We used this interleaving technique on Super8 to fill a projection screen with temporary material to be replaced in VFX, one frame on, one frame off. We also did some lightning effects in the matrix sequels the same way at 72 FPS, one clean frame, a flash from the left, then a flash from the right, so the effect could be selectively flashed in post. In Super8 we used a Millineum at 48 FPS, and the flashes were generated with very bright LED's. That was 10 years ago, some of the LED's we have now are a dozen time more efficient. The trick is you have to have some sort of controller that can look at camera sync and gate the LED's. Film is ideal, but video will work if you have a full-frame shutter and some way to get/use a sync signal off the shutter - as opposed to trying to derive it from a video signal out, which has unknown delay. Since the LED's have to turn on and off very fast, you're probably talking about custom circuitry (that's what we did on Super8), but if you're lighting hi-gain retroreflective suits, your lights might not need to be that huge, mostly just a big ringlight at the camera. The timing is tight (slop measurable in a dozen microseconds), but it's not rocket science. The single biggest issue is having a solid camera sync. Not all cameras are friendly in this regard. As far as physical effect, this is fairly distracting - think working under badly flickering flourescents - but people will get used to it, especially if it's just a small, lens mounted source that's not throwing much practical effect. You may, however, trigger people with photosensitive epilepsy. It should be noted that a lot of people who have photosensitive epilepsy *do not know* they have photosensitive epilepsy, and this is a bad way to find out.. On one shot in the Matrix sequels we had a crew member suddenly lock up like he was made of wood and fall right over. Scary. That said, the ideal frequency for triggering people is 10-15 flashes/sec, at 24FPS you're getting out of the zone, and by 30FPS you're probably safe-ish, but put it on the call sheet anyway.
  10. I had to do something like this once, with some food photography where the camera had to be in place over steaming mashed potatoes for a long time, and a hair dryer is a good tip. Don't forget that steam condenses on relatively cool surfaces. We used a piece of glass held in front of the mattebox, and heated it with a hair dryer between takes. We kept it pretty hot, and consequently, the steam didn't seem to have all that much tendency to condense on it.
  11. I don't think it was common knowlege, but Kodak actually used to be pretty good about custom runs. I was on a project years ago, before the era of common high-speed stocks, and we had them cook up a batch of what was, essentially, their new TMAX800 still stock with a rem-jet backing and perfed for cine cameras. I don't think we had to do a ridiculously large run, either. IIRC, it was maybe 80 or 100 thousand feet. And, of course, they were always good about the various specialty VFX stocks, many of which were made in small runs and really weren't part of the "official" catalog. But I think the key back then was that they maintained several coating lines, and could make individual "master rolls" (I think that was the term) that were about 5 feet wide and a couple of thousand feet long. If I recall correctly, all but one coating line is gone now, and the one line that's left in Rochester is tooled to make and use enormous rolls tens of thousand of feet long, so they really don't have all that much capability to do specialty runs anymore.
  12. Yes. The engravings were filled in with black paint to eliminate reflections. Reflections on... well, everything. The interior surface of filters, the piece of glass you have to shoot through once in a while, mirrored sunglasses, anything dark and glossy right in front of the lens where seeing the letters "YNAMREG NI EDAM - 3.1*T SSEIZ" slowly rotating through the frame might be less than ideal.
  13. If I recall correctly (and this is going way back) those cameras were wired for both 28VDC and 115VAC. There are actually different power cords that are used for each voltage. These pick up different pins on the body connector and feed through the camera to a connector where the motor mounts. The motors were interchangeable, there were options for AC and DC motors, configured for various default speeds. The 28V motors use different pins than the 115motors, so it all sorted itself out. That being said, the fact that you've got a 400Hz motor is probably going got be the biggest stumbling block. 400Hz is the frequency standard for aircraft, chosen because magnetic devices can be much smaller, and therefore lighter, at higher frequencies. You'd think the motor wound just run slower, but it's more complicated than that, because some motors use line frequency in a strategy to control current flow in the motor windings. Depending on the type, AC motors can be quite sensitive to line frequency, and not in a good way.
  14. I once owned one of these cameras. As I recall, it was a great little machine. I liked it a lot better than the ubiquitous SR's. That being said, I have to get in the way-back machine to remember much about how to use the thing. If I recall correctly, they were really easy to switch between regular and super-16, thogh realistically, nobody did that. If you have a S16 gate you're OK for both formats, there's no reason to try to switch back to a regular gate. The difference between Reg16 and S16 is that he S16 frame is somewhat wider - stretching into the space where the second set of perfs would be in Reg16, and slightly taller - there's very little frameline between S16 frames. Additionally, since S16 frames are centered differently on the film, so you have to shift the lenses over a millimeter or two so they're back over the frame center or zooms will drift and wide lenses will keystone. But you don't have to change the gate, if you shoot Reg16 with a S16 gate the gate will be wider than necessary but you'll just expose some “overspill” image onto areas of the frame that will be ignored later anyhow. As for recentering the lens (again, it's been a while) I recall that you removed the lens mount (there are about 4 screws on it), flipped it upside down, and reinstalled it. The lens mount and body were machined with a few millimeters of eccentricity built in, so when you rotated the mount it shifted over into the right position. A similar mechanism is used on contemporary Arri's to convert from academy to super-35 you could tell at a glance which way the mount was set, if the locking ear was on the operator side of the camera it was set for regular 16 centering, if the locking ear was on the off side of the camera it was set for super16 centering. The viewfinder also had to slide over to match. If you look carefully, you'll see that there are elongated slots where the finder mounts, and the recess in the body that takes the finder is 2mm wider, allowing the finder to slide back and forth. This gap is always on the same side as the lens mount ear. My camera had a small, L-shaped key that filled this slot. You moved it from side to side when you moved the finder. I'm fuzzier on the ground glass. I don't remember if you replaced the glass or just slid it back and forth. I think I might have a pdf of the LTR manual somewhere where I can dig it up. If you need one, PM me with an e-mail address, I'll send you a copy.
  15. After a day of trying to pull focus at f0.7, your AC is going to murder you in your sleep and nobody will have the heart to convict him.
  16. Oops - my bad. When I wrote the last post this morning I was multitasking, glanced quickly at the pictures, didn't think too hard on what I was typing, and got it totally wrong. The mount shown is, of course, a Standard mount, not a B-mount. The standard mount was Arri's first interchangable mount. The B-mount was a later introduction (60's?) to solve some of problems with the B mount (soft materials, poor rotational indexing, poor seating repeatability). The B mount was superseded by the PL mount. Many Ang. lenses, like this one, had a tail end configured to take multiple, interchangable (with colimation) mounts. The rest of the post should be accurate. Both the B and Std mount had a 52mm backfocus, so this lens could, in theory, mount onto an EOS with a backfocus of 42mm, provided it could clear the mirror box inset, but I've never seen such an adapter. The lens would probably have marginal coverage at the wide end.
  17. You have an old Angenieux zoom which probably dates to the early to mid 70's. This lens has a "universal mount" scheme. The tail end was threaded, and there were several camera mounts which could be installed, depending on which camera you were using. The actual lens mount is an "Arri B" or "Arri bayontte". This was Arri's first bayonette mount. IIRC, it was used from the early models up to the the BL 2 or so. There were probably some mounting shims between the mount and the lens body at one time. Since they are gone it's a crapshoot whether the lens is properly collimated and would hold focus during zoom. Many of the B mounts were made of aluminum, as is yours, as this is a soft metal, they tend to pick up nicks and dents with use, which makes mounting imprecise. The B mount was superseded with the PL mount in the 80's or so. The PL mount is much more rigid, and, importantly, much more repeatable. As to your question about using this on you 60D, the answer is yes, it is probably technically possible. The B mount has a focal distance of 52mm, while the EOS has a an FD of 42 mm, so assuming the tail of the lens clears the mirror box, there is probably enough space for an adapter. However, good luck finding a B to EOS adapter. I've never seen one. If you were good in machine shop, you could probably loose the B-mount part and make a pretty rigid adapter to pick up the threads on the tail of the lens. The lens probably won't cover the Canon frame at the wider focal lengths.
  18. >> In the 84-year history of Oscar®, no Academy Award®-winning best picture has ever been made without motion picture film. True. But for better or worse, the writing is on the wall. Hugo won best cinematography in 2012 with digital origination, and before that, Avatar won in 2010, and "best cinematography" is arguably more intimately tied to capture media and overall cinematic look than "best picture", which is an amalgam of story, acting, and (typically) perceived importance.
  19. >> But then I saw the Samyang lenses and now I'm wondering if these are >> any good. These are faster, with de-clicked aperture primes which are >> pretty cheap as I can buy the 14mm, 24mm, 35mm and 85mm for about 1800$. I've had some experience with the 14mm. We had several on the last project I worked on (a stop-motion animated feature). For a $300 lens, the 14 was a surprisingly well built, surprisingly sharp, and surprisingly contrasty. It did, however, have truly wicked distortion. Over an APS-C sized area it was simple barrel distortion, and reasonably managable if you didn't have a lot of straight lines giving the game away, but then it distorted back the other way into significant pincussion over the larger 8-perf frame, so if you had lines you'd get a weird triple bend as they ran through the frame.
  20. If you look at the film on a light table, is the color cast this bad, or has it gotten worse in transfer?
  21. It's (mostly) a B mount. You can tell because of the two milled slots for the mount "ears". As Dom notes, this is a low-end mount. The better ones have a nicer locking mechanism and can stay the camera as a new mount. This one seemingly uses setscrews to retain the lens, and is meant to be attached to the lens and left there. This is OK so long as the screws either have soft tips or some sort of pressure pad underneath them, otherwise, a normal set screw (made of hard alloy) will leave a ding in the (relatively) soft stainless steel of the lens mount, and you obviously don't want that. Although this adapter clearly allows for B mounts, many set-screw type adapters could also take an S mount lens since the throat diameter and backfocus were the same. It's a matter of whether the setscrews are placed to get a good grip on the flange groove.
  22. Don't know if there's a standard. Every one I've worked on used the left eye as the master. We'd compose on the left, then converge on the right.
  23. >> I've just been playing around with a Canon DSLR, an old micro-Nikkor, a lightbox and some 35mm (stills) film, image reproduction is pretty good. You know what would work really well for this wouldn't be a micro Nikkor, it would be an old c-mount 25mm macro-Switar mounted backwards. I played with this a few years ago for some micro work with a custom bellows system, using reverse-mounted 16mm lenses. They don't have a big image circle, but if you're on a small subject they have phenomenal image quality over the area they do cover, and they're physically small enough to light around.
  24. Actually, yes, Dragon would be an ideal building block for this sort of thing. I do a lot of motion control and stop-motion work, and I've bee around Dragon a lot. i actually looked at an application like this about a year ago. It's a good startign point for this kind of stuff, since it handles allt he camera overhead plus has a reasonable motion control capability that lensd itself well to step-and-repeat functions. You'll still have to make an "optical bench" of sorts, with an appropriate, reasonably well-registered movement and light source with long-term consistancy, but if you can hook a step motor up to it, then you won't have to re-invent the wheel as far as coordinating motor and camera and storing off the frames goes. Dragon is inexpensive - somewhere around $300 IIRC. You can roll your own motor electronics, but if you don't want to do that, Dragon sells a unit called an Iota controller for $750 that can do all the motor driving (The Iota controller was originally designed to drive an I/O slider and focus motor, but it can drive any two small step motors) . Two caveats - first, you'll probably want to arrange the gearing so that an even number of motor turns advances one frame (dragon step motor axes don't like fractional steps in the current software rev) and second, you'll probably have to work in blocks of a few hunderd frames at a time - because it's juggling allt he frames in memory, the program can bog down on really long animations.
  25. I'd call Tim Drnec at Spydercam (fxwest.com). I've done a few shots like this over the years with them. They have a couple of large, servo-driven winches set up, essentially, as large motion control systems. They can easily drive a 100m "clothesline" style cable that can pull both ways on the track. You program a start point, end point, and move profile. On cue, someon hits the button and the 10hp servo winch does it's thing, pulling the dolly along the move profile. Light dollys at very high speed have to use captured track, but a normal dolly at reasonable speeds is heavy enough to stay on the track by itself.
×
×
  • Create New...