Jump to content

Phil Rhodes

Premium Member
  • Posts

    13,750
  • Joined

  • Last visited

Everything posted by Phil Rhodes

  1. Not to break up the flow, but I'm just pleased there's actually a piece of real equipment called the Film-O-Clean. Sounds like the sort of thing Wile E. Coyote would purchase in pursuit of the Road Runner.
  2. Implement automatic exposure compensation for the varispeed! Also, hand crank input! Congrats on getting to this point. I'm somewhat aware of the sheer work that goes into it.
  3. Matthew, I think what you're overlooking here is that yes, it's an automated process done by machines, but someone has to design, build, maintain and configure those machines. The thickness of the sensitised coating on a typical photographic film is in the single-digit micrometres (that's at least an order of magnitude larger than the smallest feature on something like a modern semiconductor, not that it's anything even remotely like the same process or in any sense directly comparable). Variability in the thickness of the coating will immediately effect the optical performance of the film. Laying down a coating of an essentially gelatine-based liquid to the required tolerances is a very, very long way from trivial. One micrometre is a thousandth of a millimetre, significantly less than even quite high precision engineering tolerances, so even things like bearing wear in the mechanisms, tiny variations in pump pressures, fluid viscosities or ambient temperature and humidity can easily have ruinous effects. Modern colour negative has a lot of layers (several per colour, plus filters and separators) each of which has to be laid down with that sort of precision. Even to get to that point, you have to have made the plastic base to similar tolerances, and mixed the chemistry correctly. Tiny variations in the composition will affect sensitivity and thus colour balance; they'll have a staff of PhD-level organic chemists doing tests on the raw materials they're buying in and trimming the process at every stage to suit, plus hugely qualified and experienced engineering staff keeping the required precision going. All of this has to be done without introducing dirt or contamination in absolute darkness. Then you've got to cut it to the right width and punch the sprocket holes, both of which are dimensionally critical to image stability. No matter what anyone does, this is never going to be cheap. Notice China hasn't chased it.
  4. Northlight is a metal halide discharge lamp (or, now, an LED retrofit). I remember them looking very cool compared to room lighting of the time when I did some work for Filmlight in 2006-2007, though more with Baselight. Yes, they would absolutely fade over time, but I'm not sure incandescent has much liability there. Personally I like the idea of notch filtering a broadband source more than trying to mix it with LEDs. The amount of screwing around people have to do in order to get LEDs to be consistent enough is enormous. But maybe I'm thinking too much like a homebrewer, I don't know. P
  5. Sure, but you can mount the light somewhere away from the film. It'll necessarily be somewhat distant anyway, given the need for a variable RGB colour filtration box. That's how it was always done.
  6. Is there some reason people go for LEDs rather than incandescent, which I'd view as stabler and with better colour quality? I guess you can change colours quicker on the LEDs rather than a mechanical filter assembly.
  7. I may have overlooked a mention of this but does it have any sort of closing-contact output or open-collector output for automatically starting an external audio recorder? Probably a bit late to mention now!
  8. I think people are probably just using verbal shortcuts here; I'll be the first to accept that there are uses for things at all levels and I suspect, in extremis, many people would accept that. Certainly the market for less expensive gear is vastly larger than the one for the high end, and many manufacturers make a lot of money out of their simpler stuff.
  9. If you buy lenses with E mounts, the bulk of the adaptor will effectively be part of the lens. The optical design of the lens dictates that it is a certain distance from the sensor no matter what, and the adaptor is just a spacer. You might as well put the adaptor on the camera, and buy much more widely compatible lenses. I would never, ever spend significant money on lenses that wouldn't fit PL mounts (some, including the Xenons, have interchangeable mounts). Duclos has a mount kit for the Xenon FF which runs about $450 each as I recall, though you can get simple E to PL adaptors much cheaper than that. Yes, the CP.2 will cost more. Possibly they would hold their value better, but possibly not by much. And if you really don't care about ever renting them out, you can get whatever you want. I'd at least look at the DZOFilm stuff, maybe side by side with the Xenon. For personal projects, you don't need to impress someone with a name.
  10. I had a set on test for a bit, and I used them very briefly once at Canon's demo suite in Los Angeles on their new full-frame camera, so I can't pretend any huge expertise, but I've had them in my hands. It's a lot of lens for an A7s but I suspect any PL lens purchase will hold its value reasonably well. They are well made and will satisfy camera assistants. As I recall - and I could be wrong - the original design of the 35mm was significantly less good than the rest of the range, and there is an updated design. Double check this is true before asking awkward questions, but if it is true, then make sure you're getting the recent version of the 35mm. Or whichever focal length it is. The only consideration I'd have with this is whether you're throwing away a lot of autofocus capability, the usefulness of which will vary with what you're shooting. I honestly don't know how good it is on that camera, although presumably you're not going to drop five figures on glass to use it on one body, and compatible Sony lenses are presumably a lot less expensive so you could have both. I have seen DZOFilm used and they also seemed fine. Personally I think the choice might come down to going really inexpensive with those, middling with the Schneider, or stretching for the CP.2. I think the Zeiss might be the best long-term bet. Are you looking to use them, rent them, what? P
  11. I always think if people are going to use names like that, they ought to have a more middle-of-the-road offering, so you might have Master, Ultra, Supreme, Middling, Everyday and Mundane, for those moments where you're shooting a BBC drama and the entire lighting package is a 2K into the ceiling. (I'm being a bit unfair. In this modern Netflix-enabled world, they don't do that nearly so much anymore. I just have painful late-90s memories of various mindbendingly-dull whodunnits which always seemed to take place in a special version of contemporary England which had been twinned with 1955.)
  12. To be fair, they were always mirror less when we were using them to shoot video.
  13. Broadcast is easier. HJ17e×7.6B IRSE HJ - 2/3" portable ENG/EFP 17 - zoom range multiplier e - enhanced features such as digital zoom control. 7.6 - shortest focal length B - Means "optical adjustment." I have no idea what this actually means. Possibly adjustable back focus. I - With built-in extender R - Servo zoom, no servo focus. S - Servo iris. E - With digital drive (ability to store and recall focal lengths, etc) Compare KT20×5B KRS, which is a 1/3" 5-100mm zoom without extender, servo zoom, manual focus, and servo iris.
  14. Lens names have always slightly confused me, and I understand there is really no universal, formal meaning beyond a bit of branding. Distagon? Planar? I'm thinking also of the Meyer-Optik stills stuff made in the 60s, which had names like Oreston and Orestor, for maximum confusion, and Orestor was used for both the 100mm f/2.8 and 135mm f/2.8 lenses. I assume they're referring to a specific optical layout which they bent to do both focal lengths, or something. At least Olympus had some sort of system for their stills lenses; the letter indicates the number of elements, or something like that, so an F.Zuiko has six elements.
  15. As a sort of social experiement, what would anyone think of these colours, in comparison to the ones which received such a glowing reception above?
  16. That green probably isn't even a particularly good representation, on a website that's being presented to you in sRGB.
  17. It seems to be doing at least some variation on what a lot of similar things do - contrasty cool shadows, low contrast warm highlights. There's a point to be made about what modern film actually looks like, too. Modern camera negative is very low contrast because they were trying to compete on dynamic range, and they easily could. What that actually means is very, very flat, which of course is exactly what most people don't want when they go for film (or film emulation). It's also very often digitally stabilised, so it's very, very stable, and if you shoot 200 speed stock it's pretty low grain, and then you can degrain it - and that's before we even get to the point of being Christopher Nolan and shooting 5/65. The better we do film, the less it actually looks like film. Which, of course, was always the point. Before we replaced film, its faults weren't cool.
  18. I don't know and Kodak won't tell you, but it's reasonable. It's not just slop the gloop on the plastic ribbon. The chemistry has to be mixed and coated on there with incredible precision, or the density of the pictures would be all over the place. Even black and white film now has a handful of layers to control its contrast, while colour films have a handful of layers per colour. I'm speculating, but I'd be fully prepared to believe there's an unavoidable need for some pretty qualified process control people and some very exacting equipment maintenance. I think the real risk for Kodak is someone in China deciding to do it. As Ferrania found, setting up to make colour film is extremely non-trivial, even when you're an ex-manufacturer with most of that manufacturer's facilities and people available to you, but it's not impossible.
  19. OK, let's do the numbers. A 400-foot roll of 35mm film has an area of (400 × 12 × 25.4 × 35) ÷ 1000000 = 4.267 square metres. This overlooks the value of the silver in any material punched out to create sprocket holes. I bet that's recovered, although the cost of doing so will affect the value of recovered silver in ways we can't predict. Assume price of silver is about US$0.742 per gram. Assume 5g per square metre. 4.267 × 5 × 0.742 = 15.853. B&H charges $327.50 for a 400-foot roll of 5219, so even if we found an ideal, no-drawbacks replacement that was free, it would make film less than 5% cheaper.
  20. I'm also not very sure if replacing the silver is likely to reduce the cost significantly. Do we know how much silver there is in a foot of film? Even if we found an ideal low-cost replacement, the manufacturing process is still hugely complex and demanding of exquisite precision, not to mention based in a high-income country. Could it ever realistically be cheap?
  21. Not so many of those anymore. And no more than trace silver in colour film, anyway. It's washed out at the bleach stage. If it's bleach-bypassed or proper old-style black and white, then there's silver in it, but not in normal colour film. And yes, they recover the silver from the bleach, including just running old black and white scrap film through the bleach bath to strip out the silver.
  22. Procedural grain emulation is hard from the ground up. Any reasonable example is going to involve generating some random noise as a seed for it, and generating high quality random noise is surprisingly hard work for computers. The noise filter in After Effects is not fast. Visualising the results as an image is a common way of analysing the quality of a random number generator, since problems tend to show up as patterns in the image, which is exactly what we can't have for this sort of application. Good random numbers: Bad random numbers: Personally I don't think that doing a physically accurate simulation is particularly important. After all, shooting a grey card and scanning it is a physically accurate simulation, and depending on how it's composited with the original it can look absolutely horrible. I have noticed that the sort of grain people seem to like actually isn't that realistic anyway; it's often a lot less colourful than real grain. Fashionable grain simulations actually end up looking like the sort of grain created by ENR or bleach-bypass print processing, which introduce a sort of black cellular noise to the image based on the chunks of silver in the image. It's my observation that real grain on colour film looks a lot more like video noise than people like to admit. I base this on the fact that I once spent the best part of a year sitting in a suite at a manufacturer of film scanners and grading equipment using, as test material, some of the original scans from a film which was Oscar-nominated for its cinematography, but which had been shot super-35 on 500-speed film. Having spent many hundred hours pixel-peeping that material on quality displays, I think I can claim some quite close familiarity with what film grain looks like. P
  23. In a desperate attempt to bring this to some sort of worthwhile conclusion: Chance, I don't think anyone here wants anybody to be treated unfairly, or for anyone to be unpleasant to anyone else. The reason this stuff starts to lose its audience, though, and I suspect the reason you're finding a slightly tough crowd here, is that you're using an example of someone losing a court case as evidence of some grand conspiracy within society to support that person's position, whereas in reality it's evidence of exactly the opposite. You risk giving the impression that you're expecting the rate of prejudice and bigotry in society to be zero. It's obvious that no such society has ever existed and none ever will, and the fact that we don't live in that society is no evidence of a shadowy movement to disadvantage certain groups of people. We push for utopia in all kinds of ways; by inventing things, by building things, by making art, by exchanging ideas. We do that, though, in the knowledge that it will forever remain an unscratchable itch. We may approach it, like a spider climbing a well which advances only half the distance to the top every day, but if the court case described in this thread is evidence of conspiracy to you, you're going to spend the rest of your life tilting at windmills. - Phil
  24. In an ideal world one would very slightly distort the image underlying the grain layer to approximate the way the regions of colour are made out of individual crystals. This can be roughly approximated with a compound blur which makes the underlying image blur in areas where the grain layer is brightest. Also, luma key the grain so it appears mainly in the darker areas. For instance, based entirely on basic tools available to everyone: Original image (all shown at 200%): Build grain layer. Start with noise: Blur slightly: Sharpen to create grain boundaries, and desaturate. Optionally, at this stage, scale; smaller generally helps, blue should be larger than red: Luma-keyed blur layer: Inverse luma-keyed grain layer. Composite with appropriate transfer modes and trim intensity to taste. Before:
×
×
  • Create New...