Jump to content

Stephen Baldassarre

Basic Member
  • Posts

    93
  • Joined

  • Last visited

Everything posted by Stephen Baldassarre

  1. In rereading my last post, I made some errors. It's been a really long weekend! For instance, if I remember correctly, the "shadows" of a BMPCC are handled by 30dB high-gain channels in the sensor, in parallel with the unity gain channels, which equates to +5 stops. Using the global shutter mode disables the high-gain channels because of bandwidth/processing limitations, so you are left with 7 stops, not 6. Any way, I made a few other goofs like that but the principles are correct.
  2. You are correct, we have gotten rather off-topic. I really have considered buying a used BMPCC and after-market OLPF several times, but there's still other things about it that bother me. CCDs are natively global shutter. It has nothing to do with the camera's processor. All photosites on the sensor simultaneously become photo-sensitive and simultaneously dump their charges into neighboring capacitors where they are then sequentially sent to the output. With CMOS sensors, each row sequentially becomes photo-sensitive and sequentially dumps directly from the amplifiers. I am aware of several software solutions to rolling shutter, but the best way to fix a problem is to not have the problem in the first place. You can still see jello-cam shots even on high-budget productions, especially where there's high action or flashing lights. I've used one such plugin and if you don't have it set up perfectly, it can even make the problem worse. The fact that there's no on-board amplifiers on CCDs also allow the holes to be larger, allowing slightly higher native sensitivity and lower aliasing. A 3-CCD camera can almost get away with not having an OLPF because there's virtually no space between the holes, and I have in fact gotten decent images out of a 3-CCD camera that was missing its OLPF (mix-up at the repair facility). With a 3-CMOS camera of otherwise equal specs, aliasing is more noticeable because more detail falls between the holes, almost like having a UHD camera but only reading out half the pixels. Very true, I was just hoping there would be ONE HD camera out there for $1,000 that had global (or close to it) shutter and a proper OLPF. My requirements are pretty minimal, but it seems everybody has become blind to the issues that bother me the most. See, I thought Digital Bolex was the one manufacturer that DIDN'T make the same compromises as everybody else. Global shutter, proper OLPF, raw capture, great audio, no frills. It does have a viewfinder, that screen on top. It isn't very good but it's good enough to frame the shot, which is all you need. I shot with it for two days, both indoors and outdoors and never used an EVF, which would have been easy enough to add. Incidentally, the viewfinder was originally going to be a black & white LCD that swiveled to be 90 degrees to the ground. It was Kickstarter backers that insisted it be color, which required a more powerful processor to demosaic and white balance the preview. That, along with other added features, meant there wasn't enough room in the compartment to allow it to swivel. That's your fault, sorry. The CCD's color filters are so pure, the natural output is well beyond the gamut of most video formats. You have to pull back the saturation to fit within the color space of your environment. This is exactly what I meant when I said most people don't know how to handle it. You're used to BM's sensors, which have very weak color dyes and thus, fairly low saturation. If you simply apply a demosaic algorithm and gamma correction to BM's raw files, you get a very flat, gray image. The D16's output also has no contrast "curve", just a gamma transform, so you have to adjust the head/toe curves yourself to get soft clipping like what's baked into the BMs' sensors. See, 200 is as high as it gets for me. The studio cameras I used to use were natively 400, which was OK because we lit the room specifically for those cameras, but they really sucked when we took them outdoors. My favorite thing to do when I was shooting film regularly was to load in Vision2 100T for entire shoots, exposed as 50T. Outdoors, it would be about 25 ISO with an 85A filter, which put me at F11 if I wasn't using an ND filter. The D16 has true analogue gain. Before the last firmware update, they let you crank it up to 800 ISO. For some reason, they switched to a "BMPCC" mentality with the last update, so when you set it to "800", you were merely changing metadata rather than actual gain. Luckily, the owner of the D16 I borrowed ignored the last update. Any way, it looked good at 400 ISO, a little noisy at 800, but a little DNR applied to the shadows cleaned it up nicely. Remember, all modern CMOS cameras have DNR built into the sensor itself. Some cameras have more DNR built into the processor, which you may or may not be able to bypass, but the sensor always has it, unless shooting in global shutter mode. I think I said elsewhere that the BMPCC's sensor has a global shutter mode but using it would mean about a 6-stop dynamic range! OK, so smaller holes have less volume for collecting light, but the noise floor stays the same. Thus, there's a lower signal to noise ratio on the output. They eye can pick out details below the noise floor but most cameras automatically clip the output right above the noise floor, hence the loss in latitude. In more technical terms, let's say you have a sensor found in the typical broadcast camera. Its 5uM pixels would have a 65dB S/N ratio, or roughly 12-stop range, and a native ISO of about 400. A UHD sensor of the same size has 2.2uM pixels, which is 1/4" the light gathering volume per pixel, and a native 100 ISO. The noise output is the same as that of the HD sensor, so you now only have a 42dB S/N ratio and lost two stops of latitude in the process. On that note, my phone has 1.5uM pixels and is natively 50 ISO. HDR mode is a MUST in order to keep any shadow information. That comes with cost of its own. Price increases at a square of the size. So, a 2/3" HD CCD may cost $300. If one were to use UHD, it would need to be M 4/3 sized to avoid loss in dynamic range, which would be $900 each. You also have much shallower DoF, more limitations on lenses etc. I think M4/3 is a good format, very close to flat 1.85:1 35mm, but you would HAVE to use CMOS to keep the price under control, and we're back to our rolling shutter problem. You can add a mechanical shutter like some of the Alexas, but that's also very expensive. Note that in order to meet demands for a higher resolution camera, Alexa introduced their 6K 65mm camera to avoid loss of latitude. That's a difficult format to shoot. Good to know. IMO, it shouldn't be optional for such expensive cameras.
  3. I was originally going to reply directly to comments in quotes but it was getting to long and complicated, so I'm just going to touch on some of the ideas addressed in the replies. The BM cameras do alias, badly, and using "soft" glass is not an acceptable solution. I don't see rolling shutter or alias distortion on film or professional video. Film's color rendition is much closer to the human eye as well. Most digital cameras don't pick up deep blue or red well and the the green band tends to be broader, due to having higher resolution. The dynamic range of a sensor does matter even in Rec709. A 10-stop sensor, when properly encoded, will look much better than an 8-stop sensor. That said, the BMs are not suitable substitutes for film, nor professional video, despite claims to the contrary. Yes, the story is more important, but If that's all that mattered, we'd all be shooting on cell phones. It's all about compromise in the lower price range. It's just a matter of WHAT compromises you want to make. The sensor the BMPCC uses costs a lot of money, and I do believe they used the best CMOS sensor possible at the time. They easily could have eliminated various features to improve other things though. While I find the compromises in low-end cameras unacceptable, I understand why they made those compromises. People value features over quality. Not me; I'd gladly accept a simple passive lens mount with a 2/3" 1920x1080 global shutter sensor fixed at 200 ISO, 24fps, recorded in MJPEG at 200mbps with a black & white LCD viewfinder and no audio. There's no reason that can't exist in the $1,000 price range and have a proper OLPF. As for the variable latitude of the BMPCC, it doesn't actually have gain or adjustable ISO. It's natively about 400 ISO and everything else is metadata telling the software what to do with it. Treating it as if it's 800 may add a stop in the highlights but you lose a stop in the toe while shooting at 200 gives you a stop in the shadows but you lose highlights. When using actual gain, this isn't the case. You get more noise using higher ISOs, but no latitude is lost in the shadows. It's cheaper to use fixed-gain architecture though. As for color space etc. I shot green screen with 8-bit 4:2:0, 4:1:1 and 4:2:2, as well as 10-bit 4:2:2 uncompressed. 4:1:1 is extremely difficult but it can be done. There isn't that much difference betwen 4:2:0 and 4:2:2 and I didn't notice any difference with 10-bit vs 8-bit. The optical path makes a HUGE difference though. Soft lenses are almost impossible to composite without matte lines and if they have chromatic aberrations, forget it. The D16 was a very expensive camera to make. The CCD and OLPF alone cost THEM about $700. Then you have to add in the amplifiers, the ADCs, enterprise class SSD, custom FPGA, HDMI board etc. Don't forget it had audio capabilities akin to a $500 stand-alone field recorder, with high quality preamps, XLR inputs and true analogue gain control like you would find on a professional system (not a $200 Zoom). Unfortunately, they listened to their Kickstarter backers too much and had to redesign the camera almost from scratch to cram all the extra features into it. I estimate the redesign added about a year and $500 per unit cost to the D16. The D16 really doesn't have a look of its own. The CCD's output is digitized at 16-bit linear, converted to 12-bit Gamma-1 and stored. The rest is up to the user. The BMPCC on the other hand, has image processing built right into the sensor itself, including noise reduction and FPN cancellation. If you want that in a D16 image, you have to do it yourself. Any way, the D16 sold better than they expected but had issues with several parts manufacturers in China and a price hike on other parts made continuation impractical. Their margin was already low and they couldn't raise the price because people already claimed it was too expensive compared to the BMCC. So, they decided to get out of the business. I tried to convince them to go a different direction in the year leading up to that decision; use one of Sony's new global shutter sensors with a simplified FPGA, but it fell on deaf ears. People want "4K" because marketing told them to want it. These are the same people that insist on phone sensors being 18MP when the lens can't resolve more than 6MP or so. Marketing people have the masses convinced that UHD and 4K are the same thing (4K is a theatrical format) and that an LCD screen can really have a 10,000:1 contrast ratio. Professional HD cameras will produce sharper, cleaner results than a semi-pro UHD camera. You can easily bump up well-done 1080 up to 2160 and nobody will know it didn't originate as 2160. Video streams are so heavily compressed that the difference in bandwidth has more affect on the image than the actual number of pixels. That said, you can very well bump 1080 up to 2160 and have a noticeable improvement in image quality simply because more data reaches the screen. The lens is the limiting factor in most cases, and I've seen plenty HD and UHD video that resolved 600 lines, especially with folks that like to open up their lenses. 35mm can easily resolve 4K, but you're capturing on video, 4K costs dynamic range. That's why the Alexa is 3K S35. The .0083mm pixel format gives *almost* the same latitude as Vision3 stock, about 14 stops. 3K resized to 2K allows better luma resolution from a single-chip sensor than capturing 2K. More pixels mean smaller pixels, which means lower dynamic range. There's all sorts of claims from marketing people about this camera or that camera having 12-13 stops but the fact of the matter is, dynamic range is directly tied to hole size. You can cheat higher numbers with DNR, but lose detail. Engineers across the board will tell you pixels smaller than .005mm will cause noticeable degradation in the image, which is why pro video cameras have 1920 pixels on a 10mm wide sensor, or .0052mm, which is still 2 stops less than the Alexa. Now, some low end DSLRs do have OLPFs, but they're optimized for stills. In order to avoid ALL aliasing for HD video, the resolution needs to be less than 1000 lines or so for a 3-CCD camera, 500 for a single-chip camera. If you use a superior off-line demosaic algorithm, you can get away with 750 lines. Since most DSLRs are in the 5K range... I don't think Reds have OLPFs, as they have this edginess to them that annoys me. I suspect Alexas do. I have to stay with Windows. I can't afford to pay double for "shiny". I met with a guy that was working on a video game that includes some composited live action. He commented up front when showing me his workflow that "Mac is so much better for this than Windows". It turns out he doesn't own a Windows machine and hadn't even tried video composite work till a few weeks ago. I can say without a doubt, though, that the biggest issue he has is poor lighting in his studio. I got better composites than him in about five minutes using Vegas, but estimated he needed at least two stops more light to get clean composites. I think maybe 10 years ago, Mac may have had an edge. 20 years ago, Commodore did. Things change.
  4. Oh, I didn't think the price would fall that quickly. I'll take a look. I'd consider it a DSLR. It's internally no different from a consumer-grade mirrorless camera except it is optimized for video rather than stills. Interesting point. In viewing tests, most people can't even tell the difference between 720 and 1080, and Hollywood doesn't seem to be particularly fast at jumping on the UHD/4K bandwagon. I guess the power of consumer marketing shouldn't be underestimated. I'm not inherently against UHD or 4K but there's far more important factors IMO. Sure it aliases; it has no anti-aliasing filter at all and fairly weak IR filtering unless you use an after-market optical block (for $300). So in my book, the base price for a *new* BMPCC is $1300, then one must add a lens and some way of handling it. The word-length has nothing to do with aliasing and the only way 4:2:0 subsampling can cause it is if you used a bad encoder that didn't low-pass the chroma before resampling. On-the-fly window resizing used in You Tube etc. can cause aliasing, but I base my opinion on hands-on usage viewed at its native 1920x1080p24. Nice shoot by the way, very classy approach. I watched it on my computer and my 50" plasma screen. I did noticed some chromatic aberration, which, in conjunction with your low F-stops, may be softening potential alias distortion. Rolling shutter is ever present, though not as nauseating as many cameras. Well, no. Rec709 is designed for CRT displays and its contemporaries, which have about a 6-7 stop output range, but Input range does not necessarily equal output range. Even my Canon G20 can give about nine stops under the right conditions. The studio where I used to work used $25,000 CCD cameras that have a good 10-11 stops or so despite being Rec709 compliant. True, but there's really no reason to expand the color space, screw with it and re-condense it back to Rec709. You can do color correction in the Rec709 space as long as it's not too heavy-handed. I should say that I do probably 95% of my work without any manipulation in post, aside from maybe suppress some green spill. I do my best to light/expose for the look I want. I do love the idea of raw recording, which is why I was initially looking at a Digital Bolex (and was willing to pay the extra price for global shutter, no aliasing etc.). Those guys really did their best to make a suitable digital replacement for Super-16. It's a shame most D16 shooters use crap lenses and don't know how to grade the image properly. I got a very natural image very quickly with it while others were complaining about strange tints, over-saturation and limited latitude. When they announced they were discontinuing the D16, I revisited the BMPCC again (and again just recently). I forget which sensor the BMPCC uses (I sourced them at one point. They cost almost $1,000 by themselves in quantities of less than 10x BTW), but the color purity is nowhere near that of the KAI-04050 used in the D16, so I suspect people were failing to pull back the saturation and high/low knees to fit into a Rec709 output format. Kodak's (now On-Semi) CCDs have a spectral response very similar to film and the human eye while most CMOS sensors have weaker filters to "cheat" up the ISO and thus require a saturation boost in processing. I've never been able to get DaVinci to work right on any of my computers. One of my clients had the same problem, despite meeting system requirements. Any way, I use Vegas 14 to do my compositing, which allows multiple points of control over luma, hue and saturation. Somebody can even wear a pale green shirt on a green screen and it would look correct upon output. I can set up the composites so the shadow cast on a green floor will carry over to the composited image, though it requires a little more tweaking. I am not a fan of green screen either, but it's a necessity these days. It may sound like I'm an undiscriminant hater, but I do keep revisiting it as an option. However, every time I work with BMs, I am quickly reminded of why I don't like them. I know some people can get good results with them, but I haven't the patience for their short-comings. I've worked on many BM shoots and its issues become especially obvious when used in multi-cam shoots against professional systems. I did a 5-cam concert a couple of years ago with one and I couldn't for the life of me get the same color as the 3-chip cameras. I know, apples vs. oranges, but I wound up having to dumb down the 3-chip cameras' video to match the BM camera. While it has less rolling shutter than many other DSLRs, it's still bad enough that one must stick to fairly graceful moves (must be on a fluid-head tripod or stabilizer). It doesn't have nearly the latitude they claim except under the absolute ideal conditions AND you use DNR as a matter of course. Any way, I think this is a great discussion. I am actually thinking about building my own modular camera system based around the Sony IMX249. It's a CMOS sensor but it uses analogue memory like CCDs for native global shutter operation without loss of dynamic range (many CMOS sensors like what's used in the BMPCC have global shutter modes but lose their internal noise reduction and half the dynamic range with it). The color purity is close to that of the Kodak CCDs as well, so that's a bonus. Any way, if I can get this system working the way I want, I can potentially get a 1080p camera with raw output, recorded on swappable SSDs, for about $1,000. If I can get it working the way I want, maybe I can make a "prettier" version to sell. There would be no automatic anything and probably no audio though. Maybe one day, Sony will have a UHD sensor like that but it would have to be S35 sized to avoid loss of dynamic range.
  5. I know this sort of question gets asked all the time "what's the best DSLR under $xxx" but I know what compromises I'm willing to make vs. not and simply lack knowledge of all the different makes/models out there. I keep looking at this market because everybody seems to swear by it, but I have yet to be impressed by the normal "go-to" models compared to conventional video cameras in the same price range. This will not be my only camera. I already have a Super-16 camera with a collection of prime lenses as well as a 1/3" HD video camera with built-in zoom lens. I would like another camera somewhere in between that can be used for semi-pro video productions. I care most about: Minimum rolling shutter, this drives me absolutely nuts and is why most DSLRs wind up in the "useless" list for me. Minimum aliasing, this also drives me nuts and is why I can't consider cams like the GH3 or BMPCC Good dynamic range, not so much low noise but the ability to preserve highlight and shadow detail. Why I can't choose the GH2 (as well as rolling shutter) Natural color, I don't want to make the look in post, another reason I can't consider Blackmagic anything. If I can get cleaner green screen composites than with my Canon G20, so much the better. I actually do pretty well with this but it's harder than with say a $25K studio camera. I DON'T care about: UHD or 4K High frame rates, 24p and 30p are all I need. High bit rates (I have an Atomos Ninja 2) High ISOs, I light my scenes. Audio, I'll be recording on an external device I don't understand how people can talk about this or that camera and how "cinematic" they are but are full of horrible artifacts that no true cinema camera has. Perceived sharpness and shallow DoF matter less to me than avoiding typical DSLR artifacts. This will mostly be for weddings, You Tube videos and the occasional commercial. I am hoping to keep the price under $1,000, as I hire cameras/crew or just shoot S16mm on bigger shoots. I imagine I'd most likely be looking at the M 3/4" range but am open to ideas. I'm close to considering an industrial USB-3 camera and making some kind of portable recording computer. I've found a few with native HD raw DNG capture with global shutter like the BFLY-U3-23S6C-C. Thanks for any insight.
  6. Yeah, most of the Spaghetti Westerns were Techniscope as far as I know. It saved money not only in production but also in licensing fees. The down side is not only are they starting with half the real estate, but using an optical printer to enlarge the image further reduces resolution. They often used unsharp masks in post, so people complaining the Blu-Rays have edge enhancement can shove it. That brings me to "Super-35", which combines the expense of 4-perf with the low quality of 2-perf.
  7. This is a common myth. The shutter is always running at 24Hz (or 23.976), so reducing the shutter angle simply means it spends more time being closed, reducing open/closed ratio and making flicker worse. Luckily, most lights don't flicker noticeably, save maybe bad fluorescents. If you want to get into the math of it, let's assume you're using a lamp that puts out no light at all (virtually none do this but...) at minimum voltage and 100% output at maximum voltage. That means it will be full-brightness 120x per second (peak AND trough give 100% output). At 180 degrees and 24fps, the shutter will be open for an average of 2.5 peaks (flashes), closed for 2.5. I say on average because if you're shooting for television, it's actually 23.976fps while the power runs at 60Hz. The slight difference means sometimes the shutter will be open for 3 peaks. Sometimes it only catches two peaks. That equates to a 1/2 stop max variation over several minutes. By going to 172.8 degrees, it's open 2.4 peaks on average, closed for 2.6. There will still be times where the camera sees three peaks but has greater likelihood of seeing two. However, the wider the shutter angle, the longer the exposure (motion blur applies to light too). A 360 degree shutter will always get 5 peaks while a 288 degree shutter will get 4-5 peaks, which is 1/3rd stop max variation stop over several minutes. In reality, the difference in flicker between 180 and 172.8 is almost nonexistent because even fluorescent lights only dim by a small percentage between peak and trough, assuming the ballasts are decent. The only time I know of film productions running at 29.97fps are where there's a lot of practical CRT TVs, like Max Headroom or The Wonder Years (3-perf 35mm and 16mm respectively). In countries using 50Hz power, flicker is worse because the lamps have more time to dim between peaks, so it's a good idea to only use 25fps if you aren't using flicker-free fixtures.
  8. I don't mind hand-held when it's done tastefully, though it totally bit me in the butt once. I shot a project (S-8, 16mm and some 35mm with standard def digital post) for TV a few years ago, but then it got played in a big theater and I felt sick. To add insult to injury, I think I was using a 90 degree shutter angle. It was also rather soft because I was shooting around F2. On a side note, I was using my trusty 7212 exposed as 50 ISO. It looked really cool on TV! Had I known it would be on a 10M tall screen, I would have produced it with an entirely different style. For starters, I would have done it all on S16 and 35mm, used a 180 degree shutter angle and shot at F4. I would have stuck with my 28mm lens instead of using a zoom and also would have used a matte box to cut down lens flares. Most important, I would have put forth some kind of effort to keep the camera steady! Thank goodness; though the only IMAX theater here is running video now. Yeah, DCP's track record is nowhere near as good even over a 1-year span. When everything is a computer in disguise... NICE! I think the only way to see film in my area is at my house, but I sold most of my collection recently so it's slim pickings. I'm not a Tarantino fan in general, but I AM curious about that one.
  9. I'm glad you crossed out "the story is good" because, hey, Michael Bay movies. :) There's a lot of truth to this. I dabble in painting and I definitely have my favorite brushes. I prefer oil on canvas when everybody wants to save $0.35 by using acrylic. Yes it's a tiny bit more expensive and takes a little longer, but it's also much more flexible and behaves the way I expect. ... and camera moves need to stay slow. A quick pan is no big deal when the screen is 30 degrees of your field of vision but when it's 60 degrees, the entire image can jump several feet from one frame to the next. The last authentic, made for IMAX film I saw was "Dinosaurs Alive"... what a disappointment that was! It started like a documentary and was interesting, used the 3D process well etc. but the rest of it was that stupid acid trip excuse to show off their CG. On the other hand, the first one I saw was "The Dream is Alive", which was perfect for me because I've always been big on space exploration. My college roomie had it on laserdisc, but I thought "what's the point?" What are they doing? I know more stuff is being shot on film now and they are at least THINKING about bringing back Kodachrome (though I'm under the impression that all their manufacturing is outsourced). Well, I think it was an important step in the right direction and I still follow its standards for setting up my studio and home theater, for mixing etc. Any way, the audience seems to have thought it was just another sound format, like Dolby stereo. How were they supposed to know it was a quality control thing? Agreed. Though modern mixing/mastering techniques are by far the most harmful aspect. Hyper-editing, pitch correction, sample replacement, hard limiters and clipping for the sake of "loud" have all been summed together to turn a decent medium into a wash of eye-watering noise. Sadly, Hollywood (both visually and sonically) has gone the same direction. Yeah, specs aren't everything. Even the first generation of digital recording systems were "better" on paper but are really hard on the ears. I did a series of blind tests where several dozen people listened to material I mastered. Everybody loved my 1/4" material, hated the digital U-Matic and the stuff I did at 24-bit 48K was somewhere in between. They were all the same masters, just captured via different media. One guy, who said 1/4" tape was "good enough in the 60s but doesn't cut it now" unwittingly chose my 1/4" master as his favorite. Sadly, most vinyl these days is made from the same uber-crushed masters as the CDs, so I gave up on trying to find good vinyl. Totally. That movie is still on my "to do" list though. Did you see it in 70mm? I agree that a lot is lost in conversion but there's still an advantage in color and motion with shooting and scanning film. What a lot of people fail to realize is that it's not just about resolution and dynamic range. There's also another set of optics that alter the image. On a related note; I publicly said that any mix engineer who throws away what the band sent him in order to use canned drum samples is a lazy jackass and has no respect for his clients. Somebody responded with something like "The greatest mix engineer, Chris Lord Alge is jackass?" I told him I don't care if it's common practice or if popular engineers do it, he isn't doing his job. He's supposed to MIX the band, not replace them just because he feels like it. It would be like an editor replacing a character of a movie with CG without any input from the director. Any way, I actually appreciate the Alexa. It's the only video camera that can *almost* fool me. Still, I prefer 5213 or better yet, 5212 (discontinued :( ).
  10. Heh, I was talking about the analogue (Dolby Stereo) tracks, sorry. But yeah, the SDDS tracks are the same color (though it doesn't quite show that way on the scan).
  11. That's a cool resource, thanks. In my budget estimate: I utilized a lower shooting ratio, included an all photo-chemical post with print and I own my camera equipment. Actually, I could probably borrow a Mitchell BNCR and all its support (dolly, crane etc.) if I asked REALLY nicely but I would never be able to raise the money for the film/lab costs.
  12. You know, you do have a point to an extent. It used to be that when there was an anamorphic print, the curtains would draw back revealing more of the screen and you would actually get a bigger image than with flat 1.85:1. Shooting Cinemascope meant not only a wider screen but also tighter grain while Techniscope would mean wider screen but lower res and higher grain. That doesn't happen with digital projection, so shooting 2.4:1 means a SMALLER image and massive wasted real estate on the screen. There would be no real difference in terms of resolution because there's no additional enlargement. Even using anamorphic lenses on the camera just means small image and wasted screen space now. I think it's really odd that since anamorphic film prints and proper screens are a thing of the past, 2.4:1 has become the norm. I wish people would adopt the native 1.89:1 ratio of the projectors/screens. 3-perf is ideal for this purpose, particularly for action movies because there's no nauseating rolling shutter. Yeah, I never understood that mentality even when shooting video. You're wasting time on-set and making the editor's job harder. If I shoot 6:1 on average, that's a lot. What really irritates me is when directors (it happens with film too) who start the camera, do a take and then start coaching the actors on what they actually want without stopping the camera or slating new takes. I am under the impression that this is more to get enough light and eliminate dot pitch on the giant screens. I don't like seeing Hollywood movies on that large of a screen. The material really needs to be specifically produced for IMAX to work well. Sadly, the documentary films are getting rather out of date but the profit margins are too low compared to repackaging an existing product. I actually spoke to a Kodak engineer a few years ago, during the time when they were actively hiding the fact that they were a film company at heart. They made a lot of mistakes over the years. 1, not utilizing/selling digital imaging back in the 70s/80s when they had a corner on the market and 2, intentionally concealing film and pushing digital when they already lost the war with other digital imaging competitors. I'm afraid it would be the same with IMAX at this point. I thought THX was great but consumers had no idea what it actually was. They also charged too much. The theaters cost more to build/equip and they added insult to injury by adding huge licensing fees. Being primarily an audio guy, I totally get it. I remember having a discussion with somebody who thought scanning The Wizard of Oz in 8K was a bad idea because the optical resolution was 2K at best. I told him it's not about getting higher resolution, it's about minimizing digital artifacts, especially since they had to resize the separate strips differently. In audio, we regularly work at 96KHz even though most people only hear up to 15K or so. It's not about getting more bandwidth, it's about keeping the digital artifacts out of the hearing range. Some do, most don't. The studios really push for shooting video because they think it saves money. If you ask me, they should save money by refusing to pay $10,000,000 for big name stars. I think the digital intermediary did more to destroy the concept of "natural-looking" long before video capture/projection became the norm. Tell me what's "natural" about this film I have in my physical possession, the frame scanned with completely neutral settings. Now compare that to some film I have from before digital color grading was a thing, scanned under the exact same conditions. All of those examples were shot and printed on film and yet there's nothing "natural" about the top example, which heavily utilized the DI process. The sky is yellow, faces are pasty and trees are the same color as the Dolby tracks and the Blu-Ray review touted "natural color"! "Khartoum" has a little bit of a yellow cast, but this was just a timing test that was donated to me by somebody in the industry. Now I agree that no digital camera can capture color as accurately as film, but far more damage is done in the name of "artistic" flexibility.
  13. That's what I mean. The end result of "super" 35 is only about 10% more real estate than Techniscope, so no big difference. I use quotations because there's really nothing super about it. It's literally just full frame with 50% of the film getting wasted. I'm certain S35 is the most wasteful film format ever created. At least Techniscope reduces film consumption. Since almost nothing I do makes it to theaters, I was actually considering getting a Techniscope camera a while back but it's getting too hard to justify using film to clients. Yeah, the film may be a big expense, but is still a relatively small part of the expense even on a "low budget" production. Shooting a 65mm feature on 14M would also send a strong message to Hollywood about the stupidity of insisting on shooting video, not that they would change anything. I thought about doing a feature on S16 and figured the film and lab costs would be around $50,000. That's a big chunk of change to me but compared to even say a typical TV movie budget, that's NOTHING! Yeah, IMAX is a pathetic, hollow shell of what it once was, as are the regular theaters. It's all in the name of saving money for the studios, but now there's no reason for me to go to a theater, considering most DCPs are 2K and I have almost that on my plasma screen at home. i know DCPs have better gamut, less compression etc. but it isn't worth $20 for one-time tickets for two when I can BUY a Blu-Ray for the same price. Plus, none of my speakers are blown and the floor isn't sticky. It's nice to see a movie with some intellect behind it (even though it isn't really scientifically accurate) rather than the typical Hollywood schlock like the Star Trek reboot. One thing I appreciate about Chris Nolan is his reluctance to use digital intermediaries. You can't avoid it with composites these days, but I always thought a contact print looked better than a film-out. Plus, DIs give producers and corporate mooks an excuse to spend weeks going "highlight his face, they lit him too dark." and "can you tint everything orange?" creating what I call the "alien world" effect where the laws of physics no longer apply. It's really disturbing looking. I got irritated with one guy just for increasing the contrast on the final product of a movie I shot. I told him I intentionally shot with an Ultra Contrast filter BECAUSE I wanted a very flat, dreary look.
  14. I never thought Techniscope looked good. You can "get away" with 2-perf now because of modern low-grain stocks and digital sharpening, but it doesn't hold up against 4-perf IMO, even with only a 2K scan. Don't get me wrong, I love S16 for TV and given the choice of shooting 2-perf or say a Red for a movie, I'd choose the 2-perf just for the better color and motion rendering, but I would not inter-cut 2-perf with 4-perf for the sake of convenience. I know James Cameron did it, but he was a "Super" 35 shooter so he only lost about 10% of image height as opposed to almost 50% going from anamorphic to Techniscope. Maybe things have changed in recent years. To my knowledge, the cameras were much like Mitchells (pin registered but nothing super high-tech) and printing was done via modded 70mm contact printer unless they were doing a blow-up or reduction. I'm glad to hear it's gotten better. I know somebody that worked for IMAX who told his boss that they weren't getting nearly the resolution (using proper test charts) they claimed with the public. The response was basically "don't say anything". I knew another guy a few years ago that did lab work for them and said they had the highest defect rate of any format he's seen. That said, even if they ARE getting true 12K resolution, I'd still prefer 65mm. Quieter, easier to handle cameras, still an amazing image at a fraction of the expense. If somebody wanted me to produce a movie with an unlimited budget, that's the format I'd choose. Sadly, I'm well below that level so I must do most of my work on video or S16 if I'm lucky. On a side note, the eye is not very sensitive to extreme resolutions, so it's possible for the center of the image to be 4K with the edges being somewhat better (particularly considering lens distortion) without anybody noticing. Most people don't even notice the difference between 720 and 1080 and some don't notice the difference between 480 and 1080 in tests. I remember several people thanked my studio for "going HD" a number of years ago when in fact, all we did was switch to 16:9.
  15. It looks like it was hand-colored by pencil. It was a common practice at the time and I learned to do it as a high school student. BTW, if you ever want an accurate idea of what the Old West truly looked like, check out "Tombstone" 1993. While the Blu-Ray has a yellow cast to it and mid-tones are often pulled down unnecessarily, you can tell that clothes were colorful, vibrant and well-made. We as a society seem to have been trained to associate brown and yellow with the past due to sepia-tinted photos but that's not what their world looked like. Honestly, I really wish modern movies taking place in modern times had natural color too.
  16. OK, that is kinda' freaky because I was just having a discussion with somebody about some films I have that have completely lost their yellow & cyan dyes. I saw a VHS copy of one from the 80s that had natural color but is obviously not sharp. I started to think there has to be a way to map the C-channel from an inverse 3:2 pull-down copy of VHS to a scan of the film to get the best of both worlds. People in the industry are told by consultants to avoid technical talk as much as possible. Three syllable words, any kind of jargon, proper grammar etc. are all explicitly shunned to boot. Most back-catalogue titles come from inter-positives, which are color-timed positive copies of the negatives. They are usually in the best shape out of all the elements because they had the least handling and at the same time, they have the color/contrast that the director/DP approved. For a lot of older films, prints were struck directly from the negatives, so the negative is often in really bad shape due to excessive handling and there often aren't any pristine prints. On top of that, each print is timed directly from the neg so there can be variations in color etc. if a film had another set of prints made later. YCM separations are positive copies of the camera negative, done to three strips of black & white film. Each strip was exposed separately by yellow, cyan and magenta lights when they were made. They were originally intended to be used for making Technicolor IB prints. You can think of IB prints as the celluloid version of lithographs, where relief separation "masters" are actually run through dyes and pressed into the print. Now, separations are integral to preserving and restoring movies because they don't fade. YCM separations are even being made for current movies that are shot on video, because they are such a reliable and fool-proof method of preservation. On a side-note there's times where the colorist (usually under direction of some corporate presence) does whatever they want without approval from the director. One example is Snow White, where they flat out changed things for the restoration just because they could, turning green trees in the moonlight blue etc. I remember a matte painter complaining about a movie he did in the 60s. He did this great moon-lit sky background that looked natural and beautiful in the original prints and the first run of home releases. When it got "digitally remastered", they completely blew out the moon and tinted the sky an odd yellow color. Or "The Sound of Music" where they actually had a director-approved restoration inter-positive to use as reference, but decided to scan the original negative and saturate the snot out of things so it looks almost cartoonish in some shots.
  17. I know this is a moot point due to the time frame, but I myself prefer 200T for interiors and exteriors. It's sharper than daylight films and finer grain than 500T, which is important on 16mm. So, I would bring along some auxiliary lighting and light modifiers to make sure I get not only a good exposure, but that the quality of lighting works for the project. Too many people paint themselves into a corner by merely going for "enough" light and not ensuring it's GOOD light.
  18. It's tempting to tell a meddler to go away, but when they are the one that has the power to pull the paycheck... Sorry for the rant, but this is my second time seeing the same person turn a mediocre performance into an awful one on the same project. If there's anybody here who does commercial work that ISN'T a professional director, please just let the actors and crew do their jobs. You are harming the product and costing everybody money by inflicting yourself upon them.
  19. I largely agree, though I often spot quick pickup shots done on inferior B-cams. Not that it matters, but if it was my production, I would have shot the underwater scenes on 35mm. I have only shot 35mm under water and always had good results. Plus, if there's a leak, you ruin the video and the camera. With film cameras, you can hose out the innards, regrease it and you're good to go. The film just needs a warm rinse, not that I've ever had a problem. I don't understand how shooting aerial footage on film is a liability either with modern video taps and remote heads. If they were REALLY worried about 11 minutes of one location not being enough, they had the budget to mount a second camera. 2-perf is not good enough for theatrical release IMO. You don't have much more vertical resolution than 16mm, which is fine for TV but not the big screen. Pushing will bring out the highlights but not the shadows. You could flash the film a little too but really, 250 ISO with a T2 lens is fine for most cases. There's aquatic lighting too if you need it. I certainly would NOT shoot IMAX under water, or at all for that matter, even if somebody gave me a $200,000,000+ budget. You theoretically have all this real estate, but maintaining perfect contact with the pressure plate in the camera and in printing is almost impossible so you don't really get better resolution than regular 65mm. It's just so large, you can't help but have some flexing at the center of the frame. I know a once-insider at IMAX who said you'd be lucky to get 4K out of it by the time you get to the end product regardless of whether you use photo-chemical or digital post. Then there's the whole weight/size/cost factor... The biggest advantage of IMAX is the ability to push tons of light through the print to fill a giant screen. That's a moot point now since almost nobody is using film for projection. 65mm on the other hand is not only cheaper but actually higher resolution in most cases and I would LOVE to have the budget to shoot a film in that format.
  20. Many movies were shot without zooms. I also know of some classics that were shot using a single prime lens, like "20,000 Leagues..." What about "Barry Lyndon" which had scenes lit entirely by candles on 100 ISO film? I'm not a fan of that movie but there's no doubt of its cinematic beauty. I actually think being stuck with lower ISOs can have advantages. It forces people to compose and light better rather than what a lot of people do now, which is show up on-scene with little/no lighting and relying on electronics to make the image work. I said it elsewhere my go-to stock for indoor AND outdoor shoots was 7217 because it was so sharp and low-grain. I really miss it. The last shoot I did on location was with a camera that was NATIVELY 800 ISO, and not "marketing ISO" like BM but the actual ISO. It was an F16 day so I was really struggling to get in the sweet spot of the lens, even with an ND filter. On a side note, I can't for the life of me figure out why 400+ ISO still film would be daylight balanced. If you want to shoot indoors, you have to use an 80a filter, effectively making it 200 ISO. More evidence that marketing is more important than quality. I gladly paid a lot more for a mere 1920x1080 TV because there simply wasn't any denying it had the best image. Most people simply think "more pixels is better", like why cell phone image sensors have 12-20MP when more than 6MP or so is actually detrimental to quality due to dynamic range and limitations of the lens. You sort of touched on two big pet peeves of mine: One is producers etc. changing the DP's content in post, often without his knowledge or input. I've witnessed it first hand at premieres when the DP goes "that's not how I shot it!" or a grip saying "it looked so good on the studio monitors, now it looks like an Instagram filter." The other is people who aren't even competent at video production calling what they do "films". I cringe every time I hear one call themselves a "filmmaker". I want to go "You have no idea what being a filmmaker means. You don't have the basic knowledge or equipment required to do a proper job. Have you ever even touched film? You don't understand the workflow and you cheapen the entire industry by claiming to be a part of it." Of course, I don't actually say that. I just keep referring to their work as "video" and watch them get annoyed. I don't really care, they harm the industry by producing inferior product while cutting into clientele that could have gone to reputable businesses. I spend countless hours cleaning up their messes too. I recently had to prep a video for projection in a 400-seat venue. Not a single shot was within legal values and the dialogue AVERAGED -6dBfs RMS. Nothing would have made it look/sound good but at least I could get levels within standard. Most of them don't understand lighting, none of them understand audio. They'll spend five hours on location and 30 hours color correcting/reframing/denoising etc. when an extra hour on location would mean only needing a few hours in post, yielding a better final product in process. Call me an elitist if you must (and some do), which is odd because I make a lot of quick and dirty videos. The difference is I don't call that film, nor do I call myself a "filmmaker". I think that brings up another point. Film costs money, so you better be sure you get it right. Video costs virtually nothing, so who cares about a bad shot?
  21. Oh, I outright said it was intended for compositing, because that's what one of Panavisions product developers said. Maybe he was lying, but...
  22. I'm not sure how that's relevant. it can output true 4:4:4 from a single S35-sized sensor. With a proper OLPF, it can composite more cleanly than most Bayer sensors, which are natively *close to* 4:2:0 regardless of how you encode/capture. I always try to light as if I'm working with reversal film. That is, set up all the exposures to very tight tolerances so it holds up well on video. Except of course when I'm shooting on actual negative film, which is a lot more forgiving. I still use aim for exact exposure but I can have a lot more contrast in the highlights than video. I love when I light a set before the camera arrives and it just WORKS! I get the occasional "How'd you do that?" :) Yes, exactly. A lot of current CMOS cameras that claim 13 or whatever stops don't really achieve that unless you're shooting outdoors in 70 degree weather and PLAN to use noise reduction in post. Under realistic conditions, you're lucky to get 11-12. The F35 towards the end could reliably do it and with much truer color. At my studio, we always bought all the tapes (and lamps) budgeted whether we needed them or not, so we had (still do) plenty unopened boxes of tapes despite not buying any for most of 2011. We also didn't abandon tape entirely till 2014 or so, when we switched to Hyperdecks. I hate those things with a fiery vengeance because they're so unreliable, but they're so cheap we can record everything with multiple redundancies. Still, you have no idea how many times two decks failed at the same time (with or without any indication something is wrong), making the third all the more valuable.
  23. That depends on your criteria for "better". You can get a camera with smaller sensors that don't have as much rolling shutter or alias distortion, more natural color etc. They used to be common place before the "DSLR revolution". I agree with you on crippled CODECs, but i refuse to modify a camera just to have alias/IR free images. That argument can be made for many traits. Both those cameras have much better rolling shutter and the top-tier Alexas have mechanical shutters. They also have better color, dynamic range etc. I'm not a fan of Reds, or anything with rolling shutter for that matter, but we were talking D16 vs BMPCC and the BMPCC loses in all my criteria for what makes an acceptable professional tool while the D16 s really only lacking in form factor. I wonder if that has to do with it being designed for composite work. According to Panavision, they didn't intend it to be a general purpose camera, just something meant to speed up blue/green screen work while still allowing conventional cinema glass. CCDs shouldn't have pattern noise, unless you're referring to the stripped nature. Still, I noticed that noisy muddiness. As I said earlier, a lot of people used it as an excuse to not light professionally and it didn't work because it has little/no internal noise reduction. At its native ISO, it does quite well. That would really eat into the latitude of the sensor. It's not just that, but also the data through-put. Marketing people will make up all sorts of stupid reasons for something being a certain way, especially if it's misleading. The noise in the F35 is just a natural artifact of "dark current" and column/row shifting. CMOS cameras get around the issue with built-in processing while CCDs are completely passive and would require external processing to do the same. If you look at the native engineering specs of equivalent CMOS sensors, they don't natively perform as well as marketing people claim. You'd also need multiple analogue amplifiers, ADCs, clocking and processors to handle the data. With a CMOS sensor, all of that is printed into the substrate at little additional cost. You could expect about a 10-stop native DR by shrinking the pixels that much. That's why Arri stuck with 2.8K until they came out with a "65mm" version. ...A very common tail. Of course, some times stuff like that is intentional to get people interested in buying the latest hardware. A lot of the CCD cameras of old were capable of MUCH better performance than most people know, but they were programmed to clip the highlights and shadows to maintain only a 7-8 stop range even though the sensors natively handled more like 10-11. Then there was the nasty compression. Any former Panasonic/Andromeda users here? Panasonic's answer to people tapping into the DVX100's full potential was to create a phantom company to buy their shop and bury it.
  24. OK, so the part I hate most about making a video is the tedious process of copying all the takes to HDD, finding the right takes and assembling them on the timeline. I'm surprised there isn't (or maybe there is) some kind of device whereby you hit a button at the end of shooting a take you like that marks it for transfer, like an automatic version of the "print" command for film dailies. Only those takes get copied to the HDD along with the appropriate audio takes.
  25. Maybe I'm more sensitive to its issues than most, but I just don't get why people think it's a good camera. They just cut SO many corners. No argument here, though many serious ENG cameras are a bit heavy for long shoots. There's a good number of 16mm cams with similar form factor too, like some of the Aatons. You can shoulder one of those for hours... and putting the take-up right by the supply is genius, so the weight doesn't shift as you shoot. That's a great point. I have been in the habit of keeping both eyes open while shooting so that doesn't particularly bother me. I see the action through my left eye and the framing with my right eye (opposite with my K3) if that makes sense. I can imagine a 4-frame delay would be useless for catching fast action, but I wouldn't want to use a CMOS camera for fast action any way. A lot of CMOS sensors have global shutter modes (including the BMPCC) but are disabled because you typically lose something like 4-5 stops in the shadows. I'm REALLY wishing somebody will do something with those new Sony CMOS sensors that store/dump charges like CCDs. There's a $500 USB 3.0 industrial camera that uses them, so how hard could it be to make a raw camcorder with it? It has a 12mm frame and very accurate color too. So, stick in a proper OLPF, rig up some kind of portable SSD capture system and you're in business. I would TOTALLY buy that! Don't forget the replacement optical block and something to add some weight (there is such thing as TOO light). All in all, you're talking at least 2,800 to get a usable camera out of it. Maybe parallax viewfinders could come back!? JK :p Did they replace the lenses? I see GoPro video inter-cut with professional cameras on TV often and it's like going from "Blu-Ray" to "S-VHS delivered via You Tube". They used cameras on the Transformers movies? From the previews, I thought it was all obvious CG. I've only seen the 1986 movie (when it was new) but that was 35mm.
×
×
  • Create New...