Jump to content

Keith Walters

Premium Member
  • Posts

    2,255
  • Joined

  • Last visited

Everything posted by Keith Walters

  1. You often say that, and for the life of me I can't work out why :rolleyes: Am I missing something here? The only way you're going to know for sure what the focus was like on a film shoot is when you look at the rushes the next day or whenever. I suspect it's what others have said; the editor or somebody decided that the cinematographically inferior take was the one with the best performance and they went with that. There are numerous popular songs where the singer forgot the words of one verse and just whistled or burbled their way through that part, and that was the version that was released. Bill Withers's Ain't no Sunshine When She's Gone (I know, I know, I know etc) is just one example. Anyway George Lucas shot about 4 hours of Star Wars with 2/3" tarted-up ENG cameras, and you complain about 4 seconds! :P The thing about this is that I see people being far harder on this sort of error from beginners than from the top professionals on the best-funded productions. Well yes, but let's face it: If you can get the focus wrong on a Sony Handycam, what hope do you have with a large-format camera?
  2. The Flash" is shot on an Arri Alexa Phew! They clearly have no "disdain for Arri" then. :D All I've ever seen of Supergirl is where she made a guest appearance in one episode of The Flash. ​She just looked like a so-so blonde actress in a rather poorly-made costume, obviously hanging from a crane. Gal Gadot she ain't ....
  3. OK, call me a old heretic (hang on; I've got the number here somewhere), but I think the TV Flash looks a lot more convincing than the one in Justice League, despite the latter being shot on film and for a much bigger budget. The TV series has never been shown on Free to Air TV here, which used to be a pretty reliable indicator of dire-ness, but I took a punt on Season 1 on DVD at half price, and came back a week later for seasons 2 & 3! OK, the stories are a bit silly in places, and they've gone a tad overboard with the PC stick, but it's far closer to the original DC comics platform than most of the WB mutations of the DC universe. You know, sort of like how the first Iron Man movie knocked one out of the park after somebody finally listened to Stan Lee.... :rolleyes: I just have this terrible feeling that I'm going to find out it was actually shot on ... well, you know....
  4. It has actually occurred to me that you could probably simulate the palette of 2-strip technicolor by making up some lighting panels out of orange and cyan high-brightness LEDs. Not sure exactly WHY you'd want to do that, but if the need was ever there.... I suppose if you were making a period piece about people shooting a movie in 2-strip technicolor....
  5. C'mon; nothing beats TWO-strip Technicolor! Look at these stills from the 1925 production of Ben Hur! (Only selected scenes were in colour, and most prints were entirely in black and white. This is not "colorized," this is actual colour cinematography.)
  6. IMDB says this: Camera Arri Alexa 65 (Prime 65 and Vintage 765 Lenses) Cinematographic Process ARRIRAW (3.4K) (6.5K) (source format) Digital Intermediate (2K) (master format) So why is it 3.4K if they're using the Alexa 65? I saw it last night a special advance screening, just a 2K projector, but in the recently refurbished Reading Rouse Hill (sydney)cinemas. They've now got speakers all over the place - walls, roof, floor , front, rear - you name it. I don't think it particularly added anything to the overall experience, but a lot of people complained that they couldn't understand the dialogue a lot of the time. You always get this problem: "We spent a lot of money on this surround sound setup; if we don't crank the volume up, the management won't think we're getting our money's worth..."
  7. It was some years back and I can't remember exactly when, but I'd had a trying couple of days with a team of bozos wanting to bum equipment to make a student film. And as is so often the case, it was almost like they were expecting Panavision to pay them for the privilege of participating in this tour-de-force of fresh, exciting young film-making. Yep, the beret and arty cigarette image sums it up perfectly, along with the hipster beards when it wasn't fashionable to have beards. But I nearly pissed myself laughing when a couple of days after that there was a documentary on the ABC (similar to the US PBS) about the history of Australian student film making. There was footage from the early 60s, and I swear, the cinematic hopefuls in that looked identical to the bozos who had been making our lives difficult over the previoust few days....
  8. Somewhere Stephen Williams once posted a short video clip where the camera was being Dolly-ed in on the subject's face, whilst the zoom was simultaneously widened to precisely cancel the enlargement of the subject's face. That dramatically showed the difference between a subject shot physically close-up, and a similarly dimension-ed shot taken from further away with a telephoto lens. The shape of the subject's face changes dramatically, and this is one of the several mechanisms your eyes and brain use to build 3-D images. For example, imagine a scene where an actor walks into a restaurant and spots someone he knows at a table far across the room. The audience aren't going to be able to tell exactly what he is looking at from the initial wide shot, so you would normally cut to a close-up of the table in question. That would normally be done with a long focal length lens from across the room, because that's more or less how the actor would be seeing it (the shape of people's heads, perspective, depth of field etc). But if the script then calls for the actor to walk over and talk to the people at that table, for that scene you would normally move the camera up to the table and switch to a shorter focal length lens. The change in the way the people's heads are captured then immediately suggests to your brain that the actor is now close to the table. This is the same reason tracking shots using a dolly are vastly preferable to using a zoom. With the tracking shot, there will be a whole swag of image dimension and perspective changes that suggest to your brain what is actually going on. With a zoomed shot, your brain doesn't really know what it is looking at. A really dramatic example of this was that in the Iron Man movies, the closeups of Tony Stark's face while he's wearing the Iron Man suit, were done with 65mm format cameras, to give that extreme claustrophobic effect, but without distorting his face too much.
  9. Yeah, THEY might be, because they're the ones who have to come up with the ideas to make a movie work. The other 99.9% of people you see in the five minutes or so of credits at the end, basically do what they're told...
  10. When you say "someone who has no idea" you've pretty much nailed the problem. The problem is that the real world of movie and TV production bears almost no resemblance to what the bulk of the population think it's like, and they don't believe you when you try to tell them. You see that when stars of films or TV series with cult followings make the mistake of agreeing to attend fan conventions. Some fans simply cannot believe that the stars of those shows don't actually have an encyclopedic memory of every episode of every season, and in many cases don't even remember the storylines! I remember seeing one actor (I don't know who he was; it was just something I flicked to on TV) being dumbfounded when a fan asked him to reveal what happened between him and his main protagonist, after the end of the last episode of the final series of the show. He eventually said something to the effect of: "Well ... somebody said 'Cut!' and 'That's a wrap!'. And then [His protagonist] and I went off and had a couple of beers, and then we went home. "It's a script man; if the writer didn't write any more, then nothing else happened..." I came to Sammies/Panavision as a video technician, to handle their short-lived Betacam fleet in the late 80s. OK, I eventually got to work on vastly more expensive film cameras, and any time I wanted to, I could have borrowed some top-of-the-line kit and scrounged some just-expired film stock from Kodak and shot something, and also got the film processed and scanned with a bit of horse trading, but I just wasn't interested. I also knew how to load the magazines and fit them to the camera for film tests, but it was easier to let one of the desperadoes downstairs do it! I did do some moonlighting of TV commercials, but that was all video, and purely to make some extra money. I remember the first time I attended a production meeting, and I felt completely like a fish out of water with all the creative and marketing people jabbering away in their weird industry jargon. But then they needed some video equipment set up, and suddenly I was the expert and they all didn't have a clue. That's pretty much how it is on a film set: Most of the people there sort of understand what everybody else does, but they really only feel comfortable with their own job. The other thing that used to surprise people was that most of us aren't "movie tragics"! Most of the time I have no idea what people are talking about when they go on about this director, or that cinematographer. I generally only watch movies for enjoyment, not to make a personal statement. As a general rule, you'll find that people who waffle on with that sort of crap are never going to amount to anything in the real world. :rolleyes:
  11. Trey Parker and Matt Stone made the forerunner of South Park on VHS.... And Earth to all the posers, fantasists, wannabe's and assorted crackpots posting here: Until the advent of "proper" Digital Cinematography cameras, anybody who actually worked in this industry could easily tell the difference between film-originated and video originated footage, even on VHS! Just the same as you could easily tell the difference between something shot on a 3-tube or 3-chip professional camera, and a Video-8 consumer toy. It's the dynamic range that gives it away, nothing to do with resolution.
  12. So, what's the biggest-budget production you've ever worked on? As far as the DOP/camera operator is concerned, Arri, Aaton, Panavision, Moviecam and any other commonly-used brand pf cameras all work pretty much the same way; you get the exposure and framing correct, and basically push the "go" button when someone says "roll film" and the "stop" button when they say "cut". However, the magazines are all quite different, and loading those with expensive new film and unloading the irreplaceable captured footage, is best left to an expert. The clapper/loader may be not the most highly-paid person on the set, but if they screw up they can cause just as much damage as the highest-paid person. Same with the focus puller and the people who push the camera dolly. They hire people to do all those (what many people imagine to be) "trivial" jobs, so other people can get on with theirs.
  13. Ahrrr yes.... http://www.cinematography.com/index.php?showtopic=3932&page=2&do=findComment&comment=28874 (Can't go any earlier; correct me if I'm wrong, but this forum appears to have only started in Jan 2004 :rolleyes:) Mind you, approximately the same discussions have been going on virtually since the first videotape rolled in 1956. Video technology has changed recognition since then, and the various pos(t)ers have come and gone, but B.S. hasn't really changed since at least biblical times. But life was much slower then; such discussions had to be carried out via letters to the editors of paper magazines. (In 1956 that is, not biblical times). Plus you had to get past an editor, who could often make a reasonably accurate call on whether their correspondents had even the vaguest clue of what they were talking about, which is something you don't get on forums like this. I'd love to see some of these guys posting on CML :-) I still have the links to some of the more memorable responses from the those-who-don't-suffer-fools-easily brigade.....
  14. At the moment, in this country at any rate, the distributors still maintain their vice-like (death) grip over how and when their wares are displayed. If the cinema owners could make their own decisions about screening schedules, they could get far better utilization of their resources. For example, why not have "encore" screenings of features that have reached the end of their cinematic run, for the benefit of people who never got round to seeing them. I think that would be a lot better than continually playing the same movie over and over to a virtually empty cinema. I know an independent cinema owner up country who hasn't signed onto the "Digital Print Fee" scheme. (I thought it was compulsory, but apparently not). According to him most people can't tell the difference between a standard cinema distribution file and a blu-ray, and his projector will accept HDMI from a computer, Blu-Ray/DVD player, HD set top box or Kodi-type android streaming box. So, there's nothing physically stopping the showing of "reruns" in cinemas. One other possibility he'd thought of was running live screenings of major sporting events, and charging admission, but the networks are still way too arse-clamped to even consider such a notion. As far as flexibility goes, a film projector compared to an electronic projector is like comparing a steam locomotive to a sports car. Yet the content providers are still demanding that the sports car only be able to run on railway tracks....
  15. The usually quoted figure is about 1,500 lines across the screen width, which is equivalent to about 3,000 pixels or 3K. And that's AFTER four stages of copying from the initial camera negative. The filthiest lie being promulgated by the Digi-Wankers, was to the effect that: "Therefore, the 1440 x 800 images from [Geo Lucas's favourite toy] the F900 wound indistinguishable from film" First of all, it was only 1.44K which gives a horizontal of about 700 lines, and then (at least before cinema owners had digital projection shoved down their throats) it went through the much the same 3 or 4 generation duplicating process, which made the final film prints pretty diabolical. But (having an answer for everything) the same DW's would then loftily trot out the fat-headed line to the effect that "digital acquisition only works properly with digital projection", as there is some magical digital "essence" that pervades both systems, which in reality have very little in common, apart from not involving film. The correct answer was that digitally-shot footage is always going to look better on a cinema quality electronic projector, simply because it then doesn't have to go through a film duplication chain! And the only fly in that otherwise excellent ointment is that if you do a scan off an original 35mm negative and project that, it's also going to avoid the same multi-generational degradation and look a lot better too!
  16. I don't doubt that would have been at least some pressure, but clearly it was over-ruled. The Genesis (like the F900) is just a tarted-up TV studio camera, and it shows. Of course the tests were shot in such a way to show the weaknesses of digital at the time, but thats another matter. Well, why wouldn't you do that? It's what I would do. It's not like they were lying. The greatest irony is that in 2004 Warner Bros were so hell-bent on using the Genesis for Superman Returns, which ended up looking like video game footage, but all their latest DC comics features (Man of Steel, Batman, Wonder Woman etc) were shot on film!
  17. Not for the poor bastard who has to edit it for you! For multi-scene commercials we had a general rule: 5 takes of every scene, no more, no less, unless there was a particular reason for it. Then we'd all sit down afterwards and review the takes all one by one, and make notes about which we thought was the best one, which was second and so on. Then we'd hand all that to the editor. "Excellent is the mortal enemy of good" was not spoken there. I can tell you, there's nothing that gladdens the heart of an editor more than being handed a cheaparse project where they've made at least 30 takes of each scene , and they're all awful....
  18. Interesting you should say that, because for a very long time, Columbia (owned by Sony) didn't release a single feature that wasn't shot on film. This is despite the fact that until Arri got the Alexa up and running, the bulk of digitally-shot features were shot with Sony cameras. But every time I pointed this out here, one of the then-current generation of twat-sperts would invariably pull some lofty expostulation from their posterior to the effect that that was purely a corporate decision, and Sony is a big corporation and they're different divisions etc etc. To me it just demonstrated that Sony have never really been serious about "digital cinematography". Their bread and butter was TV studio equipment, and because Columbia were pulling in far more money than their video manufacturing division, if their producers said film was going to deliver the better product, then film it was.... The most ironic thing was, when Columbia finally did start releasing features that were shot on video, most of the early ones were shot with Red cameras....
  19. True, but the relatively high cost of film at least tended to keep them confined to the barstool where they belonged. :rolleyes: The last time I shot anything for broadcast purposes was back in the early 90s on SP Betacam. I could get great images too, and in those days you had to get it right first time; there was none of this "fix it in post" nonsense. But I also knew that nothing I shot was ever going to look as good something shot on film. In 1997 I went to a presentation by Sony to introduce their HDW750 HD Betacam, which was, sigh, touted as the "film killer". Yeah, right we said. Then about 2 years later they announced the "F900" which George Lucas himself was going to use to shoot Star Wars Episode 2. But oddly, there no Industry evening to introduce this particular new baby. And when we finally got to see one it was like: "Isn't that just an HDW750 with "150" added to the model number?" And the rest, and George Lucas's reputation, is history....
  20. Well of course they are. Didn't you know? "Digital" (whatever that actually means) is the magical hammer that's going to smash through the celluloid ceiling, and make anybody, ANYBODY, no matter how untalented, no matter how ignorant, no matter how illiterate, no matter how clueless, no matter how microscopically little they actually understand about the movie-making process, no matter how little business sense they have, no matter how unaware they are of how much of an obnoxious jerk they come across to other people, doesn't matter; they are going to show the old guard a thing or two about making successful, interesting and above all, relevant films! Sorry; I've hearing this same load of old bollocks in one form or another from barstool producers and generally imbecilic wannabes for close to 40 years now.... :rolleyes:
  21. You're may have the problem that molten metal is going to emit a large amount of infrared, and ordinary filters won't block that. I once tried to take still photos of molten copper droplets and got some weird-looking results because of this. Eventually I knocked up an old-fashioned smoked glass filter by exposing a sheet of clear glass to a candle flame (same as you use for taking photos of the sun, but not as dark). It was a bit uneven shading-wise but the exposure then came out a lot better. The photo was done to disprove something that was claimed in an insurance scam, so in that case we didn't care too much about the overall image quality. I have made better quality filters using a kerosene lamp with the filter glass mounted about 3 feet above it, in a draught-free room. My plan at that time was to show that a carbon filter would solve the problems that people were having with Red cameras and IR contamination using ordinary ND filters, but I could never get hold of anyone with a Red camera willing to let me experiment with this :rolleyes: Another job I had was to shoot high-speed video of a shipment of matches where the heads would sometimes shatter during the striking process, sending flaming sparks everywhere, and in one case setting off the entire box, resulting in burns (and, naturally, a lawsuit). I initially did a quick test in my workshop, with the matches clamped in a vice, and just dragging the side of the matchbox across the head to ignite them. I produced some fantastic close-up slo-mo shots that showed the match heads exploding into a perfect little fireworks display, but then the main flame would flare up and completely obliterate the image. So then I spent an hour or so setting up outside in the midday sun to reduce the contrast, and then spent a couple more hours fruitlessly trying to re-create my original shots. After countless attempts, mostly of the matches igniting perfectly with no sparking, or of the matches failing to ignite at all, or the heads breaking off and threatening to set fire to the piece of cloth I'd set up as a background, I ran out of matches and gave up. But anyway, the client's lawyer was happy with my original shots, and he said if anything it looked more dramatic :D
  22. I suspect the real problem is simply that when the original Blade Runner came out, (and Alien for that matter) there was really nothing else like it in the marketplace. Audiences these days are rapidly coming down with "CGI fatigue"; which is one reason Disney took the more recent Star Wars films back to their pre-CGI roots. It's interesting to compare this to the 2009 "Astro Boy" movie which bombed. You'd have to wonder exactly what they were thinking. It had very little in common with any of the various (Astro Boy/Astroboy) TV versions, so it wasn't like there was going to be any nostalgia angle to lure in Adult audiences. It was basically a fairly routine animated feature, with nothing particularly special to attract the juvenile audience either. The original Blade Runner story was really nothing like the acclaimed Philip K Dick novel it was supposedly based on, which wouldn't have helped its initial box office success either. It will be interesting to see what the new story is like.
  23. The "argument" (if you can call it that, and which has been going on for well over 10 years) is basically that in the same sentence you've quoted the raw Bayer resolution of the chip ("3.4K") and the upscaled product of the actual usable de-bayered resolution (2K x 2 = "4K", as though the two terms were interchangeable! Essentially, they've taken the 2K RGB end product of the Alexa chip, and sliced each RGB pixel into 4, and called it "4K". OK, I know that just makes it easier to feed into the post production system, and if that was all people said, there would be no argument. But the problem is, they're NOT saying that..... Also, I noticed one of you is still claiming that 2K is equivalent to 35mm film. Fine-grain 35mm film can resolve over 4,000 lines across a super-35 width. (I know, I've seen it under a microscope!) The practical limit is about 3,000 lines, so then you have chuckleheads stating that 35mm film has a resolution of "3K". But 3,000 lines is NOT 3K! You need at least a 6K scan to resolve 1,500 line pairs (ie 3,000 alternating black and white lines).
  24. Which just shows that knowing how to make movies doesn't necessarily mean you understand how the technology works. "The DCP was then made at 4K" He could "make" it anything he likes; you can't put back wasn't there in the first place.... The only 2K elements made for 'Blade Runner 2049' are for 3D projection. And er, what does that mean, exactly.... Where's George Lucas? He'll talk some sense....
  25. OK, so what's the horizontal resolution of the Genesis/F35 then? I could scan a piece of film footage of an early 1930s 30-line attempt at a TV broadcast at 1080p, and it will still be 30 lines. I wouldn't call it "debated" so much as mass-debated.... :rolleyes:
×
×
  • Create New...