Jump to content

Tyler Purcell

Premium Member
  • Posts

    7,477
  • Joined

  • Last visited

Everything posted by Tyler Purcell

  1. Well, it depends on the type of compressed lossy format you're using. In most cases, lossless formats like JPEG2000 (Red Code) and Tiff (Cinema DNG) are far better as source materials then compressed formats like MPEG2/MPEG4 which is what MOST consumer cameras shoot.
  2. Ya know... after the Academy gave Life of PI best cinematography and didn't even nominate Interstellar a few years later... it's clear there is something wrong with their decision making process.
  3. Yea, IDK man... I think the rumors are probably correct, only time will tell.
  4. The sad part in all of this is that they COULD have done a limited week-long public release THIS WEEK, one week PRIOR to Star Wars in New York and L.A.. That's all they need to qualify for the oscar's. Then, they could have put everything on snooze until AFTER the Star Wars mess and launched nationally sometime towards the end of January. Everything about this launch reeks of horrible organization. There was never a need to install 70mm projectors into cineplexes, that's a HUGE mistake. There are literally 100's of theaters across the country with 35/70 projectors already installed. The problem is that most theaters don't know how to work them. So it's really simple to send a person out, get things cleaned up and find someone passionate about doing a good job, to run the show. During a classic roadshow, prints move from theater to theater. So why not do the same thing here? Run it a week in one city and then pack up the whole thing and move it to the next? This way you could have maybe a group of 4 - 6 people who drive in a truck from city to city with the film prints and deal with everything from re-calibrating the projectors to running the films for that week. Do a thursday - sunday run, get in the truck and drive monday-wednesday to the next destination. Spend the money on marketing, rather then striking prints. The rest of the theaters could run 35mm prints full-time if they have projectors. So my proposal would be to start the full roadshow experience sometime mid-late January with permanent prints DONATED to art-houses in Los Angeles, New York and Chicago. The rest of the screenings in 70mm would be a traveling road show going from city to city. Heck, you could even bring complete 70mm projector heads and platter systems with the traveling show if necessary. They should have also struck a few 15/70 blow up prints for IMAX theaters who still run film. That would have solved some of the venue issues as well. What a friggin' mess!
  5. I use to work with Apple on NLE development. Apple was loosing money with FCP7, so even though the developers did make a 64 bit version of FCP7, they were let go and some of them went to Avid, others went to Adobe, a few went to Blackmagic. Final Cut X started as iMovie with a slightly different GUI. As they gained traction, they added features to the pre-existing iMovie engine and eventually replaced it with a much more powerful one. Yes, I'd agree that Apple developed FCPX for those people who refuse to learn the standards of editing. I've been editing for 20 years (film and linear) and modern NLE's have nothing to do with editing film. They're more akin to A/B/C roll linear editing actually, with a preview and program monitor, with assemble, insert and cut functions. The J/K/L keys, which act as a jog shuttle. With audio scrubbing like an analog tape machine. Track's are what you need to test new ideas and add layers of effects. You need the ability to add 10 tracks of video on top of what you're working with, so when the client comes in, you can show them different ideas without switching sequences, by simply turning on and off tracks. Plus, when you're really editing, you're constantly isolating effects and tracks to test new ideas without the other ideas gumming up the works. This goes for audio as well, most of my audio tracks have a minimal of 2 effects per track and sometimes even more. Not per clip... per track! Remember, Avid is "track" based editing and FCP is "clip" based editing, very different world. Even though, FCP7 can work just like an Avid, which is pretty crazy. Plus, Avid and FCP7 are both keyboard based editing software. So there is no need for dragging, dropping or frankly, using the mouse. This makes for super fast cutting, far faster then any other NLE if you know how to use it. Yes, it's more key strokes, but those strokes can be performed faster. You're talking about workflow and I've been an advocate for changing the workflow. However, in most cases you can't change it. Most films will be edited using proxy files no matter what. This is to cut down on cost and I haven't seen a single example from anyone editing an entire main-stream feature film on FCPX using raw 4k/6k media. Now, I worked in trailers, EPK's and BTS industry for quite a while. Studio's won't give you the raw media for security reasons, so you're yet again, editing in proxy files, generally given as a Pro Res SD format. Every single trailer you've ever seen, has been cut in proxy and then online edited at a lab where the master files are online. That's just the workflow and that's not going to change because storage is still very expensive and studio's are not going to release feature films in 4k quality for people to cut with, that's never, ever, ever going to happen. "Focus" used 2k media, not 4k media. They proxy transcoded everything as well. So we're not talking about a much different workflow then there rest of the industry already uses. A few pixels from 1080 to 2k. In this case, the NLE is almost irrelevant. In fact, they probably could have edited faster using Avid's script sync which actually listens to your dialog and matches it up to the script in real time. Right, but Avid and FCP7 does all the same stuff. In fact, the Avid Metadata is even stronger because of Script Sync. My workflow is to watch a clip and make markers. I then edit those markers using not just key words, but also notes. Those notes are available to the search system. So whenever I'm looking for something, I can simply type in what I'm looking for and it will pop up. Avid 8 adds a "relevancy" system as well, just like FCPX, so you can adjust relevancy depending on key words. The best thing is that Avid NAILS critical things like "reel" and timecode information. These are both super critical when going back to online media or frankly if you work with ANYTHING ELSE but a modern digital video camera. I honestly don't care about MOST of the metadata collected by FCPX. What kind of camera, what F stop, what shutter speed, what lens, what card, etc... these things are irrelevant in most cases. You do need to know what "reel" (mag) it is. You do need to know where the file is stored (physical location of original media) frame rate, resolution, time/date, timecode, scene and take. There are a few plugins that can read slates, which are nice for auto metadata. But script sync does most of this for ya anyway. I wasted 2hrs of my day watching his videos. He clearly makes money off selling his products. He has some great ideas, but they're just ideas and frankly, shooting a mid-budget feature film with stars in 2k and finishing in 2k, heck even considering that as "finishing" quality, just proves how wrong he is. 4 perf fine-grain 35mm negative is almost 6k worth of resolution and even in the cinemas, fourth generation, it's still around 3k worth of information. So all this new "digital" technology isn't any better then what we shot with for the last 20 years (ever since T grain stock's came out 20 years ago). It's just, a lot of these "whiz kids" don't even contemplate those things. They're not concerned about final output quality or that they're showing a 2k image on a 4k projector, really? I mean the whole thing is just silly and doesn't make any sense. He talks a good talk, but so far none of the numbers appear to be backing up his claims. Yes, it's easier on the filmmaker, but in the 40's, 50's, 60's, we made movies in less time, with less money, without VFX, without monitors, without instant reply and some of them are considered some of the best movies ever made. Digital production is over-bloated with multi-camera shooting, with DIT's, video village, take upon take because there isn't any consequence to just running the camera. I don't share Michael's optimism and future take on technology what so ever. I think the digital age is failing miserably and in 20 years when MOST of the stuff we produce today can't be scaled to whatever our next resolution is, people will look back and say they messed up. It reminds me of the early digital films, stuff shot on NTSC video like 28 Days later. Totally unwatchable today and there isn't anything you can do about it! Forever, that film will look that way, unlike originating on celluloid, which is completely scalable for the future. Nobody thinks about those things.
  6. Right... "transcode to proxy" that means you're not editing with the raw material, you're editing with proxy files. That's kind of of my point when it comes to the FCP workflow. The last feature I worked on that was shot with RED, had over 50TB of raw material. Good luck organizing and transcoding that in FCPX. They wound up sending it out of house and getting back Pro Res proxy files for editing. That's what MOST people wind up doing with R3D files since Pro Res uses the open GL engine and is designed for multithreading, unlike JPEG2000 which is all GPU. I've never seen this round trip actually work with anything but a short subject piece stored on a single drive. Everyone says it works great, but FCPX struggles to handle bigger projects, using mass storage arrays. I've done lots of testing and most of the time it just doesn't work. I'd love to see 5 layers of video, 20 audio tracks playing in real-time using raw R3D material with no transcodes. . With one video track and a few audio tracks, maybe... but even some facilities I've worked at, they've not even been able to get that far. Direct attached storage helps a great deal, especially if it's fiber or thunderbolt. It doesn't really matter what computer or software you use at that point. Once you get over 2 layers of video in any program that's "native" to R3D material, it generally stops working. I know fancy FCPX with all it's bells and whistles doesn't really want you to work with multi-level video, but unfortunately when you're working on bigger projects, with clients sitting over your shoulders, that's how you've gotta work unless you're cutting a multi-cam television show and simply use the editor as a switcher. Yep, most people will be working with proxy files, which is really the only way to work with R3D material. I'm watching the speech from Light Iron about "focus" right now. I stopped because you mentioned editing on set whilst the filming was going on. I hate to break it to you, but I was doing that 15 years ago when Pro Res first came out. We'd take the tapes shot from the F900, ingest them using a portable recorder directly into our portable raid array. We had 3 assistant editors working through the material and cutting individual scenes and organizing dailies on the spot. A drive would then be dropped off to the lead editors place over night and in the morning they'd do a quick cut of the material, so by noon during lunch, the director could sit down and watch a "cut" version of everything they shot the day previous. In 2007/2008 I developed an on-set editorial system using FCP again, to capture in real time media directly off modern digital cinema cameras. It used a program called Picture Ready, which would store a 1080p squeezed (anamorphic) pro-res file onto the raid array based on start/stop of the camera integrating camera timecode. The editors would take that material and in real time, they could start cutting with it since the files were open and never closed. It allowed for INSTANT cutting of scenes, no transcode, you didn't even have to wait for the shot to be over, we could literally edit AS the camera was shooting! We built a little plugin for FCP that would de-squeeze the anamorphic material by simply drag and drop. We also had the 12 channel raw audio integrated as well, so the editor could pick and choose their audio sources. Plus, when you were done with the cut, the timecode matched up perfectly to the original digital media coming out of the camera if you used timecode as your base. So none of this is new... in fact, I've worked on jobs that were shot on film, where the lab would process over night, telecine and we'd have cut scenes by lunchtime the next day. The problem is, everyone thinks this is all amazing new technology and it really isn't. Integrating media with effects artists located at different facilities, is something we were doing 10 years ago! Now sure, we worked with 1080p Pro Res 422 material on pretty much all of these projects, but that's only because the software didn't accept higher resolutions at the time. The cool part is that our fiber based integration with all the effects houses, would allow us to pick and choose final shots for VFX work, push them over to the effects house with our Pro Res file as a guide and they'd integrate into their compositing tools. Back then, a lot of people were still using Shake, but anyone using Smoke (which was new at the time) would have full integration of XML's coming out of FCP. So we built a huge facility full of Smoke work station's, trained the compositors to use that software and literally pushed XML's and media over fiber between the different facilities. This system is still in use today here in L.A. and all of what this video talks about is stuff we did FAR before anyone of these guys thought about making this particular film. The problem I have with apple and Final Cut Pro is they've ignored the proper way to edit in their GUI. They said, hey everyone, you're going to edit like we want you to edit. They also hide a lot of the features from the public, making it another iMovie and iPhoto which constantly break and are slow because the back-end gets goofed up. The great thing about FCP7 was how simple it is to work with. Even Premiere is far less complex to work with back end wise, then FCPX. Once you learn Avid's back end, it too is very simple and easy to understand. Plus, the great thing about Avid is the AMA linking ability, which is actually 100% native with any JPEG2000 material like R3D. Where I'm not a fan of Avid by any means and it took me years to adapt and start working with it on my own projects, I absolutely swear by it being the best tool on the market. Yes it has problems, but it's a logical program which makes sense. The problem with FCPX is that none of it makes sense. If you're an editor and you wish to get a job editing, learning with FCPX isn't going to help your career. However, Premiere, FCP7 and Avid, they're pretty much industry standard operating, which is vital. As more and more companies make the switch to Premiere and Avid from FCP, it's going to be harder and harder to learn those programs when all you know is the Mickey Mouse workflow that's FCPX.
  7. Actually that is incorrect. JPEG2000 which is what RED code is based on, has the ability to down-scale pretty easily. So what you're editing with is actually a scaled down version of your files. Feature films like "Focus" will edit from proxy files and then export an EDL/XML5 to which the high resolution material will be re-linked and colored. Nobody is using raw red material for a feature film, it just doesn't happen. Also, the moment you add a few video tracks and effects to a sequence using RED code, you're taxing the GPU, CPU and storage so much, it will basically stop working. This is an even bigger problem off shared storage (NAS or Fiber SAN), which means actually "editing" not playing around, but the physical act of multi-layer cutting, is impossible with R3D files. Sure, there are people who "experiment" and use single video tracks, but to edit a real product, you need more then one track and a super fast computer. Just for the record, setting up editing systems for professionals is kind of my business. I'm a professional consultant for the post production industry and what you're saying really doesn't work for professionals. It may work for your personal material on your brand new macbook laptop, but it won't in a professional environment due to so many other issues like total online media amount, storage speed, CPU/GPU speed and of course complexity of sequence. You can do quite a bit with down-scaled RED media on a single thunderbolt drive with one video layer. However, most people won't have that configuration and/or need more then one layer of video.
  8. I agree David and will say our modern heavy effects films will probably not be seen 43 nears from now. They're just not that interesting and I bet 2001 will still be a must watch then as well.
  9. Maybe it's old stock? Can you upload a video clip? It would be interesting to see how the pattern plays out. Generally bad stock can be identified quickly that way.
  10. Yep, it's all time! What makes Kubrick films so amazing is the fact he spent years developing every detail before shooting, years shooting and quite a while editing. On average it took him 5 years to make a film from conception through finishing and as he got older, it took even longer.
  11. Well, they did use Todd AO lenses on it for sure. They also absolutely used Mitchell cameras, almost all of the BTS images from the end of the film in that white room, are of Kubrick with a Mitchell body, same with the centrifuge scene. Maybe I'm mistaken about which shot that is... maybe it's the bug-eye shot of Hal looking back at the two of them? Which would be a Todd AO bug-eye lens. You could be right, the insert of the screen could have just been taken from a 65mm interview.
  12. I think the Mitchell was used for the 35mm insert shots of their discussion which are square. That's my thought on why that camera was there.
  13. Yea, they're the same codec... the standard XDCAM format is higher bandwidth then the EX format. XDCAM is a good format for television.
  14. I mean it all comes down to shooter, lens selection and post work. EX3 is an inexpensive ENG camera and it's small so you can shoulder it OR push it against your body for lower angle shots, making it ideal for reality work. I used one on a pilot once and really enjoyed working with it for TV stuff. But like any camera, if you're not a great shooter, it's not gonna come out good. The camera in that still is an optical disk camera. Not sure the model off the top of my head, but not part of the same family for sure.
  15. I'm sure he's referring to a particular film, it may have been a misprint in the magazine.
  16. Ohh interesting, it looked too good to be the A7, but I haven't see anything from the A7RII before, so that's pretty sweet. The glass helped considerably and your steadicam work was pretty darn good.
  17. I'm one of those strange ducks who likes old technology. Steam train's, old cars, analog tape recorders, 2 stroke engines and motion picture film. I grew up shooting films (and stll do today) on motion picture film and of course watching them on the big screen in the big city of Boston where I grew up. My dad was really into the movies and as a little kid, he dragged me to 70mm screenings of all sorts of things from Indiana Jones and the Last Crusade to Terminator 2. One of my favorite childhood memories is of my dad holding me up in the projection booth to look at the 70mm film on platter of Terminator 2, I was only 14 at the time. I'm the guy who is turned towards the projection booth, staring at the projectionist as he loads the film, waiting patiently to see it start, the rest of the audience wondering what this little kid was staring at. So of course as a filmmaker in Hollywood, I've attended as many film screenings as I could and since film as a theatrical format is dead, whenever someone works up the gumption to release something on film, I'm there. Last year it was Christopher Nolan's 'Interstellar' and this year it's Quentin Tarantino's 'Hateful Eight', both projected in 70mm and both finished photochemically. What Quentin is doing with Hateful Eight and the revival of the 70mm roadshow screening, is quite impressive. Weinstein's have helped pave the way for Quentin's vision and after tonight's screening, I'm in awe of their accomplishments. Since the Hateful Eight is as much about the format as the story, Weinstein's decided to have special screenings for academy members because DVD screeners, won't do it justice. As a consequence, there are two weeks of screenings on both coasts of the US and I choose the one at the DGA (directors guild of america) because I know how much they care for projection. The only negative thing about seeing the film so far before it's release date is that I'm unable to discuss the movie itself. This is eating me from the inside out because all I want to do is shout and rave about the details of this movie, but alas I can't. However, I do think it's ok to discuss the projection and what I saw on screen. Not to review the movie, but to simply explain what I saw for fans of the 70mm format, like myself. Hateful Eight is being distributed and shown in over 100 theaters, half of which had 70mm projectors installed for the viewing. This is a huge challenge and the members of this forum may know that the great staff over at Boston Light and Sound (my home town) supplied the equipment and most of it will be shipped back to them after the screenings are over if there isn't another 70mm film to be shown in the next year. To help the projectionists in their task, the film will be shipped already spooled onto platters. This is not the first time this has happened, IMAX films have been frequently shipped this way. It helps with errors that may stem from poor splicing practices. As a consequence, one would assume the prints would be clean, without cigarette burns (change over cues) in the upper right corner at the end of each roll. However, this was not the case for my screening, more about that in a minute. The DGA theater doesn't have a big screen for the size, it's very old school in that way. So the actual viewing experience wasn't optimal, but the projection sure was. The screening starts with an overture and a nice text card on screen saying just that. Reel one on projector A, had some gate wobble AND the right side of the frame was going in and out of focus very slightly, during the overture card. I was very concerned because at the time, I thought it was a platter projection and likewise it would be that way for the entire presentation. The beginning of the film contains some great exteriors and the issue continued for quite a while. However, the moment projector B kicked in, the problems went away and I never saw them again. My guess is, there was something stuck in the gate, preventing it from closing (squashing) the film properly and the projectionist fixed it because even the end credits were rock solid, no wobble or anything. This might sound disheartening to some people, but the projection was so good, it looked like digital. I've seen some IMAX presentations like this before, where it's so crisp, so clean, so grain-less, you swear it's the best digital projection ever. Yet, this was the first 5 perf 70mm projection I've seen which could stand next to the very best IMAX 15/70 projection and hold it's own. Most of that comes down to the top-notch projection, which was flicker-free, rock solid stability and silent (no whirr of a projector in the background). The other part came from the filmmakers use of fine-grain stocks, like 50D for exteriors and 250T for a great deal of interiors. There were a hand-full of shots you could see grain, but the vast majority where smooth as silk. Unlike digital projection which leaves most content lifeless through raised blacks and crushed highlights, film projection doesn't do that. Plus, this being a brand new print, perhaps shown two or three times prior, it was in immaculate condition. The detail in every shot was outstanding. You could see strands of hair and sweat in close-up's. You could see the crispness of the layers of snow even though it was all white, there was still detail. One of the characters has a striped shirt on and even when there were dark scenes, you could differentiate between the brighter and darker sections of that shirt. To me, that's a great acid test for dynamic range and I was more then impressed. I kept saying to myself during the screening, if everyone saw it like this, there maybe a push for future 70mm releases. If there was to be a complaint about the technical side of Hateful Eight, it would be the interior lighting of the main location. I knew it was going to be over-lit, thanks to the trailers and press stills, but it was way more over lit then I even expected. Table tops so bright, the actors faces were getting enough bounce for that to be the only light source. I understand the reasoning behind this lighting concept; it saved a lot of time and allowed the filmmakers to use slower speed stocks and lenses. However, it's my only real beef with the film technically and that's pretty good coming from someone who simply can't sit through our modern films due to how poorly they're made. Bob Richardson and his crew did a fantastic job at making a stage play interesting to watch on the big screen. The only other technical thing to discuss is the use of those fantastic anamorphic ultra panavision lenses, which were re-built just for this movie. It was a clever idea to tell this story in 70mm Ultra Panavision because most of it takes place in one interior location. The filmmakers could use wider shots and achieve longer dialog scenes, which not only looks cool, but saves time. I think the standard 5 perf 70mm 2.20:1 aspect ratio, may not be quite wide enough for this movie. The unbelievable bit is how little those old lenses breathed and how little anamorphic distortion there was. I was more then impressed with the look, never noticing the anamorphic lensing. There were a few shots using diopters as well, pretty slick stuff and cool looking since nobody uses them anymore. Reminds me of films from the 70's ad 80's where you had to keep the lens wide open, but wanted less depth of field. With all that said, Hateful Eight is a wonderful cinematic experience for the true cinefile. It was clearly made with the heart's of many others like myself, who strive to keep shooting film and keep it alive for future generations. The shooting crew, team over at Panavision, the great finishing guys at FotoKem and DGA projectionist, all did an amazing job making this film. Everything came together flawlessly, Quentin's vision (from the first time he saw that Ben Hur chariot scene) was a complete and utter success. He's proved without a shadow of a doubt, 70mm acquisition and projection can be better then digital. There wasn't a moment watching the movie where I was taken out of the action due to a problematic technical error, like so many modern films. Every frame was rich in color, painted with a master's paintbrush and projected with artistic flair of it's own. What Quentin has showed us with Hateful Eight is that, it takes a team of outstanding artisans both behind the camera, in the lab and in the projection room, to show something properly on the big screen. It doesn't require fancy modern digital technology to tell a story, it only requires technology from the 60's and a few people who care. If you are a cinefile, take some time out to see Hateful Eight in 70mm before it's too late. Even if you don't like the story (not everyone's can of worms), go for the technical aspects alone and next time you see something digitally, just remember how flat-out amazing GOOD film projection is to watch. Thanks for reading
  18. Ohh and for anyone in LA... LACMA is showing back to back roadshow 70mm version of It's a Mad Mad World and Hateful Eight, Sat Dec 19th starting at 1:30. This MAYBE the best place to see it because those guys show a lot of 70mm and they have a dedicated projector PLUS Mad Mad World is also in the same anamorphic format, so I'm sure it will be rock solid. Not the best theater in the world, I'll still be squeezing in an appointment to the Cinerama dome if they show it, but well worth the screening! Tickets are on sale now http://www.lacma.org/event/mad-mad-mad-world
  19. poop, that's just wrong. Press screenings should be something very special and that one looks like it wasn't. I mean, they may have had a bad anamorphic decoder element, which is something I'm sure everyone is talking about right now. Personally, I feel it was a mistake to shoot in anamorphic because it adds one more variable to a system that really is barely working to begin with. Heck, even when I saw Interstellar on 70 @ The Cinerama dome, during the 70mm trailer for Inherent Vice that preceded it, the projectionist was adjusting the loop, framing and focus for a good portion of the trailer to insure the feature (when it came on) would be perfect and ya know what, it was!
  20. Nice, really enjoyed it. :) So which camera was it shot on? I'm guessing FS7.
  21. In other good news... reviews are coming out about the film, so far it's all good!
  22. Yea, it would be cool. It maybe possible with a still image of a pattern or something.
  23. Ohh I wish you could do that, but unfortunately you can't sample. You can apply a LUT (look up table) of different print stocks. I used a Fuji print stock on my sample earlier. You can do pretty much everything with the free version, but you need one heck of a fast graphics card for it to work properly.
×
×
  • Create New...