Jump to content

Tyler Purcell

Premium Member
  • Posts

    7,833
  • Joined

  • Last visited

Everything posted by Tyler Purcell

  1. Well, neither camera is really professional due to the viewfinder (LCD displays are almost worthless for critical focus or in direct sunlight). Both shoot the only surviving non-professional film format, which has major glitches in it's manufacturing, causing gate weave and other issues on capture due to the perf manufacturing. The max capture time @ 24fps is 2.5 minutes, which is too short for any real professional work. The Logmar (and I assume Kodak) make a little bit too much noise to be considered "silent" for critical sound recording. Dirt in the gate can be catastrophic in the final image because it's such a small area and you can't really check and clean it like you can with a professional camera since there is a prism in the way. I don't know about that. 16mm is a FAR superior format and cameras today are only a few hundred dollars. Plus, for the price of a Logmar system, you can buy a REAL S16 camera, prime lenses and have an actual silent sound recording camera that shoots almost 12 minutes per load. S16 is almost the same price to shoot per minute and because it has a far superior projection system, you can print it and project much easier, delivering an excellent photochemical process image. The problem with the Logmar camera is that he was trying to make a consumer home-movie format, more then it was. Most people who shoot super 8, want that home-movie look, thats why they choose to shoot on the format. People who want a more professional look, will shoot on a professional format like S16.
  2. Nice find Miguel! Thanks for sharing. :)
  3. How have you seen this material, what was the viewing format? Some of the stuff out there is recorded at Rec709 and compressed to .h264 before presenting. I've seen the stuff on DaVinci (RAW) on a 4k color grading monitor and it was excellent. The demo material was both dark scenes and really bright ones. I had no problem lifting the blacks and lowing the highlights and still getting detail. The 4.6k imager has the soft clipping feature that you see on the other cameras, which is more similar to Arri, than Red or Sony which both tend to have harsh clipping as discussed many times. From my experience shooting blackmagic exclusively for the last 2 years (2.5k, 4k and pocket, RAW, 444, 422, 4k, 1080p) in pretty much every configuration imaginable... they deliver a nice warm image almost all the time. It's VERY easy to turn warm into cool, it's much harder to turn cool into warm. I like warm because everyone today makes stuff cool (blue/green) and it's boring. I don't feel the camera has missing channels in RAW mode. However, in 422, the Blue and Red channels are more muted and do bleed slightly. It's not so noticeable when shooting a normal project, but it is when you're doing tests in controlled environments. The 422 color sampling just doesn't work well. Since most people with 4k would be shooting 444 color space, it's a moot point, because in full RGB mode, the issue doesn't exist. Another thing to think about is how the camera is being colored. I've found the stock built-in REC709 LUT in the camera and the DaVinci LUT for the camera, to be warm in of themselves. When you color without a LUT, that warmth isn't there at all. In fact, the camera has very muted colors naturally. It really takes a lot of work to build the color in DaVici, more akin to Alexa then Sony or Red. Pocket camera 422 mode no LUT, hand-colored: Pocket camera 422 mode DaVinci LUT.
  4. I mean it really depends on your productions and the quality you seek. The great thing about the Red and Blackmagic cameras that none of the other MPEG based cameras have stock is the ability to shoot REAL raw, not faux raw. Raw is NOT an MPEG file with a flat profile. Raw is actually capturing the image data directly to disk, without subsampling. So where cameras like the FS7 have raw options, it's not the same raw you would get from a Arri, Red or Blackmagic. I'm not a fan of Red for so many reasons from the extremely proprietary file format and accessories to the form factor and inherent imager tint. Red also has deal-killer for me personally, which is that global shutter is an add-on. The Raven is in direct competition with the Ursa Mini 4.6k, they've modified the feature set to go directly with it. However, it's lacking a lot of the KEY things that make the URSA so great and of course, the URSA Mini 4.6k is missing a few of the things that make the Raven great. So in my opinion, it's really down to 2 cameras... the Red Raven and Ursa mini 4.6k. I would personally never own an MPEG camera, but that's up to you. With formats like Pro Res, you can adjust bandwidth for your shoot based on if you really care about what your shooting or not. So with the URSA Mini 4.6k, you can shoot RAW in multiple compression levels and resolutions (like the Raven). You can shoot Pro Res XQ in Rec 709 or RAW color space and again, at multiple resolutions. You can shoot standard Pro Res 4444 at multiple resolutions. You can shoot Pro Res HQ, Lite and Proxy as well, all at different resolutions. So the awesome thing is that in camera, you can determine a lot of these things. Also, Pro Res is a far better looking codec then MPEG. Pro Res XQ has been used to capture a myriad of theatrically distributed films and with external recorders today, most people don't use the internal recorder on MPEG cameras anymore. So why buy an MPEG camera that you're just going to add a recorder to? Sounds like an expensive waste to me. Now, yes... the URSA Mini 4.6k is a new camera, there is no doubt. However, I have done some testing with it, to try and see if it has the same issues as the 4k imager and so far it doesn't. One of the tall tell signs of that garbage 4k imager is the baked in imager noise, the all-new blackmagic designed 4.6k imager doesn't have it. Now Blackmagic have been delayed on getting me a sample, they're hard pressed to meet their delivery deadlines right now. But I feel by April, we should have a sample and I will post some test footage. I'm very good at tearing cameras apart and finding their issues, so it should be a piece of cake to figure this one out as well. What makes me excited about the URSA mini 4.6k is the form factor, the ability to run PL and B4 (broadcast) lenses with an adaptor. The industry-friendly capturing formats and imager size, the form factor, the high-end I/O without adaptors, the beautiful OLED viewfinder, standard V mount batteries, the ease of shoulder mounting. These things are so critical to me and it appears as if Blackmagic is the only company to figure it out, which is very strange. I will say one thing... and this may be a deal killer for some people. It's nowhere near the sensitivity of the Red and Sony cameras. Blackmagic is more akin to Arri in their design, where it's more of a movie camera, rather then a digital camcorder, which is what the FS7 is. So it's a hard bargain I know and what I'd do is wait to see what the Raven and Ursa Mini look like when they're both out and people are shooting/testing. That won't be long now and I think either of those cameras (depending on what you're doing) will be the right choice.
  5. Here is the required bandwidth for standard Pro Res 4444 4k: 1199.17Mbps = 149MBps (image below) So for ONE STREAM, as if you're editing a single track, you'd be OK with a two or more drive raid zero with a high speed data connection; USB3, ESata, Thunderbolt or Fiber. But at that speed, you're looking at 540GB per hour! OUCH! Which is why I recommended a lot of storage. However, since nobody really edits ONE TRACK at a time, you really need 2 times the bandwidth. Why? Because you'll at least have two sources, one will be camera original and one will be render files for things like transitions and effects. The moment you add another video layer, you need to a lot more bandwidth to deal with it as well. Even though NLE's today are smarter then they've ever been, USB3 and ESata aren't very good with multitasking. They get bogged down very quickly with high bandwidth requirements. These are the reasons I made my first recommendation.
  6. Huh, well that's interesting! Everyone markets them as s16 glass. http://www.keslowcamera.com/catalog/lenses/anamorphic-prime-lenses/hawk-v-lite16-anamorphic-16mm-lenses-1-3x-squeeze/ Ohh, they make 2 versions! One for S35 imagers and one for S16 imagers. I've never seen the S35mm imager version in LA, only the S16 version.
  7. Hey Matthew, The 1.3x V-lites are for super 16mm sized imagers. I'm pretty sure they won't cover the imager of the Canon C300 MKI. Most people use 2x anamorphic glass with S35mm imagers, that's the right way to go and the most common. The 1.3x lenses are available at Keslow in LA and Vantage in NYC. The 2x lenses are available from many more vendors. A google search will provide you with a long list. One other unfortunate side effect of the Hawk lenses is the price. I was quoted $4500/week for a set of 1.3x anamorphic primes. Yes, forty five hundred dollars for 7 days! The 2x primes aren't much less, around $3200/week. Those new Hawks are very expensive to purchase and aren't as rented, so the rental houses are charging an arm, leg and first born for them! You can get other/older anamorphic glass for S35mm coverage, just gotta call around and see what's available for those shoot dates. Since you live in Los Angeles, may I suggest contacting Panavision directly? They have a whole room full of Alexa's and gobs of anamorphic lenses. They also love helping students with their projects. I bet their pricing with camera is less then just the rental fee! EEK! Hope that helps!
  8. Yep, Football will dictate for sure! LOL :)
  9. Sa'll good Manu! I know you're VERY passionate about Fury Road and don't mean to be insulting. I don't know a single person outside of yourself, who would put Fury Road even close to Oscar contention, let alone nomination. It's a saturday afternoon popcorn film, nothing more, nothing less. How it got the nomination is exactly what's broken in this industry. People are blown away by spectacle and they vote/review based on emotion, rather then logic. Here is the reality of your comment. Fury Road has an 8.2 on IMDB right now and an 89/100 metascore. Well, 'Gone With The Wind' has an 8.1! 'The Big Lebowski' has an 8.1 and 69/100! 'Fargo' has an 8.1 and 85/100! 'No Country for Old Men' has an 8.1 and 91/100! 'There Will Be Blood' has an 8.1 and 92/100! 'Butch Cassidy and the Sundance Kid' has an 8.1 and 58/100! Those films are all right near each other on the IMDB top 250 scale, amongst many other top award-winning films from the history of cinema, with others like Buster Keaton's 'The General' arguably one of top 10 films EVER MADE! So yes, in a lot of ways I believe people are simply BLOWN OVER by shiny things, violence and grotesque humor. Plus, Fury Road is a heart pumping movie from start to finish, which also makes some people super excited. This is how amusement parks stay in business. People want that rush and Fury Road delivers the perfect ride for those people.
  10. Funny you mention that, the guys who listened to me, have plenty of storage and bandwidth. The guys who don't are always scrambling for backup drives to dump data to. I guess things are entirely different on the other side of the pond. :shrug: BTW, my numbers are generally accurate. I double check them from manufacturers data sheets before posting every time.
  11. Watch the interview's and read the ICG and AC articles. You can tell a lot through a persons body language.
  12. I couldn't wait to get it off my screen! Poop is too nice of a word for the drivel produced in Fury Road. It's just eye candy for people who have no problem switching off their brain. Maybe a good thing? I honestly can't switch off my brain, it's always chewing on something and from the first frame to the last, I was contemplating what I could have been doing during those two hours. It is truly a movie with no purpose for existence, much like Jurassic World, where at least the filmmakers got a passing grade because it had a script, all be it, a very thin one. Fury Road is on the Oscar list for one reason... it made a lot of money and that's the sad reality about Hollywood. Had Beasts of No Nation been nominated for several categories like it SHOULD have been, life would be good! Let movies like Jurassic World and Fury Road come and go and let the truly great films float to the surface. When I think Miller spent a decade prepping Fury Road, my head explodes. It feels rushed and nonsensical, as if they threw it together in a few weeks and hurried through post.
  13. Now you get why I'm so furious about Mad Max Bill...
  14. I did forget about that, sorry about that. 10 bit 444 isn't even on my radar anyway. It's completely worthless in my book. It's a "faux" raw, it's still not working directly with the imager data like real raw. The difference between 422 and 444 is generally unnoticeable outside of the most trained eye. The difference between 10 bit 444 and real raw is night and day. Sony just wanted to pretend they had something more powerful then they really do. It's not native, the 3rd party plugin's have issues. I deal with windows made Pro Res files every day and we've been forced to install mac's in places that create Pro Res due to the extremely slow encoding speed AND issues with speed on playback. If you encode the same Pro Res on a Mac vs a PC, you will see huge discrepancies on how they function, it's almost two different codec's.
  15. Interesting, I was part of the committee who wrote the book on motion tiff sequences. The patents that Adobe holds for Cinema DNG were developed by the company I worked for! LOL DPX, Targa and Cinema DNG are ALL 24 bit Tiff formats. I like to use the word "tiff" because it's easy for someone with photoshop experience to understand what's behind those three formats. Yes, there have been A LOT of modifications over the years, including CUDA support. Yes, XAVC-I @ 60FPS is around 600Mbps, but most people shoot at 24, 25 or 30fps, which is a more respectable 450Mbps. Remember, megabits per second, so not very much data. Yep and raw is 24 bit and around 250MBps megabytes for 4k (not UHD) Sure, but the CUDA and Open GL drivers are separate. Since there isn't a 444 10 bit format, I was merely using the phrase "10 bit" as a way to differentiate between 12 bit or higher bit depth formats and 10 bit and lower formats. The graphics card will always down sample and you're right, most displays and editing software will sample to 8 bit. Avid and DaVinci can sample to 10 bit or full color space. The higher the bit depth of your material, the harder the GPU works to sub sample. We've done thousands of tests on this and it's very clear that subsampling via the GPU where the display monitors are connected to, makes a huge difference. In other words, the faster the CUDA GPU is, the faster the subsampling is. You can test this yourself by downloading sample clips from the internet of different codec's and forcing DaVinci or Avid to display in 10 bit mode. You will see a radical difference in real-time rendering speed based on graphics card. We've installed K5000's on many machines for DaVinci as well and have seen HUGE performance leaps. It doesn't work if you just playback footage through the machine. So as a "tech", you wouldn't see anything change in the monitoring system. As a creative, once you start using the real-time functions of software like DaVinci, that's when it becomes apparent. My clients sometimes have 20+ mattes in each shot, rendering in real time at full resolution, at full bit depth. That's impossible to do without an accelerated CUDA graphics card. True, but now you're talking about the imager/hardware in camera converting to a compressed color space and lower bit depth and then expanding that in your editing software. You loose most of the imagers ability to deliver an image at that point. I'd love for you to come out to hollywood sometime and sit down for a session with my clients who spent millions on post production solutions. It's not because they wanted to, it's because they had no choice! I work with big clients every day, solving these problems one by one and I tells ya, there is far more to it then meets the eye. I've been flabbergasted at some of these companies inefficient work around's because they have no choice since their hardware and storage isn't fast enough. Mind you, working at home on a single computer, you can get away with a lot more. Modern/new computers do come with fast enough graphics cards, but they're VERY fast compared to that of only a few years ago. So when I talk about upgrading cards, I'm really not talking about computers made in 2015... more like computers made in 2010, which is what MOST people have in this industry still.
  16. The camera is a 10 bit, so max color space would be compressed 422. If you use an external recorder, it doesn't make any difference because the actual output from the camera is compressed 422. You'd have to move to an F5 to get 444 color space, which is 12 bit. Pro Res is not compatible with PC's. There are hack plugin's which allow you to use the codec, but it's inefficient (slow) and will always be slow to playback no matter what. Sony Vegas has a Pro Res plugin, but it only supports HQ. For Pro Res XQ or 4444, you will need a different editor.
  17. Film died due to big businessmen not understanding it's value, no different then newspapers. It had very little to do with acquisition and everything to do with distribution. The amount of print stock produced far exceeded the amount of camera negative used. Same goes for newspapers, the distribution medium is the key, internet distribution has little to no value. The businessmen didn't understand that at first and now they're backpedaling like crazy. The precipitous move to digital distribution also killed many of the low-cost venue's and raised the pricing at others, which lowered the total viewership. In fact, unique non-repeat viewership in cinema's has been steadily decreasing since the heights of 2009, even though 2015 was a good year thanks to big franchise films coming back; Jurassic Park, Max Max, Star Wars, so far the data still puts viewership pretty even to 2014. No, I don't feel X will go away. Newspapers are actually slowly becoming more popular again because people are tired of internet advertisements. Motion Picture film is having a resurgence because the businessmen are out of gimmicks and we've come full circle back to the early days of widescreen where 70mm was the big way to bring people back to the cinema. We're charging people more money yet giving them 70 year old technology... welcome to the entertainment industry! I actually feel this resurgence of film will be bigger then people initially let on, especially since most filmmakers have already done their "digital" experimentation and those who can shoot film appear to be going back. The big problem today is the ease of film processing and instant dailies. Once someone figures that out (more mobile lab trucks) then there is no excuse to not shoot film and distribute in 70mm. For better or worse, Quentin set something in motion and I truly feel it's going to be the future of high-end distribution.
  18. I so wish that was the case, but it's really not. The sad truth is that we really don't have a lot of content finished in UHD. If there was a lot of content, we'd see every BluRay being available in UHD. We'd see HBO, Cinemax, Showtime and On demand channels pushing UHD to customers (which they have done tests of). We'd see every sporting event broadcast in UHD as well. Sure, the super bowl will be in UHD this year, but that's the number one broadcast event of the entire year! Some of the infrastructure is there and works fine, it's the content which sets us back. Unfortunately content owners, aren't willing to go through their libraries once more and create new masters. A lot of them just did it with BluRay and it will be A LONG TIME before they do it with UHD. When you watch HD content on a UHD television set, you are actually loosing quality, you "soften" the image substantially because you're using the monitor's built-in scalers. With CRT's, scaling isn't such an issue, but with digital displays, all scaling does is lower the quality of the image because they're not very good. It's far better to watch 1080p on a 1080p monitor without scaling, then it is to watch 1080p on a UHD monitor.
  19. And AE is very Open GL friendly, not CUDA. So depending on the type of graphics card, you will actually get a different feature set from AE. The software decoder for Red Code doesn't work well. For 4k it does require some sort of high speed CUDA support. This is why I use fast CUDA cards in machines that work with DaVinci and Red files, it actually works! Put the old card back in, Red won't playback in 4k anymore. MPEG type 1, 2, and h264 has no requirements for hardware support. Meaning, they are designed to function without specialized hardware, which is the polar opposite of Red Code (JPEG2000) and Tiff (Cinema DNG, Targa, DPX), formats which were designed specifically for CUDA hardware support. This is why MPEG is so widely used on camcorders and low-end cameras, because it doesn't demand a lot of power. Sure, you can accelerate anything, but it's unnecessary and not even worth discussing when it comes to editing since .h264 isn't a professional camera codec. Today's 8bit and 10 bit iFrame XAVC codec, is so easy to playback, it works on pretty much anything, but doesn't have bandwidth necessary for professional use. In post production, most software is GPU enhanced. (avid, FCP, Premiere, DaVinci) Any real-time effects will be CUDA supported, so yes if you're using a lot of real-time effects, then it maybe smart to invest in a faster graphics card. Software like DaVinci has strict requirements on hardware GPU acceleration and I've tested a myriad of cards and workflows with the other software and you will see small advantages in real-time effect playback speed. However, that wasn't the original posters question. He asked about playback, not about editing. Actually not true at all. In the PC land, there are drivers for specific software packages. AE for instance, has it's own home brew driver package for Open GL support. It's installed when you run the Adobe installer and we've found it to be poorly coded. We've done extensive testing with 3rd party drivers, disabling the Adobe drivers and have found that windows 8 causes a whole bunch of issues with support. So it's not that easy on the PC platform and we've had to revert back to Windows 7 on some machines in order to make the older, better drivers work properly. With the Mac's, the drivers are built into the operating system and the software installers doesn't overwrite them. This is a HUGE benefit for quick setup and after spending months trying to get the PC's to play nicely and literally an hour getting the mac's to play nicely, it's a word of advice that customers who use PC's be aware of these driver issues. It sure does! Full RGB displays entirely differently then compressed RGB because it's greater then 10 bit. Most NLE's only support 10 bit playback, so it has to subsample on the fly to 10 bit for playback. Yep and that's a consumer format. In the professional world, that would be rejected by pretty much every facility I've worked at. To give you an example, minimal requirements for HD are 10 bit 4:2:2 @ DNX115. Most facilities require 220Mbps Pro Res HQ as their "base" camera format. For 4k work, everyone requires 12 bit 444 in true 4096x2160 because that's the same format used for DCP delivery. So if you wish to play in the sand box with the kiddies, go ahead and use 100Mbps MPEG, it will play off a flash drive no problem. Professionals like Macks who is asking about a 12 bit 444 format (probably Pro Res QX) are clearly looking to do professional post production, which is my expertise. Thanks for tearing apart my post, but all day long I travel around So Cal installing editing systems for companies that produce products you've seen on television, on BluRay and in the theaters. :)
  20. Yea and who would shoot ENG (TV), Commercial or Corporate event on film? I mean, if you're best quality playback device is 1080p, I mean film is completely irrelevant. The benefits of film; longevity, high resolution, better dynamic range, more color space and the film "look" are pretty wasted on 8 bit 4:2:0 delivery formats. That's why I haven't invested in a 4k digital camera yet. Who is paying for me to store 4k files? Who is paying me to upgrade my computer and software to edit 4k and who is going to watch something in 4k? Heck, Pro Res HQ @ 23.98 10 bit 4:2:2 is actually over-kill for any broadcast or internet deliverable... which is why I have pocket cameras. :) I deliver to broadcasters every week and believe it or not, most of the guys I deal with are still 720/59.94 and won't take anything else. Most of the corporate stuff I do (educational videos) are actually 480 because they still burn DVD's and they don't want the high bandwidth usage on their servers. I had to teach them how to deal with 16:9 material! I'm like guys... it's 2015 here and HD has been around for two decades! I mean, longevity is the key with film. If your product is designed to be watched once or twice and then thrown away, what's the point? If it's a long-term product that has long standing impact like a dramatic television series, feature film, documentary, anything that should stand the test of time, then film should be thought about, especially if film will be used as a presentation method. Ohh and the audience does care, nobody would have seen Hateful Eight had it not been in 70mm.
  21. It really depends on the kind of codec used. ProRes is a multi-threaded codec, so it actually uses the CPU more then the GPU. Heavily compressed codec's like JPEG2000 (Redcode) and Tiff (Cinema DNG/Targa/DPX) are VERY GPU intensive and CPU matters a lot less. MPEG is the least efficient and works on the CPU. The GTX660 is a pretty decent card, but it really depends on the drivers and if they have the right support for the different codec's. If you're on a PC, that's the biggest problem you'll encounter, codec's and drivers will kill your speed and finding the right combo to work with different software packages, is really challenging. Mac's on the other hand don't have those issues because driver support is built-in and software/hardware manufacturers know this, so they make products that are updatable very easily. Pro Res XQ @ 4k 12 bit is pretty heavy, around 1660Mbps. I've done a lot of testing with pro res 4444 @ 4k and have found 2 drives raided zero works fine. When working with raw Red files, you need a faster graphics card more then anything else. It's a bit heavier then Pro Res XQ, but not much. Still, I suggest either 8gb fiber direct attached 8 drive raid, or if you have a mac with 10Gb thunderbolt and a simple 8 drive raid, that will work fine as well. Where things get harry is multi track playback and dealing with throughput for FX and audio tracks. So you need a lot mover overhead then you'd normally build in.
  22. I personally love the resolution game, it just shows how people are influenced by numbers and hype, rather then reality. The reality is, we're a solid decade away from UHD becoming the norm. Display technology is making a huge transition in the next few years, away from LCD/Plasma and into OLED. The quality and longevity of these display systems will get better over time. Early generations will suffer from the typical high pricing and high failure rate like most new technologies. Buying al LCD UHD monitor isn't a great idea. Once displays are up to snuff, the next big hurtle is distribution. Currently the internet as a whole is not fast enough to stream 8 bit 4:2:0 UHD @ 50Mbps H265, which would be acceptable quality. I know that sounds insane, but with so many people hitting servers at once, we struggle to stream 15Mbps from sites like Vimeo and Youtube, which as we all know, look like crap. So internet speeds are the big problem and honestly it's both the servers AND home connections which need to be changed. In the last 10 years, internet speeds in the US have increased by a measly 20%, that's the smallest gain of speed in a decade. The booming speed changes of the late 90's into the 2000's are over. The average home connection is still 25Mbps, which is unfathomable when it doesn't cost the internet providers any more money to provide a better service. So that's the first big technology shift which will be needed for UHD presentations at home. Obviously we already have UHD BluRay, but only Sony has signed on for releases. Finally and the biggest thing is the content itself. As everyone well knows, the vast majority of content is produced and distributed in 1080i/60, not even progressive, not even 24fps (1080o/50 25fps in europe). So we're still doing 3:2 pulldown, we're still interlacing on broadcast and we're still lower then 2k resolution. So content makers need to buy all new equipment, cameras, switchers, editing solutions and broadcasting equipment. Content owners and libraries need to spend billions doing the same thing THEY JUST DID with BluRay and re-scan tens of thousands of films. Heck, from 2000 - today, the vast majority of digitally acquired products are finished in 2k or less! So there is no way to get a real 4k final out of them. These are the reasons I don't see anything happening for a decade. I honestly think it will take that long before all three of these issues are resolved to the level where UHD makes sense. 4k and 8k at home with 4k and 8k sources? That's so far away, it's not even worth discussing. Lets get all the movie theaters upgraded with 4k laser projectors, lets get the minimal requirement for DCP set to 4k and then we'll start building solutions for the future. Until then, we're living in a 1080 world... and I think we'll be here for A LONG TIME as broadcasters are sick and tired of upgrading after the mandatory switch from 480i to 1080i.
  23. PT Anderson, Wes Anderson, Christopher Nolan, Steven Spielberg, Martin Scorsese, Joel and Ethan coen, James Cameron, Alfonso Cuarón, Alejandro Iñárritu, David Fincher... these guys have figured it out. But I get it. There are so many moving pieces, getting everything to line up and hit a home run every time you go to bat, is nearly impossible. In order to get that great cast, crew and screenplay, you have to be passionate about the film you're making and understand what makes a truly great film. You can't just be a "director for hire" which is what Scott and Howard have been for so many years. Those guys who just take on other people's projects or read a good book and think it can be made into a great movie. Guys like Spielberg have gotten away with it many times, but only because he gets the best opportunities. When he comes calling, people jump at the opportunity and I'm sure it's the same with Scott and Howard. Yet, for some odd reason, these two guys simply can't get their act together. I know it's not a talent issue, Scott and Howard absolutely have the same skills (proven time and time again) as anyone else on that list above. Yet, their movies are in many cases, laking that illusive soul which makes a truly great product. Guys like Coppola have other issues, maybe too much passion and not enough common sense? Balancing those things is very challenging.
  24. Well, 18fps is really a projection-only format. Your 18fps footage will be telecine'd at 24fps and then a pulldown applied to 24fps. So you will get blurred, duplicate frames in the playback. The crispness of a real 24 frame per second shoot and edit, won't exist. So I highly suggest shooting and editing @ 24fps. The cost difference up front is minimal compared to the quality loss in the back end. Yep and there in lies the biggest problem really. The expense to shoot an 8 minute short film in B&W is pretty high. Once you add gate wobble/weave and pressure plate focus issues of most camera/cartridge combo's, the end product looks very unprofessional. So you aren't using it on your demo reel and years from now when you look back, you may think twice about the decision to shoot super 8 IF you wish to be a filmmaker.
  25. Ya know, I look at Spielberg and Scorsese, two guys who's careers paralleled Ridley's and those to guys have made amazing films for their entire career. Sure, both of them had some films that weren't quite up to par of their absolute best, but not a single film anywhere near the lows of Ridley Scott. I guess my point is, of Spielberg and Scorsese can still make great stuff, there really isn't any excuse for guys like Scott and Howard.
×
×
  • Create New...