Jump to content

Tyler Purcell

Premium Member
  • Posts

    7,464
  • Joined

  • Last visited

Everything posted by Tyler Purcell

  1. Ohh crap, I can see the feathers getting trapped! HOLY MOLY! I didn't see that my first time watching, my bad! It's in the first part of the move, you can see the feathers following the cable as the first glass moves, plain as day! Umm… so yea I'd use magnets, forget about the wire. Getting the right wire and making it work, is probably more difficult then it's worth. Magnets are a lot easier, you can simply throw one in a cup and have one under the table, operated by one person. You do need a nonmetallic surface however, but that's pretty easy to find.
  2. I've used the pull back and zoom out method for years. I remember first seeing it in a Terry Gilliam film as a kid and was like, "wow, what a great trick". The whole idea is to keep the object not so close to the camera, but zoom into it all the way, to get the closeness you're looking for. Then zoom back and dolly out to reveal your shot. I used trackless doorway dolly's to achieve the effect most of the time because you'll always get the track in the shot at some point. A crane arm on a doorway dolly helps change perspective from the first part of the shot to the last. Focusing on such a small object can be challenging as well. A cute trick could be to put a magnifying glass in front of it as a prop, ya know… throw it into the story. That's a commonly used device to solve problems like this.
  3. This material was shot with the most up to date super 8 camera on the market. There is no reason why the camera makers can't put hard gate block on the opposite side of the gate and spring loaded ones on the claw side. I don't think this will solve anything, it will just change the dynamic of the issue. In the case of this negative, if you were to align the scanner to the left side, you'd have even more gate weave. Just look at the side of the frame wobble left and right following the perf. The only way to stabilize the image is to throw it into After Effects, choose an object in the background that doesn't move and use that as the stabilizing location. Since the shot doesn't move, it would work well. I don't think EITHER side of the film, left or right, would fix any of the stabilization issues this frame has and absolutely not fix the rocking motion. Damn thing looks like it's being shot on a boat!
  4. I could care less about digital correction of film. Fixing something that's inherently wrong in post, defeats the entire purpose of shooting on film. Might as well shoot digitally if your intention is to "fix it in post", saves a hell of a lot of money and applying a "film look" is a lot easier. You've proven my point though, there is a problem with Super 8 stock and this is a great example.
  5. Drill a small slot into the bottom of each glass and glue a wire into it. Piece of cake.
  6. Yep, that's what I said. It's the actual perf itself moving. Using the super 8 example, a registration pin in the gate would absolutely effect the location of the image being exposed based on the location of the perf. Usually it will twist the image very slightly, rather then give it a left/right issue. This is because the registration pins are usually below the gate. However in the case of super 8, we see the rocking chair syndrome. There are many threads on this forum all about this issue. So either the perfs (which are clearly out of alignment) are the problem, or the stock itself isn't cut evenly. Yes, if you don't use the perf as a guide, then it's not a problem. However, cameras and projectors use the perf as an alignment tool with the claw. Actually, the image is moving all over the place. It doesn't coincide with the perf, that's for sure. However, it has the infamous rocking chair effect. Just take your cursor and put it the corner of the sign in the right side of the image and watch the effect. This problem is related to the perf directly. As the claw drags the film down, it's dragging an off-set perf, so it twists the film every so slightly. This "twist" is macroscopic as you said, but it's absolutely a problem with the perf or stock cutting. I doubt the gate is that poorly manufactured. My point is, the gate weave is horrible and it's not from the transfer, it's originated improperly in the camera. So forget the "digital imaging" aspect for a moment, focus on why the camera could produce such a horribly registered image and that's when issues related to the stock come into play.
  7. It's wires for sure, you can see the feathers move… whoops! Just noticed that.
  8. I'm confused, you can see the perf's weaving left to right compared to the image. This isn't a trait you'll see with any other medium that I'm aware of. I scan 16/35mm all the time and have never seen so much fluctuation with the sprockets on new stock. People have been reporting perf issues with Super 8 film forever. I've shot tens of thousands of feet of super-8 film in my life and every single frame has gate weave. I've also shot tens of thousands of feet of 16 and have projected/scanned much of it without any gate weave. Now it's a bigger image, so it's not as noticeable I get that. However, you'd absolutely see it in a test like this. I personally don't think a registration pin would solve the problem, as the edges of the film seem to be OK since the image isn't moving too much, but the perf is all over the place. If you had a pin that locked into that perf for every exposure, the frame would be all over the place as well. The only thing saving the camera is the gate and backplate, keeping the frame itself from shifting. This is an inherent issue with Super 8 because everything is so small, the machining issues are more present then ever. You won't see this with standard 8 because it's a 16mm wide frame originally. I had hoped the Logmar camera would have reduced these issues, but it appears they still exist even though it does eliminate the focus issues that plastic backplate cartridges have, which is nice. This is yet another reason why super 8 really isn't ready for primetime without A LOT of fixing in post. I'm happy people are shooting super 8, but for the format to really be professional quality, some of these issues need to be solved. The stock quality today is good enough, it's just a matter of the mechanics of the beast.
  9. Pro Res codec's are available for Windows, Linux, Mac and are free. Avid DNX codec's are only available for Windows, Mac and are tough to find without dealing with Avid. The only editing software that works with DNX is Avid. I've tried to make FCP and FCPX work with DNX, but the rendering engine has some issues. Pro Res is native to Avid, Final Cut Pro, Final Cut X, After Effects, DaVinci and Premiere, so it's a no brainer. The render engine has been built to work with pro res and it does a good job. Honestly… I wouldn't edit "film" on anything else but Avid anyway. It's the only software that allows you to catalog and deal with keycode, which is critical for going back to the negative. Maybe not so important for today, but nice to have years down the road when your film is a cult classic and you wish to restore it. I edited my films on FCP and I can't go back to the negative to save my life. So to re-cut (up-res), I've gotta scan everything, which is a real costly endeavor if you have 20,000 feet worth of film and only need to scan 1700.
  10. Spirit 2k is fine, they work well. DNX is an Avid proprietary format. So if your editing with Premiere, you'll be wanting Pro Res as your computer won't even playback DNX files. You will cut the show in Pro Res 2k 4444, consolidate (copy) all used media onto a drive using MXF or EDL and hand it over to the colorist. The colorist will import the MXF into DaVinci and do the color. They will then export three things: - Pro Res quicktime movie of the final - Pro Res colored shots (to replace the stock shots in your editing program) - DPX file for archiving You will then take that pro res quicktime and import it back to your timeline in Premiere, mate it with the sound and export your final product in Pro Res 2k 4444. I don't know if Premiere can export in 2k 4444, but it should be able to if you import in that format. You will need the proper pro res 2k codec's, which are not included with the operating system. I'm not sure if Premiere comes with them, but it might.
  11. I think a lot of people think camera is the most important thing, so they always wanna work in the camera department. However, I personally think lighting/gaffing is the most important thing. Interfacing with cinematographers in that way, is a lot better then assembling and disassembling a camera or loading mags/cards. Most cinematographers work with their own AC's, so becoming a 1st assistant camera can be very challenging. However, if you're a good gaffer and stay locally, you'll meet tuns of cinematographers, you'll learn how each one of them works and there will be enough down time for you to watch how they work from a distance, picking up on tricks of the trade. Learning the camera is actually easy, learning how to properly light? That's VERY difficult, that's where the cinematographer makes their money. Sure, this road does require "breakthrough" work done outside of paid union film sets. You would need to get hired as a cinematographer on a small film outside of your normal work. However, you'd have a skill set that most AC's don't have and be paid well for it. Honestly, I look back on my friends who choose that path and I'm jealous because they're constantly working and a lot of my AC friends are barely able to find work because cinematographers tend to stick with the same AC's. Just my .02 cents. ;)
  12. With S16mm, I simply ask the lab for a 2k scan and deliver a 2k Pro Res 4444 file in RAW color space. You'll not only save a LOT of money by doing this workflow, but there is no need for DPX camera originals unless you plan on scanning the project back out to film. I've done a lot of experimenting with deliverables for cinema projection using DPX and Pro res. I've found the 2k Pro Res 4444 mastering for S16mm to work very well. Once done with color, you can make a DPX file out of DaVinci and that can be your "archive master" of the film. In terms of 2k vs 4k… S16mm resolution is right around 2k. The less grainer the stock, the more perceptible resolution you'll have. So if you shot 50D for instance, you may have a tiny bit more visible res then 2k. However, if you shot a 250 or 500, you'll be capped out at around 2k of resolution. When you scan at 4k, all you're doing is introducing more noise into the image. I've done some camera tests with 2k and 4k scans and I would never scan S16 at 4k again. It just eats up bandwidth through flickering grain, which is unnecessary. Sure, if you plan on scanning back to 35mm, then you'd want that resolution. However, if your ultimate goal is DCP and internet distribution, I'd stick with 2k so your image is more "digital friendly" and you won't be wasting bandwidth on some reflective particulates. 'Rogue Nation' was shot on 35mm and finished in 2k. So if it's good enough for a feature film in 35mm, it's good enough for 16mm. ;)
  13. On HD, 24, 25, 30 are acceptable world wide, the player does the conversion as the color standard is the same. On SD, 24 is acceptable world wide (the player does the conversion if necessary), but 25 is PAL color only and 30 is NTSC color only.
  14. Yep Arri SR was the most balanced film camera I ever used. I shot documentary style for years with one and loved every moment of it. Clip on matte box, zoom lens and rear mounted battery, just like a video ENG camera. I never had any problem with film cameras, even when I shoulder mounted a BL4 once, it wasn't a big deal, you just needed some muscle to hold it up. I use the pocket camera with a cheap chinese rig sold in India by The Cine City. It's VERY cheap, it's pretty much indestructible and it balances very nicely. No viewfinders necessary, just an eye cup and you're good to go.
  15. Can you capture some of what it sounds like and post it here? Distortion is usually caused by a bad amplifier, rather then the pickup element itself.
  16. Directed by a Russian producer who's last film was complete trash "Abraham Lincoln: Vampire Hunter". :sigh: It's unfortunate people are unwilling to fund original ideas anymore.
  17. Yes, but not between the Alexa and HVX as the Alexa is CMOS and the HVX is CCD. CCD (charged coupled device) are actually very good imagers, but they struggle to get decent dynamic range. They also have issues with low light because the image is grabbed as pulses. Where CMOS (complementary metal–oxide–semiconductor) has less dynamic range limitation and grabs images in scan lines, which means the processor plays a bigger role but also means they can be used in low light and have better dynamic range. The biggest weakness of CCD is dealing with colors. You need to separate the colors on capture, which requires a beam splitter and three chips. This takes up a considerable amount of room, so producing a CCD camera that has bigger sensor like S35, would be very difficult and bulky. So most CCD cameras use very small imagers, thus the technology (which is pretty good) doesn't hold a candle to the mega-sized possibilities of single CMOS sensors, which have multiple layers (like film) that capture the different layers of color without multiple chips. It's fairly easy to make CCD and CMOS 12 bit 4:4:4. However, the cost to do so can price products outside of the range manufacturers feel is acceptable. This is the biggest problem with manufacturers like Sony and Panasonic. They make cameras to fit particular price brackets, so they limit functionality in one way or another to meet those requirements. Companies like RED, Arri, vision research, Blackmagic, etc… they don't have that philosophy. They make a camera and whatever price range it fits, that's what they'll sell it for. They don't "hinder" the camera's technical ability purposely to hit a particular price. Finally, the big thing that allows us to capture high dynamic range (RAW) has flipped the industry on it's head. Cameras without RAW capture, won't have the wide dynamic range and as a consequence, won't look as filmic. It's that RAW capture that truly separates the good cameras from the not so good. Most cameras don't capture raw in stock form and those who claim to spit raw out the HDMI or SDI port (which is REC790) are just toys. You can't take high dynamic range material and compress it down to REC709 and expect it to be the same.
  18. I can try to dumb down a bit more from David's excellent, though technical explanation. When television was invented, they used the frequency of the electricity to determine frame rate. In the USA that would be 60hz, so they used 60 fields per second. In the case of analog television, the glass tube image is made of little lines of information which is scanned top to bottom very fast. For standard definition in the USA that would be 525 individual scan lines from top to bottom. So the image was never solid, it was always changing, it was always scanning even if you had a still image. Each frame consisted of two fields an upper and lower. The television scanned the first line and left the 2nd line black, scanned the 3rd line, then left the 4th line blank, all the way down. Then for the next frame, the same process happened where the 2nd line was scanned, then the 4th line, all the way down. Two fields to create one frame, but the whole thing is in constant motion and your brain makes it all work. Standard definition analog cameras initially captured the same way as the television, by scanning top to bottom in fields and then converting that signal into a frequency which is sent down the cables and over the airwaves. Later that system turned into a charge coupled device or CCD which "flashed" a single frame at a time and then turned that image into the 525 interlaced system. So this is one of the reasons why broadcast looks so different. By the way, we still broadcast a similar format today, but our televisions are "active" and don't have a set frame rate like tubes do. So the interlacing is barely seen. However, since you can adjust the frequency of modern televisions from the 60hz of old days to 120hz and 240hz. This makes the motion even more smooth, it will turn anything into a "television" look. Thus demonstrating, the look of television really starts with the scan frequency, or in this case, the frame rate. #1 Frame rate = 24 progressive frames, no interlacing. The next big thing is color space. Television broadcast has a very limited range of colors. We have almost infinite colors in the real world, film has an almost infinite amount of color options as well, but television doesn't. There are three primary colors, red, blue and green: RGB. Television takes the blue and red signals and only captures half of the visible data in those two channels, filling in the rest with green. Then, it only captures a few thousand color combinations, instead of the infinite color available. Plus, our eyes can see a broad dynamic range. Film can see a broad dynamic range, but television can't. To cram the signal down the line, the dynamic range has to be compressed. Digital cinema cameras capture full RGB signal AND millions of colors. Plus they have a wide dynamic range like film does. #2 Color information and dynamic range. Things like sensor size (depth of field) are important, but not as critical. In my view, those two things I mentioned above are the two most important defining items.
  19. 4k scanning is about double the price of 2k scanning. The biggest problem is; what do yo scan? This film may have been edited and finished by scanning all the good takes in 2k during production. Then all they needed to do was take the editing bay's EDL, re-link to original source media in DaVinci, do the color and prep for distribution. This is a FAR cheaper way to work then going back and re-scanning the entire film in 4k and doing the finishing that way. Sure, you scan MORE up front, A LOT MORE, but the cost of 2k scanning is so much cheaper, you ultimately wind up with a more streamlined workflow and spending less money. It also allows for higher quality dailies as most films dailies aren't scanned, they're telecine'd. Honestly, the best way to do all of this is; - telecine the negative into raw color space Pro Res 1080p files. - Edit - Cut/conform negative - Photochemically color negative (internegative) - Scan color'd negative in 6k Preservation print exists in the highest quality possible and the digital exists in the highest quality possible. Mind you, there are thousands of films that have used DI and are only 2k. It's just, in today's age you'd expect a big hollywood blockbuster to spend a bit more money and do it right, especially for IMAX purposes. I for one am very much against DI only and vastly prefer a photochemically timed print for 'preservation' purposes, based on the original negative, rather then some computer data and preservation print based on some digital representation of what the 'film' looks like.
  20. FotoKem says this film will be finished photochemically and the DCP will be made from a completed film.
  21. So we've finally got a trailer to watch! https://www.youtube.com/watch?v=gnRbXn4-Yis I'm upset they choose to cut that kind of teaser because the film's pacing is going to be A LOT slower. At least it looks good! :)
  22. Well, here in the states, the average movie ticket is $12 dollars and most theaters in the city's are upwards of $16 - $20 depending on the show. Since the vast majority of people have a family of more then one adult, imagine paying $40 for the movie (two adults) and then buying one small popcorn and two bottles of water. That total comes to around $50 bux for two people simply watching 2hrs of entertainment. Forget about adding kids, forget about the "schedule" conflicts that forces people to scramble around after work in order to make the cinema. So when the cinema offers identical quality to your home television (which is what it is today), then the only reason to pay $50 is to see first run content. The moment first run content is available at home within a week of theatrical release, people are going to analyze the option. Either waste a lot of money for 2hrs of something that maybe complete crap or have dinner at home with the family and watch the movie at your leisure.
  23. Man, I'd love to have an A-Minima in a backpack for "anytime" use! WOW how cool is that. :D
  24. Audiences are only doing it because cinema has priced itself too high. Most people don't even think about going to the cinema because it's too damn expensive today. It's expensive because theaters had to pay millions to upgrade projectors so James Cameron could show Avatar and they have to amortize that price down to the consumers. Plus, the distributors are greedy and are pulling even more revenue's from the theaters for special poop like 3D. They're just greedy idiots killing their own industry.
  25. I know a few of the local labs have render farms for REDCODE because they get so much material that needs to be transcoded immediately and the hardware cards DO help, but even with realtime decoding, it's still too slow. It's unfortunate RED have made a proprietary version of JPEG2000 because in the end, JPEG is an open source/Open GL compatible codec that works really well. Maybe not as "pretty" as Pro Res in terms of it's packaging, but still works well. To this day it depresses me we're still discussing proprietary codecs at all and RED is the only company unwilling to let people see behind the vail of secrecy.
×
×
  • Create New...