Jump to content

Tyler Purcell

Premium Member
  • Posts

    7,834
  • Joined

  • Last visited

Everything posted by Tyler Purcell

  1. When you did the 6.5k imager swap, I'm certain it was not cheap. I've asked them before, going from an older 4k imager to the 6.5k with the required support contract, was practically the same cost as a whole new BMD Cintel II. Plus, the amount of bandwidth you need to deal with the larger files, require a whole new discussion about storage, which is grossly expensive. Where I do like the modular and more open source design, had BMD offered an upgrade for their black box, it probably would have been 1/4 the price of the scanner. Plus, there would be little to no storage changes as the thunderbolt mac's, already have enough storage bandwidth integrated into them, unlike some old windows box. Having high speed connectivity integrated into the bus, is game changing. There is no integrated high speed connectivity on X86 systems, they basically expect you to by a threadripper/Xeon with lots of lanes and use a NAS solution, which of course is not client friendly in any way stretch of the imagination. Sure, capturing to Pro Res? No problem, you can use slow drives. However, when working with DPX/Tiff/Targa even .CRI files from the Cintel, high speed storage is a must. This is why mac's are such a killer offering for creatives, even with older TB4, the 40Gb interface was just fast enough to get the job done. Today with TB5, it's a game changer and working with high res files like the 12k off my BMD Ursa Cine, is a piece of cake. Impossible to do on an X86 system without extremely high speed storage, which is again, very costly. So in the end, you wind up paying 3 - 4x more in the X86 world to deal with problems you literally never have to deal with in the Mac world. Mind you, no other scanner uses Mac's, so it's hard to quantify how much savings there would be with something like a ScanStation, but having worked on both platforms for years, it's probably considerable. I was told most of the debayer happens in the magic black box on scanner. It's similar to how BRAW works. It's why there is little to no way you can make adjustments AFTER you've scanned. The finished CRI file has your look baked in, there is no magical headroom or changes you can make post scan, like you can with any other raw file type. Heck even BRAW has "some" leeway, but very little compared to Red or Arri raw codecs. I also know playing back the CRI files that come out of the Cintel II, is very easy for a potato computer, as I edited an entire UHD short film from a lowly thunderbolt SSD on a 6 year old Intel Mac laptop (shit computer) without ANY issues. It barely used any system resources to work with those files, unlike any other camera raw system which would be debayering in playback. Also, when you're in preview mode and scanning, when you make adjustments to the color, there is a delay because the changes you make, are being done in real time on the FPGA in scanner.
  2. Using electronics to join multiple motors together is fine, but with 16mm cameras especially, you can't use a sprocket intermittent to pull the film down, you will need some sort of pulldown claw. A sprocket, won't work. The reason cameras use pulldown claws, is because they can have very tight tolerances. A sprocket has very bad tolerances, which would cause lots of up and down movement in the image. Even if you were to reduce that with a spring loaded side rail, it would still be an extremely unusable/unstable image. I know it runs through the camera ok, but those micro movements can't be seen by the naked eye. As Aapo suggested, the best thing to do is start with a movement from another camera. So that all your main components are made of metal already. I really like the ACL movement (minus the mirror) as it's mounted to a nice thick piece of metal, that you could easily find a front housing for, even perhaps an ACL one. To save even more money, re-working a K3 movement, is probably pretty straight forward. Shims yea, that's the way most cameras work. Arri has a few cameras that the entire movement can move back and forward but for the most part on 16mm cameras, it's done with shims between the lens mount and the body. The kit to measure would have a flat piece of metal for measuring that is the width of the 16mm gate. A FFD gauge, which usually has a flange that you would push onto the lens mount. Then a tester tool, so you can set the FFD gauge to zero before putting it onto the camera. FFD tools aren't horribly expensive and are available for sale, but the 16mm width flat piece of metal, maybe harder to find. I had to buy one from another tech. You can't use the ground glass method because it adds too much depth. A very thin piece of smoked paper can work on 35mm cameras to see if you're in the ballpark, because the image is so much bigger, but on 16mm cameras with that small of an image, it's just impossible with out proper FFD tools. No matter what, to "dial it in" you will 100% need the tools, there is nothing you can do about that. A collimator could be used theoretically, but they're more expensive.
  3. Cameras that have full blown AI ML engines in them today, which use what, maybe 5 watts? You talk about frame registration, when FPGA's can literally dewarp in real time these days, Canon has that tech built in to their original 5 year old R5, today with the Nikon Z8 and Z9, a single FPGA can automatically remove lens distortion at 60fps in 8k. Again, with a chip that literally uses no energy. So for someone doing restoration, being able to have the AI engine on board, deal with things like examining the outside of the frame line and looking for warping issues, then automatically de-warping as it's scanning, these are all huge advancements that nobody has done yet and all the tech is available right now as programmable FPGAs. I mean, BMD built their FPGA based scanner a decade ago, if you're saying a decade old scanner is going to deliver images comparable to a modern one, that's up to you. But generally the imagers are the issues and if you want a better imager, you need a whole new system; computer, software, imager and back end (network) to really "update" anything anyway. So the fact you can't update an FPGA easily, is irrelevant when everyone is buying all of those bits every decade anyway. Yea and you need a windows specialist to keep a system running when you are updating components all the time. It just isn't something the average consumer running a business can deal with. Not only that, but the computer needs to be a powerhouse, at a time when good GPU's are $4k each and threadripper/Xeon are the only chipsets capable of having the lanes, which are grossly expensive in of themselves. This isn't 2018, building a modern X86 desktop system today for this work is more than a Mac Studio Ultra and in the end, all you get is something that will be out of date in 2 years. The great thing about a scanner that's plug and play, is that your computer can literally be a toaster. You don't even need network storage. The compressed CRI files the BMD scanner delivers, are relatively small and a day of scanning can easily be put onto a decent sized thunderbolt NVME device. Then you can unplug it and carry it over to your finishing workstation. You save, oh gosh, tens of thousands of dollars as a business owner doing that. We scan DPX and TIFF/TARGA, which range from 10 bit to 14 bit. I have a lot of raw data to move around so I'm stuck to high speed network drives unfortunately. But when I've used BMD scanners in the past, the thunderbolt workflow is very good and saves a lot of money. It may be simple, but you've clearly been working on it for a while, majority of people who are trying to pay their rent, don't have that kind of down time. It's disingenuous to think people running restoration businesses, are going to be writing their own code. This is why the scanner manufactures do it. This isn't some BMD only thing either, MOST scanners work this way, they deliver a finished image to the computer, you know this. They may use different hardware, but in the end, it's even MORE locked in than an FPGA. Good luck updating a Spirit, Scanity, Arri scan or any of those line scanning systems to anything modern, not gonna happen. Your "open source" scanner concept is great, absolutely understand it, but the majority of people will not make that kind of investment. They need something reliable and capable of recouping their investment immediately, not in a decade.
  4. So 3D printed material, no matter what it is, will not have the tolerances for proper FFD, never going to happen. Manufacturers spend an ungodly amount of time engineering the gate to flange distance and making it perfect. The gate and lens mount, really need to all be made of metal and somehow connected. So the tolerances are based directly on film channel/float. Some cameras have a pretty large film channel, this is the gap between the pressure plate and aperture plate. Metal gates are used because you can polish the side rails and the film can be at the same depth as the physical aperture. The pressure plate, shouldn't actually be putting pressure on the film itself, it should be very smooth but held firmly in place by the gate laterally to prevent wobble. Yea, you will need gauges that go down to .01, as the FFD range is .00-.03mm, depending on how much float the film has. Some cameras can be run at .03 and not be a big deal, others need to be spot on .00 and that's the tricky part. Engineering the pulldown, the consistency with the FFD across the frame, film channel, a spring loaded side rail gate, timing with the shutter, all of these things are very challenging to get right and I'm afraid, no way a 3D printed camera would actually create proper images. Someone tried it with a 2 perf 35mm 3D printed camera and it was basically unusable. The tolerances are incredible, .01mm off and you go from working to not working. As someone who services film cameras for a living, it's a miracle any motion picture camera works at all. I commend the work tho, I'm glad to see you playing around with it. I'd love to see if any images come out.
  5. Good movies, can (not always) have a lot of magic behind them. Moments that weren't necessarily scripted a certain way, which just fell into place and worked. For the audience to know everything about production, including how those magical moments came to be, may undermine the filmmakers skillset. It may make them seem less like a genius and more like someone who struck oil accidentally and accepted the revenue simply because they touched the ground. On the other hand, some productions are so tedious and difficult, when you hear about what the filmmakers went through, it kinda opens your eyes a bit. I almost prefer the horror stories, because you can learn what NOT to do. I generally don't purposely seek out production information, but I will read the trades if there is an interesting story. I haven't let those stories effect my enjoyment of a film.
  6. Or just use some tape to mask off the monitor, either way.
  7. The Cintel II uses manual focus on the outside. I assume when you change the optic, something about that focusing system changes and it's more complicated to setup right. I have not been under the hood, but they said it's much more complicated. It's an FPGA based system, so the bandwidth of the system is limited, that's why they can't just change the way it works. They would need to change the FPGA entirely, which means updating everything including software. This is the trap they got themselves into when developing it in the first place. They're still using TB2 for transferring data, which is extremely slow compared to the modern TB5 120Gb/s protocol which they would probably move over to with any new hardware. The "black box" is fully replicable evidently. I assume their goal would be to offer an upgrade for older systems, like they did with the lamp replacement, updated gates and updated drive system. They've been very good about updating older systems. It would be a camera and black box swap out, which is basically the entire back bone of the scanner yes. Na, PC's are horrible at this work because they're just using raw power to chew through processes. FPGA's are night and day better, it's why everyone uses them for cameras. You can buy specific FPGA's built for tasks like processing the bayer imager and encoding the raw file. Those tasks in of themselves done via software, require MASSIVE power. Have you worked with Red Raw 8k before? It'll gobble up half your GPU just playing back at full res because there is NO optimization. Scanners like the Scan Station have/had (not sure if they've updated this or not) two dedicated GPU's, to do what a basic modern cinema camera does in real time at 80fps. It's just a cost reduction method. Blackmagic's concept is lightyears better, but they are using decade old tech, that's the problem. I'm not sure how the Director works, but the spirit, scanity, imagica and arri scan, do "nearly" everything on the scanner.
  8. As Dan said, they're entirely different imagers, so yea, there is a significant gap. Nobody is scanning vertical 35mm in 10k anyway, the point of the high resolution scanner is for VistaVision and 65mm formats like 5 perf and IMAX.
  9. Actually, it's literally just switching the optics. The reason why they can't have users do the swap, is because there is calibration involved and supposedly it's "burred" in the scanner, so they don't think it's something users can swap. I had a lengthy talk with their engineer at NAB 2024 about this, as it was already done and working at that point. It's saw it running at NAB 2024. Yea, they're using the full imager, they're just shooting the perforations, so the usable image area is of course less. The scanner works no different in S8 mode then 35mm mode, it's using ML to stabilize using the imager's data. It does this IN HARDWARE, not on the GPU of the client system like many scanners do. This way, they send a pre-stabilized image to the client system, which then dumps it to a drive. All of the corrections and adjustments the user makes, is done to the actual scanners hardware in real time, the stream off the scanner is fixed. This way of doing things, where brilliantly simple, also leads to major issues when you want to upgrade or change anything. This is the reason why BMD have been delayed on their new imager scanner. I have been told YET AGAIN, the new imager is on its way, but not to expect it anytime soon. They're now expecting 2 year lead time.
  10. Here in So Cal the "film" scene is dying fast. Not long ago, I had commercial agencies knocking on my doors non stop for shoots, even double booking sometimes by accident. Now, all they want is digital and they want fast/cheap. It's really unfortunate. I actually had to buy a real digital cinema camera because otherwise, I would get no work. It's really sad, but I hope to keep shooting film. I got dehancer and a few other tools to help make the digital camera look more "filmic" and it's working, but inside I know it's not film.
  11. Couldn't agree with you more, the shorter the better. Tell a story that is simple and you'll win over the hearts of your audiences. Also, comedy works great. 🙂
  12. Yes, they do! If you ever need any encouragement or want me to read something, plz let me know. I'm a big proponent of making short films.
  13. Damn, yea I totally feel ya. This is a very common issue with creatives; not being able to prove their skills. I work with people all the time who are in the same boat, in fact one of them just wrapped their first narrative production on 4 perf 35mm. Very excited for them because I know that feeling and just completing something you worked hard to make, is a great feeling. Here in LA, it's almost comical how many people are doing "spec" shoots like this, just friends getting together with some actors making something quick and dirty to build a demo reel. If that industry stopped, I'd lose quite a bit of business because we scan a lot of those films. With that said, I've been involved in the local community here in LA for a long time, so finding scripts is so easy, it's almost like the streets are paved with them. So I understand how frustrating it can be if you can't find something to shoot. Where I do write a lot of scripts, I've found my personal work to be overly complex to make, which is the main reason I haven't personally invested in any of them outside of multiple drafts and shelving them. With SUPER short films like you're talking about, I think other writers won't take you seriously. Sure you can probably find something from one of the multitude of websites, but conversing with people directly to help, maybe more challenging. So if I were in your shoes, I'd take my time and write something. The main reason is simply due to your experiences and access to locations/people. As the filmmaker, your personal experiences should be in the story and you should frame it around what you have available resource wise. You may spin your wheels for weeks trying to find something shootable for the budget, crew, cast and locations you have available. It's far easier to get a list of things you know, places you can use, story concepts that work and mix it all together, especially when you're talking about 5 - 7 pages. Funny enough, I'm writing a story right now that has literally no on-screen audible dialog, partially because I'm going to be working with non-actors and partially because of shooting speed, you can do one take and move on, when its just physical actions happening on camera. I do like super short subject films, I think it's the best way to get your feet wet, hold audiences attention AND make your money spread over multiple subjects, rather than doing one big film. I have made the mistake of making "epic" short films way too many times, they just don't play because they're impossible to book at festivals and peoples attention span is too narrow these days. So being under the 12 - 13 minute cut-off time, is smart and I would try to find a script that has little to no dialog, so you don't need to worry as much about actor quality OR dealing with flubbing dialog lines. Anyway, those are my .02 cents.
  14. The Cintel has not been updated for NAB 2025. They made a slight alteration to the magnifying optic, which allows them to use the full imager for S16mm formats. They then made a slight change to the software which allows a crop-in for 8mm formats. It's not a new scanner at all, zero major changes made. The diffuse HDR light source is not a recent addition, it's been around for at least 3 years. The 8mm gates debuted last year FYI. They did this to increase the speed of scanning with a SINGLE pass HDR mode. This was one of the biggest slowdowns in the older generation scanners, having to re-scan for HDR. With AI tools, scratch removal is easy, PFClean does it without an human intervention. The updated versions of DRS Nova do as well. PFClean has the benefit of working native with Apple Silicon and utilizing it's massive AI potential to render in real time. This is a breakthrough for people who don't want to invest in huge workstations that are sheer horsepower to crack this nut. The lower cost solutions are getting substantially better and PFClean is a very cost effective monthly license for people who actually would use it.
  15. Yea, the grain structure is challenging on YouTube, especially when you don't know the exposure or post process. I work with 7222 all the time on my own scanner and system, so I have a better understanding of how it compares to other stocks. I do find it to be pretty much in line with 250D in grain structure, but 50D is absolutely less grainy. However, you will have LESS black detail with 50D, it's a lost cause trying to get that detail, it just won't be there no matter what you do on set. Blacks are a big problem with these finer grain stocks, you have to light everything, you can't let anything just roll off, it will be unrecoverable. Fine for a film noir, but not for anything else.
  16. The reason why 500T is such a great stock for low light, is that they formulated it for DR under middle grey, the toe is much more refined for night work. The daylight stocks, 50D and 250D, just struggle a lot with below middle grey. You'll find it to be very difficult to retain much if any detail unless you over expose the highlights and shift the middle grey up a bit, almost like a pull process. 50D can take the over expose no problem, but it also means you need to light for 250 probably, that's about what I'd rate it at if you were attempting this. I generally over expose 50D outside to help retain blacks and 9 times out of 10, it works great. Film has such a great amount of highlight retention, you can't really screw it up too much unless you're dealing with direct sun or reflections of direct sun. Where I haven't shot a lot of 7222, I have scanned a bunch and I don't know why you think it's grainy. Maybe the films you've seen shot with it are older?
  17. So Kodak makes film with a 6 month selling window as "fresh". Anything over 6 months in the world of motion picture is considered "expired". A stills guy, may laugh at that because people in the stills world, generally shoot decade old expired film without flinching, this is because the 35mm and large format negatives are so much larger than the vertical like 4 perf 35mm and 16mm, that the excess layer of fog (change in density) doesn't really phase them. Also with stills, the consistency between each image, doesn't really matter quite as much as motion picture. Storage is also critical. If the film is stored improperly, at too warm of a temperature for instance, the fog layer will be stronger. You can help reduce the increase of fog by simply freezing the film, but when was it frozen? A year after you got it new? Was it already expired before being frozen? There are just too many variables to build a successful guide in understanding. Each roll of film is unique, especially considering there is no way to exactly tell a rolls age, unlike still film where it's printed on the box. There is no such thing on motion picture, You can get close, maybe within a 2 year window, but you can't figure out anything else, not even Kodak can. So in the long run, there is nothing you can do. If you want to shoot old film, the risk is that it won't come out at all, even if you over expose, even if you do everything right. The remjet may be baked on because the film sat in someone's attic, you don't know. People aren't going to be truthful on the internet, so why bother? I know film is expensive, but if it doesn't come out, then you lost more than buying it new. So I always suggest to people, the best way to make sure you actually get results, is to shoot somewhat fresh film. Year old properly stored (frozen) film from a reputable source? Ok go ahead. But most stuff that's cheap on eBay, is junk. Straight 100% garbage. Mostly reversal stocks like VFN or older Ektachrome. None of that is going to work. Even EXR and Vision 1 stocks, I would stay 2 miles away from. Vision 3 stated in 2007 I believe, but only with 500T. So any stock that would be worth shooting today, would at minimal be Vision 3 and they did change the sticker, so you can tell if it's from 2017 or older, just by examining the sticker.
  18. Yea, you could easily replace with ADR, because your actors can speed up or slow down to match in post. Shouldn't be a problem at all. Recording on site, no way. Not only will you never find 24FPS anywhere on that dial without checking with a strobe, but it's actually not that consistent. Even the electric motor driven versions aren't that consistent.
  19. Hit up Cinelab in New Bedford MA. Robert has a cool 16mm recorder.
  20. Hey, so give us some information on what you'll be shooting, that would help a lot to define what camera system will work best.
  21. We have a really nice updated spool design. We've been struggling with printing it properly, but I think we may have cracked that nut recently. I'm gonna sick my partner on it and see if we can get it done in the next week or two. Plz keep an eye out on our instagram @tye1138 and the Aaton Facebook group, I will be updating people on those two platforms.
  22. Awesome! It's so rad you're documenting all this down and releasing great books. I love your super 8 book and always use it as a reference for people when they come over and have questions about the format.
  23. 100% man, I think the stuff looks great on film. The highlight retention of film is 2nd to none, so when you're shooting something that's literally white, it makes more sense to shoot in a format that has the DR to deal with it. Plus, most of the stuff is dealing with direct sun as well, which means harsh shadows and to retain both highlights and shadow data, it's film or an Alexa.
×
×
  • Create New...