Jump to content

Perry Paolantonio

Basic Member
  • Posts

  • Joined

  • Last visited

Profile Information

  • Occupation
  • Location
    Boston, MA
  • My Gear
    Lasergraphics ScanStation 6.5k, 70mm 14k Sasquatch Archival scanner, Eclair ACL II, Pro8mm modded Max8 Beaulieu 4008
  • Specialties
    14k/70mm, 6.5k, 4k, UHD, 2k 8mm-35mm Film Scanning, Film Restoration

Contact Methods

  • Website URL

Recent Profile Visitors

14901 profile views
  1. That does not look like a file direct off the scanner. As you noted, it appears to have been exported from Resolve. You should ask for a DPX file direct off the scanner, and look at that. Perhaps they're doing a one-light grade in resolve and exporting DPX from that, which is why the file looks this way. In any case, ensure you have a DPX file made by the scanner software directly, and then look at the metadata.
  2. DPX can be linear as well. It's pretty flexible. You can even make YUV DPX files if you want, which are definitely not log. Log scans exist as a way to match the characteristics of negative film to a digital format. Log simply isn't applicable to positive film. When you scan positive film to a DPX file, it's linear. If they're treating it as log (which is certainly possible), you would need to also interpret it as such upon playback. But they would be doing it wrong if they are. My bet is that the specs are just incorrect or incomplete.
  3. Scans from Reversal aren't log. Log is only applicable to color negative scans. What you want to ask for is a flat scan, as suggested above. When we scan positive film on our ScanStation 6.5k, we typically set the black level to about 10% (roughly the same point as dmin on a log scan), and we reduce the white levels so that the perforations (where there is no film, thus maximum brightness from the light source) is at 100%. The whitest whites in the film fall where they may but will never blow out this way.
  4. That's the spec for the raw sensor. The camera that sensor is put into may impose further limits on speed/resolution/bit depth, depending on how it's configured. So the same sensor in two different cameras may produce slightly different image sizes, at different speeds, especially if one of the cameras is doing any kind of processing internally.
  5. Right. My point was more that you don't need Xeon or Threadripper to get the PCIe lane bandwidth to add a 25gbE NIC. The PC the ScanStation uses is nothing especially fancy, just a gaming rig really, with a mediocre CPU. A similar motherboard with an i9 should provide plenty of PCIe lanes. Of course, you also need the software to support whatever camera you put it. I seriously doubt it's a simple drop-in replacement so this whole tangent is kind of moot.
  6. But the camera can't. Sorry - you're right. I was using my calculator set to monochrome. For color it would be 3x larger, so a 12bit file would be about 55MB/frame. For 10bit it would be about 46MB/frame. You could never do 10bit DPX faster than about 13fps because of the speed of the camera interface, and even that's a huge stretch. Probably more like half that speed in real-world usage. It's in the spec sheet for the camera. This is almost certainly why you can't get faster speeds. This is probably unnecessary. Our ScanStation's host PC has dual GTX1070 cards, a 25gb nic (for the camera), a 10gb nic (network), Cameralink card (optical track reader), RAID card (internal RAID) and it's only an i7 9800 with 32GB RAM on a decent gaming motherboard. We can capture single-flash 6.5k scans at 15fps on this and 2k scans up to 60fps. A good i9 should be sufficient.
  7. There are a lot of things that could cause slowdowns, but my bet is the interface. That camera has a USB3.1 Gen1 interface, which is only 5Gb/s. (gigaBITs - note the small 'b'). That's 625MB/second, assuming you're getting full saturation, which you almost certainly are not, because it's USB. A 4k 12bit file is a bit over 18MB/frame. So the USB 3.1 connection can move 33 frames per second, theoretically, assuming it's an RGB image off the camera onto the disk. I would bet you're getting maybe 50-60% of your available 5gbps bandwidth if everything is perfectly configured, which might get you near 20fps max in bursts. But if the PC is doing other stuff (even background stuff you're not aware of), that's going to affect performance too. That could include any image processing they're doing on the computer, whether that's GPU or CPU based. This is why professional scanners don't use USB cameras. To get 4k files off the camera to the PC at speed, you need a proper interface on a PCIe card: CameraLink (though nobody is really using this anymore), or Coaxpress, or 25GbE (5x faster than USB3.1 gen1). And you need a motherboard/CPU combo with enough bandwidth to allow all that data to move back and forth. Also, that camera has a tiny image buffer onboard - it can only hold about 6-7 frames of 12bit 4k, so if the FPGA on the camera is doing any processing that could be a choke point as well. The problem isn't your disk speed.
  8. This is nowhere near the 2500MB/S you claim it requires. 10bit DPX needs about 1200MB/S, plus you want a bit of overhead. If you have RAIDs that can do 1500MB/S consistently (and I don't mean testing with BMD or AJA disk speed tools, because those barely generate enough data to get past the caches on most hard drives), then you should have no problem with this. I think you're conflating things here. You started off talking about the disk speed needing to be 2500MB/s and that's not true. 1500 should be sufficient, though a bit tight. Now you're talking about the speed of the camera. But the camera being USB-C. The camera almost certainly isn't outputting a DPX file, it's pumping out the RAW data and then the DPX is being made by the frame grabber (in this case, it seems that might be a software element rather than hardware. DPX isn't a format that machine vision cameras really understand. They can do things like pass the raw sensor data (which is pretty lightweight if it's not debayered yet), or they can convert to an RGB bitmap file and pass that back to the software. But not DPX. And if you're looking at the specs for the camera on the camera manufacturer's site, the top speed they rate is always based on an 8 bit image. That's the industry standard for machine vision cameras - to use the 8bit speed as the advertised speed of the camera. So you need to dig deeper to find out what the camera's speed really is, for a higher bit depth image. That should be in the datasheet for that camera. What you are describing here is not a bug in how the OS handles the data, it's that this is a very inefficient and inherently slow way to do the work. If it has to write the data to disk and then verify that data, it's Writing and reading a big file. That's going to slow things down by adding a bunch of unnecessary I/O. But that's not the fault of the OS, that's just a matter of the way the scanner software is handling the files. There's no reason they can't do the verification of the data in memory and then write the file. GPU has nothing to do with it. Since the camera in this system is a bayer sensor, you would probably be better off writing the raw data to the disk and then converting it from Resolve. What camera is used in the scanner you have? Exact manufacturer and model? It should be easy enough to tell in the System Information window in Windows, if you can't see it on the physical camera.
  9. What format are you scanning to? 4k 16bit DPX (a ridiculous format to use, especially if you're scanning color neg) is about 1700MB/s at 24fps. We easily get those speeds on our SAN, which is home-built using off the shelf WD Red drives. But even with the network removed, one could get enough speed to do 4k DPX on a local array with just 4-5 drives in RAID 0 (and RAID0 is fine if the files are only going to be there temporarily). There is no issue with Windows slowing down DPX sequences. You've brought this up before, but it's simply not the case. Back in Windows XP it was a problem yes, but hasn't been for many, many years. Working with a DPX sequence on an NTFS RAID 0 is pretty straightforward stuff and is done all the time, even at high bandwidth. We have local raids for caching in all of our Windows machines - some are NVME or SSD caches, but some are just simple 4x 7200RPM spinning disks. All are NTFS, and all can handle image sequences.
  10. Pardon? please explain. All you need for consistent speed, if you're aiming for real time, is a closed loop stepper motor, Hybrid stepper/servo motor, etc. This is all cheap and readily available, as are sophisticated motor control systems to drive those motors. The flywheel in machines like the old Magnatech dubbers was there because the stepper motors it used were fairly coarse steps and it helps to smooth out the motion. What made the motor go the exact speed was the stepper (and the encoder to ensure it wasn't drifting, if it was, speed was adjusted). The flywheel just smooths it out. It is not. Freely available open source code exists that makes this possible: OpenCV. There is no need for an AI, it's a simple matter of pattern matching. You have a predefined image of the perfs, for example, then with each frame you find that pattern in the overscanned image, and figure out how far away from the target location that perf is. You move the image on the X and Y axes by that amount, and you have a well registered frame. If you wanted a standalone application to do this, sure, it would need to be written. But you can do the basics in a short python script. Resolve 18's tracker looks very promising. I haven't tried it yet, but sophisticated tracking has existed for a long time in tools like After Effects, Nucoda/Phoenix, Resolve, Diamant, Baselight, etc. It's not an especially complex problem to solve, but you have to define the inputs (good references on which to stabilize). And all of those should work with a ProRes file, without requiring a DPX sequence. We use Phoenix for restoration on a 28 core Xeon PC. 3fps is par for the course in our world with 4k+ files. We're doing just fine, having earned plenty of dimes from this system.
  11. Before they made film scanners, they made recorders to make still images from a computer on film (slides I think). This went on one of those devices.
  12. Rob's right. This has gone off the rails, and I'm sorry about that. After asking him not to contact me privately, Dan keeps doing it so I've blocked him where possible. too many distractions when there's actual work to be done.
  13. The hole is yours, my friend. I have the thread with you from that conversation which contradicts all that you're saying above. I also have the emails with the client from after I figured out who you were talking about. Amicable is definitely the tone of the communications with the client, which began with a request from him for a new quote, not a complaint. You began that PM by telling me "I’ve just had a complaint against your company Perry." (you have no relation to my company so I'm not sure why you're the one they're talking to). He has never once contacted us about a problem. Had he, my first course of action would have been to ask him to send the film back so we could look into it. We did advise the client, who brought up the idea of scanning one reel to start, then the other as his budget allowed. That was not the first time we've done work like that. We advised him on ways he could reduce the cost without having a detrimental effect on the image quality. Your words, verbatim, in that thread: "Please if you ever have a collector who can’t afford a full scan with your company don’t do that again. That’s ridiculous. If you don’t want to do a full 4K scan of a 2hr movie for $1500 then send a customer to one of your competitors that will so that situation doesn’t happen again" (The "don't do that again" line is in reference to scanning one reel of a multi-reel film) The reality is that the only time you really need someone to be physically present is when the system is being installed, and travel is included in that cost. We had two people out here for 2 days when we got ours in 2013, one day to uncrate, assemble and calibrate the machine, one day for training. the cost included flights and hotels, and FWIW, Steve bought me lunch both days. In the years we've had the scanner we've never had a need for an on-site visit for support and we've had a number of issues that had to be dealt with. The scanner is designed such that most parts are user-replaceable and are modular in nature. For example, if you had to replace the camera for some reason, they will send you detailed step by step instructions on how to do so. It's a couple cables and a couple screws to remove the camera module from the machine. You send it to Lasergraphics and they repair or replace it. In some cases they will ship you the new part first (like a new camera upgrade), and you send the old one back to them in the box the new one came in. The scanner came with a set of tools required for assembly, and for future removal/reassembly of parts. The design principle of their machines is that you don't need a field tech because the parts are plug-and-play modules. All other support is done via email, phone and remote login on the machine. If there's a problem they may ask you to do a remote login session. You need to be at the machine to load film and do some things in meatspace that can't be done over the phone, but again, we've never had a need to have someone come out to fix something. This is different than how some companies do it - Arri, for instance, will send a tech (at your expense) to deal with things. I'm sure Lasergraphics could do that if necessary, but I don't know anyone who has had to do that.
  14. Nice. Yeah it can be really hard to pull the color out of something that faded. There's a little bit left in there, but what remains is highly variable from film to film. So while two reels may look equally pink to the eye, there's a very high probability of never getting any color at all from one of them. I've found that Resolve's Auto Color feature is often a good starting point for this kind of thing, and I suspect BMD is doing something similar to what Lasergraphics is doing in that feature. Yeah, ignoring is a possibility and that's good advice. Normally that's exactly what I'd do. But there's a reason I respond to his posts: This site and some others come up as the first hit in a lot of google searches about scanning, and on more than one occasion I've had to spend an inordinate amount of time dealing with potential (or existing) clients who have questions about incorrect information they read here and elsewhere. Some of these are Dan's posts. Additionally, and this was in a private email thread on another forum, Dan accused us of being irresponsible to our customers because we scanned one reel of a film for a collector, instead of the whole film. This was the customer's choice, because of budget. His assertion was that we shouldn't accept work like that if we're not scanning the whole film, which is patently ridiculous. The customer only sent us one reel. Are we supposed to refuse to scan that on principal because it's an incomplete film? many of our customers engaged in long term restoration projects will send us a reel at a time, as they can afford to, and will work with what they have in the mean time, sometimes over many years. A lot of these projects are unpaid labors of love and take a long time to complete because of the costs and effort involved. This is an *extremely* common scenario. He had been conversing with this client privately, and relaying information to me without telling me what the film was or who the client was, telling me the client felt "ripped off." (the customer never contacted us with any complaints). His justification for this? The ProRes files seemed too small, indicating the customer didn't get the resolution they requested. They did. It was a 4k Scan to ProRes 422HQ with HDR. In fact, the files were smaller because it was ProRes 422HQ, vs 4444, and they are, well, smaller. But it's exactly the specifications the customer asked for. He also accused us of not scanning the film in focus, telling me and the customer that we scanned it "too fast" causing focus issues (a completely cockamamie idea. It just doesn't work that way), and then that we should have focused on the edges of the perforations, instead of the image (this is not how you do it and itself will result in an out of focus image), but he also told our customer as much, sowing confusion and doubt. The print in question was of low quality, and was a reduction from 35mm to 16mm, several generations removed from the original. The optical reduction was slightly out of focus - you could clearly see the film grain in our scan, indicating that we were in focus on the film. That, as you well know, is the only way to ensure it's in focus. Meanwhile, I heard from that client, and we explained why everything was the way it was, and that seemed to end amicably enough that he asked for a quote on a new job. But it cost me half a day of digging through old files, emails and order forms, and a fair amount of stress, to ensure that we didn't screw something up. Then I had to explain to the customer the litany of things Dan was incorrect about. Unfortunately, it's hard to ignore the pathologically uninformed, who post stuff about you and about things they have no direct knowledge of on the internet, as if they're the expert. Because once bad information is up there, it's up there for good.
  15. And you are wrong. You are inferring that the tool is designed to eliminate post production, which is not what they're saying. Nobody is forcing you to use this feature, but it's there if you want it, and they will tell you straight up that you shouldn't use it for master scans. Again, as you say above, people know this already because it's an archival scanner, and grading in-scanner isn't how you're supposed to use it for master scans. So you're contradicting yourself here. If anyone's being contrived in their argument, it's you. I'd love to know what you think it explains, having never seen this particular reel of film before. In any case, you seem to live in a world based on assumptions, feelings and opinions, not facts. I wonder how that's working out for you? 1) 2k scan, SDR, done in about 5 minutes from an incredibly crappy reel of film that's badly faded. It's never going to look better than that, even if it's graded from the flat scan because there is virtually no color left on the film. The intention is to show that the feature works as advertised, which it does. 2) @Daniel D. Teoli Jr.: This is in part why we don't post example scans, because every last "expert" on the internet will pick it apart without knowing anything or having seen the source material. It's not worth the time or effort to have to argue with every last yahoo on the internet. (and yet, here I am). Dan, honestly, what is your problem, man? Why do you insist on going on and on about stuff you so obviously know nothing about, even when multiple people prove you wrong time and time again? I simply don't understand what your deal is.
  • Create New...