Jump to content

Perry Paolantonio

Basic Member
  • Posts

    897
  • Joined

  • Last visited

Everything posted by Perry Paolantonio

  1. Yes, they do. It's on the product page for all the non-Director pages. That's a ridiculous statement, sorry.
  2. Also, this is a terrible analogy. When you hire a photographer, you are hiring someone by the hour (even if it's a package price, they're figuring it out by the hour, most likely). You are paying for someone's time and skill. You are not paying because of the camera they use. And if you are, you're going about finding a photographer the wrong way. The idea that anyone can buy a high end camera and take a good picture is ridiculous. I could give my mother my good DSLR and she'll still take photos of people with their mouths open, eyes half closed, and their faces partially cropped. This last notion extends to film scanning. There are plenty of folks out there with good film scanners who do bad work. We've re-scanned a lot of that stuff over the years. Just because you have a tool doesn't mean you know how to use it.
  3. Seriously? Time is money. If you scan with the scanner in 2.5k mode you can get 60fps. If you scan at 5k it's half that speed. If you scan in 6.5k mode it's half that speed. So 2k = 60fps, 5k = 30fps, 6.5k = 15fps. If you have HDR then halve each of those numbers. If you're outputting a 2k file from a 5k can (Super2k) it still takes the same time to scan as a 5k scan does, it's just that the file the scanner generates is smaller because there's less data there. None of this takes into account file formats. If the client wants a 5k scan of DPX sequences, the scanner can happily do that at the rated speed. It will, however, take 10x longer than it took to scan, to copy the files off. 20x if the client provided a USB drive. No. Because it costs more to work at higher resolutions at every level: scanning, file wrangling, rendering (if you're grading or restoring), etc. It's not a conspiracy or gouging (well it may be by some), it's because we also need to make a living, pay our employees, rent, electricity, insurance, shipping, internet, etc.
  4. I run resolve on a mid-2020 iMac, a 2019 Macbook pro, and we have it running on windows and linux as well. If you're doing lots of mattes, noise reduction, etc, then yes it's a resource hog. but for basic playback you pretty much just need to meet the minimum GPU requirements. His machine may not. An alternative (paid) is Scratch Play Pro. I assume since he doesn't have Quicktime, he's on Windows. So the obvious solution is to install Quicktime and use Quicktime player. It's free and will run on very old hardware.
  5. Not well though. It’s buggy software and we try to steer clients away from it. Daniel: you need to be working with real software. Resolve will let you change the frame rate as I described. It is free software. Try it. Should work with tif. I don’t know about jpeg because scanning to JPEG sequences is not really a thing because it’s not a good format.
  6. No it’s not. What’s in the file is metadata. You edit that if you want or you can bring it into resolve (or premier or after effects) and just tell the application to treat it as if it was whatever frame rate you want. It maps them 1:1 so if you bring a 24fps scan into resolve, tell it it’s 16fps and then drop it in a 16fps timeline it’s as if you scanned at 16.
  7. It doesn't need to be DPX to change the frame rate. You can do it with ProRes files or most others as well. GOP-based files might not work properly (MPEG variants, for example). As you said, 8mm and 16mm (non-pro, old home movies) are typically 16fps. Sound speed is 24, so 16mm shot on better/newer gear is usually 24fps. Super 8 could be 18 or 24, depending on the camera. Some older commercial films were done at 20, but not home movies. And hand cranked film could be all over the place. The biggest variable is the camera. Most old home movie cameras (8mm, 16mm) were spring-wound. Those motors could easily fluctuate by a couple frames per second in either direction. Nobody is really going to notice that unless you're trying to sync it with sound. You'd set it to 16fps but it could be 15 or 17. or 16. or all three in the span of a few seconds. If people think they can tell some old home movies are off by 1 frame per second, I have a bridge I'd like to sell them. Or maybe some Monster Cable.
  8. Stock. I think. My ACL 1.5 has this handgrip. It's more comfortable than you'd think, since you you can adjust the angle. I always wanted a wooden handgrip like the Aaton though. Each of those slides holds a wratten filter that goes into the camera body behind the lens.
  9. This doesn't sound like a problem with the film, it sounds like a scanning issue. Most scanners are designed for color film, and use color cameras. If you scan B/W film in color mode, it's going to have a subtle color tint to it (just like if you printed b/w to color stock). The easy fix is to just make your scan monochrome in your grading tool. I'd tend to do that as a first step anyway, when grading b/w material.
  10. No it won't. At least not definitively. It might tell you if the lens is broken, but that doesn't seem to be the issue here. As I mentioned above, the correct lens choice is dependent upon a number of factors, including the size of the sensor's photosites, the area of the sensor as well as the magnification factor that's required. While you might be able to tell if something is grossly off with the lens by trying it on another camera, that doesn't sound like the problem here. Tyler has said that it's not as crisp as scanners like the scanstation, which has a more expensive lens as well as a more complex focusing/rack system for that lens. I don't think this is a matter of the lens not working properly, I think it's probably just that it's the lens that fit FilmFabriek's budget and mechanical constraints, and it is what it is.
  11. The full scanstation is different. Also a Schneider, but a Makro-Symmar. And the lens in our 5k was different than the one in our 6.5k, even though they're both the same basic model. We're using a very similar Makro-Symmar in the 70mm scanner, but there are real differences between revisions of the same model lens. According to Schneider, there is only one variant in their lineup that is suitable for the sensor, pixel size, and distance between film and sensor, for the 14k camera we're using -- and that all the others would be soft if used in that setup. A different sensor or different pixel size would require a different lens. (which is why the 5k and 6.5k ScanStations both had different variations of the same lens). The point of my post above is that the manufacturer's choice of a lens is partly about what's available, but every one of those companies picked their specific lenses based on the mechanical and magnification requirements, the sensors being used, the film gauges they're optimized for, etc. That comes down to math, not guesswork and assumptions, of which there's been a ton in this thread. Just because a lens is good in one scanner doesn't mean it's the right choice for another... (I also have a 95mm Printing Nikkor 2.8 like the one in the Xena and Arriscan , but it wouldn't work in our scanner due to mechanical and space constraints)
  12. Then contact other vendors. I don't really see what the problem is here. You write one email and send the same one to 4-5 manufacturers and see what you get.
  13. Rather than guessing, why not contact a company like Schneider and ask for a recommendation? You will need to know the exact size of the sensor, the pixel size of the sensor (all available from the spec sheet), the distance from the film to the sensor, and the type of mount you're currently using. The calculation for a lens is just math. You pick the lens based on the enlargement you need, the space you have to work with, and the resolving power of the lens, factoring in the size of the photosites on the sensor. We chose our Schneider lens for our 70mm scanner after discussing with them exactly which model we'd need to cover everything from 35mm to 15p IMAX (it's a kind of macro/bellows setup with separate stages for lens and camera so we can move both camera and lens as needed). You can easily buy a lens with the same focal length, magnification factor and model name on ebay, but if you get the wrong version it's not optimized for the sensor's photosite. There are a lot of factors to consider and this problem could be resolved with a couple quick emails with a lens manufacturer. We also looked at lenses from Linos, Myutron and Nikon (Rayfact), but ultimately went with Schneider. All were helpful in choosing the correct lens (though Schneider was the most helpful), but you need to give them the right information.
  14. That does not look like a file direct off the scanner. As you noted, it appears to have been exported from Resolve. You should ask for a DPX file direct off the scanner, and look at that. Perhaps they're doing a one-light grade in resolve and exporting DPX from that, which is why the file looks this way. In any case, ensure you have a DPX file made by the scanner software directly, and then look at the metadata.
  15. DPX can be linear as well. It's pretty flexible. You can even make YUV DPX files if you want, which are definitely not log. Log scans exist as a way to match the characteristics of negative film to a digital format. Log simply isn't applicable to positive film. When you scan positive film to a DPX file, it's linear. If they're treating it as log (which is certainly possible), you would need to also interpret it as such upon playback. But they would be doing it wrong if they are. My bet is that the specs are just incorrect or incomplete.
  16. Scans from Reversal aren't log. Log is only applicable to color negative scans. What you want to ask for is a flat scan, as suggested above. When we scan positive film on our ScanStation 6.5k, we typically set the black level to about 10% (roughly the same point as dmin on a log scan), and we reduce the white levels so that the perforations (where there is no film, thus maximum brightness from the light source) is at 100%. The whitest whites in the film fall where they may but will never blow out this way.
  17. That's the spec for the raw sensor. The camera that sensor is put into may impose further limits on speed/resolution/bit depth, depending on how it's configured. So the same sensor in two different cameras may produce slightly different image sizes, at different speeds, especially if one of the cameras is doing any kind of processing internally.
  18. Right. My point was more that you don't need Xeon or Threadripper to get the PCIe lane bandwidth to add a 25gbE NIC. The PC the ScanStation uses is nothing especially fancy, just a gaming rig really, with a mediocre CPU. A similar motherboard with an i9 should provide plenty of PCIe lanes. Of course, you also need the software to support whatever camera you put it. I seriously doubt it's a simple drop-in replacement so this whole tangent is kind of moot.
  19. But the camera can't. Sorry - you're right. I was using my calculator set to monochrome. For color it would be 3x larger, so a 12bit file would be about 55MB/frame. For 10bit it would be about 46MB/frame. You could never do 10bit DPX faster than about 13fps because of the speed of the camera interface, and even that's a huge stretch. Probably more like half that speed in real-world usage. It's in the spec sheet for the camera. This is almost certainly why you can't get faster speeds. This is probably unnecessary. Our ScanStation's host PC has dual GTX1070 cards, a 25gb nic (for the camera), a 10gb nic (network), Cameralink card (optical track reader), RAID card (internal RAID) and it's only an i7 9800 with 32GB RAM on a decent gaming motherboard. We can capture single-flash 6.5k scans at 15fps on this and 2k scans up to 60fps. A good i9 should be sufficient.
  20. There are a lot of things that could cause slowdowns, but my bet is the interface. That camera has a USB3.1 Gen1 interface, which is only 5Gb/s. (gigaBITs - note the small 'b'). That's 625MB/second, assuming you're getting full saturation, which you almost certainly are not, because it's USB. A 4k 12bit file is a bit over 18MB/frame. So the USB 3.1 connection can move 33 frames per second, theoretically, assuming it's an RGB image off the camera onto the disk. I would bet you're getting maybe 50-60% of your available 5gbps bandwidth if everything is perfectly configured, which might get you near 20fps max in bursts. But if the PC is doing other stuff (even background stuff you're not aware of), that's going to affect performance too. That could include any image processing they're doing on the computer, whether that's GPU or CPU based. This is why professional scanners don't use USB cameras. To get 4k files off the camera to the PC at speed, you need a proper interface on a PCIe card: CameraLink (though nobody is really using this anymore), or Coaxpress, or 25GbE (5x faster than USB3.1 gen1). And you need a motherboard/CPU combo with enough bandwidth to allow all that data to move back and forth. Also, that camera has a tiny image buffer onboard - it can only hold about 6-7 frames of 12bit 4k, so if the FPGA on the camera is doing any processing that could be a choke point as well. The problem isn't your disk speed.
  21. This is nowhere near the 2500MB/S you claim it requires. 10bit DPX needs about 1200MB/S, plus you want a bit of overhead. If you have RAIDs that can do 1500MB/S consistently (and I don't mean testing with BMD or AJA disk speed tools, because those barely generate enough data to get past the caches on most hard drives), then you should have no problem with this. I think you're conflating things here. You started off talking about the disk speed needing to be 2500MB/s and that's not true. 1500 should be sufficient, though a bit tight. Now you're talking about the speed of the camera. But the camera being USB-C. The camera almost certainly isn't outputting a DPX file, it's pumping out the RAW data and then the DPX is being made by the frame grabber (in this case, it seems that might be a software element rather than hardware. DPX isn't a format that machine vision cameras really understand. They can do things like pass the raw sensor data (which is pretty lightweight if it's not debayered yet), or they can convert to an RGB bitmap file and pass that back to the software. But not DPX. And if you're looking at the specs for the camera on the camera manufacturer's site, the top speed they rate is always based on an 8 bit image. That's the industry standard for machine vision cameras - to use the 8bit speed as the advertised speed of the camera. So you need to dig deeper to find out what the camera's speed really is, for a higher bit depth image. That should be in the datasheet for that camera. What you are describing here is not a bug in how the OS handles the data, it's that this is a very inefficient and inherently slow way to do the work. If it has to write the data to disk and then verify that data, it's Writing and reading a big file. That's going to slow things down by adding a bunch of unnecessary I/O. But that's not the fault of the OS, that's just a matter of the way the scanner software is handling the files. There's no reason they can't do the verification of the data in memory and then write the file. GPU has nothing to do with it. Since the camera in this system is a bayer sensor, you would probably be better off writing the raw data to the disk and then converting it from Resolve. What camera is used in the scanner you have? Exact manufacturer and model? It should be easy enough to tell in the System Information window in Windows, if you can't see it on the physical camera.
  22. What format are you scanning to? 4k 16bit DPX (a ridiculous format to use, especially if you're scanning color neg) is about 1700MB/s at 24fps. We easily get those speeds on our SAN, which is home-built using off the shelf WD Red drives. But even with the network removed, one could get enough speed to do 4k DPX on a local array with just 4-5 drives in RAID 0 (and RAID0 is fine if the files are only going to be there temporarily). There is no issue with Windows slowing down DPX sequences. You've brought this up before, but it's simply not the case. Back in Windows XP it was a problem yes, but hasn't been for many, many years. Working with a DPX sequence on an NTFS RAID 0 is pretty straightforward stuff and is done all the time, even at high bandwidth. We have local raids for caching in all of our Windows machines - some are NVME or SSD caches, but some are just simple 4x 7200RPM spinning disks. All are NTFS, and all can handle image sequences.
  23. Pardon? please explain. All you need for consistent speed, if you're aiming for real time, is a closed loop stepper motor, Hybrid stepper/servo motor, etc. This is all cheap and readily available, as are sophisticated motor control systems to drive those motors. The flywheel in machines like the old Magnatech dubbers was there because the stepper motors it used were fairly coarse steps and it helps to smooth out the motion. What made the motor go the exact speed was the stepper (and the encoder to ensure it wasn't drifting, if it was, speed was adjusted). The flywheel just smooths it out. It is not. Freely available open source code exists that makes this possible: OpenCV. There is no need for an AI, it's a simple matter of pattern matching. You have a predefined image of the perfs, for example, then with each frame you find that pattern in the overscanned image, and figure out how far away from the target location that perf is. You move the image on the X and Y axes by that amount, and you have a well registered frame. If you wanted a standalone application to do this, sure, it would need to be written. But you can do the basics in a short python script. Resolve 18's tracker looks very promising. I haven't tried it yet, but sophisticated tracking has existed for a long time in tools like After Effects, Nucoda/Phoenix, Resolve, Diamant, Baselight, etc. It's not an especially complex problem to solve, but you have to define the inputs (good references on which to stabilize). And all of those should work with a ProRes file, without requiring a DPX sequence. We use Phoenix for restoration on a 28 core Xeon PC. 3fps is par for the course in our world with 4k+ files. We're doing just fine, having earned plenty of dimes from this system.
  24. Before they made film scanners, they made recorders to make still images from a computer on film (slides I think). This went on one of those devices.
×
×
  • Create New...