Jump to content

Perry Paolantonio

Site Sponsor
  • Posts

    972
  • Joined

  • Last visited

2 Followers

About Perry Paolantonio

Profile Information

  • Occupation
    Other
  • Location
    Boston, MA
  • My Gear
    Lasergraphics ScanStation 6.5k, 70mm 14k Sasquatch Archival scanner, Eclair ACL II, Pro8mm modded Max8 Beaulieu 4008
  • Specialties
    14k/70mm, 6.5k, 4k, UHD, 2k 8mm-35mm Film Scanning, Film Restoration

Contact Methods

  • Website URL
    http://www.gammaraydigital.com

Recent Profile Visitors

19,795 profile views
  1. No, they're not. No optical disk is "archival" For one thing - optical drives haven't shipped as standard devices on computers in years. Good luck finding a working drive in 20 years to read those discs. And you'd better hope that implanted supercomputer you'll be using in 2075 has USB ports. Second, the claims that the data on these discs will last as long as they say are based on theoretical testing, since recordable optical media (in 5" disk form) has only been around since the 90s. Nobody knows or can say definitively that the discs will be readable, it's just theoretical. And the companies offering the guarantees are only going to refund you the cost of the media, not the cost of the lost data - assuming they're still around. Optical disc technology is not reliable, and never was. Just because it says "archival" on it doesn't mean it is. That's just marketing. The idea that you can put your files on any one form of media (other than film, and even that has obvious limitations) and expect it to be readable in 20, 30, 50 years is just completely unrealistic.
  2. Yes. RAID drives can fail sooner, especially if they're not enterprise grade drives. But all hard drives are susceptible to failure. No hard drive is a reliable long term solution for storage. Full stop. This isn't even really up for debate, to be honest. It's a well understood problem that affects drives that are used on a daily basis as well as drives that are used only occasionally. Advising people to put a ton of data on a drive as a long term solution is, IMO, irresponsible. MTBF, by the way, is typically way off from reality and is mostly marketing - like the supposed 100 year lifespans of some optical discs. A while back Seagate stopped using MTBF as a benchmark and switch to AFR: the annualized failure rate of drives. There was an article on their site about it and if I recall they said something like real-world MTBF numbers were about 2x more optimistic than reality, and that MTBF was often just a guess based on statistical extrapolation, not real-world data. The temperature in the enclosure for an operating drive can have a huge effect on the lifespan, and you can't really predict the wide range of operating conditions with MTBF projections. You don't know how people will use them in the field - will it be in a data center kept at 68 degrees? or in a server closet in a small studio that's closer to 90 degrees? Is it being regularly shipped cross-country (getting dropped and banged around?) So a metric based on actual failure rates is more accurate. Backblaze is an important resource for this kind of data. Though it's worth keeping in mind that they are pulling their data from drives in a RAID or RAID-like (probably ZFS) setting. https://www.backblaze.com/blog/backblaze-drive-stats-for-2024/ Different drive mechanisms have different AFRs, but as you can see they ranged from 0.22% to 11.38% in 2024 (scroll down for the annual chart). That tracks with what we've seen in-house. At the moment we have approximately 150 drives in operation in our studio at any given time, including portable drives we use as shuttles. We see a higher failure rate than BackBlaze does, but that's in part because our drives are not in as controlled an environment (a very large, monitored data center), and that a lot of our drives are regularly plugged and unplugged from machines. We go through about 12-14 drive failures a year on average. So, about an 8% failure rate, which is probably more real-world than Backblaze's numbers show. In my book that's more than enough to not trust drives for long term storage and we only advise clients use hard drives if they have a 3-5 year data migration strategy in place (and we recommend having at least 2 copies at all times in different physical locations).
  3. Hard drive average lifespans are typically well under a decade - this is not me making a number up, it's well documented. While you may have some very old ones that still work, that is the exception to the rule. You're correct that data migration is important. Nobody, ever, should be thinking about any storage format as if it's going to last for more than 5 years. It might, but if you care about your data you're moving it to new formats on a regular schedule. LTO can last a long time, but it's still tape and tape is delicate. And newer LTO formats aren't as backward compatible as older LTO formats were with previous generations, so you need to maintain more (and very expensive) hardware, which itself fails. Ask me how I know. I like LTO, but tape is also susceptible to erasure if it's close to magnets, which can be a problem. And if the tape mechanism is damaged, the tape itself can get jammed in the machine requiring delicate surgery to try to extract it and almost certain data loss. Consolidating lots of data onto larger drives is a bad strategy - you're putting all your eggs in one basket and if that drive fails you've now lost a lot more. It's better to spread it out across many smaller drives so that at least you don't lose a bunch of stuff when one fails. Of course we all have drives and tapes that are old -- a couple years ago we finally migrated a bunch of old DLT tapes from 20-25 years ago to LTO. An entire box of them fit on one LTO-8 tape. Nothing on there was critically important at this point, but it was beyond time, just to be sure. About 1/3 of the tapes were unmountable. Same with old LTO-1 backup tapes from roughly the same era. Hell, I've got ZIP disks from the mid 90s that still work, and floppies from the 80s too. But that's just dumb luck - most of them are unreadable now. We have about 50 hard drives in house that are dead and heading for recycling soon. Would the data be recoverable? Maybe, but if the drive physically crashed it means the head hit the platter and the damage caused by that can be pretty catastrophic. (instead we destroyed the drives by disassembling them. The platters now sit in a stack and the metal cases and electronics will be recycled. I added two new dead ones to the pile just last week. Don't rely on hard drives. Don't rely on any storage format to last. Always be migrating.
  4. mbps means megabits per second and nothing more. It *can* correlate to quality but it doesn't always. All that number tells you is how many bits per second are allocated to the picture. A good encoder can do a lot with a little. And by tweaking other settings in the same encoder, you can make two files at the same bit rate that look dramatically different. The quality of the encoding algorithm, whether it's 1 or multi-pass encoding, what codec, what codec settings, etc, all effect the picture quality. Back in the early DVD days you couldn't make a high quality disc at a bit rate below about 6-7mbps. This was because the MPEG encoders were relatively primitive in the late 90s and early 2000s. "Superbit" DVDs were just DVDs encoded at a higher bit rate to get around this problem. By about 2005, there were software based DVD encoders, like Cinemacraft, which would allow you to make DVDs of the same quality as those old Superbit discs, but at half the bit rate. Same footage, same quality, same file format, half the data rate (thus, half the size -- allowing you to fit more on the disc without sacrificing quality). So Mbps isn't a measure of quality, it's a measure of bandwidth.
  5. Mbps is not a measure of quality so there's no way to answer your question
  6. To answer your question - yes. But it depends on the scanner. Lasergraphics can do it. I think I remember the Northlight could do this as well, by modifying the config files accordingly. I don't know about other scanners. That said, I don't think we've ever once been asked to do this. Even in the scanner, it's done after the image is captured and before it's written to disk, so it's happening in software, probably on the GPU for speed. So there's no real advantage to doing it in the scanner.
  7. It's not a matter of what I "feel" they want. Its what they have told us they want. An archive looking to make a digital preservation copy generally isn't interested in making up picture where it was missing. That's counter to the whole idea of preservation. Different clients have different goals though. Often it's just to improve it for projection, so we do a light dustbust and then manually touch up the really egregious defects. But many of them want pristine cleanups, which means a fair bit of manual work - typically about 2-3 weeks for a feature length film. But even at that, a line is typically drawn at recreating entire frames. In some cases we'll be asked to simply duplicate frames to fill a gap, so it's less distracting. for 3 frames this is pretty easy and depending on the amount of camera motion, sometimes it's not even noticeable. You're not making up new picture, but you're maintaining an image on screen without affecting the audio sync so it's less distracting than several frames of black. We don't generally de-grain/regrain either, for the same reasons. Most of our library and archive customers require two copies of the deliverables: One is the flat scan, ungraded and unrestored. The other is the final graded and restored version. That way they have a high res scan for future use if they need it, but they have a scan restored using today's tools, for presentation. And even if they don't ask for this, we make it a point to recommend that they should, again, just in case. That being said, we have had the occasional (rare) client who is working with archival footage in a new production, and they often have different goals. For those clients we have done stuff like degrain, aggressive dustbusting and smoothing, then regraining. Personally, I don't like the way this looks, but it's what works for their production. I'm sure they would be fine with something AI generated. But we don't do enough of that kind of work to justify paying for those features.
  8. Yeah, my point is that our archive clients don't want that. MTI makes great software and I'm sure it looks good, and this is a useful tool in certain situations. But for a film archive that's doing preservation, using AI to generate entire frames is problematic, at best.
  9. I spent more time compositing the text onto the image than I did cleaning this up in Phoenix. Scratch Target: Default settings but increased column aggressiveness, which detects more DryClean: Default settings 3 minutes work to clean this up in Phoenix. It could be a little better but I didn't feel like spending the time on it. https://www.dropbox.com/scl/fi/jlepb1kp28gttafzqkl2c/Tarkovsky-GRD.mov?rlkey=a0b7uw6layizbql7g3buqtrzv&st=vlbmph7e&dl=0
  10. You're making my point. Those scratches are transient --different on every frame-- ao any decent restoration tool can clean those up easily, using the same method they've all been using for the past 25 or more years. With the correct parameters, DryClean would probably get those as well because they're not persistent scratches, they're intermittent. The result is seamless because these tools all do basically the same thing: they look at the surrounding frames and they pull in actual picture data to replace what they've detected as scratches. The background in this shot is perfectly static, too, making it even easier to do, because there's no complicated motion happening in the image to throw off the algorithm. I don't think this same result could be achieved if the camera was mid-pan, zoom, or dolly. Also, it's Tarkovsky, not Tarkofsky.
  11. Dryclean is a dustbusting tool. You're expecting it to do something that it's not expected to do. Scratches that weave back and forth are scratches that don't match the background. Phoenix's scratch tools clean these up nicely. Scratches that don't weave match the background by definition - because they ARE the background. Automated tools can't clean these up using the same kind of algorithm DryClean or MTI's DRS tools do (looking at surrounding frames, figuring out what's transient by analyzing the motion in the image, and pulling pixels from surrounding frames to seamlessly infill). What they can do is fake it by making up something that wasn't there. AI can probably do this reasonably effectively if you don't care about preserving the original image, but now you're inventing stuff, instead of pulling actual picture data from surrounding frames to conceal something. That's a non-starter for us.
  12. I would appreciate it if you would refrain from speculating on our pricing in a public forum. People read stuff like that and think you're posting our prices. Trust me, it happens. You're way off base here with your guess. When we do restoration, we charge per hour (labor), not per frame. If we were using a tool like this it wouldn't cost you much of anything if it was part of a larger restoration project, we'd just fix it when we got to it. But we don't use that tool and I haven't seen it in action so I don't know how good a job it does. I'm very hesitant to use tools like this, because our clients are mainly archives and libraries, and archives and libraries generally don't want stuff that's invented by an AI in their film restorations. The scan is the wrong place in the workflow to do this kind of thing. We occasionally use the "find grade" tool in the scanner but only for a secondary file, not for a master scan. That is, an access copy that's being generated to a low res MP4 or something like that, alongside a high resolution flat scan. And even on a faded reel, it's never a one-grade setting for the whole reel. We scanned a faded 16mm reel last week that was almost entirely pink. But there was this one scene (not spliced in, it was part of the same contiguous strip of film), that was basically unfaded. Super weird, but it happens. Setting a single grade for the reel would make that shot look bad. You do this kind of thing post-scan, in grading software afterwards, for a reason. Otherwise, you're baking more problems into the scan that you have to deal with later, and depending on the grade applied in scan, you may not be able to fully recover the image later. That's why flat scans are a thing. DryClean isn't a scratch removal tool so I'm not sure why you'd bring that up. It's a dustbusting tool. They're different kinds of defects with different attributes, and require different algorithms to detect and conceal. Phoenix has scratch concealment tools as well, which do work pretty well, but are not perfect. In our experience nobody makes a scratch removal tool that works in all cases, because scratches are wildly different depending on a variety of factors (how much lateral movement it contains, if it's on the base or emulsion, if it's wide or narrow, if it's sharp or fuzzy, etc). Hiding scratches effectively usually involves a laborious combination of matte tracking, grading, scratch concealment and some manual dustbusting to fix the artifacts that are left behind from all that manipulation of the image. I've yet to see an automated tool that works in most, nevemind all, situations without leaving some other undesirable artifact behind, and I've been doing digital film restoration for 20+ years.
  13. I can only speak to our experience, which was absolutely abysmal. It managed to crash so consistently in such a way as to obliterate tens of hours of saved work. When it would crap out, it would take the project database files with it, so there was no way to recover work you had done (and saved). Truly awful. It was also massively resource-intensive, in that almost any fix had many, many tiny files associated with it. Projects that consisted of 6 Quicktime files might balloon to several hundred thousand tiny little files, which made backups painfully slow. It also left a lot of that behind in the folder, so when it crashed, new fix files were created alongside those, resulting in way more junk. Doing an LTO backup of a PFClean project was an all day affair, even though it wasn't that much data - just a truckload of tiny files that wreaked havoc on the filesystem. We used it on Windows, which at the time was the preferred system. And we built our machine to their spec, which they then changed on us and told us they wouldn't support (even though the system we built was the one they recommended for the version we were running). It was really something. I'll never buy anything from them because of the way they treated us and will yell from the rooftops any time I hear of anyone else thinking of it! I will say the "fix frame" (I think it was called) tool worked some kind of crazy miracles on footage we had that had greasy fingerprint marks only in the blue channel. I don't know what it did, but it wiped that out instantly, and those fingerprints were on either side of every splice in a feature. Other than that - junk, as far as I'm concerned.
  14. MTI's software is great for manual work. They're way behind the curve on the automated side of things though. Phoenix Dry Clean is really good at this part, but the licensing sucks. Manual cleanup on MTI is miles ahead of anyone else and once you get into a rhythm, you can work really move fast. But because the app only works with DPX files, it's a bit of a clunker and requires a ton of disk space. Most of the other applications work with ProRes or DPX (or EXR, or other formats too), and since the vast majority of the work we do comes in as ProRes, with only a few exceptions, that's really important and has kept us from going back to MTI. If this has changed I'm all ears! I learned on MTI Correct 20+ years ago. The fact is, there is no single tool that does everything well. If I had to pick two to use on the regular it'd be MTI and Phoenix, probably. With our permanent (older) Phoenix Touch license, we can do almost everything we'd need to do, but the full version of Phoenix gets you access to the grading tools, which are useful for things like scratch concealment. PFClean is utter garbage and the company behind it is even worse. They won't admit when they have problems and they won't stand behind their software. We lost tens of thousands of dollars in completed work because of the crash fest that was PFClean. absolute nightmare. The only serious contenders here are MTI, Phoenix, and Diamant.
  15. They sell it outright (permanent license) and it's about $20k. The annual fee Todd is talking about is probably the support contract (which presumably comes with a software update, if available), which I think is in the $3k range. This is a little different than an annual "subscription." We looked into getting it in 2016 and checked in on the pricing recently because I'm not very happy with the direction Filmworkz (Phoenix) is heading with their subscription-only pricing. In a subscription model you lose the software when you stop paying. In a permanent license model with a support contract you buy it outright but get updates and support for your annual fee. Diamant has a physical USB dongle, but they do have a license server, so they must offer a monthly version as well (which a lot of these applications do, in case you only occasionally use them).
×
×
  • Create New...