Jump to content

Perry Paolantonio

Basic Member
  • Content Count

    597
  • Joined

  • Last visited

  • Days Won

    21

Perry Paolantonio last won the day on November 28 2018

Perry Paolantonio had the most liked content!

Community Reputation

82 Excellent

About Perry Paolantonio

  • Rank

Profile Information

  • Occupation
    Other
  • Location
    Boston, MA
  • My Gear
    Eclair ACL II, Pro8mm modded Max8 Beaulieu 4008
  • Specialties
    5k, 4k, UHD, 2k Film Scanning, Film Restoration, Blu-ray and DVD Authoring

Contact Methods

  • Website URL
    http://www.gammaraydigital.com

Recent Profile Visitors

12277 profile views
  1. I'm not so sure about this. The fact is, UHD screens are cheap now, and as 1080p screens die (which they do because they're cheap), they're going to be replaced with 4k just because there aren't as many 1080p screens on offer now. I mean, take a look at Best Buy and the number of cheap 4k screens is pretty high. 1080p screens aren't that much less and there's a limited selection. To me, 4k in the home makes some sense. In our living room, you wouldn't be able to tell the difference between it and 1080p, simply because we're far enough from the screen that it won't matter. But if we had a bigger screen, you'd definitely notice. As for transmission - well, it's just a matter of time. I used to have this argument with someone I worked with who insisted that streaming would never happen so we had nothing to worry about with our DVD and Blu-ray business... Compression keeps getting better, and the way it's been going for 20+ years now has been either the same quality at half the bit rate, or better quality at similar bit rates, with each new generation of codecs. With ATSC's digital bandwidth at around 20Mbps for MPEG2, a better compression format would run rings around it, without requiring more data. The cable boxes and transmission stuff just has to catch up, which it will, in time.
  2. The unfortunate thing is that they're targeting 8k towards the consumer. It really makes no sense in a consumer context -- to see a difference between 4k and 8k screens, your screen would need to be massive and you'd need to be sitting far too close to it. That said, 4k+ acquisition (cameras and scanners) makes a lot of sense to me because it opens up framing flexibility as well as the option to oversample and get better results. We see this all the time with Super 2k scans - same concepts apply at higher resolutions, just on a larger scale: https://www.gammaraydigital.com/blog/case-super2k
  3. This is correct, though the frame area is really about 4k for Super 8, not 3k.
  4. I think there's a fundamental misunderstanding here. And also some marketing that's probably confusing matters. The scanner has a 10k sensor. That means if you use every pixel on the sensor in the output image, your file is 10k. However, the scanner is taking a picture of more than just the film frame. It's capturing the perfs, frame lines, and everything out to the film edge (in the case of 8mm). With 35mm, it's going into the perfs, but I don't think it goes all the way to the film edge. This is an optics thing, and it's by design. With 35mm, there are 8 perfs to work with, four on each side. So the scanner doesn't need to get more than about half of each perf, and it has plenty to work with in order to align the frame to where it needs to be (digital/optical pin registration). But in order to capture the perfs, the film frame itself has to be less than 10k. In the case of the Director 10k, the scanned frame size is more like 9k, within that 10k overscan file. With 8mm, the camera and lens have to physically move closer to the film. In order to get an acceptable image of both extremes (8mm and 35mm), the camera has to move a fair bit. If you notice, the Director's camera module is about 18" long. The ScanStation's is about 36" long. They use different lenses, which is part of the reason why. But the Director is primarily geared towards 16mm and 35mm. That it does 8mm at all is kind of a new thing, added with the 10k version, actually. There's another issue with Super 8, in that in order to deal with the sloppy perf positions from the factory, the scanner has to cover the whole *film* from edge to edge, not just some of the perfs, as with 35mm. This is because the scanner uses a combination of the perf and the film edge to stabilize the image. That means you need to pull the camera farther back, so that you get the film edge plus some white to the right of it, where the is no film, in order to align that edge to a known position. This is the digital equivalent to a spring-loaded edge-guide in a camera. The more you pull it back, the smaller the frame gets in the 10k sensor. So while you can certainly output a 10k file with Super 8 film, the area of the film frame within that 10 scan is definitely not 10k. There wasn't someone at the show who could give me an exact dimension, but my guess would be that it'd be in the 5k+ range for 8mm/S8mm. 16mm is something like 8k on this machine. it's worth noting that any scanner that uses optical registration (Kinetta, Lasergraphics, MWA, and others) will have the same limitations. Also, from what I gathered at the show there is only one Director 10K with an 8mm gate out there, and it's at an archive in the Czech Republic.
  5. I will check with lasergraphics at NAB but my understanding is that really small gauge film (8/S8 for example), uses a crop of the sensor and that limits its max resolution. The director doesn't have the same optical path as the ScanStation, which can move the camera and lens over a range of a couple feet. The director is significantly more compact, and the form factor hasn't changed. My understanding is that there is no change to the director or scanstation at this year's NAB, other than the addition of some software features. I'll know more on Monday.
  6. Not sure where you're seeing any pricing on their site - it's not there, you need to talk to their sales rep to get that information. Tyler's pricing is basically correct - it varies depending on the options you get. We're about $180-$200k into our ScanStation 5 years on, once you factor in all the upgrades we've done over the years (adding gauges, audio formats, upgrading the camera, support packages, etc). I'm not aware of any used ScanStations for sale right now, and if there were, you'd be looking at a minimum of about 10x what your budget is. Maybe in 20+ years you might find a used one on ebay for $5k, like you can with old Ranks and other telecines now. maybe.
  7. There's no "technology" per se. On the Northlight, it's done by altering the color of the light going into the scanner with a physical filter. This eliminates the orange cast. The scanner is also calibrated to the density of the film, typically by looking at an area of unexposed film (frame line, or the area between perfs). On the ScanStation, the color of the light can be adjusted because it uses discrete R-G-B LEDs to create a white light source. So when you select Negative, you see a slightly bluer light than when you work with positive film. You could remove the orange digitally after, in photoshop. There are plenty of how-tos out there that cover this. If you have large batches, you might look into doing it with something like ImageMagick, which lets you script these sorts of color adjustments. But for desktop software, I'd agree with Rob - check to see if SilverFast works with that scanner. it's pretty widely used for connecting to old scanners, and may just do what you need when it scans.
  8. I think it'll be well worth your time to learn Resolve. It's actually not as steep a learning curve as you think - it's divided into different windows, and for your purposes you'd only use the Media window to load in files, the Edit window to set up a timeline, the Color window to do color work and the Deliver window to export. You can completely ignore Fairlight and Fusion. While it may see overwhelming, it's worth it, and there are a ton of videos out there on basic Resolve use. I'd highly recommend spending some time with it. Because... Working with 16 bit files is insane. The camera isn't 16bit, and those files are going to be impossible to deal with in a meaningful way. If you can manage to get a CinemaDNG RAW sequence out of the camera, it'll load up in Resolve, which will do the debayering, and a 4k 12bit CinemaDNG sequence requires about the same bandwidth as a 10 bit 2k DPX. That's more than 4x less data to have to store and move around. Believe me, 16bit isn't worth it. (and if you're making 8bit TIFFs, then you're losing a ton of color information from the get-go) I think trying to do any color correction in the scan doesn't make sense either, because you don't have proper monitoring or scopes. You're best off using a simple histogram to ensure you're not clipping or crushing anything in the scan, then deal with the color in the color/NLE system of your choice.
  9. This assumes you're talking about 16:9. If you're scanning film at 2k at the actual aspect ratio, the difference in size can be more than 100%: 1440x1080 (4:3 inside a 16:9 frame) = 1.5MP vs 2048x1556 (4:3 at 2k) = 3.186MP
  10. I was responding to your assertion that: Grading and image quality evaluation are different things. You don't need a 10 bit display to do color grading. If you're QCing the image quality you would see an advantage in a 10 bit display for sure. But the OP's question and your response to it are about color evaluation, for which you don't strictly need a 10 bit display.
  11. the quality of the output is more a function of the bit depth of the source footage, I believe, than about the processing inside the screen. If you have a monitor that's capable of displaying the whole color space, the only real downside of an 8-bit screen is banding because it just doesn't have enough colors available for simultaneous display. That's where Flanders excels - by dithering nicely so you don't really see that. Cheaper monitors meant for computer display won't bother with the hassle of adding dithering because they don't need to for 99.9% of their intended market. color space defines the parameters -- the range of available colors. bit depth defines the number of colors available for a pleasing display within that color space at any given time Will a 10 bit pipeline with a 10bit display look better than an 8bit pipeline with an 8bit display? Probably in some cases with certain types of footage, yes. But as long as both screens have a gamut big enough to display the standard color spaces we're concerned about, the question becomes: "do you mind seeing a bit of banding once in a while, knowing that it's likely just you hitting a limit of your display?" If so, get a Flanders or get a higher bit depth monitor. But it doesn't mean it can't be accurate within a given color space as long as that space is within its gamut. 10bit argument has become a bit of a religious one in some circles. It really depends on the end user's requirements whether it makes financial sense to step up to a significantly more expensive 10 bit screen for what usually amounts to a nominal improvement.
  12. With an 8 bit monitor, the screen can't show as many colors simultaneously, but this has nothing to do with the color space. All the colors you can show are still within the specified color space for that screen, there are just fewer of them on screen at the same time. The potential downside of this would be banding, where the screen has to average color values that are too similar to render smoothly as a gradient, resulting in areas where you might get the same colors over several pixels, causing a banding effect. Here's a great (simple) explanation of why 8 bit is fine for grading: https://www.liftgammagain.com/forum/index.php?threads/clarifing-notions-10bit-gamut-32bpc-settings.12140/#post-121144 Flanders Scientific 8-bit screens are perfectly good for many grading scenarios. They also do some clever dithering to make banding artifacts pretty much disappear. I got a demo at NAB a few years ago from them with the same frame on both an 8 and a 10 bit screen of the same size, side by side. The differences were imperceptible. You could really only tell there was a variation if you brought up the onscreen color picker and went to the same screen coordinates to see the numeric difference in displayed color values. Their AM210, for example, can easily display Rec709, SMPTE-C and DCI P3. At $2000, it's not a bad price either, considering you get lifetime factory recalibrations, and it's highly capable in terms of inputs and tweaking.
  13. Resolve is far superior to Premiere. Also, if you're familiar with FCP 7, it's about as close as you'll get to that these days. It's a solid editing system and the color correction tools are hard to beat. Fusion and Fairlight are dicey at this point - they're relatively recent integrations into the application, but if you stick primarily to the edit and color windows, it's a good choice. Get yourself a really good GPU (GTX1070 at minimum) and you'll have a good system for color correction. For editing you don't need the GPU power so much.
  14. Very cool. If you want, PM me, and I can share some code with you for controlling an Arduino from custom software. When I was using an arduino to control an Imagica 3000V film scanner that I gutted, I built a custom app using Xojo (RealBasic). The camera had a companion control board and an API for taking the images, and my software would send commands to the Arduino as simple human-readable phrases over a USB serial connection. Such as "Advance1Frame". The arduino basically just listens for incoming commands, does its thing and returns a response. It should work with any software that can communicate over serial connections. In my setup, there were stepper motors and a ton of sensors, but the basic idea would be the same for yours, with some modifications. I'm in the midst of another film scanner build right now and got a 100W RGB LED COB to experiment with. Can't wait to see how bright that thing is. It's a monster.
  15. I haven't looked too closely at these. High CRI (high 90s if you can find them) is best, because you know you'll get more consistent results. You can also get RGB models that let you control the channels separately, but in one neat package. You can then mix your own white from that. As for color temp, it probably doesn't matter a ton as long as you're calibrating your camera to it and there's enough light to get through the densest film. With white light and enough diffusion you can just point the light at the film and stick the diffusion between, then with the camera look for hot spots (adjust the exposure time way down and stop the lens down as well. This will reveal hot spots. If you see these, then maybe use an integrating sphere of some kind - could be the mirror that was used int he projector, or something similar - A box with mirrored surfaces on the inside works - the light bounces around in there and you get better diffusion. Also look into holographic diffusers (which may be easier) - you can get a small piece of it online and use it in line with your regular photo diffusion material.
×
×
  • Create New...