Jump to content

Film Resolution equivalent


Adam Ray

Recommended Posts

  • Premium Member

Again, the confusion partly comes from talking about optimal scanning resolution versus measurable detail in the image. No one seriously thinks that 8mm film has 4K resolution, meaning that it out-resolves an ARRI Alexa (and it would also mean that 35mm is therefore something like 12K!)

Link to comment
Share on other sites

Again, the confusion partly comes from talking about optimal scanning resolution versus measurable detail in the image. No one seriously thinks that 8mm film has 4K resolution, meaning that it out-resolves an ARRI Alexa (and it would also mean that 35mm is therefore something like 12K!)

 

You're right. The problem here is the meaning of "resolution" -- in film and optics, it means something different than it does in the digital realm, where "resolution" is not about how much detail can be visually resolved, but about image sizes. I can make 4k digital files where fine detail can't be resolved, by applying too much compression, for example, or defocusing the lens. It's still a 4k digital file, though.

 

Here's a very simple example of exactly that: http://www.gammaraydigital.com/blog/case-super2k

 

This is from our blog, and is based on some tests we did when we upgraded one of our scanners from 2k to 5k. The "Regular 2k" scan is made on the same sensor as the 5k scan, the difference is that in 5k mode, the scanner is oversampling and outputting a 2k file. In 2k mode, it's a 1:1 mapping of pixels. The difference in resolved detail is substantial. Yet, they're both 2k scans from the same film.

 

And it also shows that in fact, scanning 16mm at 4k makes sense because it's able to resolve (in the optical sense) more of the film, therefore more detail in the ultimate scan. The same idea applies to 8mm film.

 

In fact, I'd argue that 8mm film has better (optical) resolution in the real world than Super 8, even though Super 8 is a bigger image area by a not-insubstantial amount. Why? Because most 8mm cameras were modified 16mm cameras with better transports, pressure plates and better lenses. Super 8 was about convenience and a lot of quality went out the window with it: no pressure plate, cheaper lenses on cheaper cameras made with more plastic (partly a function of the era in which Super 8 was introduced), etc. The result is that we see older regular 8mm film that looks better than a lot of more modern Super 8, because the cameras were better and produced sharper images.

 

-perry

  • Upvote 1
Link to comment
Share on other sites

Perry, when the scanner is downsampling from 5k to 2k does it use a resizing algorithm that introduces additional sharpening?

 

The 100% scan of super 2k looks a little artificially sharp, especially the grain in the white tiles area in the right top corner.

 

I might just have the inaccurate idea of how the grain should really look, though.

Link to comment
Share on other sites

There is a sharpening/blurring adjustment in the scan software, which we always leave at 0 (off). That being said, I had the same question when we did this test, so I tried some examples with the adjustment set to full soften (it doesn't do much), and still saw more detail than the regular 2k scan, even a regular 2k scan with sharpening cranked all the way up. Bottom line: there is no artificial sharpening on the resulting image.

 

There are approximately 4 times more photosites being used on the sensor when the scanner is in 5k mode (~19.6 MP) than in 2k mode (~4.9 MP) , so the initial pre-scaled image has substantially more data, which means finer resolution of the grain on the film to begin with, and a better downsampling to 2k.

 

Also probably worth noting - the film used in this test was Vision2, not Vision3, so the grain is a bit different than current film stocks.

  • Upvote 1
Link to comment
Share on other sites

  • 1 month later...

 

 

Someone had to figure out an "optimal" resolution, which really turned out to be more of a minimum than anything.

 

The notion that 8mm = 480 is completely ridiculous. Have you ever seen a 2k or 4k scan of 8mm? There's so much more on the film than you might think.

 

I'm not expert on this at all, and I guess this is verging on philosophy, but isn't there a difference between what would be a scan with discrete pixels, and the random quality of the particles in film? I'm assuming, again without knowing, that the particles in the film are of infinitely varying sizes, whereas an illuminated pixel either is or isn't. I think some people have called it "implied" resolution? Because I seem to feel a different physiological response to film images ( of course they're coming through digital scans , not talking about theatre projection) that I can't explain otherwise.

 

I don't mean to derail, and of course this is really partially a psycho-perception issue separate from the technical issue of what resolution is needed to scan film to get the highly random information of film I am arguing exists across.

Edited by Alain Lumina
Link to comment
Share on other sites

 

I'm not expert on this at all, and I guess this is verging on philosophy,

It's really simple actually and people try to make it out to be more difficult than it is. I can scan a faded Polaroid at 1200dpi and it doesn't make it any better, just because I can now zoom in more on a portion of the picture doesn't mean it has "hidden resolution" "fine detail" or anything else.

 

Based on human 20/20 vision seated at a minimum distance from the source at what point is over scanning overkill and a waste of time and money?

 

The first list here is all that is needed.

Link to comment
Share on other sites

It's really simple actually and people try to make it out to be more difficult than it is. I can scan a faded Polaroid at 1200dpi and it doesn't make it any better, just because I can now zoom in more on a portion of the picture doesn't mean it has "hidden resolution" "fine detail" or anything else.

 

If you scan a Super 8 film at 480 and view it on an HD screen, you're blowing that image up 4x from the original (scan) size. You are introducing artifacts when you do this. The scaling algorithm has to make up image where there was no image before. If you scan the film to HD, the same film will look substantially better than the 480 version. If you scan at 2k, 4k, 8k, whatever, and you scale down, it will look substantially better than the 480 version. Try this sometime.

 

 

It's a simple formula: If you view the 480 scan on an NTSC screen it's going to look good. If you view the HD scan on on HD screen it's going to look good. if you view the 480 scan on an HD screen it's going to look soft. If you view it on a 4k screen, it's going to be incredibly soft.

 

 

But again, it all comes back to the erroneous comparison between digital resolution (x by y pixel counts) and optical resolution. Trying to assign a pixel count to a film gauge basically makes no sense when you factor in all the viewing variables (screen resolutions, projection sizes, etc) and all of the variables that went into making the film (lens quality, camera quality, operator skill, exposure settings, etc).

Edited by Perry Paolantonio
Link to comment
Share on other sites

You are saying scan at 480 then blow the 480 up to HD. Of course anything blownup from 480 will not be great. If I was viewing on 1080 then I would scan at 1080. The issue with money is on the top end overscanning 35mm film. For budgeting and backing up digital media every 36-48 months for life - things get expensive.

Link to comment
Share on other sites

You are saying scan at 480 then blow the 480 up to HD. Of course anything blownup from 480 will not be great. If I was viewing on 1080 then I would scan at 1080. The issue with money is on the top end overscanning 35mm film. For budgeting and backing up digital media every 36-48 months for life - things get expensive.

 

Have you priced it recently? Scanning is not what it cost 10 or even 5 years ago.

Link to comment
Share on other sites

You are saying scan at 480 then blow the 480 up to HD. Of course anything blownup from 480 will not be great. If I was viewing on 1080 then I would scan at 1080. The issue with money is on the top end overscanning 35mm film. For budgeting and backing up digital media every 36-48 months for life - things get expensive.

 

What you're saying is basically: "scan at the resolution you need now."

 

What I'm saying is "scan at the resolution you think you might need in the future." Yes, there is a small increase in costs to scanning at a higher resolution than you might need right now, but the benefits far outweigh the costs:

 

1) If you scan at 4k (for example), and you need HD now, you have all you need for a master up through 4k.

2) If you scan at 4k and you need HD, you get a better image because you're oversampling and then scaling down, than if you scan natively at HD

3) The cost to scan at HD now, and then rescan at 4k later is substantially more than simply doing it at 4k now.

4) The cost to store an HD scan vs a 4k scan is negligible. Hard drives cost less than HDCAM SR tapes, for example, so it's actually cheaper than ever to have multiple copies. You don't need to store them in a paid storage facility if you're smart about it: make 3-4 copies on drives and store them in reasonable conditions in different physical locations. Occasionally copy these files to new drives. Or use LTO tape, which is also reasonably inexpensive.

 

-perry

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...