Jump to content

What is the resolution of 35mm film negative and 35mm film prints shown in cinemas?


Recommended Posts

  • Premium Member

Chemophotographic resolving power is not measured in digital terms. Usually line pairs per millimeter or per inch linear length that are on the verge of being discernible are noted. The image format and therewith the film size are secondary. The calculus goes line pairs per unit of length in relation to the image area. Typically you have 150 or 200 line pairs per millimeter with negatives. Print stocks have higher resolving power, so the negative is the weakest link in the chain which is logical because the taking stocks are of about ten times higher sensitivity than print stocks. Today’s taking lenses often resolve double as much as the negative. Projection lenses theoretically similar. In practice, a lot gets lost.

The video should be projected to same size as a print, if you want to find out.

Link to comment
Share on other sites

Theres a notable difference scanning even 16mm at 4k vs 2k, and the rule of thumb back in the day was that s16mm was roughly 1080p. 35mm 3 perf at 4k will be significantly greater than 1080p, though I know some folks think 500T at box speed doesnt really improve at 4k. Unfortunately there isnt much in the way of public tests on this matter, and even if there was on youtube the compression would probably screw up the results of the test. 35mm negative in 3 perf is over 3 times the size of s16m negative. 

Its also complicated by the fact that if you over expose film, you'll expose smaller crystals in the negative and thus generate more detail that can be extracted in a scan. This is why to do a true comparison you'd want to be dealing with the theoretical maximum line pair resolution of a given film vs a given sensor. you also just have the fact that not all pixels are the same, and thus not all 4k is the same in terms of the detail we get out of it. 

I havent dealt with prints, but I know the typical process for making them from digital files is a 2k transfer back to film. I dont know if thats because 4k isnt really worth it on a film print cost wise, or if its because the print stock cant actually resolve 4k when projected. 

But 

  • Like 1
Link to comment
Share on other sites

  • Premium Member

From all of the tests I saw when the digital transition started happening, S35mm appeared to resolve approximately 3k in horizontal resolution. More than 2k, less than 4k.

If I'm remembering correctly, I think most of those comparison tests were generally shot on 200 ASA stocks. So I'm sure you'd probably resolve a little more on low-grain 50 ASA stock, and a little less on 500 ASA.

But at 200 ASA, the grain structure stopped revealing any additional detail at about the 3k mark.

I think this is part of the reason the Alexa was so effective at replacing celluloid, because you weren't really giving up anything in capture resolution.

From my own tests comparing scanned 35mm full-frame film to early DSLR cameras, I found that at 200 ASA, a 12-megapixel digital sensor effectively matched a scanned frame of 135 for resolvable detail. That was the point at which I bought my first digital camera (the Nikon D300).

If you run the math, 12-megapixels (approximately 4300 pixels wide) at a full-frame aperture, scales down to approximately 2900 pixels wide at a S35mm aperture. So my tests (on stills stock) would seem to align pretty well with the tests on motion picture stock.

Now obviously there are some added elements to all this. You can blow up a S35mm celluloid image, much further than a digital image of equivalent resolution - because the grain structure doesn't break down into nasty individual pixels in the same way as a digital image does. So to compare them in terms of resolution, you really do need to talk about "resolvable detail", and on that front about 3k pixels wide for a S35mm frame does seem to be roughly the point of equivalence between the two.

 

  • Like 2
Link to comment
Share on other sites

I can't remember where I heard this (probably some dumb Google autofill thing) but I read somewhere it resolves up to 5.5K? When I commission illustrations that is the basis for the resolution I choose to future proof things.

Link to comment
Share on other sites

  • Premium Member
2 hours ago, Max Field said:

I can't remember where I heard this (probably some dumb Google autofill thing) but I read somewhere it resolves up to 5.5K? When I commission illustrations that is the basis for the resolution I choose to future proof things.

I've never seen anything like those numbers.

I have heard of some scanning-related considerations - something like "you have to scan at 6k to resolve the full 3k". But I don't think I've ever seen S35mm out-resolve 4k.

  • Like 1
Link to comment
Share on other sites

Most of those results from „back then“ are now completely obsolete:

Vision2 has been replaced by Vision3.

Most scans have been done with single chip CCD sensors where the manufacturers cheated (e.g. on a 12MB sensor, only the green channel had the full resolution, the red and and blue part only had half as many sensors - that’s where those „double the scan resolution“-hints come from).

Details of the film have been lost in the debayering step.

Many of the „visible grain structures“ turned out to be „visible sensor noise“, „compression artifacts“ or side-effects of features like „digital edge enhancement“ or „digital resharpening“.

The „achievable resolution“ isn’t only limited by the film, but also by the lenses, effects like „motion blur“ and the skills of the cameramen.

The „achievable resolution“ is also limited by the used scanner (sensor, lenses, …).

There’s a point after which an increased resolution doesn’t add details, but instead only creates a more lifelike digital representation of the grain structure. So, if you want to have a more film-like experience, then you will have to go way beyond 4K. But this is more for archival purposes or very large screens. For most homes, even UHD is already overkill (size of the TV + distance to the TV + „resolution“ of the human eye).

Etc.

Besides: Bluray is using a lossy codec. Hence - and because of the need to upscale HD-videos on an UHD-TV - „4K“ transfers of films usually look much better than „HD“ transfers.

 

Edited by Joerg Polzfusz
  • Like 4
Link to comment
Share on other sites

well I've got some LED lighting tests tests coming up that are for my own purposes and not under NDA, so I may post them publicly. plan is to compare s16 4k scans with Ultra 16 lenses vs 35mm with ultra prime lenses with traditional tungsten vs creamsource vs aperture (and possibly skypannels), will be on 200T and 500T shot at box speed and 1/3 increments over exposed till all are at 1 stop over. Plan is to use Fotokem's scanity in 4k for all scans, and if I can swing it I want to send the same film to be scanned on a LG Scanstation in HDR mode to compare. 35mm will be 4 perf just cause thats what I got. 

If anyone has a preferred resolution chart let me know

  • Like 1
Link to comment
Share on other sites

I feel like the average projection of a print when most theaters showed prints was around HD resolution, but a good print could exceed that (by a bit). 1080p video generally feels sharper (not in a good way, necessarily) than film prints, imo, perhaps because of digital sharpening. I think large parts of Transformers etc. had vfx done at sub-2K resolution. The idea that HD is sharp enough to be comparable to S35 projection is not a lie, it's just not that simple.

The most confusing issue, I think, is that the MTF curve of digital and film behaves differently. And sharpness is more determined by the area under the MTF curve than where it becomes extinct at 0% or <20%. We're looking at entire systems and the combined MTF curves of the entire system (including, potentially, digital sharpening), so there are a million answers and none are right. In my experience scans are usually sharper than prints for stills (digital prints sharper than cibachrome), I assume film prints are similarly softer than the best scans of negatives.

In my experience, a very good S35 4K scan if you zoom way in has more resolution than 2.8K ArriRAW or 3.2K Arri ProRes, but not by much. Yet a good 2K DCP projection is probably significantly sharper than your average theatrical print from the 90s (hard to say for sure) because of all the other variables involved. An 8K S35 scan should yield significantly more texture in the grain than 4K, but imo no more meaningful resolution in the image. 

Edited by M Joel W
Link to comment
Share on other sites

  • Premium Member

A private entity did a study in the early 2000's and they found that 4th gen (OCN>IP>IN>Print) was between 600 - 800 lines. 

http://tye1138.com/stuff/35mm_resolution_english.pdf

It makes sense honestly, I have never seen a 4th gen standard print that looks good. Answer prints made directly off the negative, are generally 1400 lines, so a marketable improvement. Many show prints look outstanding. 

These tests were done using early vision stock tho, today I have a feeling with our finer grain stocks, the resolution would be a bit higher, especially on the negative which only resolved 2100 lines, which seems a bit low to me. I'd expect it to be a bit closer to 2800 lines with our modern 50D Vision 3 stock, which would of course be the highest resolution stock available. 

Still a far cry from anything digital projection can do. I've been present with the setup of Dolby Cinema projectors now thanks to my job and holy crap, my vision isn't good enough to discern the actual lines of resolution, it can display that tightly. Mind you, still a DLP grid, which creates aliasing, but if you discount that issue, it's WAY sharper than 35mm projection of any kind. Even 5 perf 65mm isn't that great either, because there is so much loss in simply projecting. I'd say that 5 perf is close to modern day 4k on the screen if prints are struck from the negative like Nolans movies have been. They're pretty damn good and worth the upgrade over 35mm due to the resolution and brightness alone. 

  • Like 2
Link to comment
Share on other sites

  • 1 month later...
On 1/8/2024 at 2:48 PM, Tyler Purcell said:

It makes sense honestly, I have never seen a 4th gen standard print that looks good. Answer prints made directly off the negative, are generally 1400 lines, so a marketable improvement. Many show prints look outstanding. 

According to some ARRI research from 2009, new photochemical facilities are so advanced that "generation loss" is a thing of the past. I remember reading about it on a lengthy, 60 page document that compared a digital intermediate and traditional photochemical print (from tests shot on Vision 2 200T) and found out that the print contained much more detail and sharpness. I haven't done any tests myself, so ı'm just quoting what that article said.

  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...
  • Site Sponsor

Old first gen MP film scanners used tri-linear CCDs that were 3K / 4K and then 5.7K this was true RGB scanning where there are three lines of pixels and they were R, G and B with color filters over each part of the line, they were typically made by Kodak's CCD division. This was usually in conjunction with a pin registered Oxberry transport and it took 20-30sec (or much more) to scan each frame. This is "Jurassic Park" era where only some shots would be scanned for VFX to then be recorded out and inserted in a negative cut.

So 35mm and even 16mm was able to be scanned on Cineon scanners in the 1990's to 4K and that worked pretty well.

Later scanners like the Northlight-2 got the 8K (or 9K?) Kodak Tri-Linear sensor and could to 2X over sampling or actual 8K scans. The Later generation Spirit scanners (2003-2009) had three 4K Kodak CCDs with dichroic color filters for real time 2K and 7fps 4K true RGB scanning. These scans still look really great today. The Arriscan has a 3K ALEV sensor and does a piezo shift to make 6K true RGB and true HDR down sampled to 4k.

None of the above was in any way available to the regular public to fuss over these machines were in big iron post with engineering staff and got fussed over by the big guys.

CFA scanners like the Scan Station only really got going fairly recently and the first gen Scan Station used a 2K or 3K Kodak CCD in quad tap mode and it had allot of issues with showing the tap balance so there would be obvious quadrants, they went for speed over quality as allot of places wanted the speed. Things only really got to somewhat modern when CMOSIS introduced a 5K 30fps capable Cmos sensor used in the Scan Station and the GPUs got fast enough to process that, this is about five years ago.

Now there are some new sensors like the Sony 6.5k used in the Scan Station (color) and Director (Mono sequential RGB) for up to 13K scans and DFT has put the new G_Pixel 9.4x7K mono sensor in their Polar scanner and the Sony 14K mono sensor in their Oxscan for 65/70

I don't really think there are any definitive public tests on "what res" is film as it is an organic medium being pushed into a grid pattern. In general I think Nyquist over sampling at 2X the intended res is what should be the goal and true RGB still beats CFA in detail and color fidelity and separation, I think both 16mm and 35mm really benefit from a 4K scan, negatives more so than prints.

As the sensors get better and higher res capturing the whole film image which is made of clouds and grains at reasonably higher res shows an improvement in the final results and with that resolution comes the other horsemen of the emulsion full density range and color separation especially in over exposed negatives.

I think everyone who does this work as a job sees the benefits of scanning 16mm and even 8mm at 4K and it is trivial to do now in fast color scanners and even true RGB scanners, I can see 35mm having some benefits from 8K scans too as they become more easily available. Ultimately is this a pixel peeping tech obsession or realistic best practice for the intended delivery format, most likely to be compressed online or UHD and HD video and rarely (for most) a return to 35mm prints or finely finished 4K DCI distro.

  • Upvote 1
Link to comment
Share on other sites

20 hours ago, Joerg Polzfusz said:

Super8 at 4K (as ruined by YouTube compression):

Super8 at 6.5K (but scaled down to 4K for YouTube):

Now compare the image size of Super8 with 35mm…

That's good enough for TV, if I may be so bold. Too bad that film is getting too expensive, and too bad that photographers don't have cinema scanners available to them. Photo scanners are really not that great, save for maybe drum scanners or Flextights, but those aren't fast.

Link to comment
Share on other sites

The resolution varies, but most theatrical prints are 1.5K-2K in resolution, but a good negative might exceed 4K greatly. A good showprint though might have 4K resolution as well, all depends on the quality of the print itself. I've seen 35mm prints that look almost as good as 70mm when projected.

Also it's not all about resolution, the dynamic range of print exceeds that of Bayer digital even now.

On 1/8/2024 at 6:16 PM, Joerg Polzfusz said:

Most of those results from „back then“ are now completely obsolete:

Vision2 has been replaced by Vision3.

Most scans have been done with single chip CCD sensors where the manufacturers cheated (e.g. on a 12MB sensor, only the green channel had the full resolution, the red and and blue part only had half as many sensors - that’s where those „double the scan resolution“-hints come from).

Details of the film have been lost in the debayering step.

Many of the „visible grain structures“ turned out to be „visible sensor noise“, „compression artifacts“ or side-effects of features like „digital edge enhancement“ or „digital resharpening“.

The „achievable resolution“ isn’t only limited by the film, but also by the lenses, effects like „motion blur“ and the skills of the cameramen.

The „achievable resolution“ is also limited by the used scanner (sensor, lenses, …).

There’s a point after which an increased resolution doesn’t add details, but instead only creates a more lifelike digital representation of the grain structure. So, if you want to have a more film-like experience, then you will have to go way beyond 4K. But this is more for archival purposes or very large screens. For most homes, even UHD is already overkill (size of the TV + distance to the TV + „resolution“ of the human eye).

Etc.

Besides: Bluray is using a lossy codec. Hence - and because of the need to upscale HD-videos on an UHD-TV - „4K“ transfers of films usually look much better than „HD“ transfers.

Joerg, you are my new best friend! Absolutely correct!!

Quote

Most scans have been done with single chip CCD sensors where the manufacturers cheated (e.g. on a 12MB sensor, only the green channel had the full resolution, the red and and blue part only had half as many sensors - that’s where those „double the scan resolution“-hints come from).

Details of the film have been lost in the debayering step.

Yep older CCD sensors had limited dynamic range, they couldn't scan print well at all (the scanners didn't even have a setting to scan print - remember you used to have to get a special low-contrast print made for telecine transfer which cost considerably more than a normal projection print and they're normally 16mm but they can be 35mm), and the Bayer versions were even more limited.

On 1/8/2024 at 6:16 PM, Joerg Polzfusz said:

Many of the „visible grain structures“ turned out to be „visible sensor noise“, „compression artifacts“ or side-effects of features like „digital edge enhancement“ or „digital resharpening“.

YES!!! Most older scanners have aggressive artificial sharpening that cannot be disabled. Even brand new scanners today come with artificial sharpening, though thankfully it can be disabled. To put it the way a mate of mine recently did: artificial sharpening in the initial scanning stage doesn't add anything that can't be done better later. Most "film grain" is digital noise, or enhanced by it.

On 1/8/2024 at 6:16 PM, Joerg Polzfusz said:

The „achievable resolution“ isn’t only limited by the film, but also by the lenses, effects like „motion blur“ and the skills of the cameramen.

Resolution is one thing, dynamic range is another. While the newer Sony imagers are amazing even with a Bayer filter, they can't capture the same dynamic range as true RGB and they are softer from crosstalk. That's not necessarily a "bad" thing as many older movies get scanned and come out far, far sharper than they were ever intended to look in the cinema. The filmmaking and cinematography process took into account the intended look of the film once printed to the projection prints, and scanning the original negative can bring out details that were previously obscured such as making fine wires visible or makeup effects and matte paintings are more obvious, more details in the shadows that were previous obscured, etc. Any decent restoration will use a reference print for grading anyway, but the original negative itself was not color timed and is much sharper than the final print, usually.

The other thing is, the older scanning systems are still in use today. Most scanners reached end of life in terms of development many years ago, like the Northlights that were discussed in another thread. With the others like Arriscans, DCS, LaserGraphics, Filmfabriek, and DFT - they have many different models, and in some cases every single scanner itself may be unique.

On 2/26/2024 at 5:24 PM, Robert Houllahan said:

Old first gen MP film scanners used tri-linear CCDs that were 3K / 4K and then 5.7K this was true RGB scanning where there are three lines of pixels and they were R, G and B with color filters over each part of the line, they were typically made by Kodak's CCD division. This was usually in conjunction with a pin registered Oxberry transport and it took 20-30sec (or much more) to scan each frame. This is "Jurassic Park" era where only some shots would be scanned for VFX to then be recorded out and inserted in a negative cut.

Yeah digital scanners, but they had telecines that were a lot faster than that!

On 2/26/2024 at 5:24 PM, Robert Houllahan said:

Later scanners like the Northlight-2 got the 8K (or 9K?) Kodak Tri-Linear sensor and could to 2X over sampling or actual 8K scans. The Later generation Spirit scanners (2003-2009) had three 4K Kodak CCDs with dichroic color filters for real time 2K and 7fps 4K true RGB scanning. These scans still look really great today. The Arriscan has a 3K ALEV sensor and does a piezo shift to make 6K true RGB and true HDR down sampled to 4k.

None of the above was in any way available to the regular public to fuss over these machines were in big iron post with engineering staff and got fussed over by the big guys.

Exactly. The price has come down like anything. A decade ago the going rates on a good 4K scan was in cents per frame not cents per foot like it is now. The scanner manufacturers had to compensate for limitations in both lighting and imager tech, newer LED lighting has solved the problem of using Xenon bulbs and splitting into R/G/B for sequential scanning and so on, and the 2019 Sony-chip cameras many feel are true CCD quality without the limitations of CCD.

Not only that, but because the older tech took so much more engineering and was so complicated, most of the older scanners are mechanically unreliable and it's impossible to self-service them. I know that's not the case for every scanner, but it's a big difference compared to a modern LaserGraphics that is a literal workhorse and never complains or breaks down.

As you say, this has opened the door to larger markets. With your example of Jurassic Park you are right: digital scanning was once pretty much exclusively for Hollywood special effects. Then in the 00's it was for post-production and film restoration as well, in the 10's it expanded into Archive markets, and now it's accessible affordably to the general public who can scan their home movies on the same ScanStation that a film restoration was done on for not that much more money than scanning on a Tobin or a RetroScan. I think MovieStuff is on its last legs... Clive retired many years ago, so that just leaves really FilmFabriek now for that market (yes I know there are others like Ventura Images but honestly for the same price as one of those you can buy a Pictor Pro).

On 2/26/2024 at 5:24 PM, Robert Houllahan said:

CFA scanners like the Scan Station only really got going fairly recently and the first gen Scan Station used a 2K or 3K Kodak CCD in quad tap mode and it had allot of issues with showing the tap balance so there would be obvious quadrants, they went for speed over quality as allot of places wanted the speed. Things only really got to somewhat modern when CMOSIS introduced a 5K 30fps capable Cmos sensor used in the Scan Station and the GPUs got fast enough to process that, this is about five years ago.

2015 actually they put the JAI camera in, so 9 years ago. As you say, it would have solved CCD area imager tab balance problems, but the camera itself doesn't have as good dynamic range.

Blackmagic are still using the same camera they launched with in 2015 or 2016 (I think prototypes went out in 2015 and launch to retail was 2016?) Regardless of the details, it remains amazing value for what you get, but the development is glacial because they don't have the R&D budget due to selling it so cheap and not charging a support contract. They announced the 8mm gate last year and it still hasn't hit the market! Obviously the Cintel will never be for finishing scans of 8mm, but basic support would be welcomed by users as they can make quick proxy inspection scans without tying up time on their proper 8mm scanner, or just to catalogue what they have etc.

Link to comment
Share on other sites

  • Premium Member
On 2/11/2024 at 4:48 PM, Deniz Zagra said:

According to some ARRI research from 2009, new photochemical facilities are so advanced that "generation loss" is a thing of the past. I remember reading about it on a lengthy, 60 page document that compared a digital intermediate and traditional photochemical print (from tests shot on Vision 2 200T) and found out that the print contained much more detail and sharpness. I haven't done any tests myself, so ı'm just quoting what that article said.

Yes DI can be much sharper, especially if you record out using an "Arri" 4k laser recorder. You'd record directly to internegative and strike prints directly from it. I'd say they could be upwards of 3k, which is a marketable improvement. 

Link to comment
Share on other sites

  • Site Sponsor

I think there is some misinformation about early scanning in this thread.

None of the telecine machines even the very old ones had "sharpening" (Coring and Aperture correction) which was not able to be turned off. Even the old Bosch Quadra and other early CCD telecine had these controls, the colorists tended to ride them to taste. The original SDC2000 Spirit ( one 1920 mono line and three 960 rgb lines) which was the first real "Data-Cine" used on "Brother Where Art Thou" with the HIPPI data and Pogle had the ability to make quasi 2K scans that looked fairly good when setup correctly. Same with the Cintel / ITK Millenium and DSX flying spot 2K 4K Data-Cine machines.

The Spirit SDC2000/2001 the Millenium and DSX and especially the later Spirit 2K / 4K machines never had problems scanning prints or other positives like Kodachrome or Ektachrome, the need for a low con print was really a standard def issue not for any of the HD telecine. People would make a Low Con print to give more room in grading in the early HD days but it became a legacy practice by 2005.

The Spirit 4K was the fastest scanner for 4k and 2K until after 2010 and the fastest 4K / 2K true RGB scanner until the Scannity replaced it, it ran 2K in real time and 4K at 7.5fps.

There were no real 2K + res Bayer Mask scanners till well into the 2010's the first ones were Kodak 2K and 3.3K CCDs used in the Scan Station / Kinetta / Xena etc. scanners with some single tap and some with various levels of tap balance issues. None of these machines were major market scanners.

The Jurassic Park scans done on a Kodak Cineon 4K Tri-Linear CCD scanner hold up well today 30yrs later, it just took 4 minutes per frame to scan them to some beast of a SGI machine.Quantel had the Domino system which had similar performance both based on the Oxberry shuttle.

Great data scanning has just become available to everyone, the Cineon technical specification that Kodak built and which became the basis for all data scanning was and is a very high technical spec which has worked for thirty years, this is separate from and concurrent with scanning film to video for broadcast which is low DR Rec709 and what Telecine was built for.

 

  • Like 2
Link to comment
Share on other sites

18 hours ago, Dan Baxter said:

YES!!! Most older scanners have aggressive artificial sharpening that cannot be disabled. Even brand new scanners today come with artificial sharpening, though thankfully it can be disabled. To put it the way a mate of mine recently did: artificial sharpening in the initial scanning stage doesn't add anything that can't be done better later. Most "film grain" is digital noise, or enhanced by it.

 

Sharpening is for dilettantes*. Change my mind.

*Astrophotography has different needs, and is almost the opposite of standard photography.

Edited by Karim D. Ghantous
Link to comment
Share on other sites

17 hours ago, Robert Houllahan said:

There were no real 2K + res Bayer Mask scanners till well into the 2010's the first ones were Kodak 2K and 3.3K CCDs used in the Scan Station / Kinetta / Xena etc. scanners with some single tap and some with various levels of tap balance issues. None of these machines were major market scanners.

The first Muller HD model was released in 2011, and MWA had a range of small format Bayer scanners as well. I wouldn't say that the they and the ScanStation, Kinetta, DSC Xena, and the Blackmagic Cintel weren't "major market" I'd say that they were market disrupters.

8 hours ago, Karim D. Ghantous said:

Sharpening is for dilettantes*. Change my mind.

*Astrophotography has different needs, and is almost the opposite of standard photography.

Well 5K on RGB is far sharper than 5K on Bayer, so depending on what you want you may want to sharpen a Bayer scan in post if you want it to look more like an RGB scan.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...