Jump to content

Screen pixel size


Paul Bruening

Recommended Posts

  • Premium Member

I was doing a little math on 2K scans. It comes out to 1/4" sized pixels on a 40 ft. wide screen. Seems kind of big even when you consider the screen distance in an average theater. 4K puts the pixel at 1/8". Still kinda' big but more passable at 24 fps. I wonder if some kind of shifting pixel scheme could divide those 1/4" pixels up across the screen, spanned over 24 fps. Could that create the illusion of a higher presentation resolution? I guess it would all have to be managed by software from scan-to-show or it could get quirky.

Link to comment
Share on other sites

  • Premium Member

People have considered using a predefined stack of virtual grain fields, that is, to define for each of several successive frames a grain pattern, and sample one RGB primary for each virtual CMY grain.

 

The thing is, that to do this, you need one of two things: a sensor which can alter the shape and position of its photosites, which we don't have, or just a higher resolution sensor which you can mathematically interpolate into your virtual grain pattern, and then a higher resolution output device, such as a DLP chip, to recreate the grain pattern.

 

That we could do, but at that point you're just using it as a way of compressing images, and one which would probably not give such good results as the other mathematical transforms already in use, unless you weight heavily in favour of a filmlike grain pattern.

 

But at the end of the day, the acuity of the normal human eye is only about 1/60 of a degree, at which point if you're 40 feet from the screen you can't see anything much smaller than:

 

2(distance (tan (1/60)/2)) = 0.01163' = 0.13"

 

...which is somewhere between 1/4 and 1/8".

 

P

Link to comment
Share on other sites

  • Premium Member
The thing is, that to do this, you need one of two things: a sensor which can alter the shape and position of its photosites, which we don't have,

 

 

I wonder....

 

Did you know that the arri scanner achieves it's 6K resolution by doing multiple scans and *physically* shifting the sensor to get the interpolated extra resolution... up to 4 scans per frame if you use the extra dynamic range mode...quite a feat and different to traditional software based pixel shifting. It also scans depending on the mode and hardware (drives etc) at something close to 12 FPS....not too far off 24 FPS..???

 

jb

Link to comment
Share on other sites

  • Premium Member

I've walked right up to the screen and had the projectionist rack slowly thru focus. The optics don't resolve those 1/4" pixels, which is a good thing for two reasons. First, you need a reconstruction filter, it's part of sampling theory. Second, if they were razor sharp, they'd moire against the sound perforations in the screen.

 

 

 

 

 

-- J.S.

Link to comment
Share on other sites

I've walked right up to the screen and had the projectionist rack slowly thru focus. The optics don't resolve those 1/4" pixels, which is a good thing for two reasons. First, you need a reconstruction filter, it's part of sampling theory. Second, if they were razor sharp, they'd moire against the sound perforations in the screen.

-- J.S.

 

I guess this depends on the theater then. I've been in rooms where the pixels are patently obvious from the back row.

Link to comment
Share on other sites

  • Premium Member

If a managed pixel shift technology worked as an illusion, current file sizes could remain the same but perceived resolution could be improved. A little difference in the digital projector design, no difference in the optical projector (manage the film-out instead), the same pixel shift but at a higher sensor resolution in the scanner and some embedded code throughout the data stream... hmmmm.

Link to comment
Share on other sites

But at the end of the day, the acuity of the normal human eye is only about 1/60 of a degree, at which point if you're 40 feet from the screen you can't see anything much smaller than:

 

2(distance (tan (1/60)/2)) = 0.01163' = 0.13"

 

...which is somewhere between 1/4 and 1/8".

 

P

 

Sorry to nitpick Phil, but as the self-appointed Imperial unit guru on this forum, 0.13" is, to two significant figures, 1/8" or exactly 7/40". It's not your fault that you got it wrong though; it's the damned EU's attempt to brainwash the British into forgetting their Imperial heritage ;-)

 

In seriousness, I am actually surprised that it's as small as 1/4" (6350 µm). I'd have thought, without actually crunching the numbers, that it'd be over an inch, hell, over a couple.

 

What's the resolution of the human eye again, 100-something megapixels equivalent? Considering the distance you sit away from the screen, dust in the air, optical imperfections in glasses/contact lenses that most people wear, I'm surprised I am able to pick up the fact that digital projection is "not right" compared to normal 35mm with dots that small.

 

Just curious Paul, as I haven't crunched the numbers here either, what's the size of your average 35mm grain after 3 generations of copying? For simplicity's sake let's say it's 35mm silent academy, so 18x24mm or about 3/4" x 1". . .

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...