Jump to content

How Something Is Filmed and How Something Is Shown


Recommended Posts

If I decide to film something in 4K, and some years from now TV channels around the world decide to broadcast their content in 8K as the new standard, how will my 4K film appear on an 8K screen?

 

What happens in the reverse case, if I show a higher-resolution video on a lower-resolution screen?

Link to comment
Share on other sites

  • Premium Member

There is a maximum perceivable resolution and screen size and distance are more important then pixels. My old 32" Sony XBR television with laserdisc source, looked great. However, take that same source and project it up to 60" from the same viewing distance, it looks like a big mess of noise. Take that same 60" HD monitor and put a standard 1080i broadcast signal on it, the difference is night and day. So that's a good example of how smaller monitors kind cover up resolution issues.

 

So then you have to ask yourself, what will 4k bring us? Perhaps larger monitors, but will people want those in their houses? With the advent of OLED and stick on displays in the very near future, we may see larger viewing systems coming around BEFORE 4k is widely used at home. But people's displays will be limited to the size of the wall in that case and in the vast majority of cases, it will NOT take up a full wall and the viewing distance will be greater then it COULD be, because people are use to watching things within a smaller device window.

 

What does all this gibberish mean? Really, 4k vs 8k means nothing in the long run because very few setups will ever be able to display that perceivable difference due to physical issues in screen size vs seating location.

 

Now to the whole 4k vs 8k debate. For the record, broadcast television is 1080i and that standard will stay for quite a while. The reason is quite straight forward; cost. To stream a 1080p signal at 29.97 or even 23.98, takes up WAY more bandwidth then 1080i 59.94 which is what most terrestrial broadcast is in the states. There are still a lot of 720p and 480i broadcasters here. Plus, everyone recently invested in HD equipment, it's only been 6 years since it became mandatory. So it's going to be at least a decade before any major change is made and I predict, most big broadcasters won't be around by then anyway. Then you've got the HUGE problem of backwards compatibility, which there is none. Unlike 1080i broadcasts, which can be received by 720p displays... UHD broadcasts are of a different decoding system then what we're currently using today, so even our modern TV's can't read the signal. Sure, broadcasters can add another channel, but that's super costly. Satellite providers will have to cut back bandwidth of other channels to jack it up for UHD broadcasts, which is already a problem today.

 

Web streaming will be the only real way to watch UHD content and honestly, it's a huge problem as well. Even with todays standards, streaming UHD material on the internet is a lesson in futility. Netflix caches/buffers most of the content in advance before playback because they know it won't stream in real time. Plus, part of the reason is server infrastructure, it's just not strong enough. The cost to setup 4k streaming is huge and you can only compress the signal so much before it looks like crap. Far better to run a 1080p signal and compress it less, it will look better in the long run.

 

Finally, you've got Sony's failed UHD BluRay and steaming service. None of the studio's are signing on because they don't have UHD masters. It's a complete failure because nobody cares and honestly, I feel that is the same case about UHD across the board. See, the government forced everyone to buy new TV's, even when people were perfectly happy with their old ones. Most people are sick and tired of constantly upgrading and honestly 1080i is so much better then what they're use to, upgrading isn't even on the radar. So without the consumers jumping on board in droves like they were forced to for the HD roll out, UHD is going to be a long road.

 

In the end, if you shoot something in 4k, it will most likely never been shown in 4k. You will make a DCP and be asked to make a 1080p version. It's that lower-resolution 1080p version which will be seen everywhere forever. The 4k version will be lost over the decades, like so many digital-only films have been already. By the time UHD is standard, finding your 4k maters will be difficult and since you can't go back to a camera negative to re-scan at 4k, you'll be stuck with whatever you have quality wise. Trust me, this happens so much it's not even funny. Forget about 8k at home, that's a pipe dream you'll be telling your grand kids about when you talk about technology failures.

 

If you shoot and produce in 2k, you will save a lot of money and never see any difference with broadcast, BluRay or web distribution.

Link to comment
Share on other sites

As has been said by other people, If you want a production to have a future, make something that's worth watching. In the UK a comedy program called "Dads Army", made in the late 1960s and early 1970s, still gets shown on prime time BBC, this was shot using stand definition colour studio TV cameras (most likely EMI 2001s) with 16mm film inserts, So, I wouldn't get too hung up on 4k v 8k.

Link to comment
Share on other sites

As has been said by other people, If you want a production to have a future, make something that's worth watching. In the UK a comedy program called "Dads Army", made in the late 1960s and early 1970s, still gets shown on prime time BBC, this was shot using stand definition colour studio TV cameras (most likely EMI 2001s) with 16mm film inserts, So, I wouldn't get too hung up on 4k v 8k.

 

I knew I made a mistake when I mentioned 4K and 8K specifically. I was to lazy to correct it. The point wasn’t in specific values, but in what will happen when you show something shot in lower resolution on a high-resolution screen. I’m not getting hung up on it. :) I just wanted to know what it will look like.

 

Then there’s that other thing Tyler Purcell mentioned, which is the matter of storage. We’ll leave that for some other time. :)

Link to comment
Share on other sites

As has been said by other people, If you want a production to have a future, make something that's worth watching. In the UK a comedy program called "Dads Army", made in the late 1960s and early 1970s, still gets shown on prime time BBC, this was shot using stand definition colour studio TV cameras (most likely EMI 2001s) with 16mm film inserts, So, I wouldn't get too hung up on 4k v 8k.

It's a shame the 16mm. inserts are forever limited by the telecine of the time, because obviously the originals are gone. If they could rescan them now, they would probably look as good as the video. They could even get rid of the dirt and tidy up the tape splices.

Link to comment
Share on other sites

It's a shame the 16mm. inserts are forever limited by the telecine of the time, because obviously the originals are gone. If they could rescan them now, they would probably look as good as the video. They could even get rid of the dirt and tidy up the tape splices.

And not only the 16mm have gone. The videos too. That must be why the Beeb keeps showing the same 20 or so episodes again and again.

Link to comment
Share on other sites

In those days video tapes were expensive and bulky, and probably reused a lot.

Assuming 4K becomes standard everywhere it would seem there is a better chance now of TV films being shown as they were intended, well into the future ...?

Link to comment
Share on other sites

Upres the footage to 8k, crop to 4k and you will see how it will look on an 8k monitor (given the same ppi and viewing distance).

 

As far as I am concerned, you have to downres the footage shot with Bayer sensor cameras to get good results, so obviously not a big fan of upscaling. Digital imaging is at it's very beginning, it will get much better in the future when cameras with more and more resolution will be made. I think stuff shot on 8k, downresed to 2k is kind of a benchmark for a good digital 2k picture, so of course in the future when people will be shooting crazy high resolution footage and display it on 4k or 8k monitors, stuff shot on today's 2k or 4k CMOS sensors won't be able to compare in quality.

 

Film is a future-proof alternative. It can be scanned at any resolution and look good at any size (IMO).

Link to comment
Share on other sites

 

I knew I made a mistake when I mentioned 4K and 8K specifically. I was to lazy to correct it. The point wasn’t in specific values, but in what will happen when you show something shot in lower resolution on a high-resolution screen. I’m not getting hung up on it. :) I just wanted to know what it will look like.

 

Then there’s that other thing Tyler Purcell mentioned, which is the matter of storage. We’ll leave that for some other time. :)

 

The question is... do you walk out of a theater that has a 2K projector... if not... then there is no real problem... I think in terms of resolution... I think Film film is around 6K, under the 'best' of conditions. So 8K would be beyond that, and 4K is 'close enough' for most purposes... and as we know... 2K is quite popular throughout the world as we speak...

Link to comment
Share on other sites

  • Premium Member

Film is a future-proof alternative. It can be scanned at any resolution and look good at any size (IMO).

Yep, that's 100% correct. But I've found most people who ask about resolution are digital people, they wouldn't even contemplate shooting on 35mm to begin with.

Link to comment
Share on other sites

Yep, that's 100% correct. But I've found most people who ask about resolution are digital people, they wouldn't even contemplate shooting on 35mm to begin with.

 

It is true that one can get enough 'resolution' out of Film film to produce an electron microscope level of scan... but does that actually add anything to the usual presented visual image on a screen, especially a screen that is 2K or so, in the home...

 

I think realistically 6K is about it for 'useful' resolution from Film film.

 

In most of these discussions for cameras that I can afford... like the BMPCC... I'm fine with 2K and want as a wish list item, as in the ever cheaper product offering, something with 12 bits in each channel of R, G, B... (aka 36 bit image data...)

Edited by John E Clark
  • Upvote 1
Link to comment
Share on other sites

Why do you write Film film? Is there any other film (when speaking about the film as a medium, not film as a movie)?

 

Because there is Digital film as a medium. I am not now, nor have I ever been (really) interested in Video. Ok... I did quite a bit of image processing using video camera equipment in the 80's. And of course that's when I had the epiphany of the future of Digital film (and obviously was not alone, unique, etc. in seeing that future.) as a replacement for Film.

 

ARRI and RED have both made cameras to specifically replace silver based film. Whereas Sony et al. do have 'video' cameras which can be used, have been used, in situations where Film film was used.

 

And in the still world, one never refers to a digital SLR as an Electronic SLR... despite that being the capture method via a CCD/CMOS sensor.

 

About the only thing I do find useful from the 'video' world is the waveform display... it saves on having to make graphs by hand of step wedges... although I can do that too via a program called "IMAGE-J"... free image processing tool... from either a series of images aka 'movie' or a still...

 

And of course there is something called Digital Video... which uses video terms, encodes digital data such that resulting captured data is usable by 'video' processing equipment, such as conforming to Rec 709 or the older Rec 601 color spaces, etc.

Edited by John E Clark
Link to comment
Share on other sites

Thanks for elaboration, makes sense if you see it that way.

 

***

 

BTW, I went and made a crop of an approximatelly 8k darkroom print scan of a cheap plastic point and shoot camera:

 

image.jpg

 

And here is the same image downresed by a factor of 2 (therefore to approx. 4k) and then upscaled back to 8k:

 

image.jpg

 

There isn't much difference, but the grain is more pleasant looking (and most importantly, more accurate) in original version. I know it's not directly comparable to motion film, given that 35mm still film has more resolution, but the picture is more than 20 years old (when film had less resolution I suppose?), shot with plastic lenses and printed on a paper, and even in these conditions it looks better in 8k than 4k. I don't know where would be the point where there wouldn't be any difference in quality between the resolution.

Edited by Peter Bitic
Link to comment
Share on other sites

I remember reading an article about the usefulness of 4K as being mostly a better way to get 2K. If you want a better quality 2K, shoot 4K and downscale. So wouldn't that be true of 6K and 8K? Just to get a better, true 4K. If and when 4K laser projection becomes affordable and more common?

 

I see increased resolution in capture being more about the ability to do things like handheld shots that can later be smoothed out using warp stabilizer and "cropped" in on without a lot of resolution loss. That kind of thing. Cause I also agree that presenting in anything greater than 4K is likely far into the future. But if Sony's laser projection technology is ever made available to consumers, it could hopefully speed that up.

Link to comment
Share on other sites

  • Premium Member

I think realistically 6K is about it for 'useful' resolution from Film film.

In terms of standard 4 perf 35mm negative yea, it's only really capable of around 4k.

 

But 5 perf 65mm camera negative is around 8k. 15 perf 65mm is just north of 12k.

 

Sure, once you strike prints and project it, the resolution does cut in half. Most theatrical 35mm prints are around 2k worth of perceivable resolution. So it's true that if you compare standard 4 perf 35mm which isn't even using the full potential of the format, it's about equal to today's 2k cinema projectors. I guess my earlier point is that digital cinema technology hasn't really changed much in the last 10 years. Sure, it's become more popular, but we're still stuck with most theaters being 2k, most films being finished in 2k (which means there is no high resolution negative to go back to for UHD release) and people discussing 8k as if it's right around the corner. From my point of view, film is the only fool-proof way to make your movie future proof. Otherwise, it's always going to be a debate between which is better... Alexa 4k or RED 8k. I say throw the K's away and suck it up and shoot it on film. :)

 

Ohh and we're very much on the same page about digital technology, 12 bit RAW is the lowest I'll ever go as well. I see no point in going lower if you're making the investment. I'd rather have all that beautiful color space in 1080p then highly compressed 4k with the lower-end offerings from MOST manufacturers. You'd be surprised how acceptable 1080p looks in a normal theater when done right.

Link to comment
Share on other sites

Hopefully I am not retreading something I missed in a reply above, but the ANGULAR resolution of the human eye would not resolve 8K on a TV-sized screen.


You have to realize that the resolution of the human eye is finite, and even 4K is pushing it. Not that this stops people from selling things and technologies (like "20 megapixel camera phones") from catching on anyway, but 4K is pushing it in terms of appreciable quality improvement.

You would have to sit closeto a 4K screen to even see the difference between 4K and 2K (high def 1080P) anyway.


Star Wars was projected digitally at only 2K. There is absolutely no need, whatsoever, to worry about "8K TV."

Link to comment
Share on other sites

Tyler:

Prior to the digital intermediate, not trying to nitpick too much, but it wsa higher.

Want to say around 2500 lines, so around 2.5K. That's for a fourth generation print. A showprint pretty close to 3K, I'd say. I agree for the most part with your large format numbers, though.

Link to comment
Share on other sites

  • Premium Member

Hopefully I am not retreading something I missed in a reply above, but the ANGULAR resolution of the human eye would not resolve 8K on a TV-sized screen.

Well, it depends on the size of the screen. I can configure my home theater into a 14 foot wide screen with the viewing distance of around 10 - 14 feet. Full-height, floor to ceiling displays are already in development and should be out WAY before broadcasters make the switch to UHD. With OLED technology, the cost will be super low for this type of display, so the average joe can afford it. This is where higher resolution images come into play and why anyone is discussing UHD or 8k. In reality a standard 60" TV being viewed from lets say 5 - 10 feet away, is impossible to resolve anything greater then 4k. Since most people sit further back then that, there is really no way any of this high resolution technology has any value for the average consumer.

 

Ohh and in terms of 35mm print quality, the data sheets I've read from independent research firms, have put 4th generation photochemical film prints at between 700 and 1200 vertical lines, depending on the origination stock graininess. Now, the tests were done years ago and there is no doubt in my mind if you shot an entire feature on 50 ASA 1.33:1 full aperture, you'd probably get a bit more resolution out of it. But from my research and even viewings of movies over the years, I'd say 2k is about right on for a standard 4th generation 35mm print.

 

I have yet to see a study on 5 perf 70mm print resolution, but it's more then double the resolution of 35mm.

Link to comment
Share on other sites

"K's" are horizontal, so your figure isn't necessarily in disagreement. I remember getting in a long-winded argument with a projectionist who INSISTED that 2K was "twice the resolution" of 1080i, and this guy, in addition to running film had an engineering degree, I think.

My observations of contact vs. DI prints are in agreement, too: With the exception of Super 35 blowups, the digital intermediate process hurt the resolution of film prints in theatres.

Anecdotally, I know a projectionist who knows NOTHING about photography, not even the F/stops, and he said something along the lines, when I explained this to him "so THAT's why the movies aren't crisp when they're in focus anymore!"


I saw a study, mind you this is from memory half a decade ago now, that said even 4K was beyond what most people could see on a TV at normal viewing distance.

And a lot of TV shows are/were mastering at 2K. Has 3D TV caught on yet? ;-)

Edited by Ari Michael Leeds
Link to comment
Share on other sites

This notion of "film is X resolution" is something I've been chewing on for some time. I've come to the conclusion that tying a resolution to a film gauge is totally bogus. There are a few issues at play here:

 

1) Film is analog. There are no pixels, so assigning a pixel value to the film doesn't make any sense. Apples and Oranges, basically. Because the grain on the film is what makes up the image, more pixels sampling that grain means more sharply resolved grain, means a truer representation of the image that's on the film. This is especially the case when projected large. Would you rather capture your audio at 22kHz or 96kHz? it's the same idea. Does it mean you're going to get a sharper image if you scan at 4k vs 2k? not necessarily. Factors like camera optics and film stock are what will determine the sharpness of the image. But a higher resolution scan does mean that you're going to be starting from a better place in terms of your digital pipeline, though, because you're getting a better digital representation of the film that way.

 

2) If you scan a film at 2k and view it at 2k, and you scan the same film at 4k and view it at 4k will you see a difference? If you're the correct distance from the screen for the room you're in, you shouldn't. But if you're close to the screen it's a totally different story. Given that most people want monstrous screens in their homes, this is an actual scenario that happens all the time. Just look at HD on an 80" screen from 6 feet away. It's pretty terrible. Step back 10 feet and it looks like HD again. More pixel density means better clarity when viewing from close up.

 

3) Saying an nth generation print is equivalent to a specific reduction in digital resolution again doesn't make any sense. Is there generation loss? Of course. But to equate it to a resolution like 2k or 2.5k is pretty arbitrary. There are too many factors involved to make such a blanket statement, such as the sharpness of the original film and the fineness of the grain in that film, and the fineness of the grain in the intermediate and print stocks. The quality of the lab work matters too. And if there was a digital intermediate done at 2k, with a filmout, then you've hobbled the resulting print somewhat due to the lower resolution. Any intermediate format should be higher resolution or higher quality (optically that means finer grained than the OCN, digitally it means more pixels) than the source, so as not to reduce the quality of the image. This applies to the audio world too, and to digital still photo work for print, and for a host of other things where the original is being manipulated a lot before being output to another format.

 

4) If one wants to show a film at 4k and one scans that film at 2k, then the only way to get it to 4k is to scale it up. This is a process that requires creating something that wasn't there before (2k to 4k means 4x the amount of data). Sure, 2k to 4k will look better than SD to HD, but you're still going to see marked softness, no matter how good the scaling algorithm is. Why not just scan at the higher resolution and scale down if you need to (no ill effects going in that direction), and have that 4k version in your back pocket for when you need a 4k master?

 

Number of pixels is only one factor here, and it is overblown somewhat, partly because it's used by the industry as a shorthand for "quality." Dynamic range, color reproduction, etc are arguably more important. But pixel count is actually an important measure of quality in many scenarios, and it has to be factored in. Considering the future of the project after the immediate needs is important, and in most cases the logical conclusion to be drawn is that you're going to get better results with more pixels in the mix...

 

-perry

 

 

  • Upvote 1
Link to comment
Share on other sites

Film has a real, finite resolution. Line pairs per millimeter.

What do you want us to do, apologize that they are not convenient to digital terms and metrics? ;-)


You get to a certain point with a scan where you don't see any appreciable increase in quality for the time and data expended. Film has a very real and finite resolution that corresponds to a megapixel count, whether you want to believe me or not.

I have been scanning film for a long time, and there are charts available that show line pairs per millimeter if you want to independently confirm my observations.


I do agree that the importance of resolution is overblown compared with color space and dynamic range.

Link to comment
Share on other sites

  • Premium Member

Well yes Perry, which is why when discussing these things, we use terms like "camera original negative" and "4th generation photochemical finish", to try and define things a bit more. It's also true that film doesn't have pixels, but resolution is very easy to define with film using a standard ol' line based resolution chart. We can extrapolate theoretical pixel count from the line based resolution chart tests.

Link to comment
Share on other sites

We can extrapolate theoretical pixel count from the line based resolution chart tests.

 

In Film film, there is a 'well' defined definition of resolution, it's called the Modulation Transfer Function (MTF), and uses a sinusoidal target that is captured, and then measuring the 'contrast' of the sinusoidal image, and at a point were it is 30% of the full scale sine value, one reads off the Line Pairs per millimeter value. (I've also seen 50% contrast as well..).

 

Otherwise it is anyone's guess as to how small of a contrast range someone can some how see their way to 'seeing'...

 

In any case, what one sees on a negative is actually a combination of lens MTF and negative material MTF... or a 'system' MTF... added to the projection print (of any generation...) and the lens of the projector... and the lenses in the eyes of the viewers... yields the overall MTF...

 

In the case of digital sensors, one can still speak of MTF, but now there is a problem with 'what if the sinusoidal pattern is not quite lined up with the sensor's 'pixel' pitch...

 

But in any case an MTF can be found and so, one can speak of the resolution of the digital sensor.

 

As for comparisons with Film, one also has to indicate whether the film reference is Black and White, and so the 'smallest' capture element is a small glump of silver crystals aka grain, or if the film is Color, the small glump of silver creates a 'color' fuzz ball, which is also called 'grain'... but depending the color fuzzball can be perhaps 10x the small B&W crystal volume. And obviously these things are really 2-d projections of these 3-d objects... (And of course processing color film such that the silver is not removed, will retain some of that 'sharper' grain pattern of B&W...).

 

From what I've read on the subject, it seems that somewhere between 4-8K 'digital' scans of Film film, reaches the maximum useful resolution. Higher resolution scans may result in better 'grain' capture, but that will be beyond the MTF of the film for photographic purposes, and again, scanning negatives captured in a camera, will include the MTF of the lens involved... as well as any blur of misfocusing, motion of the camera, or subject. So it is really a system MTF and not just the absolute lab assessment of the negative material.

Edited by John E Clark
Link to comment
Share on other sites

Let's step back a bit. I'm not talking about the *picture* on the film, whether that's actors, landscapes or line charts. What I'm talking about is the film: the physical material that contains the image information. The more pixels used to digitally represent this image, the more accurate a representation of that film you will get. It's one (important) metric of of quality in the final digital image, but not the only one.

 

Using a line chart to determine the "max resolution" of a digital representation of a frame of film in lines/mm makes little sense to me, because the creation of that line chart is subject to the vagaries of the camera lens, film stock, how well the image was focused, etc. That is, the film/camera's ability to resolve those lines have little to do with the scanner's ability to scan the film at higher and higher resolutions. At a certain point, you hit the limits of even the best lenses and film stocks in making an image.

 

My whole point is that if one is mixing analog and digital in a DI, for example, the best practice would be to do all your scans at resolutions higher than those you will be outputting, so as not to cripple the image in any way. In the same way one uses extremely slow, fine-grained film stocks for intermediates and prints, one should scan at higher resolutions than what conventional wisdom says are the "limits" of the film, in order to ensure that no damage is done to the image in the digital realm (that also buys some flexibility compositionally, having a larger scan to crop or scale down).

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...