Jump to content

1k, 2k, 4k, Numbers Numbers Numbers


Jonny Brady

Recommended Posts

Right... I've tried to work this out from various articles on the net and I enquired at a cinematographer's convention about it and got a different answer here and a different answer there...

 

Until recently I was under the impression that 1k meant 1080p (as 1080 is roughly... 1k, surprise surprise); 2k is... double that, and is the equivalent of the output resolution of the print that goes out to cinemas; and 4k is just a ridiculously high res version that is too high to edit with anyway so ends up getting compressed to 2k.

 

Until.

 

Somone at the British Society of Cinematographer's show at Elstree told me that the Sony CineAlta shoots 2k, which is apparently 2k HORIZONTAL (1920, sometimes 2048 or something and 1080 vertical). Hence it shooting to HDCAM (which rather shocked me)...

 

What! Is this true? So cinema projections are essentially 1080p?

 

But then the next day I read something that differentiated 1080p from 2k, which is what I originally thought. And if 2k IS 1080p, why isn't 1080p just called 2k then?

 

And elsewhere in this forum I've seen somebody compare 2k with 16mm and 4k with 35mm. I thought 35mm was hypothetically about 12 or 16k vertically (can't remember which) since it's negative, molecular!

 

Ever so confused. Can anybody iron this out for me?

 

Thanks!

Link to comment
Share on other sites

Numbers such as 1K, 2K and 4K have typically referred to digital film scans where the image width is typically kept constant. For example a 4perf 35mm frame is scanned at 2048x1556 - resulting in an aspect ratio of 1.33:1. If the same frame of film was cropped to 1.77:1 then the image size would be 2048x1157.

 

In HD the numbers used describe the vertical resolution on the image. I'm not sure why this is, possibly because the early HD sensors where recording 1440x1080 images and then stretching them out to 1980x1080.

 

35mm film probably works best around the 3.2K to 4K area. I've only seen one 16mm/HD comparison and aside from grain, the images looked pretty much identical. So I don't think you'd gain much scanning 16mm higher than 2K. Although I've never seen what a fine-grain 16mm stock looks like compared to HD footage.

 

So yes while 1080P and 2K aren't that much different in image size, most people tend not to refer to HD footage as being 2K or 2K as being 1080P.

Link to comment
Share on other sites

These numbers refer to the horizontal figures not the vertical. HDTV is 1920 - the Sony cameras shoot 1920, rather than 2k, the difference during the shooting is basically ignored since it's only a small percentage difference. The 2k is the standard agreed for cinema production and is intended to match the resolution of 35mm film prints. It's the usual standard for digital theatrical projection to give a 35mm equivalent.

 

The 1080 figure comes from the vertical pixels being used in the 16:9 aspect ratio when using 1920 horizontally - 2k is 1.85. If you change the vertical pixels, keeping the horizontal pixels the same will result in the frame changing shape e.g. a 2k x 2k will be a square frame.

Link to comment
Share on other sites

  • Premium Member

HD is usually 1920 x 1080, which could be called 1.92K in terms of horizontal pixel resolution. Film scans can be at 2K or 4K, but even then, 2048 is the width of a "2K" scan of Super-35... if you scan Academy / 1.85 / anamorphic at "2K", it's 1828 pixels across.

 

So 1920 pixels across is more or less 2K.

 

However, the way that information is handled, stored, etc. is a bit different for HD video versus RGB data. Color may be handled differently, the HD may be more compressed, etc. So 1920 x 1080, 10-bit 4:4:4 HD (1080P) is similar but not exactly the same as 2K RGB uncompressed data. 2K digital cinema projection is also a different format, a specific color space, level of compression, etc. But yes, 2K digital cinema projection is similar to HDTV projection in terms of resolution.

 

Generally 35mm negative should be scanned at 4K (4096 across for Super-35) to capture all the grains and fine detail on the negative without aliasing or stairstepping. See:

http://digitalcontentproducer.com/mag/vide...cial/index.html

 

Now one thing to keep in mind that the level of pixel resolution is not necessarily the same thing as the level of detail, sharpness, measurable resolution. Just because you should scan 35mm negative at 4K doesn't mean it always resolves 4K worth of detail. I'd say that a lot of people are getting more like 3K out of it. However, you'll find some people saying that particularly sharp fine-grained 35mm photography, like for anamorphic, should be scanned at 6K. This is somewhat controversial though.

 

At the opposite end of the argument are those who feel that there's no reason to scan 35mm at more than 2K since 35mm print projection can barely achieve even that level of resolution. I tend to disagree with that since you want to start out with more resolution if possible to survive the degradation of copying, printing, etc. However, if we had no print projection anymore, only 2K projection, I suppose it would argue more for just doing all the work in 2K, other than the problem that you may be creating a digital master that does not contain the maximum possible level of detail for future digital presentation formats that may be 4K.

 

HDCAM, by the way, is 1440 x 1080 pixels. HDCAM-SR is 1920 x 1080.

Link to comment
Share on other sites

The F23 is 1920x1080, and it records to HDCamSR, which is the same. HDCam is an older tape/data format that is used on cameras like the F900. In order to save bandwidth, it takes a 1920x1080 image and squishes it horizontally in order to store it as 1440x1080. The image is still 16:9, but now it has a pixel aspect ratio of 1.33:1 in order to stretch it back out to 1920x1080 on display. So that's a fairly significant quality hit, especially if you're going to be doing any sort of post work on it.

Link to comment
Share on other sites

  • Premium Member
Ahhh, I see, course - I never considered aspect ratios.

 

David - so the Sony F23 shoots in 4:3? Why, I never knew that - I thought HDCAM would be widescreen? I was told the F23 had a 16:9 chip...

 

What makes you think I said it shoots 4x3? The Sony F23 shoots 16x9 1920 x 1080 on HDCAM-SR.

 

HDCAM -- which is what the Sony F900 records to -- also shoots 16x9 but records it to 1440 x 1080, with non-square pixels I guess because it becomes 1920 x 1080 on display.

Link to comment
Share on other sites

  • 2 weeks later...
  • Premium Member

Back in the days of analog TV, the TV guys got in the habit of counting vertically from top to bottom, because the horizontal was analog. So, we had 483 active lines for NTSC, 576 for PAL, etc.

 

The use of scanning for feature films started with line arrays that had a fixed number of samples across the frame, and could be clocked as many times as you wanted vertically, so it was natural for them to go with horizontal counting.

 

So, when things converged, we got this mess where 1080 and 2K really mean nearly the same thing.

 

Original old time HDCam uses 1440 across instead of 1920 because at that time they couldn't get the full 1920 compressed and onto the tape. They were doing 1080i at the time, which has only about 60 - 65% of the vertical resolution of 1080p, so there was no real need to record a full 1920 across. Human vision has equal horizontal and vertical resolution, and we perceive images with unequal h&v resolution as having the lower of the two resolutions, provided the ratio isn't much more than 2:1. (If it gets bigger, we notice that the sides of stuff are sharper or softer than the tops and bottoms.)

 

 

 

 

-- J.S.

Link to comment
Share on other sites

  • 4 years later...

Just because you should scan 35mm negative at 4K doesn't mean it always resolves 4K worth of detail. I'd say that a lot of people are getting more like 3K out of it. However, you'll find some people saying that particularly sharp fine-grained 35mm photography, like for anamorphic, should be scanned at 6K. This is somewhat controversial though.

 

At the opposite end of the argument are those who feel that there's no reason to scan 35mm at more than 2K since 35mm print projection can barely achieve even that level of resolution. I tend to disagree with that since you want to start out with more resolution if possible to survive the degradation of copying, printing, etc.

 

Jonny (the OP), even though this thread is about 4 years old, you may get this if you subscribe.

 

It's worth noticing that Lawrence of Arabia got an 8K scan to get all the detail off of a 70 mm negative. So did Baraka. Although both lost some quality in theaters, after going through all the in between film stages, it's nice to digitally capture all the detail that's there on the negative. As digital slowly replaces photochemical photography, I hope they eventually standardize all cameras and displays at 8K --- or better!

Link to comment
Share on other sites

  • Premium Member

Again, don't confuse optimal scanning resolution with actual picture resolution -- to avoid the Nyquist limit, in theory an image with actual 8K worth of detail would have to be sampled at 16K to avoid aliasing.

 

So even though it was a good idea to scan "Lawrence of Arabia" at 8K, it doesn't mean it is an 8K image in terms of measurable detail on the original 65mm negative. In fact, it's probably more in the 4K-to-6K realm. If modern 35mm stocks seem to resolve a bit more than 3K, maybe 3.5K, in detail, then a negative twice as wide would resolve a bit more than 6K, maybe 7K (assuming the 65mm lens was a sharp as the one used on the 35mm camera.) And "Lawrence" was shot on 50 year old color negative and optics technology. And even if the 65mm negative was resolving near 6K quality, a 70mm print would then be more like 4K on the big screen.

 

8K cameras and digital projection would be closer to IMAX quality, which would be cool, though I'm not sure every actor wants to be recorded at that level of detail...

  • Upvote 1
Link to comment
Share on other sites

Again, don't confuse optimal scanning resolution with actual picture resolution -- to avoid the Nyquist limit, in theory an image with actual 8K worth of detail would have to be sampled at 16K to avoid aliasing.

 

So even though it was a good idea to scan "Lawrence of Arabia" at 8K, it doesn't mean it is an 8K image in terms of measurable detail on the original 65mm negative. In fact, it's probably more in the 4K-to-6K realm. If modern 35mm stocks seem to resolve a bit more than 3K, maybe 3.5K, in detail, then a negative twice as wide would resolve a bit more than 6K, maybe 7K (assuming the 65mm lens was a sharp as the one used on the 35mm camera.) And "Lawrence" was shot on 50 year old color negative and optics technology. And even if the 65mm negative was resolving near 6K quality, a 70mm print would then be more like 4K on the big screen.

 

8K cameras and digital projection would be closer to IMAX quality, which would be cool, though I'm not sure every actor wants to be recorded at that level of detail...

 

Thanks for the info. I didn't know about the Nyquist limit, although now that you mention it, it sounds familiar.

 

To me, modern 35 mm film in a theater looks like it has about the resolution and acutance of the best 70 mm prints from 65 mm negatives of the past, although the projection doesn't seem to have the brightness of the old carbon-arc lamps. There were a few softer appearing 70 mm films back then like 2001: A Space Odyssey (people shots only, the man-apes and the space gear looked fine), 80 Days, and Oklahoma! (the latter two may have used inferior lenses, but the sense of depth was striking). The 70 mm equipped theaters I frequented often had seats that went right down to very near a large (sometimes curved) screen, so the size of the image on the beholder's retina was larger than is typical in 35 mm projection today, so I'm not sure of my subjective detail comparison.

 

My puzzlement is around some of the Blu-ray versions of the old large format films that don't do them justice. Lawrence does seem to be detailed enough, even though the home Blu-ray version is only about 2K (given the discussion in the early posts in this thread). Ben-Hur is an odd case. We hear that it was scanned at 6K, but while the faces are fine, the long shots are much blurrier than I remember in 70 mm or 35 mm, and they seem worse than in the Lawrence BD... but the grain is sharply reproduced and, sadly, larger and more apparent than in Lawrence.

Edited by gary ronald camp
Link to comment
Share on other sites

  • Premium Member

"Ben Hur" was shot in 65mm anamorphic to get a 2.7 : 1 aspect ratio, so in lower light levels, the lenses wouldn't have been as sharp as the spherical lenses used in "Lawrence of Arabia" -- plus "Ben Hur" was shot on 25 ASA film stock, not 50 ASA stock, which was available when "Lawrence of Arabia" was shot, so it would have even harder to light interiors to a deeper stop. And "Lawrence" has a lot more real daylight exteriors working in its favor.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...