Jump to content

2K master on the new Star Trek movie


Recommended Posts

Hey guys

 

Have any of you noticed the 2K print on the new "Star Trek" movie?

 

I don't undestand. Why would a big hollywoodproduction with a budget of $150,000,000 make a 2K print of such a movie. If it comes to me, the movie really needs to go 4K.

 

The movie is running at almost every theatre around the world. As this is a big Sci-Fi movie , I think the extra resolution, would definitely give the movie the extra touch. Specially at the biggest cinema screens.

 

What do you think?

Link to comment
Share on other sites

The difference on the screen is much less than the numbers would make you believe. Good 2K (film and digital projection) is plenty. One of my customers did a 4K DI of a feature, his next film is back to 2K. Better to spend the money elsewhere, better script, better lighting, better grading etc. Of course, there is no excuse for bad 2K.

Link to comment
Share on other sites

  • Premium Member

I agree. I find it perplexing as well. My analysis of 2K revealed a very poor representation of film. The pixels are so large at 2K that it seems there is little point in originating in film in the first place. With storage so cheap and render farms aplenty, why short a big movie like this with big pixels? Is there some technical thingy we littlings don't know about? Does 2K hide bad CG better? Does 4K-8K actually create problems in FX simply because you can then distinguish crummy CG with the human eye? Maybe, it's something that simple.

 

Or maybe it's a marketing thing. If the big screen is nothing more than a billboard for ancillaries, then 2K-for-home-screen is all you'll really need.

Link to comment
Share on other sites

  • Premium Member

The difference is not as extreme as you think once you make a print. On a couple of projects, I've asked to see a 4K film-out and projected it next to a 2K film-out, and the improvement -- though it exists -- was so subtle that it was hard to justify the extra cost (though that's in my case, you'd think a "Star Trek" movie could justify it).

 

Part of the problem is that a first generation contact print is barely 2K once projected, so it's harder to see if the film-out was done at 4K.

 

Now I think 4K scanning is much more important -- a 4K scan downsampled to 2K preserves more fine detail than straight 2K scanning.

 

A straight-across 4K D.I. would be ideal, if not for archival reasons mostly... but I think there are diminishing returns once you get to the printing and projection stage.

Link to comment
Share on other sites

Guest Stephen Murphy

I saw a digital print of the movie last night and thought it looked incredible.

S

Link to comment
Share on other sites

Wait a minute! They only did a 2K DI?? You've got to be kidding me. Why on earth would they bottleneck a brand-new, huge-budget picture like this at 2K?

 

Does this have anything to do with FX? To me, this is madness. They should be future-proofing a movie like this at 4K, IMO.

Edited by Tom Lowe
Link to comment
Share on other sites

Part of the problem is that a first generation contact print is barely 2K once projected, so it's harder to see if the film-out was done at 4K.

 

I completely disagree with this statement. From my experience contact prints (optical prints are a completely different story, mind you, we are talking about emulsion-to-emulsion copying) lose practically no important information. What you are throwing out are extreme highlight details that may count as resolution on a chalk board, but aren't desirable anyway.

 

I'd say a traditionally printed film must have better bit depth (which you don't measure as resolution but should) than a DI film, so herein lies the problem with 2K, not the resolution but the colors.

 

I'd say, considering you're degrading 2K at least once (usuallly four generations of copying though), 3.2K should be the bear minimum that is used unless you're burning directly to print film. 4K would be preferable.

 

For something shot anamorphically though, 4K should be the bear minimum and 6.4K would be preferable.

 

Look at it this way: Isn't it sick that Star Trek VI, technically, had better resolution, even though it was shot S35 and optically blown up, than an anamorphically-shot film made in 2009?

 

Isn't it a shame that you're throwing out so much of the resolution shooting anamorphic, that you had might as well just shoot S35mm (unless all you care about are friggin' lens flares) anyway?

 

I mean, would you still be that 2K is good enough if they were shooting VV or 5-perf. 65mm?

 

 

I am really hoping that, come 2010, we are going to enter a new decade of digital quality standards, not the tired old 2K crap that should have been banished to the 1990s, frankly. . . :blink:

 

At the same time, even though I could tell it was a DI, I thought Star Trek looked quite nice.

Edited by Karl Borowski
Link to comment
Share on other sites

  • Premium Member

Hey, I'm just reporting on what I've seen in tests at the lab. The weak link is really printing process and the projector.

 

Like I said, I could see an improvement with all-4K once you go to print, but it was not 4X better, not like the difference in data levels would suggest.

 

It makes more sense for higher-quality anamorphic photography. It's harder to see the difference between a 2K versus 4K film-out for Super-35 and HD (and RED) material. For scanning, yes, 4K is a must I think.

 

And a lot of spherical 35mm photography these days seems more like 3K. I'm doing a D.I. right now and I've been a bit perplexed why everything isn't sharper, even in my contact prints off of the original negative, no D.I. at all. It's all 5219, Primo lenses, no diffusion, Super-1.85. But it's not as crisp as I'd like. Partly due to the softer light of Canada, partly because I've been used to anamorphic and HD photography, but I don't know. But it's not the fault of the D.I. process because I've looked at the original footage. I think part of the problem is that modern stocks are so wide in latitude that they lack enough edge contrast to make things snap.

 

I don't really think that 2K should be the standard for D.I. work, but I am saying that the improvement by going to an all-4K process is not going to dramatically improve some movies, especially if they are going to be released in prints made from dupe negs. Because the originals themselves are more like 3K. So it gets harder to justify the extra expense of a 4K film-out when you show a comparison test between 2K and 4K to the producers and director and they can barely see a difference, yet are told it would add $30,000 to the film-out (and another $30,000 to the scanning). These are rough numbers I've heard floating around. One facility told me that it would add about $50,000 to our D.I. work to stay in 4K.

 

For that increase, I'd almost rather make an extra 2K digital negative from the master so that more release prints can be off of that rather than an IP/IN.

 

But costs are coming down slowly -- scanning at 4K is becoming more common every year, because they all bought the scanners anyway, and the scanners keep getting a little faster.

 

For archival purposes, I think 4K should be the standard for 35mm material.

 

Now in terms of "Star Trek", we don't really know if it was an all-2K process or a 4K-to-2K process. The D.I. I'm doing right now, I was told that it was a 2K D.I. so I asked if they scanned at 2K and they said "no, we normally scan at 4K and then downsample to 2K."

Link to comment
Share on other sites

Now in terms of "Star Trek", we don't really know if it was an all-2K process or a 4K-to-2K process. The D.I. I'm doing right now, I was told that it was a 2K D.I. so I asked if they scanned at 2K and they said "no, we normally scan at 4K and then downsample to 2K."

 

You would think, David, that they would have to go higher than 2K with the scans for IMAX right?

 

I may go see it in IMAX tonight, and if so will report back.

 

The film I saw, a brand new 35mm print, looked quite sharp though, so even if it was 2K it was a good DI. Probably, yes, 4K knocked down to 2K.

 

I agree that the weak link is in generational copying. What I wish they would do is 3.2K plus scans (4K better to pick up the around 3.2K of film information), and 2K or better straight to print stock or INs.

 

The problem I have is that 1080P TVs are now, technically, better than what you are seeing on the big screen. To quote Billy Joel from Captain Jack "And That's SO WRONG"

Link to comment
Share on other sites

David is right,

 

it makes no sense to do the whole chain in 4K and then strike a contact IP and DN to print from. Much better to stay in 2K and strike the prints from the recorded 2K negative. We do that all the time and customers are happy. A second 2K negative doesn't cost more than a contact IP/DN from the first recorded negative; A typical contact printer has to be very well adjusted to render anywhere around 80lp/mm, so by eliminating two steps (IP/DN) you remain closer to the original 2K quality. Some labs still insist on striking IP/DN for various reasons.

Link to comment
Share on other sites

Umm, didn't I just say that too?

 

If you're just recording 2K and copying it even once, why even bother projecting on film then?

 

There's no point if you're just making 2K files softer. . .. :ph34r:

Link to comment
Share on other sites

You would think, David, that they would have to go higher than 2K with the scans for IMAX right?

You would think so, but it's actually backwards. The DMR process to bring 35mm films to IMAX basically runs the whole thing through noise reduction via wavelets, and then upsamples it significantly and I think adds new grain. So basically when you see it in IMAX, you see less of the original picture information.

 

IMO, they would be incredibly foolish to post and finish at 2K with no 4K FX and master 4K.
For a film that probably had (I'm just going to pull a number out of a hat) $50million dollars in VFX, doing the whole thing at 4k is a huge additional expense and would probably add several months to the post schedule. Yes, storage and processing power improves all the time, but even doing 2k for hundreds of really complex shots already stretches resources to the breaking point. On the film I just finished working on, we had ~550 shots, all at 2k, and we were severely strained for storage space and processors. I actually ended up rendering almost everything on my workstation, tying it up for over an hour at a time, rather than submit my shots to the queue. Had I needed to work in 4k, each shot probably would have taken me 50% longer to do, given the additional needed attention to detail and the vastly longer rendering times. It's likely that the producers considered 4k, but looked at the additional cost and scheduling, and compared it to the somewhat marginal increase in image quality, and decided that it wasn't worth it.
Link to comment
Share on other sites

For most major blocksbusters like Terminator 4 or Transformers 2, etc, would the effects typically be done at 2K or 4K. Just generally speaking, I am asking? I thought they usually did effects at 4k?

Link to comment
Share on other sites

  • Premium Member
Usually 2k. It's still pretty rare to go beyond that. I know that Flags of our Fathers was done at 4k, and sometimes the odd shot here or there will be done higher, but as far as I'm aware, the vast majority of effects are still done at 2k.

 

They sometimes do a few shots at 4K when they have to hold up better on the big screen, like a slow-moving wide landscape / cityscape shot from a helicopter in the daytime. But your typical effect in an action movie is only on screen briefly, and there's SO much motion blur that resolution isn't as critical.

Link to comment
Share on other sites

  • Premium Member
What about the big pictures, though, like Transformers?

 

"Transformers" is what I was thinking of -- so many cuts in those movies are so short that you could get away with sharpened sub-2K if you had to. With a movie like that with 500 or more effects shots, there's no way they are doing all the work at 4K.

 

Even "Spider-Man 2", which went through a 4K D.I., did all the effects at 2K.

 

The only big movie I can think of where they did a lot of effects work at higher resolution was "The Dark Knight". But that movie actually isn't wall-to-wall digital effects, a lot of it is classic stunt work with just digital work to paint out rigs, etc.

Link to comment
Share on other sites

  • Premium Member

What you need to remember with FX is that working at 4k rather than 2k is multiplying not only storage and render time by roughly four times, but also processor functionality. This directly effects the creativity process, so you have to spend more time working in proxies, more time estimating rather than dealing with problems practically, etc etc. Basically it is not just a matter of money but also creativity and time. You might have a squillion dollar budget, but that doesn't mean you want to wait for months on end to see a shot which doesn't work for what is in essence a smallish improvement- especially considering that films are rightly or wrongly always going to be watched more at home than a cinema. One or two effects shots at 4K (or above) are fine but a whole FX laden movie would be seriously hampered by doing the FX at such resolution however much cash and fascilities they had at their disposal. Then again in a few years it could change, but in reality we are only just comfortably working at 2K without staring at endless render bars.

 

Keith

Link to comment
Share on other sites

Even "Spider-Man 2", which went through a 4K D.I., did all the effects at 2K.

 

The only big movie I can think of where they did a lot of effects work at higher resolution was "The Dark Knight". But that movie actually isn't wall-to-wall digital effects, a lot of it is classic stunt work with just digital work to paint out rigs, etc.

 

What I was expecting "Dark Knight" and SM2 to have the effect of, though, was to help push the big render farms into getting ready for 4K. Once the infrastructure is in place, they ought be able to drop the price for subsequent films.

 

The hard part of filmmaking is being the *first* to do something, because usually you have to pay an arm and a leg to do something new or design new equipment or a new workflow. The first person pays for it, but then after that, others should be able to get the equipment/workflow for a relative bargain.

 

I thought Dark Knight's, what, 3.2K or 4K scanning and then 8K 70mm scans would have enabled the equipment to make this commonplace to be bought.

 

It's a real shame that that has not happened.

 

As for doing CGI at 2K, I'm fine with that. In fact, maybe it's better that way, because the CGI in Trek was still fake as all h*ll.

 

Honestly, they need to get some models for classic ship flyby/beauty shots. Those ships looked like cartoons.

 

Give me "Wrath of Khan" style effects over that crap any day. Even if model shots are harder to do, they still look one whole hell of a lot better.

 

If only they had had a movie budget for the script "Yesterday's Enterprise". . .

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...