Jump to content

Varicam for feature?


Chuck Hartsell

Recommended Posts

720p progressive video is a significant improvement over 1080i interlaced video. However the 720p format fails not in picture quality but rather in marketing where consumers prefer the higher numbers of 1080i. Consumers thinking they are getting higher resolution buy 1080i cameras and salesmen fan the flames with marketing buzzwords like "full HD 1080" and "only 1080i is real high definition". Salesman care nothing about picture quality but rather how many cameras that they can sell and it is a proven fact that higher numbers sell.

 

720p may not be a marketing sucess but nevertheless it is a format that is superior to 1080i and it cannot be dismissed as that video look but rather that awesome progressive scan look.

 

Lets face it the video look has been for a long time associated with that flickering 480i electronic news gathering look that we see on old fashioned cathode ray tube televisions with those ugly black scanning lines that make it look like you are watching television through a set of venetian blinds

Link to comment
Share on other sites

  • Replies 129
  • Created
  • Last Reply

Top Posters In This Topic

720p progressive video is a significant improvement over 1080i interlaced video. However the 720p format fails not in picture quality but rather in marketing where consumers prefer the higher numbers of 1080i. Consumers thinking they are getting higher resolution buy 1080i cameras and salesmen fan the flames with marketing buzzwords like "full HD 1080" and "only 1080i is real high definition". Salesman care nothing about picture quality but rather how many cameras that they can sell and it is a proven fact that higher numbers sell.

 

720p may not be a marketing sucess but nevertheless it is a format that is superior to 1080i and it cannot be dismissed as that video look but rather that awesome progressive scan look.

 

Lets face it the video look has been for a long time associated with that flickering 480i electronic news gathering look that we see on old fashioned cathode ray tube televisions with those ugly black scanning lines that make it look like you are watching television through a set of venetian blinds

 

Thomas,

 

I'm not being facetious at all when I ask, do you own a Varicam or have stock in Panasonic? I'm just looking for a disclaimer so that readers are aware of any bias that you may have against 1080i and the reasons for it. Thank you. :)

Link to comment
Share on other sites

Thanks David. That makes sense. Of course now that I think about it, I have been on 1080 24p sets.

 

I don?t want to be rude but why everyone is so stack with resolution?

 

Yesterday, today, and tomorrow (at least for the next couple of years) we will not be able to see in a theatre near us (meaning around the world not USA only) anything better than a normal 35mm projection.

 

Has anyone of you done a simple test.

 

Take a resolution file like this http://www.bealecorner.com/trv900/respat/respat.ps import it to Photoshop scale it to 4K and print it to your favorite lab for several feet?s as a repeat frame, make a positive print and project it to your favorite theatre.

 

You will be amazed by the result, you will see about 700 to 800 lines!!!

 

I know many filmmakers that shot with 1080 24p and at the final stage blurring the image? like Benini in Pinokio!

 

So in my point of view thinks like ?latitude?, ?tonal reproduction?, ?35mm Bokeh? and others (even a good scenario or an outstanding photography) are more important in audiovisual masterpieces than resolution.

 

And since we are in a Varicam Sub forum, it?s my opinion that is much more productive to focus on how we can maximize the performance of our tools than losing our track to technicalities like resolution, since Varicam has the latitude and the resolution (see above argument) to cover a theatrical release in my opinion.

 

And as for the subject mater (and since all that discussion is looking more television than cinema to me) the European Broadcasting Union ?EBU? has created a earthquake two years ago when a technical committee of numerous broadcast engineers of around Europe concluded that the HDTV preferable format for EBU will be the 720 50p and not the Sony proposed 1080 50i because of the better visual experience that offers?

 

Regards,

 

PS. For the records, EBU is at www.ebu.ch

http://www.ebu.ch/CMSimages/en/tec_text_r1..._tcm6-16462.pdf

Link to comment
Share on other sites

  • Premium Member

The problem with citing the resolution figures of print projection is that if, as you say, there is resolution loss in the chain of events culminating with print projection, then starting out with less resolution means ending up with less resolution, because the degradation is accumulative, not equalizing.

 

If a 4K 35mm negative becomes less than 800 lines by the time of projection, imagine what happens when you start out at 800 lines...

 

Look at standard definition television, which is even lower in resolution, yet we shoot 35mm for commercials and high-end television shows because we can see the difference in quality.

 

I've spent the last month seeing 1080P-shot features on the big screen like "Zodiak", "The Lookout", "Reign Over Me", "What Love Is" in the local cinemas, and I can see the resolution difference between those and the releases shot in 35mm. Not significant, but visible on a typical movie theater screen.

 

And watching the release prints of my own work in "Akeelah and the Bee", shot in 35mm anamorphic, I can see the difference between the 2K digitally-corrected shots and the ones that didn't go through a D.I., and I can definitely spot some 1080P HD material cut into the movie.

 

And I just did a 2K D.I. for a Super-35 movie and we had to use some 1080P stock footage material, and it is visibly softer on the big screen.

 

You CAN see the differences between the different formats on the big screen. You can see the difference between Super-35, Super-16, 35mm anamorphic, standard 1.85, 2K D.I., 4K D.I., show prints, release prints, 1080P, 720P, fast film, slow film, Cooke lenses, Zeiss lenses, zoom lenses, prime lenses, etc. Sometimes the differences are small, sometimes other factors are more significant, and the truth is that when an entire movie is at a lower resolution your eyes get used to it because you have no frame of reference. But that doesn't mean that resolution differences don't matter on the big screen.

 

I don't have a problem with 720P material being used for cinema release, especially for some low-budget movie that cannot afford otherwise, but that's a big difference from saying that it doesn't matter what resolution you shoot in. Let's not make the standard for cinema release based on the worst-case projection scenarios.

Link to comment
Share on other sites

  • Premium Member
720p progressive video is a significant improvement over 1080i interlaced video. However the 720p format fails not in picture quality but rather in marketing where consumers prefer the higher numbers of 1080i...

 

720p may not be a marketing sucess but nevertheless it is a format that is superior to 1080i and it cannot be dismissed as that video look but rather that awesome progressive scan look.

 

Thomas, surely you must realize that most of the people here on this board are film and video professionals who deal with HD in all its flavors on a regular basis. We're not the average misinformed consumer, nor are we all going by pure numbers. Most of us discussing this here and now have direct experience with 720P, 1080i, and 1080P. Opinions expressed here are likely based on that experience, not salesman BS.

 

Same goes for the "video look" -- most of us participating in these dicussions have seen enough HD 60i and 60P to have formed opinions on how much like "video" it appears to us. Of course those opinions are partly informed by experience with lower-res video, but what difference does it make? If a professional DP looks at a 60P image and says, "it looks like video to me," then that's his assessment. Your assessment may be different, and that's fine.

 

Professional directors of photography are charged with evaluating imaging systems and their charactistics for particular use. If a professional, experienced DP isn't qualified to evaluate the most appropriate frame rate and resolution for a given project, then who is?

Link to comment
Share on other sites

I don?t want to be rude but why everyone is so stack with resolution?

 

I don't mean to be defensive, but why pick on me? :D I have been trying to get my facts straight thats all.

 

For what its worth I think the numbers game gets old pretty fast and quite honestly it is usually fruitless. I think for the vast majority of people here the numbers that end up dictating a project are the ones with a $ in front of them. I could argue all day long how HDCAM SR is superior to DVCPRO HD or that 1080i is better than 720p to a producer, but if the money aint there, it aint there. If the best we can afford is a Varicam (or maybe somebodies cousing works at a rental shop) then thats what we shoot. I don't mean to be apathetic its not that I don't care about the all the cool numbers (1080, 720, 35 etc..) but generally speaking it comes down to what is most cost effective. I realize this means something different for each of us, but you can't ignore it.

 

I thought "Zodiac" looked great.

 

Sorry to go off topic.

Link to comment
Share on other sites

  • Premium Member
There is no spit of difference between shooting 1080/60i and 720/60P.

 

Both are HDTV broadcast formats, so 1080/60i when broadcast on a 720P station just gets converted to 540/60P (fields turned into frames), 1920 horizontal gets downrezzed to 1280 and 540 vertical gets uprezzed to 720.

 

And 720/60P when broadcast at 1080/60i just gets the P frames split into fields, and 1280 horizontal is uprezzed to 1920 and 720 vertical is downrezzed to 540.

 

And there is NO difference either way in motion reproduction, and the net resolution is about the same. Either way, 1080/60i and 720/60P, motion is sampled 60 times per second. 60 images captured per second. Once 1080/60i is converted to 720/60P, it looks the same basically -- there are no interlaced-scan artifacts. And once 720/60P is converted to 1080/60i, it also looks the exactly same, as if you had shot at 60i in the first place. Not "sort of", but exactly the same. 60 motion samples per second -- the only difference is the display format.

 

As for 24P, that's used for its unique film-like motion characteristics, since 24 fps is the standard for film production. If 60 fps were the standard for film, then digital would also have to shoot at 60 to match that look. But it isn't. Has nothing to do with right or wrong, or best or worst, it's just about the look you want to achieve and audience expectations.

 

Trying to compare resolution between 1080/24P and 720/60P is pointless because they aren't used for the same types of projects. And you can't make a clear connection between pixel resolution and temporal resolution, which is not even accepted by everyone as a type of real resolution anyway. Not to mention that the majority of 720P production is shot at 24P and only recorded to 60P with redundant frames.

 

 

Go to Hell David!!!! You just spoiled it for the 90,000 wannabe DPs out there who thought it was all about formats and numbers. How dare you ruin the fantasy. A fantasy that manufactures have been using to sell cameras to people who really have no need for them, but who think if they have the most resolution, the 'best' format, the coolest camera, the coolest way of recording, etc, they will actually feel better about themselves and maybe someone will tell them that their existence is okay. Man this board is turning from a bunch of web searching amateurs who used to act like qualified DPs to a bunch of qualified DPs who've known all along that none of these formats make a difference. None are better. Yes, (SPOIL ALERT), none are better!!!!!!!!

 

I hope your mother is proud of you!! :)

 

Seriously...

 

It's posts like yours that prove who is qualified to talk here and who needs to be asking more questions rather than cutting and pasting numbers. Now if only the ones that act like web searching, know-it-alls, could just stop trying to be cool and could rather humble themselves so as to to get into legitimate discussions and maybe asking questions instead of doing web searches then spitting back numbers like it means anything, we'd probably have the best single site on the web to talk amongst cinematographers of all categories and experiences. Rather we have constant arguments amongst the know-nothings and those that actually use the stuff in the real world who simply see through all the crap that the rest spew. Sorry to sound so down after the last smiling paragraph but what you said should be posted on every other page of every magazine article and on amateur sites like DVXUSER. Man these people just don't get it. HD is 'H' but there are far too many conditions for both acquisition and distribution to say that any one number means any more than another.

 

I'm not against folks that are new to this field, or who are learning. I love them in the mix and on sites like this. Lord knows I spend a lot of time helping and teaching in both articles, DVDs, lectures, and constant email questions on my spare time and sometimes on time I don't have. But some of these guys need to simply sit back and watch instead of trying to run 24 heures du Mans with little or no driving experience.

 

I had a realization last night. After nearly a year of watching both HD channels and SD channels on a nifty HD set, I'm starting to have a real tough time telling what is HD and what is not. Of course I'm sitting 9 feet from my set so I really shouldn't be able to see much of a difference anyway. The contrast of the set alone is good enough to fool 90% of the people watching it that its' HD when it's not. And now if I didn't have the screen to tell me what resolution I'm watching as it does when I switch channels, I'd never know the difference. Guess the hype wore off, but then again that is what we said when HD came out. Eventually you have sharper pictures and no one will notice anymore. Now try to explain that sensation to all those that wear numbers like 1080 and 720 on their sleeve as some sort of validation. That must really confuse them. That should spur some web searching and number spewing...

Link to comment
Share on other sites

  • Premium Member

Numbers matter... but only in context to real world conditions, unless you are an engineer working in a lab. But even just looking at numbers alone will tell you that 720/60P and 1080/60i are not that far apart.

 

I have a blood disorder called Thalacemia Minor, in the same category as Sickle Cell Anemia, some sort of blood disorder that evolved in Mediterranean countries, possibly Asian countries too, probably as a defense against some tropical diseases like malaria.

 

Means that my bone marrow releases new red blood cells prematurely. So my individual counts are all off -- I have too many red blood cells (which can be fatal due to crowding) except that they are too small (so no crowding) and they don't have enough hemoglobin in each cell (normally bad) except that I have too many cells! So everything averages out to normal. When I was a kid, the doctors thought I had anemia even though I was healthy.

 

Anyway, the specs of these cameras and formats are a little like that sometimes -- one thing counteracts another, or simply does some things better than other things.

 

What ultimately matters are two things: that the camera person's skills can take the quirks and weaknesses of various cameras and formats into account and either mask them or exploit them; and second, what your eyes can see in terms of those weaknesses and then making judgement calls regarding the relative importance in the overall scheme of the production. I'm not going to apply the same standards shooting a video interview meant for a DVD versus something that is going to be projected on a 70' screen -- and it has nothing to do with caring less about the smaller DVD project, it's merely a matter of caring about the things that matter, about the things that can be seen.

 

I remember this one DP who was proud that he managed to get all the color layers of his film emulsion to perfectly track by using minor color-correction filters, and some lab manipulation... but the scene was not shot that well, could have been lit more interestingly. So it's a matter of putting your efforts in areas that can be seen by the viewer -- and sometimes resolution is something that can be seen and it matters for that particular presentation method, but other times, it's not as big an issue. "The Devil Wears Prada" does not have to look as sharp as "2001" (in fact, it would probably be a bad thing if it were.) A tiny figure in a vast landscape needs more resolution to, well, resolve such fine detail on a big screen, but a big close-up of a middle-aged actress probably should not be too detailed.

Link to comment
Share on other sites

Just for the record, I agree with just about everything that David Mullen has said about these issues, and he has said it well.

 

However, since this is the Varicam forum and I own a Varicam....

 

Last month I shot a test of the Varicam and had Efilm make a 35mm anamorphic print from an f-900r and a Varicam. Both cameras made beautiful prints and were very hard to tell apart. But neither print had the detail of a 35mm film original print, especially, a 35mm anamorphic orginal.

 

That said, the test print, in my opinion, had as much detail as many of the release prints I've seen at the local multiplex. The reason, I believe, is that the release prints have gone through multiple optical print/negative generations and the beautiful 35mm detail that we love at the dailies, answer prints, and studio screening prints is lost through the generations. If one can make release prints from multiple film out original negatives from HD originals, than the HD original has a fighting chance in a showdown at the multiplex.

 

Regarding shooting in 60p, I'd love to shoot and release in that format. I kind of like the look (with a shutter speed longer than 1/120 sec though). But it's not practical because 60p does not convert well to 50i or 25p for non-US distribution. And come to think of it, 60p converted to 60i would have all the interlace movement artifacts that could have been avoided by shooting at 30p and converting to 60i. Not to mention, that it's just too much data for internet distribution where 24p would be more practical.

 

I thought Showscan looked awesome, except for the unatural lack of motion blur when each frame was shot at 1/120 sec...When someone waved their arm, it looked like watching 5 freeze frames or the arm at once.

 

-bruce

Link to comment
Share on other sites

  • Premium Member
That said, the test print, in my opinion, had as much detail as many of the release prints I've seen at the local multiplex. The reason, I believe, is that the release prints have gone through multiple optical print/negative generations and the beautiful 35mm detail that we love at the dailies, answer prints, and studio screening prints is lost through the generations. If one can make release prints from multiple film out original negatives from HD originals, than the HD original has a fighting chance in a showdown at the multiplex.

 

Sure, but 35mm would also benefit from not going through an IP/IN generation loss. You have to compare apples to apples, you can't compare 35mm through two extra generations to HD-to-35mm with no IP/IN step. You could do a 4K or even 2K D.I. of 35mm and make multiple "original" IN's for printing just as easily as you could with HD-to-35mm.

 

One of the better HD-to-35mm prints I saw was for "Flyboys" and I believe they made multiple "original" negatives for printing, which caused a discussion between me and Daryn Okada that 2K original resolution with direct release prints from the original negative seems like 4K resolution with release prints made from an IP/IN step.

 

But ultimately the goal should be multiple original IN's at 4K for release printing.

 

Trouble is that most people shoot HD to save money because their budget is lower, and therefore, the distributor is not going to pay for multiple original negatives to be output. Any HD feature I shot that had more than a few prints made (such as "D.E.B.S.") went through an IP/IN generation for the release prints, despite me asking that they just output more "original" printing negatives. In fact, I have yet to get any of my projects that were shot in HD or went through a D.I. to get the distributor to output more IN's from the digital master.

 

I've had about five 1080P HD features transferred to 35mm and they all looked fairly sharp on their own; it's only in direct comparison to 35mm material that you can really spot the lower resolution. But I don't think that justifies dropping 35mm as the gold standard for resolution for theatrical releases. Truth is that we shouldn't have dropped 65mm for big-budget projects, and now we're fighting to keep 35mm resolution as the standard, so I certainly don't want to see 1080P and 720P established as the new standard for cinema release. The goal should be 4K origination ultimately, and if possible, 4K digital projection so we can start to actually improve the quality of movies in the theaters. Digital technology gives us the potential for improvement in standards if we don't start settling for less. Of course, I realize that for practical reasons, we choose to work at all sorts of quality levels.

 

There are measurable resolution differences between 720P, 1080P, 2K, and 4K. And with newer digital camera technology coming, we don't have to settle for quality level of the Sony F900 and the Panasonic Varicam for theatrical release work, we can and should demand better tools. Imagine if back in 2000, Hollywood decided that the F900 HDCAM format was "good enough" for all theatrical releases? We'd look pretty foolish today.

 

Bruce, would you be happy if you never shot 35mm again and all your work was done on the Varicam? My experience has been that my 35mm work always looks better than my HD work -- I keep hoping for an affordable digital camera to come along where I don't have to feel I'm making too many compromises. I actually LIKE shooting digitally, I like seeing what I'm doing on a big HD monitor, I like not dealing with inaccurate dailies. But I prefer the results I get with 35mm.

Link to comment
Share on other sites

I just finished watching American Idol in high definition shot with the Varicam at 720p60. It looked really good, had no compression artifacts and it generated a very sharp live theatre look. And it completely refutes all the arguments that 720p is not real high definition so you might as well shoot in standard definition because there is only a 20 percent quality improvement between standard definition and 720p. In actuality the quality improvement is more like 400 percent rather than the 20 percent that the naysayers love to quote. But the problem is the naysayers consider only vertical resolution when counting pixels. Anyway it is obvious that 720p will hold up on the big screen but there could be a problem with multigenerational losses just like when you make a Zerox out of a Zerox out of a Zerox. But with HD-DVD or Blu-Ray distribution there are no generational losses because this is a digital format and this format also supports 720p60 which would make the Varicam an excellent choice for filming action movies.

Link to comment
Share on other sites

  • Premium Member
I just finished watching American Idol in high definition shot with the Varicam at 720p60. It looked really good, had no compression artifacts and it generated a very sharp live theatre look.

 

But who would want a movie to look like "American Idol"???

 

Anyway, you seem to be arguing with yourself -- most of us here agree that 720P is an accepted high-def format.

 

Sure, it may be interesting to release an action movie on HD-DVD/Blu-Ray that was shot in 720/60P for the smoother "hyper-real" motion as an experiment, although some people are naturally going to think it looks video-ish, like a live TV broadcast, rather than a "movie".

 

And again, anyone watching it on an 1080i monitor is going to think it was shot in 60i. Same goes when it gets broadcast in 60i/1080 and 60i/480. You can't control whether your movie gets sold to an HD channel with a 1080i standard versus a 720P standard.

Link to comment
Share on other sites

Unless you are making a movie about American Idol. But even then that movie if it is ever produced will probably be shot in 24p with the exception of the mockup live performances which will be shot in 60p so it will have hybrid frame rates since it is a movie about a television show.

Link to comment
Share on other sites

it completely refutes all the arguments that 720p is not real high definition so you might as well shoot in standard definition because there is only a 20 percent quality improvement between standard definition and 720p. In actuality the quality improvement is more like 400 percent rather than the 20 percent that the naysayers love to quote. But the problem is the naysayers consider only vertical resolution when counting pixels.

 

 

Thomas,

 

No-one here has said that 720p is not 'true HD' (whatever that means). Could you please tell us who these 'naysayers' are and then go and have this boring, repetitive, misinformed argument with them, and leave the rest of us in peace.

Link to comment
Share on other sites

  • Premium Member
My experience has been that my 35mm work always looks better than my HD work -- I keep hoping for an affordable digital camera to come along where I don't have to feel I'm making too many compromises. I actually LIKE shooting digitally, I like seeing what I'm doing on a big HD monitor, I like not dealing with inaccurate dailies. But I prefer the results I get with 35mm.

 

 

Here, here. It simply hasn't been equaled in its abilities. Needless to say Hollywood hasn't caught on because of it. And part of it has to do with the fact that we are analog creatures and analog is simply appealing to us. Sort of like the other thread hear about the difference between digital noise and film grain.

 

That said, the Varicam can make a great camera for features. Part of the results has to do with the talent behind the camera, part with the method and the people responsible for getting it to whatever means you do. I've always thought this comparison method of discussing cameras was fruitless as we don't normally watch a film that was shot on two different cameras so for the most part, I look at it with working with what tool you choose (or can afford) and making the best of it. Of course there are situations such as the film Collateral where you see different cameras in action and (on the big screen) it becomes clearly noticeable (even though the finishing process tries to balance it all as best as possible). But content drives most of your audience so work on complimenting a good story visually and no one will care at the end of the day what format you use if it's good.

Edited by WALTER GRAFF
Link to comment
Share on other sites

The goal should be 4K origination ultimately, and if possible, 4K digital projection so we can start to actually improve the quality of movies in the theaters.

 

That is the goal. I don't think anyone is currently stating that it isn't. It's also unaffordable right now to all but the largest studio pictures, for many reasons - as are multiple printing negatives, when compared to the considerably less costly, and much faster to create IP's and IN's. I really don't think anyone is taking their eye off the ball in our industry, it's just that current technology has limitations, very real ones. It's very easy to say something should be done a certain way. It's another thing to justify that in terms of increased revenue if it's going to cost considerably more to achieve it. Unfortunately, show business is, well, a business. And it will continue to be run like a business - a dysfunctional business, but a business nonetheless.

Link to comment
Share on other sites

  • Premium Member
That is the goal. I don't think anyone is currently stating that it isn't. It's also unaffordable right now to all but the largest studio pictures, for many reasons - as are multiple printing negatives, when compared to the considerably less costly, and much faster to create IP's and IN's. I really don't think anyone is taking their eye off the ball in our industry, it's just that current technology has limitations, very real ones.

 

I'm as aware of the financial restrictions as anyone.

 

I just react everytime I run across the old "good enough" argument that gets trotted out, usually starting with the phrase "by the time you see something in the theater, it's only 800 lines anyway..." Like I said, the logic problem there is that the degradation process of post to release doesn't therefore mean that one should start out at 800 lines of resolution if you're only going to end up at 800 lines.

 

It goes back to what I said about NTSC broadcast being able to differentiate between 35mm and Super-16 even though it's only 720 x 480, well below the resolution levels of those film formats. There are advantages to oversampling.

 

It's also similar to the argument that if HD broadcast is so heavily compressed, then the original photography can use the same level of compression.

 

We don't originate at the resolution (or compression level) of the final worst-case distribution format.

 

I actually do run across a number of people in the industry who don't have their eye on the ball and have fallen into a "2K is good enough, 4K is ridiculous" mentality, rather than a "2K is what is practical now, 4K is the future" mentality. I worked with an efx supervisor who swore that there was no difference between 2K and 4K, and then proceded to deliver some of the blurriest visual efx I have ever seen, barely 1K-looking at times! So as you can imagine, I decided that the problem was that he couldn't see the difference.

Link to comment
Share on other sites

I think I have to clarify few thinks?

 

Phil Gerke sorry instead to press the quote button I press the reply?

 

David, as my signature states, above all, I?m an electronic engineer with more than 20 years of R&D in my back, in which the last 7 is in the AV industry (TV & Film) so my engineering background doesn?t allow me to talk without measurements and numbers. The art of Cinema wouldn?t be here if the art of Engineering wouldn?t allow it in the first place many years ago. So Cinema is evolving along with engineering through the years (first B&W then voice then color etc.) and since engineering is coming first, numbers are always relevant.

 

BUT the art of MAKING CINEMA is a different think and I agree with you.

 

The storytelling has nothing to do with engineering numbers (except financial numbers that they always apply?!!).

 

We have to try for the better and bigger every time so I agree that we have to peruse the 4K acquisition and projection (and even 3D). I?m waiting to order my RED think also ?

 

So when it comes to ?How we gone light that scene? I?m backing of and see the art happening in front of my eyes and let the Maestro (DoP) to do his magic*.

 

But when we engineers trying to make thinks bigger and better subjective evaluation is NOT efficient. There are thinks that we have to measure for instance thinks that physics are the limiting factor. Like projection systems and particularly projection lens systems.

 

Did you know that on my test print with the resolution chart the limiting factor was the projection system and not the generations? I see my negative with microscope and I was having 1700 lines visible and in my positive 1600 lines!!! And in the projection 700**!

 

So NO, the film process isn?t the problem?

 

So let us now discuss something more critical. The aperture of the 35mm is one very small slot that tremendous light flux has to pass with uniformity to be able to project 80m away in a 400 square meters screen that has to have at least 2000:1 contrast to be adequate for a theatrical viewing.

 

The projection lens system has a big problem to solve. It?s a give and take game. The more light power the less MTF aka less resolution, that?s why when you see a screening in the lab which is always in a lot smaller screen than your nearest multiplex everything is more detailed. In the real theatre where our clients (audience) are seeing our works of art there are some considerable obstacles that doesn?t allow as to see more resolution and that?s old projection machines, projection lens not replaced every six months (the lenses are burned out from the light flux) bad focus etc.

So is easily understood as much resolution as we pump the projectors would not be able to project it except if a major lens technology come out and solve the problem.

 

Until then a 65mm projector will have a much better resolution for our eyes to see.

 

And we are coming back where we started:

 

?So in my point of view thinks like ?latitude?, ?tonal reproduction?, ?35mm Bokeh? and others (even a good scenario or an outstanding photography) are more important in audiovisual masterpieces than resolution.?

 

And to that I think we all agree. Since the current state of the art is reaching the 700 lines that a normal projector can show.

 

In my facility in the same machines I have done 2K DI with 35mm film as well as Varicam transfers. I can easily say that the difference is only subjective when we comparing similar thinks 1,85:1 HD with 1,85:1 3 perf 35mm film similar light conditions and subject (except of the bokeh).

 

But remember to do that you have to handle Varicam in the correct way.

 

?? in order to correctly expose with Varicam you NEED a grey card and a SPECIAL exposure chart from Panasonic. And, when you colour grading you need to apply a special cineon curve (again sourced from Panasonic) in order to do on the fly the reverse Telecine transform in the LOG colour space (YES Varicam records images in a LOG way).?

 

As I wrote in another post

 

These are my two cents?

 

Regards,

 

* In a very good cinematic HD package I don?t think that DIT has a place in the set just a lightmeter and light as if you have a positive stock in the camera not a negative ?that?s my opinion

 

** Anomorhic projection has more detail because is a bigger aperture than 1,85:1 so instead of 24mm x 13mm you have 24mm x 18mm gate = more light better MTF.

Link to comment
Share on other sites

David additionally

 

I agree that over sampling is for the best, but where and when it?s needed, not everywhere without discrimination.

 

Because it?s a waste of resources and some times producers want to put more money and resources for costumes for instance or for better set even for better VFX not to a wasted 4K processing just for the eyes of the Hard disks that this footage will lay in until its being erased for the next project?

 

Everything has to have a justification, and I sympathise you for what has happened in the past to you but this is not the best case in order to justify the 4K everywhere argument.

 

We have to give storytelling a chance and not waste money for big, big and bigger without a meaning.

 

I don?t have anything personal with you David in contrast I read a lot of your writings carefully and I admire your work but this is my professional opinion.

 

Regards,

Link to comment
Share on other sites

Sure, but 35mm would also benefit from not going through an IP/IN generation loss. You have to compare apples to apples, you can't compare 35mm through two extra generations to HD-to-35mm with no IP/IN step. You could do a 4K or even 2K D.I. of 35mm and make multiple "original" IN's for printing just as easily as you could with HD-to-35mm.

 

One of the better HD-to-35mm prints I saw was for "Flyboys" and I believe they made multiple "original" negatives for printing, which caused a discussion between me and Daryn Okada that 2K original resolution with direct release prints from the original negative seems like 4K resolution with release prints made from an IP/IN step.

 

But ultimately the goal should be multiple original IN's at 4K for release printing.

 

Trouble is that most people shoot HD to save money because their budget is lower, and therefore, the distributor is not going to pay for multiple original negatives to be output. Any HD feature I shot that had more than a few prints made (such as "D.E.B.S.") went through an IP/IN generation for the release prints, despite me asking that they just output more "original" printing negatives. In fact, I have yet to get any of my projects that were shot in HD or went through a D.I. to get the distributor to output more IN's from the digital master.

 

I've had about five 1080P HD features transferred to 35mm and they all looked fairly sharp on their own; it's only in direct comparison to 35mm material that you can really spot the lower resolution. But I don't think that justifies dropping 35mm as the gold standard for resolution for theatrical releases. Truth is that we shouldn't have dropped 65mm for big-budget projects, and now we're fighting to keep 35mm resolution as the standard, so I certainly don't want to see 1080P and 720P established as the new standard for cinema release. The goal should be 4K origination ultimately, and if possible, 4K digital projection so we can start to actually improve the quality of movies in the theaters. Digital technology gives us the potential for improvement in standards if we don't start settling for less. Of course, I realize that for practical reasons, we choose to work at all sorts of quality levels.

 

There are measurable resolution differences between 720P, 1080P, 2K, and 4K. And with newer digital camera technology coming, we don't have to settle for quality level of the Sony F900 and the Panasonic Varicam for theatrical release work, we can and should demand better tools. Imagine if back in 2000, Hollywood decided that the F900 HDCAM format was "good enough" for all theatrical releases? We'd look pretty foolish today.

 

Bruce, would you be happy if you never shot 35mm again and all your work was done on the Varicam? My experience has been that my 35mm work always looks better than my HD work -- I keep hoping for an affordable digital camera to come along where I don't have to feel I'm making too many compromises. I actually LIKE shooting digitally, I like seeing what I'm doing on a big HD monitor, I like not dealing with inaccurate dailies. But I prefer the results I get with 35mm.

 

David, thanks for the great reply!

 

Would I be happy if I never shot 35mm and had to shoot everything on a Varicam? Yes, if that meant I always had a job :D But seriously, shooting with an 8 bit camera where you have to nail the exposure and lock in the look on the set is not really how I like to work. I'd much rather shoot in 4K with a raw workflow and view the image via look-up tables on the set. I really like the way digitial images can look when they are of high quality. Given the choice between shooting a picture on 35mm film and the Varicam, I would choose the film usually. 16mm film, and I would think about it and weigh each alternative.

 

I guess my original point was that the Varicam is a great tool and that the quality of the image possible on the big screen is not very different than many release prints. And that this has much to do with the poor quality of the release prints. Of course an HD feature that goes through all the optical generations will look even worse at the theater. I saw Zodiak last week and I thought the film print lacked resolution and looked like mud for most of the picture. I did like the movie though :)

 

Shooting 8 bit HD with a Varicam or a Sony f-900 is a good choice for projects with a small chance of theatrical release and limited budgets. The images from these cameras will play very well when distributed in HD and hold up well enough that a theatrical distribution is possible. I think the Varicam in particular is a good choice for truly independent projects because of the DVCPROhd postproduction workflow.

 

I do really miss the days when the major 35mm films were released on 70mm prints (at least at a few cinemas) I remember watching "The Right Stuff" in 70mm and being overwhelmed by the quality of the print and the photography. I think this process ended when they found a way to play the surround sound from a 35mm print, and it's really a shame. If we want to get people to pay $10 and up at the cinema, we should be offering a visual experience that was available more than 20 years ago.

 

I saw "Fame" when first released in 70mm at the Zigfeld theatre in NYC. It just looked awesome on the very giant screen with the surround sound and the music....That experience might be why I'm not an economist today :blink:

 

-bruce

Link to comment
Share on other sites

I'm as aware of the financial restrictions as anyone.

 

I just react everytime I run across the old "good enough" argument that gets trotted out, usually starting with the phrase "by the time you see something in the theater, it's only 800 lines anyway..." Like I said, the logic problem there is that the degradation process of post to release doesn't therefore mean that one should start out at 800 lines of resolution if you're only going to end up at 800 lines.

 

 

If I had a nickel for everytime a producer asked me if their shaky handheld footage from the HDV will cut together with the F900...

 

There used to be something called "broadcast standard" which was never clearly defined. But now when you've got so many different formats and little cameras being thrown around by directors who think they are qualified cameramen or cheap PAs who are young auteurs in their own heads, it is difficult to continue the fight for quality. In one sense, they hire me because they know what I can do and like the final product. But then they'll undermine the project (and ultimately what I'm doing) by casually throwing in the additional "little cheap camera" operated by the aforementioned auteur.

 

So, I guess the point is that it isn't just the technology which "threatens" overall quality, but it is also the prevailing attitude that ANYTHING with a lens anymore is accepted as being "good enough." We (I) can strive to get the best, but ultimately, when someone else is paying the bills and making those choices, all I can do is make sure that I've dotted my i's and crossed my t's with my own work. I presume that's all any of us can really do. The only other option is to walk away from the job.

Link to comment
Share on other sites

I feel that I have to add some more info.

 

It?s not that am biased to Varicam but I think that very few DoP in that forum worked with Varicam properly (after all we are in the Varicam sub forum). Probably because of a misunderstanding or a lack of documentation from the side of Panasonic or a misleading info from various sources.

 

Given the choice between shooting a picture on 35mm film and the Varicam, I would choose the film usually. 16mm film, and I would think about it and weigh each alternative.

According to our tests if I would put a score to every format in the base of how it looks in a 35mm projection it would be:

 

35mm DI 4K scan to 2K process = 10.000

35mm DI 2K scan to 2K process = 9.750

RGB cameras = 9.300 ? 9.600

Varicam 10 bit Cinema Log DI to 35mm = 9.100

F900R 10 bit Cinema Log DI to 35mm = 8.900

35mm DI 2K Telecine scan to 2K process = 8.600

Varicam tape Cinema Log DI to 35mm = 8.500

F900R tape Cinema Log DI to 35mm = 7.800

35mm old analogue way = 7.600

16mm DI 2K scan to 2K process = 5.250

16mm DI Telecine HD to HD process = 4.500

DV from DVX100 to 35mm = 2.700

16mm old analogue way to 35mm blow-up = 2.500

Digital beta to 35mm blow-up = 1.000

 

This is our assessment in formats

 

If we add the cost to benefit graph only Varicam is the winner. An observer can see that is so small the difference in the top between them that is within marginal error.

 

Shooting 8 bit HD with a Varicam or a Sony f-900 is a good choice for projects with a small chance of theatrical release and limited budgets.

The 8 bit issue we will discuss it later.

 

We are based in Greece a country with 10 mil people. A Greek block buster is selling no more than 1.2 million tickets and this is happening once every few years. The absolute record is with Titanic 1.8 million. The typical ticket count is between 10.000 to 300.000 tickets the profit to the producer is about 2.7$ per ticket. So the A class movies having a budget of 300k to 800k (the top is very risky) and the B class 50k to 250k. There are productions that have over a million budgets but they are counted in one hand in the history of Greek cinematography.

 

In this type of market is living the 2/3 of the world cinematographers.

 

When you are discussing in a global medium like internet thinks like that, imagine what a Romanian can do or a Nigerian or a Vietnamese or a Dutch or even a Greek. It?s not USA or English talking regions the whole world.

There are people that they shot just two takes its scene to make it in 35mm because they try to do ?FILM? with 50K budget for film costs (including rentals)?

 

So the evolution of the digital cinematography will help this entire people to express themselves in a much better way and it?s our duty (engineers) to assist them in order to leverage the quality of their storytelling by teaching them how to maximize performance of tools like Varicam.

 

But seriously, shooting with an 8 bit camera where you have to nail the exposure and lock in the look on the set is not really how I like to work.

To shoot with Varicam (at least how we setting the Camera) you have to use your light meter as if it was positive film (in negative it would be better to overexpose 1 stop but HD is positive so it?s vice versa) tungsten or daylight (the settings that we load are pre-balanced) with EI 640 (-3db, MG0,35, DL500%), keep iris under 4 to 2.8, no white balance only black balance and look the monitor just to frame and check focus. Light as it was a film camera with over 10 stops of latitude (YES Varicam has over 10 stops of latitude!) and never light using the monitor. Better to use your eyes (HD monitors doesn?t have the latitude is like video assist in film cameras). Use lenses as it was 16mm and that?s all the rest is on the DI suite. You can sleep confident that everything is OK.

 

Do the editing in your favourite Avid. Export an OMF from the timeline and bring it along with the original tapes to us.

 

In the DI suite we are rebatching the whole material from original tapes in uncompressed 10bit native resolution 1280x720. We reconstruct the whole timeline in FCP and then we sending it to Final Touch 2K. We working in a Log 10 bit colour space by loading a special cineon reverse Telecine curve in CFX room of FT. We colour grade as a 2K film DI with Cinespace and our custom LUTs. And the trick is from that point on. We render using RGB colourspace instead of YUV at 32bit per colour in the same time that we up-resing the images to 2K. This technique is forcing the FT to resample everything in 32bit per colour. At that point the image gets tremendous tonal clarity and compressing the noise. Is like, we using during recoding to tape a pre-emphasis and during rendering a de-emphasis like Dolby is doing in audio for years. At the final stage during film printing we add sharpening because during shooting the detail was off.

 

If you record uncompressed 10bit from shooting then the results are far better.

 

I think that there are very few of you guys that have worked like that in your HD projects and this is the basis of all your problems with HD for cinema shooting.

 

Waiting for comments.

 

Regards,

Link to comment
Share on other sites

  • Premium Member
35mm DI 4K scan to 2K process = 10.000

35mm DI 2K scan to 2K process = 9.750

RGB cameras = 9.300 ? 9.600

Varicam 10 bit Cinema Log DI to 35mm = 9.100

F900R 10 bit Cinema Log DI to 35mm = 8.900

35mm DI 2K Telecine scan to 2K process = 8.600

 

I'm not going to get into a prolonged argument with you since I didn't see your tests, but I have a hard time believing that a 2K 10-bit Log RGB scan of 35mm color negative under-performs something shot in 8-bit HD, regardless of how you processed the information.

 

8-bit HD camcorder photography has: (1) less exposure information, (2) less color information, (3) less resolution, and (4) more compression than a 2K RGB scan of 35mm -- that's not my opinion, those are just basic facts.

 

You can't create information that didn't originally exist, and color negative is capable of capturing more information. So in theory, whatever special processing you do to improve 8-bit HD origination should work even better when starting out with more information all-around. So I can only theorize that the 2K scanning of the 35mm frame for your test was poorly done, or there was some other flaw in the post chain.

 

I mean, a 2K scan of 35mm means each color record is 2048 x 1556, whereas a Varicam frame recorded to 4:2:2 DVCPRO-HD is something like 1280 x 720 (actually I think less than 1280 horizontal) just for green, and half that for red and blue (640 x360). Not to mention the compression of DVCPRO-HD versus uncompressed for the 2K scan.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Forum Sponsors

BOKEH RENTALS

Film Gears

Metropolis Post

New Pro Video - New and Used Equipment

Visual Products

Gamma Ray Digital Inc

Broadcast Solutions Inc

CineLab

CINELEASE

Cinematography Books and Gear



×
×
  • Create New...