Jump to content

HDV on the Big Screen


Guest Charlie Seper

Recommended Posts

  • Premium Member
Well shot hi-def looks every bit as good as film whether on a TV set or a big screen. The differences are so minimal that it would take quite a sissy to squack about it.

Considering your department, you must forgive me if I dissmiss this statment of yours as uninformed and worthless. But on the other hand you've just managed to insult a forum full of camera people...

Link to comment
Share on other sites

  • Replies 188
  • Created
  • Last Reply

Top Posters In This Topic

  • Premium Member

Somewhere between the "all video is crap" and the "HD is as good as 35mm even for the big screen" lies the reality.

 

HD can look very good on the big screen -- for example, "Sin City", "Revenge of the Sith" -- but it has its own look and in measurable ways, it is not as good as 35mm color neg in terms of resolution, exposure range, color depth, plus film doesn't have inherent digital artifacts. On the other hand, film has a number of its own artifacts, some of which can be considered negative depending on who you talk to.

 

But anyone who says that HD and 35mm are equivalent in technical quality are ignoring fairly basic technical measurements of quality! Not to say that HD's deficiencies can't be masked or minimized if care is taken, and its more limited exposure range compensated for in lighting, exposure, filtering, etc.

 

On the other hand, anyone who dismisses HD as being a crappy-looking format is expressing a bias. I've seen what would be considered tradionally "beautiful" images shot on HD. It just doesn't look exactly the same as film, so it depends on how much that fact bothers you. For some, the more "plasticy" clean look of digital images is not a problem and the grainer look of film is.

 

But all of this doesn't get around the fact that HD comes short of matching 35mm in many technical ways.

 

Now if you're not shooting for theatrical projection, but for electronic presentation on home systems, you have more leeway to choose digital over film, although the differences are still visible.

Link to comment
Share on other sites

Somewhere between the "all video is crap" and the "HD is as good as 35mm even for the big screen" lies the reality.

 

HD can look very good on the big screen -- for example, "Sin City", "Revenge of the Sith" -- but it has its own look and in measurable ways, it is not as good as 35mm color neg in terms of resolution, exposure range, color depth, plus film doesn't have inherent digital artifacts.

 

Yep.

 

Film and Digital Video are different beasts. It's going to take a few dozen Siths and Sin Citys for people to start appreciating digital cinema as it its - a new format.

 

Right now the trend is to make the Video footage look like film, as closely as possible. This is natural, as more or less all high profile work has been done in film so far. We've been trained to watch film for the last century.

 

So, we do all sorts of stuff to make video look like film - reduce the frame rate, add grain and scratches, adjust the color curve etc. In practice, we try to mimic film's articfacts.

 

If it was the other way around (All high profile work had been done on video, and all reality shows were shot on film for the last 100 years), we'd probably be discussing about the poor motion reproduction of 24fps film, the ugly graininess of it, dust and scratches etc.

 

And people would shoot 60 fps (video cadence), very slow speed (no film grain) small format film (Bigger DOF) to get it to look like video... and the video dudes would say "it'll never look as good" ;-)

 

***

 

Anyway, one thing worth taking into consideration is the quality of prints we actually see at theaters. Here in Finland, we probably get at least one generation poorer copies of the blockbusters than you do in the US, because of adding subtitles.

 

So, when i said that to me i.e. Harry Potter's technical quality (sharpness etc.) didn't look that much higher than my HDV to 35mm print's, it may very well be because of that generational loss - i'm pretty sure my spot was first generation positive from the printout.

 

When it comes to films i've seen in theaters, "Revenge of the Sith" as a digital projection was totally in it's own class, when it came to apparent technical quality. Very sharp, very low noise etc. So, one could argue it was BETTER than film.

 

It'll be interesting to see the future developement of digital cinema - there are already cameras that shoot at 4K resolution, and really high def, really high dynamic range stuff is waiting around the next corner...

 

 

***

 

BTW, we also did a "Guess if this is film or video or 3D" thingy at a company used to work for -

 

http://www.matheus.fi/Quiz/QuizNav01.html

 

The examples are in PAL D1 resolution. All the film examples in there are from 35mm scans, all the video examples were shot by me on a DVCPro 50, 50i / de-interlaced in post. All 3D is LightWave - by me too. It'd be interesting to hear if anyone got them all correctly at the first go. ;-)

Link to comment
Share on other sites

  • Premium Member

1.5 wrong and I didn't spend long. So there is a visible difference even looking at JPEG stills. Not to say one or the other looked better in those examples. Dumb anything down enough and it will all look the same, and SD TV presentation is dumbing down both film and HDV. A Goggomobile is as fast as a Lamborghini in a traffic jam, which logically leads to the conclusion that the former is just as 'good'. On an open road...

Link to comment
Share on other sites

1.5 wrong and I didn't spend long. So there is a visible difference even looking at JPEG stills. Not to say one or the other looked better in those examples. Dumb anything down enough and it will all look the same, and SD TV presentation is dumbing down both film and HDV. A Goggomobile is as fast as a Lamborghini in a traffic jam, which logically leads to the conclusion that the former is just as 'good'. On an open road...

 

Cool - what was the telltale sign that made you think "video"??

 

BTW, None of the video stuff was HDV (HDV didn't exist as a format when these were made a few years back), it was all originally interlaced D1, shot with DVCPro50, so it wasn't actually dumbed down at all...

Link to comment
Share on other sites

  • Premium Member

The trouble with the argument that since duping film through multiple generations for release prints and using typically mediocre projectors means that there is a loss of resolution (which is true), then HD is just as good as 35mm does not take into account that if you START with less resolution by using HD, you end up at an even LOWER resolution once the HD image also goes through the same steps to get to movie theaters.

 

So your bad print of "Harry Potter" would look even worse if it had been shot on HD and then gone through the same degrading steps as the 35mm image did, because you'd be starting out at a lower point to begin with. Bad post and print presentation does not act as some sort of equalizer on the big screen for various formats. You start out with a better image, it might survive the degradation a little better.

 

So you can't use the resolution of final print presentation (sometimes less than 1K) as a guide for what resolution to capture the image at.

 

Plus some of us -- like me -- actually make the effort to see movies in good theaters and I'll be pissed if the film industry decides that the quality level of the worst cinemas in the world is the level they should be aiming their work at.

 

I'm not against HD, having shot eight features in that format, but I'm well aware of its weak areas compared to 35mm. I think it can look pretty good if shot, posted, and presented well -- but then, the same argument can be made for Super-16. Doesn't mean that I'd rather shoot in HD and Super-16 instead of 35mm though just because if I work hard enough, I can get decent enough results to fool a lot of viewers. I'd rather work hard to make 35mm look like 65mm, you know what I mean? Rather than work to make HD and Super-16 look like 35mm.

 

But there is also an aesthetic bridge here between digital and film images that some people won't cross. Maybe that's good in the sense that if forces digital manufacturers to keep working on improving the technology to compete with 35mm better, rather than simply saying "just love it for what it is".

 

But I suspect that digital will keep getting better and better but it won't ever look EXACTLY like 35mm, so in the end, people will make that last step towards accepting the digital look because by then, it may equal or surpass 35mm resolution, be nearly equivalent in exposure range and color depth, but be very clean-looking with the high-tech plastic quality that some people hate, so they'll either have to get rid of their prejudices or find a way to keep shooting on film.

 

Oddly enough, talking to both Fuji and Kodak sales reps, sales of motion picture negative stock in 16mm and 35mm have been rising lately, not falling. Kodak's indie division is selling almost double what they were a few years ago in camera stock. Fuji is up by 30% over previous years. Even they aren't sure why more people than ever are shooting film in higher volumes than ever.

 

I'm starting to form a theory that the general demise of film will be a decade in the future or so, but instead of being really gradual, it will reach a tipping over point and be very fast towards a switch to all-digital once the infrastructure favors all-digital production, post, and distribution. And I also don't think film, even after that, will completely die off -- it will just become more of a boutique item used by serious art types more than mainstream production.

Edited by David Mullen
Link to comment
Share on other sites

  • Premium Member
When it comes to films i've seen in theaters, "Revenge of the Sith" as a digital projection was totally in it's own class, when it came to apparent technical quality. Very sharp, very low noise etc. So, one could argue it was BETTER than film.

I have also seen ROTS projected in 2K and it was definitely not better than the 35mm projection that I am used to.

 

 

Cool - what was the telltale sign that made you think "video"??

 

Very poor handling of highlights. Always a dead give-away.

Link to comment
Share on other sites

  • Premium Member
QUOTE(Eki Halkka @ Dec 9 2005, 03:26 AM)

 

Cool - what was the telltale sign that made you think "video"??

 

 

 

Very poor handling of highlights. Always a dead give-away.

 

And the edge enhancement, and lack of shadow detail.

Link to comment
Share on other sites

Guest Charlie Seper

"All you are showing is its possible for a major film to look bad. Not all DP's are equal!"

 

I'd be more than happy to do another test using film stills from nothing but motion pictures that won high awards for cinematography and leave them uncompressed. The so-called "poor highlights" and "edge enhancment" should be a cakewalk for you people unless you're stupid enough to make claims your superman eyes can't backup. How about it?

 

And the tiny bit of compression artifacts in what I've already posted would have nothing whatsoever to do with making film stills look less like film. Compression has nothing to do with edge enhancements and highlights if that's your beef. Compression at best would make certain things look a tad pixelated and you'd have to blow-up the stills to see even that to any degree worth mentioning. If I shoot video at 60i and don't bother doing any gamma adjustments, deinterlacing, etc., and post it next to a still from 35mm film you would see quite well the differences no matter what the ammount of compression I used. Compression should have nothing whatsoever to do with being able to tell film from video unless you blow it up. In short, if you can't tell which stills are from film and which are from video in my test at 50% compression, you'll never be able to do it at 100%.

Link to comment
Share on other sites

Guest Charlie Seper

"Anyway, one thing worth taking into consideration is the quality of prints we actually see at theaters."

 

That's another thing that makes me cringe when these fruits and nuts want to argue their life away about how damn great film is. When was the last time you went to a theatre and looked up at the projection booth and didn't see a filthy window? Put that beside the fact that some prints are better than others and that different projectors make films look different (even the bulb used makes a difference) and all the money you wasted on film becomes a non-sequiter. Oh, and that there are fewer and fewer people going to cinema's too. Most people will see your movie on their TV's and that's a trend that's going to grow with the advent of home theatre systems.

 

"... there are already cameras that shoot at 4K resolution, and really high def, really high dynamic range stuff is waiting around the next corner..."

 

I keep wondering what their argument will be (oh, but if they have a vested interest in film they'll surely come up with one) when video can out-perform film? That day can't be far off.

 

I also hear people talk about all the color that film has to offer but I've never seen anyone that could tell 4:4:4 from 4:1:1. I could go out an shoot stills with my Olympus at either 4:2:2 or 4:1:1 and post them uncompressed. I'll never believe anyone could see any difference at all. So much of what film does is simply overkill. The majority of differences between film and video are things no human eye could begin to see.

Edited by Charlie Seper
Link to comment
Share on other sites

  • Premium Member
I keep wondering what their argument will be (oh, but if they have a vested interest in film they'll surely come up with one) when video can out-perform film? That day can't be far off.

 

I also hear people talk about all the color that film has to offer but I've never seen anyone that could tell 4:4:4 from 4:1:1. I could go out an shoot stills with my Olympus at either 4:2:2 or 4:1:1 and post them uncompressed. I'll never believe anyone could see any difference at all. So much of what film does is simply overkill. The majority of differences between film and video are things no human eye could begin to see.

 

Hi,

 

HD video today is better than 16mm film from the 1970's. Film, lenses & telecine's have improved over the last 30 years. Film is a moving target!

 

Try making a blue screen test at 4:4:4 v 4:2:2 v 4:1:1. The keyer will see the diffreence even if you can't!

 

Stephen

Link to comment
Share on other sites

  • Premium Member

I'm sorry Charlie, but these arguments of yours are getting really tedious. If you can't see the difference or don't care, don't expect everyone else to lower their standards to your level. If you are happy with the way HDV looks, fine. But if I were you I'd seriously wonder why most films are still shot on 35mm, despite the added cost.

Link to comment
Share on other sites

Charlie,

 

As a "video guy" who has learned a lot about lighting and production from these "film guys," your utter rejection of reality makes me ashamed to be a "video guy." You're giving us a bad name, and I hope people don't hold it against me (us).

 

Josh

Link to comment
Share on other sites

And the edge enhancement, and lack of shadow detail.

 

...

 

Very poor handling of highlights. Always a dead give-away.

 

Mmm.. that's pretty much it. But...

 

Actually, on these examples, these all are partly something that i could be blamed for as colorist / post dude, rather than a fault of the medium.

 

The originals were rather low contrast, there was plenty of detail in the shadows, and a bit more also in highlights. It's just that that high contrast look is/was the trend of the day - so it was done to look this way in color correction, deliberately.

 

Same goes mostly with the sharpening - i always shoot with very low detail settings, and sharpen in post. On Video monitor, that edge enhancement is less visible - the stuff just looks crispy.

 

Since the day these were made, the arrival of HD/HDV and also better sharpening algorithms has luckily reduced the need for unsharp mask type sharpening for video. Which is good. Very good.

 

One cool thing about those 35mm shots in the examples was that i got two transfer passes on DigiBeta - there was a special "highlight" pass that was underxposed otherwise. I then used hand-drawn masks to blend between the two. I must admit, that's something that would have been nearly impossible to get with video (without altering lihting).

 

I have also seen ROTS projected in 2K and it was definitely not better than the 35mm projection that I am used to.

 

I guess that depends on taste - i happen to like the "high-tech plastic quality" - film grain etc. is something i consider an "effect", that can be added in post if desired - probably because i've mostly worked on stuff that originates on video, and ends up on broadcast TV.

Link to comment
Share on other sites

  • Premium Member
Actually, on these examples, these all are partly something that i could be blamed for as colorist / post dude, rather than a fault of the medium.

 

The originals were rather low contrast, there was plenty of detail in the shadows, and a bit more also in highlights. It's just that that high contrast look is/was the trend of the day - so it was done to look this way in color correction, deliberately.

We're not going to blame this on the colorist now are we? Highlights are a problem with video. Period. No postproduction can fix the ungraceful clipping of highlights that video exhibits.

Link to comment
Share on other sites

  • Premium Member
Anyway, one thing worth taking into consideration is the quality of prints we actually see at theaters. Here in Finland, we probably get at least one generation poorer copies of the blockbusters than you do in the US, because of adding subtitles.

 

The prints shown in Finland are usually laser subtitled, so there is no additional generation of duplication or bi-pack subtitle printing required compared to Hollywood features shown in English-speaking countries. Prints may be made by Finnlab (Helsinki) or other European labs (e.g., Technicolor London/Rome/Spain, Deluxe London/Rome, etc.), or even recycled from US theatres.

Link to comment
Share on other sites

  • Premium Member
Prints may be made by Finnlab (Helsinki) or other European labs (e.g., Technicolor London/Rome/Spain, Deluxe London/Rome, etc.), or even recycled from US theatres.

 

Since Harry Potter gets released at the same time worldwide, you can bet it is a brandnew print.

Link to comment
Share on other sites

I also hear people talk about all the color that film has to offer but I've never seen anyone that could tell 4:4:4 from 4:1:1. I could go out an shoot stills with my Olympus at either 4:2:2 or 4:1:1 and post them uncompressed. I'll never believe anyone could see any difference at all.

 

Oh, you can see it too. Shoot a car's red taillights at night, or anything else that has bright red or blue details against black. At 4:4:4 they look OK, at 4:1:1, you'll clearly see very blocky colors.

 

4:1:1 will fool the eye most of the time, but not always.

 

Another thing is keying, as mentioned. There's a world of difference there. I should know, i'm just working on a 200+ shot blue/greenscreen production, shot with HDV.

 

Luckily, this will be mastered on D1 PAL, so after scaling HDV down we have more or less 4:4:4 color, and the keys look quite good.

 

4:1:1 (or 4:1:0) stuff can be processed to look decent at full rez (i.e. by median filtering the color channels), but that only works if you keep the stuff at 4:4:4, or at least 4:2:2 from there on.

 

As far as that 4K camera goes, i was talking about Dalsa Origin:

 

http://www.dalsa.com/dc/origin/origin.asp

 

4K resolution, 12 stops of latitude, directly uses cine lenses. This one should already be better than 35mm film negative is technically, AFAIK, approachingf 65mm film.

Link to comment
Share on other sites

We're not going to blame this on the colorist now are we? Highlights are a problem with video. Period. No postproduction can fix the ungraceful clipping of highlights that video exhibits.

 

Well, *I* was the colorist, and i do take at least half the blame on highlight clipping, on these examples. I deliberately modified the gamma of the video with the notorious "S-curve", increasing contrast in midtones, darkening blacks and lightening whites - that was the desired look.

 

Also, not all video cameras are equal in this respect. There's been a lot of progress from 5-10 years ago.

Link to comment
Share on other sites

  • Premium Member

35mm is roughly 4K equivalent, if not 6K -- 65mm would be more like 8K.

 

It's only because Hollywood has compromised by using 2K for most special effects work that people start to think that 35mm is only 2K, when tests clearly show that it is capable of resolving much more -- hence the move to start to do D.I.'s at 4K.

 

Even right now, I'm doing 2K digital effects for a 35mm anamorphic feature that I'm answer-printing, and I can see the resolution hit of doing all the digital work at 2K when I project the results at the lab.

 

Take a look at this article:

 

http://millimeter.com/mag/video_digital_ci...cial/index.html

 

As for the Dalsa, it's 4K but using a single CCD with a Bayer filter, which means that it has to process the RAW signal to derive 4K RGB, so in practical reality, it's more like 2 to 3K in resolution. However, since it is a grainless image, it looks similar to a 4K scan of a fine-grained 35mm stock. And in terms of graininess, yes, it would resemble 65mm. But not in terms of resolution.

 

Now if we could digitally project 4K from a 4K scan of 35mm, or a 4K digital camera, then THAT would seem more like 70mm print projection.

 

In other words, the Dalsa is the first camera to come close to matching 35mm resolution. And it takes a mini-fridge-sized recorder that can only hold two hours of footage at that quality level...

 

As for the argument by Charlie that 35mm is overkill in quality for theatrical features, I'm not sure why he is so desparate to argue for a lowering of standards. We should be working to raise digital cameras to the level of what we currently already have with 35mm, not settling for half the resolution and a quarter of the color information and making THAT the standard for cinema. Why in the world should we do that? We'll only end up regretting it.

 

These sorts of oddly militant arguments tend to be made by people who can't afford to shoot with better technology -- which is fine, there's nothing wrong with working within a tight budget -- but rather than admit any technical shortcomings in their approach, would rather argue either that (2) the technical shortcomings don't actually exist or (2) they don't actually matter. Sometimes they even try to make BOTH arguments, which is somewhat contradictory.

 

Any visual artist worth their salt doesn't use the standards of the common viewer as a guide to the quality level they wish to work at. If they care about their own artform, they set personal standards that are much higher than the average person. The average person won't say they care if a face is well-lit as long as they can see what's happening on screen, but even the average filmmakers care more about such things -- that's the nature of art and entertainment, we put a lot of unnoticed craftsmenship into our work so that the viewer can just enjoy it. We create the ILLUSION that it was all easy. I really don't care if the average person can see the difference between 1K, 2K, 4K, or the difference between 100 ASA and 500 ASA stock, or the difference between 1.85 and 2.35, or the difference between whether a camera is slightly above or below the actor's eyeline, or whether the scene is slightly cool or warm, etc. The point is that WE care so that the audience doesn't have to.

 

So why even go into filmmaking if you don't care about the quality of what you do beyond the level of a non-filmmaker??? WE'RE supposed to be the experts here, not the audience.

Edited by David Mullen
Link to comment
Share on other sites

So your bad print of "Harry Potter" would look even worse if it had been shot on HD and then gone through the same degrading steps as the 35mm image did, because you'd be starting out at a lower point to begin with.

 

Don't get me wrong - it didn't look "bad". It looked rather good, just as good as any other 35mm prints i've seen, and the theater i saw it at was about the newest in town so it should supposedly be technically good. As other have pointed out, there's NO additional generations due to subtitling - that was my mistake.

 

It's just that the release print always is inferior quality compared to the original - and there's nothing that can be done about it really.

 

Bad post and print presentation does not act as some sort of equalizer on the big screen for various formats.

 

I must disagree with you to some extent, the reduced quality of actual prints DOES work as an equalizer - not as much as scaling down to standard def video does though.

 

There probably is a treshold where additional "original quality" ceases to affect the output in any discernable way. For example, i really doubt anyone can see the difference between 4K digital intermediate and 2K digital intermediate, after scaled down to standard definition TV. In both cases there's plenty of oversampling there. For film release prints, i suppose the difference can be seen. But difference between 4K and 8K probably would be lost.

 

If you remember my original argument, it was that in that theater, some shots done with HDV, cut within the film, would probably have passed unnoticed by the audience.

 

***

 

I totally understand what you're saying about rather trying to make 35mm look like 65mm... to me, making MiniDv look like Digibeta is just as cool. I simply love to try to stretch each medium as far as it can go. You know, to turn turds to diamonds ;-)

Link to comment
Share on other sites

  • Premium Member
As far as that 4K camera goes, i was talking about Dalsa Origin:

 

4K resolution, 12 stops of latitude, directly uses cine lenses. This one should already be better than 35mm film negative is technically, AFAIK, approachingf 65mm film.

 

Although Dalsa claims that the Origin is 4K, it seems that what you get is more like 3K effectively. 35mm is AT LEAST 4K, with more latitude, so it is still better than any exisiting digital camera. And we're not even getting into colordepth, another area where film is vastly superior. If digital really were so good, then why aren't more people using it? Especially those who have the choice, because their budget allows them to use either format.

Link to comment
Share on other sites

35mm is roughly 4K equivalent, if not 6K -- 65mm would be more like 8K.

 

...

 

Take a look at this article:

 

http://millimeter.com/mag/video_digital_ci...cial/index.html

 

Yep, i've seen the article - a good one. The reason why i said dalsa surpasses 35mm wasn't so much the resolution (which does look visually just slightly better than 4K film scan to me), but it's lack of grain and higher exposure latitude (correct me if i'm wrong, but isn't film negative somewhere around 9-10 stops at best?).

 

That's why i said it's "approaching 65mm"...as it's being somewhere between the two qualitywise.

 

Now if we could digitally project 4K from a 4K scan of 35mm, or a 4K digital camera, then THAT would seem more like 70mm print projection.

 

There actually are 4K projectors already - not in common use though...

 

In other words, the Dalsa is the first camera to come close to matching 35mm resolution. And it takes a mini-fridge-sized recorder that can only hold two hours of footage at that quality level...

 

Yep. It's a bit clumsy right now - but in 5-10 years... probably editing 4K video shot with my cell phone, on my cell phone ;-)

 

These sorts of oddly militant arguments tend to be made by people who can't afford to shoot with better technology -- which is fine, there's nothing wrong with working within a tight budget

 

I hope i haven't sounded militant - i wasn't aiming at that at all. Believe it or not, i'm currently involved in post production of a job that's done entirely in 4K, and love it... i do like quality too.

 

The stuff that's shown in the theaters will probably evolve to something higher quality than now - not because the artists demand more quality, but rather because HDTV quality home theaters are getting too close... something must be done to keep the audience.

 

Anyway, to me HDV is kind of a revolution - much in the same way as desktop publishing was in early 90's, and digital audio studios and CGI on home machines were a bit later - or MiniDv was in standard def video world.

 

It's something that will allow Joe Average to get technically "close enough" of big time, with cheap off the shelf equipment - the current generation of it is kind of at the edge, the next gen will probably be over it already. So, the difference won't be who has the big bucks, but rather who's good.

 

And, as i mentioned earlier, i was pleasantly surprised at the quality when watching it on 35mm print.

 

All this said - in my daily work, which is SD for television, HDV has replaced DigiBeta and DVCPro 50 as the shooting format of choice. Not because it's cheaper (that's a non-issue with commercials etc. - i'm not the one paying the bills), but rather because it simply looks better.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Forum Sponsors

BOKEH RENTALS

Film Gears

Metropolis Post

New Pro Video - New and Used Equipment

Visual Products

Gamma Ray Digital Inc

Broadcast Solutions Inc

CineLab

CINELEASE

Cinematography Books and Gear



×
×
  • Create New...