Jump to content

Star Wars 9 to be shot on 65mm film


Recommended Posts

JJ & Dan Mindel specifically talked at length in the AC article of how much they paid attention to make it look really in keeping with the OT and it does, and the Gareth Edwards directing circles around JJ bit made me smile, I have no idea why some people have a hard on for Rogue One, it's an opinion but not only does The Force Awakens' beautiful anamorphic 35mm lensing imo wipes the floor with Fraser's flat, low-contrast, jarring digital look (what a mistake for a film taking place right before ANH to shoot digitally), but TFA is also a far superior film whether in terms of direction (enough of that handheld aesthetic), craftsmanship, characterization (can you look at me with a straight face and tell me any of the characters in Rogue One are more memorable than Rey, Finn, Poe or Ren?), pretty much everything really.

 

Rogue One being a spin off, I just cannot consider it in the same way as a film of the main trilogies, it's expandable, I'm curious about the Han Solo one since Phil Lord & Chris Miller are great, although it being shot digitally makes me groan once more, but yeah.

 

I used to love the prequels as a kid, they're not as bad as some are making them out to be, but they're surely not anywhere near the level of the OT, ROTS can come close at times, TPM has some great moments, AOTC is flat out mediocre though.

 

The characters in TFA are terrible.

 

Han had his character development mangled. It's clear that the only reason that he's smuggling and flying solo again is because that's what people remember and liked about Han in the OT. He pays lip service to having changed, but if he had really changed he wouldn't have left Leia when the going got tough like he did in ANH when he got his reward.

 

Carrie Fisher can no longer act. At all. She can barely move her face and she sounds like a man who's been in a coal mine for 90 years. So her character is unsalvageable just do to the casting.

 

Finn COULD have been good, his premise is good. But JJ doesn't do anything with "soldier brainwashed/indoctrinated from infancy". He gets over it way too quick and is acting like any old deserter, not one that has been drilled from infancy who needs to detox.

 

Rey, as so often pointed out, has things happen too easily. Luke needed Ben to show him the mind trick and even then it took him until Return of The Jedi to do it. Rey just somehow knows both what a mindtrick is, and how to do it. And that's just the tip of the iceberg. I can elaborate upon request. And she's a bad actress IMO. Much worse than Felicity Jones.

 

Poe has no character.

 

Kylo is almost a good character, but things are still out of place simply because of how poor a filmmaker JJ is. For instance the timing of his helmet removal reveal is wrong (should have been on the bridge with Han).

 

Oh, and they character assassinated Luke for the sake of their original trilogy pastiche. Luke is now the hermit/Yoda expy for Rey, and he fills that role at the cost of his entire character. Luke would never cut off contact with Leia and everyone else he loves and go sit on a rock. Especially not when he should be able to sense or even just know about the First Order and the danger for the New Republic.

 

And as for your comment about the Prequels, no, all three are much better than Return of the Jedi, TPM and AotC are on par with ANH, and RotS is easily on par with Empire. (ROTS is far and away the best for me, with its one big loose end/screw up being the resolution of the Yoda/Sidious fight).

 

And lastly, Jyn is a better character than Rey, and Cassian is a better character than Finn, so Rogue One wins there too. Also, RotS was digital, and the ending of that meshes with the opening of ANH very well.

Edited by Jesse Straub
Link to comment
Share on other sites

  • Premium Member

I saw Rogue One 15/70 Imax film print and thought it was very high contrast interesting you thought it was low contrast.

The film out adds contrast no matter what you do. The digital release of Rogue One was very flat and unappealing.

Link to comment
Share on other sites

The film out adds contrast no matter what you do. The digital release of Rogue One was very flat and unappealing.

 

Wouldn't that depend on the projection quality?

I saw Rogue One on a very new projector and it looked good to me.

Link to comment
Share on other sites

  • Premium Member

The characters in TFA are terrible.

Meh, they're terrible in Rogue One as well.

 

Even though I like Felicity Jones, her character has no reason for existence. She doesn't DO anything, how does she survive? Why does she even care about this mission? It's like all of a sudden a switch clicked and she wants to be part of something bigger, give me a break.

 

Cassian? Just a Han Solo knock-off, another smuggler type character to help Jyn. I knew what was going to happen before they even met, it was so played out, so been there, done that.

 

K-250? Comedy relief for children? Give me a break, just flat-out uninteresting and bland dialog. Nothing like C3PO

 

Ohh then the stupidity of the entire plot. The death star has hyperdrive capability? Give me a break. The whole point of the death star is that it was incapable of moving fast. Otherwise, you'd just park it somewhere way far away from anyone else when not in use, but for some reason, it's always found and attacked... gimme a break. Then somehow magic happens and they survive the attack of the death star, really? I mean that's all in the first act... Then they've gotta go to this stupid library thing with LTO tapes and steal one of them? WOW really, is this the year 2000? What genius came up with that ridiculous decision.

 

Again, nothing about Rogue One was "better" then TFA, it was just "different" and seeing as I'm not a fanboy, I can look at it from the point of view of a "movie", rather then how good they did at translating or filling in gaps within the Star Wars universe. Yes TFA had some stupid things... the creatures on Han's ship, the planet falling apart at the end, poop like that. However, I'm more invested in the TFA characters because they were part of the original trilogy, to a certain extent.

 

Anyway, we could go on about this forever. I can just say from an outsider, who doesn't read the books or comic's, who just enjoys the movies for what they are, TFA was far better achieving the look and feel of a Star Wars movie. Rogue One just felt like a made for Netflix movie.

Link to comment
Share on other sites

  • Premium Member

Wouldn't that depend on the projection quality?

I saw Rogue One on a very new projector and it looked good to me.

I saw Rogue One with the latest 4k Dolby laser projection system. If low-contrast is your thing, I guess that's what you get.

 

I'm a member of the projectionist forum as well, a lot of projectionists called up the distributor to confirm the super flat look was "normal". That's how freaked out people were about it.

Link to comment
Share on other sites

Ohh then the stupidity of the entire plot. The death star has hyperdrive capability? Give me a break. The whole point of the death star is that it was incapable of moving fast. Otherwise, you'd just park it somewhere way far away from anyone else when not in use, but for some reason, it's always found and attacked... gimme a break. Then somehow magic happens and they survive the attack of the death star, really? I mean that's all in the first act... Then they've gotta go to this stupid library thing with LTO tapes and steal one of them? WOW really, is this the year 2000? What genius came up with that ridiculous decision.

 

 

?

The Death Star has hyperdrive in the first movie. That's how it gets to the rebel base. The fact that it then takes like 15 minutes to mosey out from behind the planet in the way is it's own writing issue.

Link to comment
Share on other sites

Oy, I had no intention on seeing Rogue One, but now I'm going to have to rent it just to get some kind of clue. I've heard it's way better than TFA and I've heard it's way worse. I thought TFA was well-done for the most part, visually and aurally speaking any way. I agree that the script missed a lot of opportunities by trying too hard to recreate ANH and despite what everybody says about it, there's a lot of CG for the sake of CG. I'm fine with effects when they serve a purpose, but it seems like about half the movie had some kind of "enhancement" that didn't really need to be there.

I have no intention on watching the Han Solo movie either. That reminds me; I really need to get my laserdisc box set sold before Disney makes everybody sick of Star Wars.

Link to comment
Share on other sites

  • Premium Member

That reminds me; I really need to get my laserdisc box set sold before Disney makes everybody sick of Star Wars.

I bought up 4 of the CAV original transfer versions for $25 bux at a local shop before they went under and sold them on ebay for $150 each!

 

Now I just have the stupid redux 1997 box set, it's AC3 audio and all, but it's the same as the DVD's, so it has no value.

 

I do have A New Hope, letterboxed, original 1981 pressing tho. That I'm never going to sell.

Link to comment
Share on other sites

  • Premium Member

I have that old laserdisc box set too.

 

The Death Star would have to have hyperlight drive unless it gets built in the solar system of the planet it is destroying, otherwise it would taken decades if not centuries to move to another solar system.

Link to comment
Share on other sites

  • Premium Member

The Death Star would have to have hyperlight drive unless it gets built in the solar system of the planet it is destroying, otherwise it would taken decades if not centuries to move to another solar system.

I always thought it was always in motion and it moved very slowly... they never really explained it.

Link to comment
Share on other sites

  • Premium Member

"George Lucas's myopic vision"?

Rogue One was shot digitally, and in my opinion was much more visually appealing than The Force Awakens, though some of that might be due to the fact that Edwards can direct circles around J.J.

 

And many of the best directors and cinematographers of today are very on board with digital shooting. (Fincher and Deakins, just to name 2 superstars.)

"George Lucas's myopic vision"?

Episode 2 was captured in 1440 x 800 resolution by so-called "Cine Alta" cameras, basically tarted-up HD Betacams.

1440 x 800 is acceptable if projected directly by a digital projector, but at the time these were very few and far between; the vast majority of screenings were of standard film prints.

With a movie shot on film, the standard four-generation duplicating process produces an acceptable quality print, but that's because you're starting with a much higher resolution negative. With Episode 2, they were starting out the 4-generation chain with a digital master that actually had less resolution than the average cinema print, so the projected pictures were pretty dreadful.

Very few people followed George's lead; most of them stuck to film origination, all George really demonstrated is several compelling reasons to stay with film, and none to switch to video if you had the budget for film. That was back in 1999! Film origination for big budget productions didn't actually start to seriously decline until around 2016...

 

Yes, the Alexa is a long way ahead of the cameras GL used but it took a damned long time to get there!

 

This whole retarded "Film vs Digital" argument was mostly propagated by clueless wannabes, unshakably convinced that the hardest and costliest part of making a movie is centered around the camera system, and that video cameras are somehow vastly easier and cheaper to use than film cameras, and by some unexplained means, this will fling open the doors of Hollywood to any person who fancies him or herself to be a cinematographer.

  • Upvote 1
Link to comment
Share on other sites

"George Lucas's myopic vision"?

Episode 2 was captured in 1440 x 800 resolution by so-called "Cine Alta" cameras, basically tarted-up HD Betacams.

1440 x 800 is acceptable if projected directly by a digital projector, but at the time these were very few and far between; the vast majority of screenings were of standard film prints.

With a movie shot on film, the standard four-generation duplicating process produces an acceptable quality print, but that's because you're starting with a much higher resolution negative. With Episode 2, they were starting out the 4-generation chain with a digital master that actually had less resolution than the average cinema print, so the projected pictures were pretty dreadful.

Very few people followed George's lead; most of them stuck to film origination, all George really demonstrated is several compelling reasons to stay with film, and none to switch to video if you had the budget for film. That was back in 1999! Film origination for big budget productions didn't actually start to seriously decline until around 2016...

 

Yes, the Alexa is a long way ahead of the cameras GL used but it took a damned long time to get there!

 

This whole retarded "Film vs Digital" argument was mostly propagated by clueless wannabes, unshakably convinced that the hardest and costliest part of making a movie is centered around the camera system, and that video cameras are somehow vastly easier and cheaper to use than film cameras, and by some unexplained means, this will fling open the doors of Hollywood to any person who fancies him or herself to be a cinematographer.

 

As far as "it took a really long time to get there", Episode 3 looks drastically better than Episode 2 (and it's not because Episode 2's Blu Ray has lots of issues, I have a much less mangled HD version of Episode 2 and Episode 3 still looks much better)

Episode 3 on a digital projector looks very very good.

 

For my money, Revenge of The Sith looks about as good as Prometheus, shot on RED in 2012. (Other than resolution. I am aware of the resolution difference, but lets face it, when even the resolution that Revenge of The Sith is shot at looks great and crisp on a regular movie theater screen, it's really in no danger of becoming obsolete, unless we all start having IMAX screens in our houses) Which leads to my other point. I think 2016 is a bit off as far as the digital switch goes. Hell, Cameron shot Avatar digitally in 2009, X-Men went digital in 2011 (Singer himself much earlier, with Superman in 2006) , Fincher went digital in 2007, Deakins made his comments a few years before 2016, etc...

Edited by Jesse Straub
Link to comment
Share on other sites

  • Premium Member

"George Lucas's myopic vision"?

Episode 2 was captured in 1440 x 800 resolution by so-called "Cine Alta" cameras, basically tarted-up HD Betacams.

1440 x 800 is acceptable if projected directly by a digital projector, but at the time these were very few and far between; the vast majority of screenings were of standard film prints.

With a movie shot on film, the standard four-generation duplicating process produces an acceptable quality print, but that's because you're starting with a much higher resolution negative. With Episode 2, they were starting out the 4-generation chain with a digital master that actually had less resolution than the average cinema print, so the projected pictures were pretty dreadful.

Very few people followed George's lead; most of them stuck to film origination, all George really demonstrated is several compelling reasons to stay with film, and none to switch to video if you had the budget for film. That was back in 1999!

 

 

Episode 2 was shot in 2000 and released in 2002. It was Episode 1 which was released in 1999, and that was shot on film other than that brief scene shot in 1080/50i HD.

 

I didn't think Episode 2 looked that bad, sharpness-wise -- I shot maybe six low-budget features on the same camera, the Sony F900. Episode 3 was a big improvement technically, recording 4:4:4 video in true 1920 x 1080 with low compression, compared to 1440 x 1080 3:1:1 video with a lot more compression. The problem with both movies was mainly an excess of CGI environments, giving the movies a video game quality. I have the same issue with "Avatar", more than I do with the use of HD cameras in the live-action portions.

 

I also think there were a number of good-looking movies shot on the Viper, the Panavision Genesis, and the Sony F35. I still like watching the blu-ray of "Zodiac". The other day, "Apocalypto" was on TV and I had forgotten how film-like many of the scenes were.

 

Doesn't mean I don't like film just because I can acknowledge that some nice images have been shot digitally over the decade.

 

Anyway, the debate is really over, isn't it? Digital is the primary image acquisition technology for narrative features and television.

Link to comment
Share on other sites

  • Premium Member

. I think 2016 is a bit off as far as the digital switch goes.

That was the first year when more than 50% of "mainstream" cinema release movies were shot on video instead of film.

There was a rather egregious bet made here about 10 years ago by a certain character who some people still seem to think highly of, for God knows what reason. He reckoned it was going to happen by 2009 if I remember correctly. Even Jim Jannard got involved at one point.

I was told by an industry insider that it was actually expected to happen somewhere around 2015, and I even posted that back then and was cackled at by the usual jackasses, but lo and behold, he was right!

But that was probably because he knew about the upcoming timetable to kill off film projection, and once that went, origination wasn't going to be too far behind.

 

Link to comment
Share on other sites

  • Premium Member

 

Anyway, the debate is really over, isn't it? Digital is the primary image acquisition technology for narrative features and television.

Of course it is, but if you listened to George Lucas, it was all going to happen 17 years ago. The infrastructure simply wasn't there then, and nobody was particularly interested in re-equipping and re-skilling everything just because he reckoned it was a good idea...

I heard almost the same spiel back in 1988, when NHK decreed that in the "very near future" all movies would be shot in their abysmal "Hi-Vision" format!

Hence the "myopic vision" comment. I thought it was hilarious when of all movies, they announced Episode 7 was going to be shot on film....

 

Anyway few of my comments have ever had much to do with merits or otherwise of the various image capture technologies, they've mostly been concerned with my opinion about which end of their alimentary canal certain posters' words were coming from. :rolleyes:

Link to comment
Share on other sites

Of course it is, but if you listened to George Lucas, it was all going to happen 17 years ago. The infrastructure simply wasn't there then, and nobody was particularly interested in re-equipping and re-skilling everything just because he reckoned it was a good idea...

I heard almost the same spiel back in 1988, when NHK decreed that in the "very near future" all movies would be shot in their abysmal "Hi-Vision" format!

Hence the "myopic vision" comment. I thought it was hilarious when of all movies, they announced Episode 7 was going to be shot on film....

 

Anyway few of my comments have ever had much to do with merits or otherwise of the various image capture technologies, they've mostly been concerned with my opinion about which end of their alimentary canal certain posters' words were coming from. :rolleyes:

 

You seem to think that Lucas thought that film would die in less than a year or something. 17 years isn't a long time at all, and Attack of The Clones clearly started the trend.

And as for The Force Awakens being film, I counter that Rogue One was digitally shot. The new trilogy being shot on film is purely a nostalgia move, Like how the "PRACTICAL EFFECTS!!!! REAL SETS!!!!" marketing for The Force Awakens was. Ironically, The Force Awakens used CG where it really would have counted to go practical- space ships. And things like Maz Kanata and the Rathtars not being puppets. Or Maz's Castle being CG. etc etc etc. They don't care about film and they don't care about practical effects.

 

The Han Solo movie is also being shot digitally.

Edited by Jesse Straub
Link to comment
Share on other sites

"George Lucas's myopic vision"?

Episode 2 was captured in 1440 x 800 resolution by so-called "Cine Alta" cameras, basically tarted-up HD Betacams.

I wouldn't even go that far. They essentially changed the lens mount and modded a PAL chip set for 24p capture. 3-CCD cameras are designed to correct for chromatic aberrations via sensor placement while cinema lenses do it internally. Lucas, of course, said that the new "superior" format showed off the "shortcomings" of cinema lenses. He also neglected to realize the image quality of a lens changes depending on the frame size. Like, a lens designed for 35mm frames will yield a better image on a 24mm wide frame than it would for a 10mm wide frame, while a similarly priced lens made for a 10mm wide frame would be sharper but won't cover a 24mm wide frame. I mean, it DOES require 6x as much glass to cover, say, S35 as it does 2/3". I think it's good people are willing to try new technology and techniques, but it shouldn't be reported by somebody that doesn't understand it (as mainstream news shows us time and time again)... or maybe he understood but was spreading propaganda.

 

 

This whole retarded "Film vs Digital" argument was mostly propagated by clueless wannabes, unshakably convinced that the hardest and costliest part of making a movie is centered around the camera system, and that video cameras are somehow vastly easier and cheaper to use than film cameras, and by some unexplained means, this will fling open the doors of Hollywood to any person who fancies him or herself to be a cinematographer.

 

Yep. I remember the first feature on which I worked that was shot in HD video. It took special set design, special lighting, special clothing, art direction etc. to make sure everything fit within the limited pallet of HD video. It probably cost MORE to shoot on video in the end. I also remember when the first PII-based camera hit the market and everybody was excited to not have to deal with tape any more... yeah, but you now have to have somebody follow you everywhere with a Mac Book to off-load clips every few minutes. They solved all that by recording everything at 24mbps, adding a lot more image processing and improving automatic lenses, so now everybody IS a Hollywood cinematographer at the low, low price of $800!

 

 

 

For my money, Revenge of The Sith looks about as good as Prometheus, shot on RED in 2012. (Other than resolution. I am aware of the resolution difference, but lets face it, when even the resolution that Revenge of The Sith is shot at looks great and crisp on a regular movie theater screen, it's really in no danger of becoming obsolete, unless we all start having IMAX screens in our houses) Which leads to my other point. I think 2016 is a bit off as far as the digital switch goes.

Am I the only one that sees Red as hot-rodded DSLRs? While I'm here, I can't seem to convince anybody that Blackmagic cameras are cheap mirrorless cams with pro CODECs and some added features. I can tell you, having worked with both film and REAL professional video cameras, BMs aren't even close.

Any way, I think Hollywood went primarily digital before 2016, but, most IMAX, like conventional theaters, are 2K video, so who cares?

 

 

I didn't think Episode 2 looked that bad, sharpness-wise -- I shot maybe six low-budget features on the same camera, the Sony F900. Episode 3 was a big improvement technically, recording 4:4:4 video in true 1920 x 1080 with low compression, compared to 1440 x 1080 3:1:1 video with a lot more compression. The problem with both movies was mainly an excess of CGI environments, giving the movies a video game quality. I have the same issue with "Avatar", more than I do with the use of HD cameras in the live-action portions.

I agree. I always thought the professional CCD-based cameras are quite good. My heart sank when we replaced our CCD field cameras with large, single-chip CMOS cameras (for that more "cinemaesque" look that's far less film-like in every regard except DoF IMO). Any way, the cameras in Avatar and E2/E3 hardly had any screen time. What time they did have was heavily manipulated. They probably could have shot on DigiBeta and hardly anybody would know the difference.

 

 

I also think there were a number of good-looking movies shot on the Viper, the Panavision Genesis, and the Sony F35. I still like watching the blu-ray of "Zodiac". The other day, "Apocalypto" was on TV and I had forgotten how film-like many of the scenes were.

A lot less of Apocalypto was shot on Genesis than they admit (they used many cameras) and more of it was film than they admit. Never the less, I thought the Genesis was a good camera for some things, but most people seemed to use it as an excuse to not light scenes well, which leads to a very "video-like" image.

 

 

I was told by an industry insider that it was actually expected to happen somewhere around 2015, and I even posted that back then and was cackled at by the usual jackasses, but lo and behold, he was right!

But that was probably because he knew about the upcoming timetable to kill off film projection, and once that went, origination wasn't going to be too far behind.

I think the use of digital intermediaries also really helped push things in that direction. Producers want to be able to manipulate stuff, CG is not the last resort it originally was but the first resort, second resort, third resort, last resort and if it's completely implausible to use CG, they put the release on hold till it becomes possible to use CG. In the mean time, scanning film is expensive and the optics in the scanners, regardless of capture resolution, leave their own imprint on the image, like an optical film printer degrades the image just by adding another lens to the path. Film is also much lower resolution than it used to be. Kodak became so obsessed with reducing grain (done by making emulsions very thick, allowing light to scatter more within them) that 35mm film went from about 5K res to maybe 3K.

Link to comment
Share on other sites

  • Premium Member

Am I the only one that sees Red as hot-rodded DSLRs? While I'm here, I can't seem to convince anybody that Blackmagic cameras are cheap mirrorless cams with pro CODECs and some added features. I can tell you, having worked with both film and REAL professional video cameras, BMs aren't even close.

I've done quite a bit of shooting with the Epic MX and Dragon cameras recently. The Dragon looks pretty good, far better then any DSLR I've ever used, even in still mode. The color science is a bit whack, but with good warm glass and the right white balance setting, I was able to get some decent images out of the RED, stuff that's absolutely impossible to get from a DSLR even in RAW mode.

 

You can't compare Blackmagic Designs to Red, Arri or Sony when it comes to the professional world. Blackmagic cameras are less then half the price of the competition for like features and as of the Ursa Mini Pro 4.6k, they've finally built a swiss-army knife camera that even with it's limitations, is worth the money compared to the competition.

 

I shoot a lot of film these days and honestly, unless you spend gobs of money on a monochrome imager high resolution scan, digital still kicks the ass of film every day of the week. The problem today with film is that good scanning is still too expensive. It's still too labor intensive and it's still not capturing exactly what's on the film itself. So sure, you've got this awesome image on film, but getting that image into the digital world to look better then lets say a Red Dragon, is very expensive. So where it's nice to pick on cheaper cameras for being a bit constrictive on their abilities, trying to compare them to film is kinda silly. A $10,000 URSA mini Pro Package is "free" to shoot with after you get that 10k back.

 

Any way, I think Hollywood went primarily digital before 2016, but, most IMAX, like conventional theaters, are 2K video, so who cares?

We're up around 60 - 70% 4k digital adoption these days, but the studios still release much of the content in 2k. This is because some theaters have older servers that struggle to playback 4k material and it costs A LOT LESS MONEY to finish VFX in 2k. So when a film is rushed out the door, a lot of times it's just finished in 2k.

 

Hollywood went digital BTW prior to 2013. Even most non-science IMAX theaters had switched over to digital by then.

 

I always thought the professional CCD-based cameras are quite good. My heart sank when we replaced our CCD field cameras with large, single-chip CMOS cameras (for that more "cinemaesque" look that's far less film-like in every regard except DoF IMO). Any way, the cameras in Avatar and E2/E3 hardly had any screen time. What time they did have was heavily manipulated. They probably could have shot on DigiBeta and hardly anybody would know the difference.

CMOS is a garbage in-between technology. CCD's just don't have the dynamic range of CMOS. The industry also wanted to use the same glass and mount system, so they just went for a technology and we're stuck with it. Honestly, CMOS is more filmic then CCD, it's just lacking in the color depth that you get with a 3 ccd block.

 

Film is also much lower resolution than it used to be. Kodak became so obsessed with reducing grain (done by making emulsions very thick, allowing light to scatter more within them) that 35mm film went from about 5K res to maybe 3K.

Umm... just the opposite actually. T grain is so much smaller/finer it's actually higher resolution because there are more granulates per frame then ever before. This is a huge increase in resolution to the point where modern 35mm prints look like they're digital, nice and crisp all the time, even if they've gone through a photochemical finishing process.

 

I watch A LOT of film prints and I've never seen anything out of the 90's or older that even holds a candle to the modern stocks capabilities. Remember, in the early 2000's, everything went to 2k finishing and laser out's, so the quality decreased substantially. When you compare a print of a photochemically finished movie... Interstellar for instance, to a movie made 20 years ago, the difference is astronomical. The 90's prints look like standard definition in comparison to the modern prints. That's because our camera negatives and print stock is so much better, the resolution has increased substantially.

 

I just hope Kodak invests in making a new stock. I've been told they are doing it right now and we should have some new Vision color negative stocks hitting the market in the next two years, that would be so awesome. I think a low-noise, naturally rated 1000 ISO stock would do really well, it would make people less concerned about lighting.

Link to comment
Share on other sites

I've done quite a bit of shooting with the Epic MX and Dragon cameras recently. The Dragon looks pretty good, far better then any DSLR I've ever used, even in still mode. The color science is a bit whack, but with good warm glass and the right white balance setting, I was able to get some decent images out of the RED, stuff that's absolutely impossible to get from a DSLR even in RAW mode.

To be fair, I said "hot-rodded" DSLR. The color is very strange, it has the overt sharpness to it and still has a rolling shutter. I wasn't intending on starting a war.

 

 

 

You can't compare Blackmagic Designs to Red, Arri or Sony when it comes to the professional world.

I wouldn't even try, yet many people swear they are absolutely amazing. The have poor IR handling, no anti-aliasing filter, it's virtually impossible to get accurate color out of them (spending hours correcting post doesn't count), rolling shutter etc. and when you rig up everything it takes to handle it comfortably, it isn't cheaper than the competition anymore. I actually seriously considered a BMPCC for a while, despite being really disappointed in the original BMCC and found it unusable without major modifications and constraints on how to shoot with it. Plus, they don't have nearly the ISO or dynamic range they claim. I can only imagine they assume the user will apply noise reduction in post to rescue shadow information. On that note, I have not used a single piece of BM equipment that was reliable. Again, not trying to start a war, just going on my own experience. Compare that to, say, a run of the mill Canon or Panasonic that does what it claims and works every time. I ultimately decided not to own a semi-pro video camera because there weren't any in my range that didn't irritate the snot me. That wasn't the case 15 years ago, strangely.

 

 

CMOS is a garbage in-between technology. CCD's just don't have the dynamic range of CMOS. The industry also wanted to use the same glass and mount system, so they just went for a technology and we're stuck with it. Honestly, CMOS is more filmic then CCD, it's just lacking in the color depth that you get with a 3 ccd block.

Single-chip tech is every bit as capable of being film-like regardless of being CCD or CMOS. The two main problems are 1, the best dyes are patented and 2, lesser dyes allow increased ISO at the cost of color purity. As for dynamic range of CMOS exceeding CCD, that's done by artificial means; noise cancellation and other image processing tricks. Now Sony has recently released a CMOS chip with analogue memory, like CCDs have, and IMO is the best of both worlds. It has fairly good color science, native global shutter and larger pixel area like CCDs but at the cost of CMOS. Sadly, there aren't any conventional cameras using this technology yet.

There's also quantum film right around the corner, which looks really promising, but I suspect it will wind up being an excuse to cram even more pixels into an unreasonably small area.

 

 

Umm... just the opposite actually. T grain is so much smaller/finer it's actually higher resolution because there are more granulates per frame then ever before.

Check out the MTFs. Earlier T-grain films resolve quite well, but as emulsions kept getting thicker with each gen, the MTF drops more and more abruptly above 20C/MM.

 

 

The 90's prints look like standard definition in comparison to the modern prints. That's because our camera negatives and print stock is so much better, the resolution has increased substantially.

I suspect a lot of that has to do with the sharpening effect of modern lens coatings and other processes. The resolution isn't greater, but there's a little "bump" in the MTF curve of a lot of film projects before dropping like a rock that adds to the perceived sharpness.

 

 

I just hope Kodak invests in making a new stock. I've been told they are doing it right now and we should have some new Vision color negative stocks hitting the market in the next two years, that would be so awesome. I think a low-noise, naturally rated 1000 ISO stock would do really well, it would make people less concerned about lighting.

I agree. I spoke to a Kodak engineer a few years ago and she said they stopped all R & D. Also, I would love to see a film DESIGNED for 16mm rather than a stock designed for 35mm and merely slit for 16mm. When I spoke to the engineer, she said there was a few things they could do to make 16mm better but since 35mm was the biggest seller, they couldn't justify the cost of a purpose-made 16mm stock.

Link to comment
Share on other sites

  • Premium Member

To be fair, I said "hot-rodded" DSLR. The color is very strange, it has the overt sharpness to it and still has a rolling shutter. I wasn't intending on starting a war.

But what is a hot-rodded DSLR? A 5D MKIV with open software? Still super limited bandwidth and bit rate. Gotta get up over the 200Mbps range to make a decent 1080p image and gotta get over the 400Mbps range to make a decent 4k image.

 

I wouldn't even try, yet many people swear they are absolutely amazing.

They are amazing for the money. I work mostly in post production, but I bring up shots for people to see of what my pocket camera is capable of doing and everyone whose seen it, has been impressed. I'm also impressed what that little $998.00 camera can achieve in just normal Pro Res HQ shooting. Switch it to Raw and do a serious color pass in DaVinci, the camera looks really good for what it is. I have yet to use another camera anywhere near it's size with similar results. Both the Panasonic and Sony "alternatives" are twice the money, for less quality.

 

Here is a little bit of footage I shot from a feature I worked on last year. A very basic grade done in DaVinci, but this is exactly what the camera looks like: https://www.dropbox.com/s/xonfgpua4u2kqyn/Blackmagic%20Pocket%20with%20Zeiss%2012-120.mov?dl=0

 

The have poor IR handling, no anti-aliasing filter, it's virtually impossible to get accurate color out of them (spending hours correcting post doesn't count), rolling shutter etc. and when you rig up everything it takes to handle it comfortably, it isn't cheaper than the competition anymore. I actually seriously considered a BMPCC for a while, despite being really disappointed in the original BMCC and found it unusable without major modifications and constraints on how to shoot with it. Plus, they don't have nearly the ISO or dynamic range they claim. I can only imagine they assume the user will apply noise reduction in post to rescue shadow information. On that note, I have not used a single piece of BM equipment that was reliable. Again, not trying to start a war, just going on my own experience. Compare that to, say, a run of the mill Canon or Panasonic that does what it claims and works every time. I ultimately decided not to own a semi-pro video camera because there weren't any in my range that didn't irritate the snot me. That wasn't the case 15 years ago, strangely.

Well, my only digital cinema cameras are Blackmagic, so I have a different perspective. I've shot with pretty much everything on the market today and honestly, I agree with many of your comments. At the same time, I've been shooting with my pocket cameras since the day they came out. They have traveled around the world with countless other filmmakers (friends) who absolutely adore them. Two of the films that were shot are features, one of which is almost done with post, it will be out in theaters later next year. I personally have shot with them in extreme heat (120f) extreme cold (-22f) extreme rain/wind, on dive boats, floating in a pool with no protection, in super dusty situations (motocross/dirt bike videos) and the camera's have never once had a single glitch and I have two of them. Both look like they've been through a war, all the markings are gone and I've had them both apart to tighten the tripod mount a few times because I never bought a cage for'em. Needless to say, with a viewfinder adaptor and a PL mount adaptor, I can put my Super 16 cinema glass on them and create some fantastic imagery. Both cameras, a kit of Rokinon EOS mount primes (8mm, 12mm, 24mm, 85mm) wireless audio, batteries, cards, adaptors, bag, tripod... discounted pricing but still new, the entire kit ran me $3k. I shot two projects and it paid for the cameras right away, right now I'm just in the profits every time I use them on a paid gig.

 

Now I've used comparable Sony and Panasonic cameras, but they aren't really any better. Sure they have bells and whistles, but at what cost. External recorders to capture 10 bit 4:2:2 in any other codec but Long GOP MPEG disaster? Sorry, not interested. The GH5 is the first Panasonic camera I've ever really been interested in and I talked my friend into buying one, so I'll be using it quite a bit coming up in a few months once he receives it. I'll do a full write-up, but needless to say my S16 glass won't work on it, so there goes all those well laid plans! LOL :)

 

I've not scientifically tested any of the blackmagic cameras. I'm just a cinematographer, I go out and shoot poop, I come back and I edit it. Yea, I'm not happy with a lot of the skin tones and how the camera deals with certain unusual situations. However, I know how to work around those issues in post and how to light properly to help avoid them. All digital cameras have their little issues, the RED and Arri cameras do to, it's just they're so much more money, people buy them not realizing they too have issues.

 

Single-chip tech is every bit as capable of being film-like regardless of being CCD or CMOS. The two main problems are 1, the best dyes are patented and 2, lesser dyes allow increased ISO at the cost of color purity. As for dynamic range of CMOS exceeding CCD, that's done by artificial means; noise cancellation and other image processing tricks. Now Sony has recently released a CMOS chip with analogue memory, like CCDs have, and IMO is the best of both worlds. It has fairly good color science, native global shutter and larger pixel area like CCDs but at the cost of CMOS. Sadly, there aren't any conventional cameras using this technology yet.

There's also quantum film right around the corner, which looks really promising, but I suspect it will wind up being an excuse to cram even more pixels into an unreasonably small area.

This is a discussion for another thread cuz we could go on about CCD's vs CMOS all night long! :)

 

Suffice to say, nobody has made a CCD look as cinematic as CMOS yet, artificial or not, the high resolution cinema camera version doesn't exist. So until it does, it's all theory and discussions.

 

Check out the MTFs. Earlier T-grain films resolve quite well, but as emulsions kept getting thicker with each gen, the MTF drops more and more abruptly above 20C/MM.

I haven't seen MTF values for Vision 3 in a while. I'll have to look/compare.

 

However, the proof is in the watching a print. I've watched dozens of prints over the last few years and haven't seen any non-technicolor print as crisp as the modern stocks. Dunkirk is suppose to have a lot of 35mm prints, all done photochemically, that will be a great thing to sample.

 

I suspect a lot of that has to do with the sharpening effect of modern lens coatings and other processes. The resolution isn't greater, but there's a little "bump" in the MTF curve of a lot of film projects before dropping like a rock that adds to the perceived sharpness.

Ehh... maybe? Honestly, I don't think so because a lot of movies are shot with more vintage lenses even on film. Also, just watch a 4k laser projected version of something modern shot on 35, it's so crisp and beautiful, it really is another world compared to the restoration transfers I've seen of movies even from the early 2000's.

 

I agree. I spoke to a Kodak engineer a few years ago and she said they stopped all R & D. Also, I would love to see a film DESIGNED for 16mm rather than a stock designed for 35mm and merely slit for 16mm. When I spoke to the engineer, she said there was a few things they could do to make 16mm better but since 35mm was the biggest seller, they couldn't justify the cost of a purpose-made 16mm stock.

No way would they ever make a purpose built 16mm stock. The cost would bring the price so high, they'd never sell any. Honestly, 16 is very expensive these days because short ends don't exist and Kodak doesn't really give deals since they don't sell that much. 35mm sells so much, you can get recan's for cheap and Kodak practically gives away stock if you need it. They've treated me really well with 35mm stuff, I've been very happy.

 

35 is really here to stay and so it large format.

Link to comment
Share on other sites

  • Premium Member

I really don't think you can design a stock that is "better" for 16mm yet isn't better for 35mm, because if the 16mm is better, then 35mm shooters are going to want to use it. Better as in sharper and less grainy? There isn't a market for that in 35mm?

Link to comment
Share on other sites

  • Premium Member

CMOS is a garbage in-between technology. CCD's just don't have the dynamic range of CMOS. The industry also wanted to use the same glass and mount system, so they just went for a technology and we're stuck with it. Honestly, CMOS is more filmic then CCD, it's just lacking in the color depth that you get with a 3 ccd block.

 

 

g.

I think you're somewhat confused.

There have been single-sensor CCD 35mm-sized cameras, the Panavision Genesis being the most notable example.

There are also 3-Chip CMOS (prism) cameras.

Apart from the superior (and non-patentable) colour separation you get from dichroic prism systems, a 3-sensor camera delivers displayable colour images almost directly from the sensors, which means there is minimal processing delay, vital for live television. Single-chip cameras can only deliver a fairly basic "live" image; most of the spectacular results you see from those are the result of an enormous amount of inter-frame jiggery-pokery in post-production.

 

The industry also wanted to use the same glass and mount system, so they just went for a technology and we're stuck with it.

Pretty egregious statement. Do you have any idea how much money is tied up in 35mm cine lenses? Far more than there is in cameras.

Also, it's virtually impossible to design a lens/sensor system for a 3-chip system that can reproduce the depth of field characteristic of 35mm cine cameras. Panavision had several goes at this and the results were all laughable.

If they weren't able to make single-sensor 35mm-sized digital cameras, virtually all movies would still be shot on film.

George Lucas & Co were a classic example of the tail wanting to wag the dog.

 

 

Link to comment
Share on other sites

  • Premium Member

 

 

A lot less of Apocalypto was shot on Genesis than they admit (they used many cameras) and more of it was film than they admit. Never the less, I thought the Genesis was a good camera for some things, but most people seemed to use it as an excuse to not light scenes well, which leads to a very "video-like" image.

 

They certainly used a mixed bag :rolleyes:

 

Aaton A-Minima

Arriflex 35-IIC, Panavision Primo Lenses

Arriflex 435, Panavision Primo Lenses

Ikonoskop A-Cam

Panavision Genesis HD Camera, Panavision Primo, Lightweight and Nikon Lenses

Link to comment
Share on other sites

  • Premium Member

I really don't think you can design a stock that is "better" for 16mm yet isn't better for 35mm, because if the 16mm is better, then 35mm shooters are going to want to use it. Better as in sharper and less grainy? There isn't a market for that in 35mm?

In any case, film is actually made in wide sheets that are sliced up as required. The manufacturing process would be exactly the same; you'd just get more footage if you were cutting it to 16mm vs 35mm

Link to comment
Share on other sites

I think the other thing is that digital has its own unique aesthetic as well. Look at Collateral by Michael Mann, you couldn't do the movie that way on film.

 

So I think both film and digital should co-exist, but I don't think film is necessarily superior.

I get the feeling that a couple of you guys would rather 100% of movies were still shot on film (correct me if i'm wrong).

Edited by Jesse Straub
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...