Jump to content

Epic and Scarlet release in 2010


Emanuel A Guedes

Recommended Posts

I don´t get it Keith.

I can throw any 4k RED Material onto Premiere CS5 and - without any other special hardware than a 350 Euro Nvida card - I can edit it like DV or DVCHDpro. Easy, smooth, realtime.

 

What seems to be the problem (when they not force you to work on FCP or AVID)?

 

Frank

 

Often on broadcast post production you need to have a workflow in place that enables various programs and processes to speak to each other. These workflows aren't based around Premiere, plus the editors don't use it either and they are the important element in post production. What works for you on your own may not be that simple in a more complex environment with tight schedules.

 

AVID seems to have caught up and is becoming more friendly towards ingesting all these new codecs.

Link to comment
Share on other sites

  • Replies 88
  • Created
  • Last Reply

Top Posters In This Topic

  • Premium Member

 

Why is resolution such a big deal? 80%+ of the viewing public don't have 20/20 vision. People can't resolve 4K+ of resolution. Sharpness, contrast, latitude and refresh rate are more important for great images than sheer number of pixels. We know this yet ignore it. For the majority of applications HD/2K is ideal in terms of image quality, data management etc. for presentation.

 

Another thing, how many RED shoots master their projects in 4K? I imagine very few. In Hollywood, sure, but for independent features they mostly finish in HD/2K, right?

 

 

Practically-speaking, yes, 2K/HD seems to satisfy a lot of people currently (maybe not me, so much...) but even so, there are a lot of advantages to oversampling, especially if you want less of an electronic look. If you are going to end up showing something in 2K/HD, extra resolution allows you to use a heavier Optical Low Pass Filter, it allows you to use noise reduction software, it needs less sharpening, etc. all of which help give you a detailed but smoother, less-sharpened look.

Link to comment
Share on other sites

  • Premium Member

I'm not sure how the 1.6K figure comes about. Assuming that Arri are going to debayer for HD, the camera should be capable of a 1.9k with I calculate as 2700 (Bayer) when shooting HD. The Arri table uses the same figures for the RAW and HD, even through the latter cover a very slightly smaller area on the sensor. http://www.arridigital.com/technical/cameraspecs

 

But if you have a 1.9K recording, wouldn't you want to avoid resolving exactly 1.9K in a line chart? Wouldn't that create moire? I wouldn't think any HD camera would actually want to resolve 1.9K. Anyway, from what I've heard, most pro HD camera recordings are in the 1.6K-1.8K range in terms of measurable resolution.

Link to comment
Share on other sites

But if you have a 1.9K recording, wouldn't you want to avoid resolving exactly 1.9K in a line chart? Wouldn't that create moire? I wouldn't think any HD camera would actually want to resolve 1.9K. Anyway, from what I've heard, most pro HD camera recordings are in the 1.6K-1.8K range in terms of measurable resolution.

 

Indeed moire would be a feature to be avoided.

 

I think the point I was trying to make is that Arri sensor should, on paper, manage a tad more in HD mode, since its pixel count doesn't drop off that much from 2k RAW. If they soften it to avoid moire effects on HD, that's another matter.

 

It's not surprising the 2/3" cameras are in that resolution range.

Link to comment
Share on other sites

  • Premium Member

But comparing HD out of the Alexa to 5K out of the Epic is a bit like comparing the worst option out of one camera to the best option out of the second, even if that's a likely scenario.

 

Comparing these cameras overall might be more like comparing a Porsche Carrera with it's "little" 385hp engine, to an old Mustang built up with a 600hp engine.

Link to comment
Share on other sites

  • Premium Member

I have completely the opposite opinion -- to design a digital cinema camera around 1920 x 1080 systems and make it beholden to 10-year-old HD tape workflow technology is completely assbackwards.

Don't get me started on the assbackwardness of the broadcast TV industries.

We're stuck with a whole lot of compromises that were based on the assumption that most cameras and display devices would be forever based on vacuum-tube technology or deviced designed to emulate their operation.

The reality is that the TV infrastructure is what it is, currently largely based on the best that Sony could come up with 15 years ago. They are not going to just bulldoze the entire structure into a gigantic dumpster and re-cable and start from scratch.

 

I heard almost exactly the same stupid arguments about 25 years ago, when Matsushita announced that the industry was required to switch to all-component operation, based around their MII analog component format. There were the same clueless MII Zealots making much the same assetions and arguments, although much more slowly without the Internet :lol:

 

In fact, while it seems inevitable that most of the world (ie the 96% that is not the USA) will eventually switch to digital TV transmission, most of the studio infrastructure seems set to remain PAL/NTSC for a very long time indeed.

 

Regarding the RED, giving the option of realtime HD delivery or OR "4K" hardly seems "assbackwards" to me. The 5D is the standout elephant in the room here.

Link to comment
Share on other sites

  • Premium Member

I don´t get it Keith.

I can throw any 4k RED Material onto Premiere CS5 and - without any other special hardware than a 350 Euro Nvida card - I can edit it like DV or DVCHDpro. Easy, smooth, realtime.

 

What seems to be the problem (when they not force you to work on FCP or AVID)?

 

Frank

Edit AND render?

Besides, I don't know you.

I have't been to any wrap parties with you.

I have't worked sucessfully with you before on my many Emmy-winning projects.

 

You might be a great guy, resourceful und unflappable at 3AM when the heat is on.

 

Or you might be a complete flake, with an irritating speech impediment, with hidden chemical dependence problems, and a tendency to lecture everybody on how the USSR failed Marxism when financial problems crop up.

 

Or you might be a steady competent worker, but you only take a bath once a week. Or month. Or you might be plain, agonizingly, excruciatingly BORING!

 

You might be none of these, you might be all of these, who knows?

 

But the guy with the old-fashiond tape-dased workflow, OK I know he's a closet queen, with a gambling problem, and likes to listen to Abba on his iPod, and eats raw liver sandwiches, but he gets the job done, and we can have beer afterwards.

 

This is what's known as my comfort zone.

 

Most industry people aren't totally averse to taking risks, it's just that they tend to have this irritating habit of only wanting to take risks that benefit THEM :P

 

 

 

Get the picture?

Link to comment
Share on other sites

These urban myths in digital cinematography are hard to destroy... So I'll try it again:

 

Typical test charts are problematic because they operate under unrealistic conditions (high-contrast, black/white, alignment along photosites) and easily "pretend" high resolution-numbers (2k, 2.4k, 2.8k, 3.2k...28k!)

http://provideocoalition.com/images/uploads/ResDetail.jpg

That's a sharpened RED-sample @"4k" wih an excellent lens and not strictly horizontal/vertical patterns - the only one I've found (hopefully it's properly done) and it barely reaches 2.5k ("1250" on the chart) without alaising (just because you see lines, doesn't mean they're actual detail).

But this is only half the picture, actual resolution-numbers (color, lower contrast, less obtrusive sharpening) should be even lower - especially compared to images not derived from bayer-filtering, compression or OLPF (film for example).

But that's still just resolution, what actually matters in the resolution/sharpness-debate for our perception is MTF at a given frequency - resolving a certain amount of detail at a given contrast and even regular testing equipment can be "tricked" by adding excessive sharpening) .

That's the basic theory.

 

Actually defining what pixel count you want is not a matter of production technology or development skill in the sensor department (sensor cost is mostly R&D and some size-related production-costs and has nearly nothing to do with pixel count) - it's a trade-off, a compromise:

Higher pixel-count (with lower pixel-pitch) results in higher MTF (also because of weaker OLPF) but lowers (given similar technological standards in sensor-design - ALEXA has >70% fill-rate, that's already top-notch for CMOS - I don't expect more from RED) DR (saturation - read noise) and increases noise (lower photon count due to smaller photosite-area). A specific problem in cinematography is the fixed output size (1080 or 2k or 4k - but not 3k or 5k) and the data-rate. 3k (16:9 aspect ratio -EPIC is a bit wider) @16bit and 24fps is 2Gbit/s, 5k is 5.6GBit/s - impossible to manage with todays technology.

 

ALEXA has a pixel-pitch of 8.25µm (= 68µm²) which results in 3072 actual pixels @ 25,3mm sensor wideness (the sensor itself is nearly 29mm wide).

EPIC has a pixel-pitch of 5.4µm (= 29µm²) which results in 5120 actual pixels @ 27.6mm sensor wideness (the sensor itself is ~30mm wide).

 

ALEXA as a real 4k-camera (although even 1.5x oversampling is not perfect) would need to have a 6k-sensor and RAW-data - theoretically that would result in a 2 stops loss in DR and noise (although the higher pixel counts compensates for that to a certain degree) with enormous data-rates - they decided against it for a good reason, just like Panavision generated 1080p from a 12MP-sensor and not 4k like RED.

 

Judging from RED's 4k-image, a 4k-image downsampled from native 5k EPIC-image (compressed as well) will only gain very little MTF (and complete suppression of alaising in comparison to a downsampled 2k-image - hard to justify given a four times higher data rate and high costs.

 

So they're basically both cameras for 1080/2k-output but because ARRI never intended to create 4k+ (that's what 35mm is for), they could use more than twice as large photosites (given similar fill-rate) on the sensor! They can record uncompressed RAW (is 2k not ready yet, just ARRIRAW with 2880 pixels?).

That's what it's about - actual IQ (not just resolution, but MTF, color, rendition, DR, lack of artifacts...) in a 2k workflow, reliability and usability in everyday work - not 1080p, 2k, 4k, 5k or 28k or other marketing claims.

 

When we don't appreciate these sometimes difficult to communicate design-decisions, the digital cinematography-market will become a shareholder-value and marketing oriented business! More gimmicks, lower standards, bugs and less real innovation. Worse for everyone - just like NASA has to use cheap foxconn-made notebooks which have be thrown away every few weeks, because the market lacks high-end alternatives.

 

Epic is made in the US?

So they have trained (3+ years) technicians for machining and assembly like ARRI now?

That would be really surprising in a good way - or is it just the prototypes? ALL previous products were not even assembled in-house, not even in the US. Don't get me wrong, major economic value is supplied by international specialists in all digital cameras (sensors made in US or Israel-fabs with German/Dutch technology, for example) - but the majority of the ALEXA is not made with slave-labour.

Edited by georg lamshöft
Link to comment
Share on other sites

  • Premium Member

 

So they're basically both cameras for 1080/2k-output but because ARRI never intended to create 4k+ (that's what 35mm is for),

 

Well, I think that's the basic problem with your argument, that digital cinema does not need to attain 4K resolution because that's what 35mm is for. There are all sorts of practical reasons for 2K becoming a standard for digital cinema, but that doesn't mean it's a good thing.

 

Philosophically, I think that cinema should try to be 4K at least and home video can be 1080P/2K. So the goal of digital cinema cameras should be to get closer to 4K than 2K. We can quibble about the actual resolutions recorded by the cameras (you can shoot your own tests and come to your own conclusions), but this is a more fundamental question: if digital is going to eventually replace 35mm, shouldn't it strive to match 35mm more closely in resolution, dynamic range, color reproduction, etc. Saying that "just shoot 35mm when you want 4K" is missing the point, we may not have 35mm around for much longer.

 

ARRI deciding to stick to 2K, in my mind, can only be a temporary, interim solution. Eventually, that level of resolution will be deemed only acceptable for television work. I hope ARRI sees that, and I think they will.

 

Now obviously resolution is only one element of overall image quality, so I don't want to overemphasize it, but I think anyone who can test these cameras is going to find the resolution difference fairly easy to measure, that's just common sense looking at the specs of the sensors and the recording formats. 2.8K RAW converted to 1.9K RGB (1080P) is not going to have as much resolution as 5K RAW converted to 5K RGB, or 4K RGB, no matter what the actual measurements show you (i.e. whether 5K RAW ends up measuring at 4.5K, 4.2K, 3.9K, whatever). Even 2.8K RAW converted to 2.8K RGB isn't going to have the same level of resolution, it's probably going to be about a third lower.

 

If that doesn't matter to you, fine, but don't say that the differences don't exist just because you don't care about that difference.

Link to comment
Share on other sites

  • Premium Member

We didn't end up with 2K RGB as a standard for D.I. work because everyone agreed it was the best format, it was a compromise based on the storage and processing speed limitations of the technology of the day. And even today, it will always be faster, cheaper, and easier to process 2K rather than 4K, being a 4X difference in data.

 

So, in my mind, to design new digital cinema technology around the notion that the gold standard is now 2K instead of 4K is backwards-thinking. 2K was a compromise and it will always be a compromise. It may be a practical and affordable compromise, it may be a compromise of little concern compared to other needs of the project, but it is a compromise nevertheless, though I'm sure there can be creative reasons for wanting lower resolution images, but I'm talking about cinema in general, the average cinematic image of a wide shot in daytime made with no diffusion on the lens.

 

Now some of you are particularly unfair because you simultaneously complain that the Red One is not "true 4K" (to which the solution is to record 5K RAW and higher) and then say that 2K is a smarter idea and 4K is ridiculous. Well, if you really like 2K, then a 4K sensor is a great idea because it's a simpler downsample (every two pixels becomes one) and it allows for less aliasing because you can get away with a heavier anti-aliasing filter on the camera. But don't challenge Red to make a "true 4K" camera only to turn around and make fun of them for trying to make higher resolution cameras because you think that 2K is "good enough".

 

Personally, I think that digital cinema is going to have to go to 4K as a standard if only to differentiate itself from HDTV in the home.

Link to comment
Share on other sites

Wait, we need to capture at least 4K for digital systems to match 35mm film? I don't buy that. I think HD is clean and sharp enough as it is.

 

A lot of HD cameras around today oversample for the best images to output 2K/HD (F35 & Genesis both have 5760 photosites across the horizontal, Alexa from 3.5K etc.), the RED isn't any different. Maybe it should be an aesthetic choice to shoot at HD instead of 4K in the same way one may choose Super16 over 35mm.

 

Does it mean to say that any number of films and television shows which have been shot in HD are lesser when compared to material that originated as 4K? If a film has great lighting, production design and the like what does it matter to capture it at a resolution that people can't see with their own two eyes. I'm not saying 4K is a bad thing, simply that it should be used for the right reasons, not just for its own sake. The majority of users don't need higher resolutions except for the need of wanting it. As the saying goes: use the right tool for the job.

 

HD is not 35mm. It is not objectively better or worse. It's - something else.

 

As far as tapeless capture is concerned, it's the way for productions in film and television, especially with recorders like the Cinedeck and Codex. Despite this, Sony is upgrading HDCAM-SR to 12-bit 2K by the end of the year, which doesn't rule out tape wholly just yet.

Link to comment
Share on other sites

  • Premium Member

Wait, we need to capture at least 4K for digital systems to match 35mm film? I don't buy that. I think HD is clean and sharp enough as it is.

 

That's your opinion. Objective measurements generally get around 3.2K-ish for Super-35 but anamorphic photography would have more vertical resolution than Super-35 cropped to 2.40. So if 35mm isn't 4K, it isn't 2K either and I'd rather round up than down.

 

My point is that IF digital cinema ends up replacing 35mm, then we can't settle for 2K/HD as becoming the new gold standard for resolution or else we are going down instead of up, we would be settling for less.

 

Haven't you been reading all the endless complaints here about 2K D.I.'s being a compromise for 35mm photography?

 

You could look at this article, which demonstrates why 2K is not quite adequate for 35mm scanning:

http://digitalcontentproducer.com/mag/video_digital_cinemas_special/index.html

 

We already can watch HD at home, I'd want to get something more than that on a 100' theater screen!

 

Honestly it would depress me to think that all cinema of the future, at home and in theaters, was going to be HD.

 

Really, is that what everyone here wants? Cinema to all be 2K???

Link to comment
Share on other sites

HD is not 35mm. It is not objectively better or worse. It's - something else.

 

You can be very objective as to whether it's better or worse. Each pixel or grain is a piece of information. HD has 2,073,600 of them per frame. How many does a frame of 35mm have?

 

You can say that neither is better subjectively, but if you want to be objective choose a metric. The problem is that one format beats the other in one metric, while the opposite is true for another metric. What is better is largely a matter of cost-benefit analysis. And that's subjective.

Link to comment
Share on other sites

  • Premium Member

As for the notion that HD photography looks "sharp", sure it's very sharp, sometimes horrifyingly sharp. But it's an electronic sort of sharpness, not the detailed but smoother look of film, which I think will only be achievable with oversampling so we can feel free to use heavier anti-aliasing technology while retaining fine detail. We can get plenty of sharpness with 1080P but it requires using as much resolution as the system can offer, so as a result, there tends to be a lot of edginess and crispness to detail, a certain twittery, aliasing knife's edge to shapes that looks very electronic.

 

Just look at still photography and compare a 2MP picture (which is what an HD frame is) to, let's say, and 18MP picture that is downsampled, the higher pixel count image looks a lot more pleasant in terms of edge detail.

Link to comment
Share on other sites

My point is that IF digital cinema ends up replacing 35mm, then we can't settle for 2K/HD as becoming the new gold standard for resolution or else we are going down instead of up, we would be settling for less.

 

Thank you, David, for that emphasized IF. The fight isn't over yet.

 

I agree that 4K is certainly better than 2K and I definitely want my cinema experience - when it has to be digital - to be better than my home theater experience. But I think that Arri have gone down the right road given existing technology and infrastructure (in terms of gear more than theater outfitting) by opting for larger photosites for more dynamic range with less noise. I think the day where we can get that out of 4K is yet to come, but to that end I'm glad that Red are out there pushing for it, hopefully to soon be joined by the likes of Aaton.

 

I wonder though, just given the physics of capturing light, does it make any sense for these companies to manufacture 65mm format cameras with 4K resolution? I mean, (DoF issues notwithstanding) the technology exists to handle the resolution without increasing the size of the cameras so the major hurdle would be availability of lenses, right? or am I missing something.

Link to comment
Share on other sites

  • Premium Member

 

I wonder though, just given the physics of capturing light, does it make any sense for these companies to manufacture 65mm format cameras with 4K resolution? I mean, (DoF issues notwithstanding) the technology exists to handle the resolution without increasing the size of the cameras so the major hurdle would be availability of lenses, right? or am I missing something.

 

Well, by coincidence, that's been around for a few years, the Phantom 65, a 4K 65mm sensor. Main problem, as always with larger sensors, is the lack of cinema optics to cover it. But the advantages of giving yourself more room to work with is the reason the 5K Mysterium-X sensor is 30mm wide and the 6K Montro sensor will be 36mm wide (FF35 / VistaVision size), which definitely would allow larger photosites in that example.

 

I agree that there are immediate practical advantages to what ARRI is doing to Alexa, I speaking about the bigger picture, so to speak, where we are headed if 35mm goes away and why I think we need to start thinking of 4K as being the new digital cinema standard, at least for origination and mastering, even if projection stays at 2K for awhile because even that will change.

 

And certainly even if 4K becomes the standard in movie theaters, 2K/HD will remain the standard for TV for awhile, and probably smaller features, so Alexa will get plenty of work before it ever becomes obsolete. But whatever the generation is after Alexa will hopefully be 4K+.

 

ARRI believes that 35mm is better than 2K, by the way. In fact, the reason the ARRI D20 got that name supposedly was because ARRI thinks that HD/2K is somewhere between 16mm and 35mm in terms of resolution, a fictional "20mm" format (sort of an in-joke I think.)

Link to comment
Share on other sites

  • Premium Member

David does that make the D21 the "Super 20mm" format ;)

 

I agree that in the future, which I still think will be peppered with film origination, that we're going to need to look at 4K at least for our theater experiences and our D-cinema cameras. However, I also honestly think that we are approaching a limitation in the technology when we really start to diminish our returns. Yes, we get more resolution, but we loose sensitivity in our cameras. And, I think given the nature of this industry we will see a synthesis of both 2K systems with better low light capabilities (like high speed film stock) used in situations where you need that performance, and 4K systems (like mid-range stocks) where you have the ability to light more. In truth; how much of that overall resolution really reaches the screen and how much will an audience notice it? My opinion, of course.

Link to comment
Share on other sites

  • Premium Member

But I've shot with the M-X Red One and 800 ASA is no problem, and 1600 ASA is fine for night work... and that's a 5K sensor, so how much more sensitivity to you really need???

 

Not to mention if you have a 4K file and you only need to end up with a 2K master, then you can get away with noise reduction, so an M-X image at 3200 ASA, for example, noise reduced and then downsampled to 2K, might not look any noisier or less detailed than an Alexa image at 3200 ASA at 2K.

Link to comment
Share on other sites

  • Premium Member

Quite true. And for me personally, I'd be interested in having a 6400 ISO sensor ;) but that's just for personal pleasures. I don't think you'd need much more than 1000 asa for 90% or so (just guesstimate stat) of productions.

Link to comment
Share on other sites

That's your opinion. Objective measurements generally get around 3.2K-ish for Super-35 but anamorphic photography would have more vertical resolution than Super-35 cropped to 2.40. So if 35mm isn't 4K, it isn't 2K either and I'd rather round up than down.

 

My point is that IF digital cinema ends up replacing 35mm, then we can't settle for 2K/HD as becoming the new gold standard for resolution or else we are going down instead of up, we would be settling for less.

 

 

Yours and Jim's (btw, we all know he's reading these lines) point. Whether we like whether not, we cannot say Jim has no a successful business strategy. They're posts like yours that makes industry goes ahead. Digital included. All that digital vs. film stuff is just net plein/plenty rubbish.

Link to comment
Share on other sites

  • Premium Member

Yours and Jim's (btw, we all know he's reading these lines) point.

Hi Jim!! How you been? How are the dogs doing? You never talk about them... Why not post some Epic footage of a couple of those? I hear even YouTube supports 4K now! B)

 

Whether we like whether not, we cannot say Jim has no a successful business strategy.

 

And you know this how?

OK he's sold cameras with serial numbers up to 7,000 or so, which doesn't necessarily mean he's actually sold 7,000.

Has he made any money?

On average, how many of his customers have made money?

 

Who knows?

Still I guess if it's OK to redefine the terms:

 

  • "Will/is" to mean "expect/Hope to"
  • "K of resolution" (meaning "horizontal RGB pixels in a film scan") to mean "number of original monochrome pixels per row of a colour masked sensor"
  • "Resolution" to mean "number of original monochrome pixels of a colour masked sensor" (or alternatively "the square of the original meaning of resolution"
  • AS WELL AS the meaning and definition of the terms "resolution" and "lines" themselves.
  • "No noise reduction" to mean "noise reduction that-is-not-called-noise-reduction"
  • "Raw" (originally meaning "completely unprocessed image data direct from the sensor"), to mean highly compressed but otherwise "completely unprocessed image data direct from the sensor"
  • "Flesh coloured" to mean "Brown" etc etc,

Yeah, I guess it's OK to redefine "successful business strategy". :lol:

Link to comment
Share on other sites

According to David Mullens quote of the Digital Content Producer article about 2K versus 4K scanning. It seems that an argument can be made that 4K scanning is superior however the law of diminishing returns sets in and what we are left with is a very small improvement. When comparing a 2K versus 4K scan I do not see the 4 times increase in resolution that is promised but rather a slightly smoother image with the outlines slightly less jagged and with slightly reduced aliasing artifacting. Therefore a compelling argument can be made that for a 35mm film negative scan 2K is good enough and 4K is overkill. Therefore in order to really take advantage of 4K digital projection we must have access to 65mm film origination otherwise 4K is just a waste of money.

Link to comment
Share on other sites

  • Premium Member

When comparing a 2K versus 4K scan I do not see the 4 times increase in resolution that is promised

 

Not surprising you don't see a four times increase, since that's only a two times increase in resolution, as has been pointed out many, many, times, here and on other forums, to no avail. A flock of 10,000 galahs that have been taught to screech out the same B.S. statement over and over, doesn't make it correct.

 

And as is also clearly pointed out in my copy of Donald Fink's Seminal 1940 extbook "Principles of Television Engineering"! (No not about the birds, about the resolution).

 

To get twice the resolution, which is for the benefit of the viewer, you need four times as much data, which is a liability for the originator/distributor, not an actual benefit as some people seem fond of implying. "Making a virtue out of a necessity" as the old saying goes.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Forum Sponsors

Broadcast Solutions Inc

CINELEASE

CineLab

Metropolis Post

New Pro Video - New and Used Equipment

Gamma Ray Digital Inc

Film Gears

Visual Products

BOKEH RENTALS

Cinematography Books and Gear



×
×
  • Create New...