Jump to content

RedRay a reality?


Keith Walters

Recommended Posts

  • Premium Member

I'm looking forward to seeing this product.

 

I can see this device as being great for film festivals, assuming it can playback a variety of sources loaded into it. What sort of output does it use to go to what sort of 4K projector? What was used at the Red party? The Sony 4K projector?

Link to comment
Share on other sites

  • Replies 108
  • Created
  • Last Reply

Top Posters In This Topic

I'm looking forward to seeing this product.

 

I can see this device as being great for film festivals, assuming it can playback a variety of sources loaded into it. What sort of output does it use to go to what sort of 4K projector? What was used at the Red party? The Sony 4K projector?

 

The production unit will have Quad DVI (or HDMI) and Quad HD-SDI outputs. Use 4 for 4K, 2 for 2K or 1 for 1080P output to any current display. We used a Sony 4K projector at the reduser party.

 

Jim

Link to comment
Share on other sites

In a year I am going to say that compression is so '90s and 2000s.

 

Honestly, it's a cop-out. Come up with a way to fit more information into smaller spaces. I haven't seen a single form of compression, except truly lossless compression formats, that I am satisfied with.

Link to comment
Share on other sites

In a year I am going to say that compression is so '90s and 2000s.

 

Honestly, it's a cop-out. Come up with a way to fit more information into smaller spaces. I haven't seen a single form of compression, except truly lossless compression formats, that I am satisfied with.

 

Looks like you are stuck with it until someone can fit more info into smaller spaces. WHy not vote for "fit the most important info into the smallest space" in the meantime?

 

Jim

Link to comment
Share on other sites

  • Premium Member
I can't imagine any technical reason why this codec couldn't be used for material from other sources.

Neither can I, and neither presumably can thousands of other people, but I prefer knowing to imagining.

I couldn't imagine why the RED One couldn't output live 1920 x 1080 either, but eventually the information became available.

Link to comment
Share on other sites

  • Premium Member
Conventional wisdom is usually based on conventional thinking and current methodology. Jim

Ugh. That just sounds so-o-o-o 1970s ;)

 

But this is correct, sometimes these threads read like something out of a 19th century copy of Scientific American, where they would occasionally try to paint a picture of what the 20th century will be like. You know, the sky is full of private airships, all well-to-do homes have their own telegraph key and San Francisco harbour is crossed by a gigantic copy of the Brooklyn bridge...

 

Considering the blazing speed with which the standard home TV screen went to 42" full 1920 x 1080 at prices way below what people used to pay for fairly ordinary large screen SD CRT sets, I can't see 4K screens being that far away. They'll need some reason why we need to replace our perfectly good existing equipment.

 

And a lot of producers are going to have a lot of explaining to do. :lol:

Link to comment
Share on other sites

  • Premium Member
Though film doesn't have a single fixed pixel grid, the range of scanning structures that make sense is not unlimited. Even if you shoot 5201 with master primes, upping the scan from 32K to 64K is only going to get you a marginally better look at the grain of the camera original, not a better image of the original scene.

 

-- J.S.

Huh?

Do you mean "32K" as in 32,768 pixels across?

Why on earth would you want to do that.

I was only talking about going from 2K to 4K (or maybe even 8K, which is already available although expensive and currently pointless)

Link to comment
Share on other sites

  • Premium Member
Do you mean "32K" as in 32,768 pixels across?

 

Yup. My point is that someplace between 4K and 32K, scanning film with more K stops making sense. You're not seeing the picture better, you're just seeing the film grain better. We'll find that point when we pass it. ;-)

 

 

 

 

-- J.S.

Link to comment
Share on other sites

  • Premium Member
In a year I am going to say that compression is so '90s and 2000s.

 

As long as storage and bandwidth cost money, we'll have compression. The price/performance points will shift around, but the fact remains that we already have compression systems that work to the satisfaction of the overwhelming majority.

 

Consider SR -- a $90K machine using IIRC 2.7:1 compression. Is it worth $243K to do the same thing without compression? Or $270K paralleling three of them as bit buckets? It ain't gonna happen.

 

 

 

 

-- J.S.

Link to comment
Share on other sites

In a year I am going to say that compression is so '90s and 2000s.

 

Honestly, it's a cop-out. Come up with a way to fit more information into smaller spaces. I haven't seen a single form of compression, except truly lossless compression formats, that I am satisfied with.

 

It's not just that you need to come up with a way to fit information into smaller spaces. That happens like clockwork. The issue is that there are significantly diminishing returns as one moves from highly-compressed to totally uncompressed. The differences between a Blu-ray disc (25-30 Mb/s) and its uncompressed master (989.36 Mb/s) is already subtle enough that most people wouldn't notice them. Once you back off from ~35:1 compression to 12:1 compression (Redcode 28), you've already got footage that holds up pretty well as an acquisition format, as Red has demonstrated. Moving to 5:1 compression will get a little more. Moving to 2:1 compression will get you a very little more. And moving from 2:1 to uncompressed gets you nothing or almost nothing.

 

At the same time, at a given data rate, a compressed signal is (assuming you're not misusing your compression algorithm) always going to look better than an uncompressed signal that has been downscaled or (especially) acquired at a lower resolution. Conceptually, in this context, the best approach is to think of downscaling as an extremely primitive and ineffective compression algorithm, which is being compared to much more advanced algorithms. The same is true of saving space by reducing bit depth, subsampling color, etc.

 

Combine these two factors and you can derive the criteria under which uncompressed makes sense. It doesn't really make sense until you have enough resolution, etc. that nobody could reasonably want more (or no sensor can acquire more) and technology has advanced to the point that it's almost trivial to work with such footage uncompressed.

 

For instance, uncompressed SD is nearly trivial now. But compression hasn't died yet, because nobody was happy with the resolution provided by SD; they want HD (or better), and uncompressed HD is still annoying.

 

At some point, people will presumably decide that 16-bit-per-channel 8K (or whatever) is really enough. And at some later point, advances in storage technology will make it practical to work with that format uncompressed. And only then will lossy video compression start to disappear.

Link to comment
Share on other sites

Looks like you are stuck with it until someone can fit more info into smaller spaces. WHy [sic] not vote for "fit the most important info into the smallest space" in the meantime?

 

Jim

 

That is a fair assertion. At the same time, "uncompressed systems" (like IMAX) continue to exist, and I will continue to use them, and patronize them.

 

I wouldn't touch a JPEG with a 10-foot pole.

Link to comment
Share on other sites

As long as storage and bandwidth cost money, we'll have compression. The price/performance points will shift around, but the fact remains that we already have compression systems that work to the satisfaction of the overwhelming majority.

 

Sometimes ignorance really is bliss, isn't it? I honestly want to know nothing about compression. It is depressing, and I avoid it whenever at all possible.

Link to comment
Share on other sites

Yup. My point is that someplace between 4K and 32K, scanning film with more K stops making sense. You're not seeing the picture better, you're just seeing the film grain better. We'll find that point when we pass it. ;-)

 

There are very, very few legitimate people who would claim that 35mm needs to be mastered at anything above 4K. It's just a waste. Now, one could argue that the film should be scanned at 6K and finished at 4K. That makes some sense, depending on the particular project. But IMO, 4K is going to be the gold standard for film and digital for the decade ahead.

 

And the ascendance of 4K is coming sooner than most realize. Only a couple days ago I was here saying this on the General forum, that 4K was coming soon, and I was mocked by Walter Gaff and others as someone who promotes "science fiction." Then, only one day later, REDray 4K at 10 Mbits/sec blew everyone away in Vegas and put 4K very close to consumer distribution. I wonder if Walter and those guys on that thread still think I'm talking "science fiction"? ;) :lol:

Link to comment
Share on other sites

I have no particular knowledge of exactly what Red's intentions are, but I'd be surprised if they were planning to turn RedRay into a mass-market consumer device, which among other things would require an interactive layer, elaborate copy protection, and lots of dealmaking with studios. Oh, and a bunch of 4K TVs, to really take advantage of it. I can see 4K computer displays happening in the not-too-distant future, mostly because the extra resolution would be really nice for text rendering. But there isn't all that much of a use case for 4K TVs; most people sit too far away to even really benefit fully from 1080p.

 

Rather, I'd expect instead that RedRay will mostly be a production tool (if things haven't changed, it can also play back Redcode Raw from CF cards and hard drives, making it quite useful on-set, at the production office, in the DP's living room... heh) and an exhibition device of some interest to the festival circuit and art house theaters (being much cheaper and more accessible than DCI-spec hardware, software and workflows).

Link to comment
Share on other sites

  • Premium Member
I finally figured it out. Not sure why it took me so long... Cliff Clavin. :-)

Jim

Yes but Cliff Clavin is only a character in a TV show, he's not a real person.

Are you suggesting I am not a real person?

 

Could be. After all I'm pretty convinced that you're just a figment of some advertising copyrighter's imagination. Your alleged life story is almost as strange as mine, and it went off in a bizarre new direction at almost the same time as mine. Maybe an alien spaceship disguised as a meteor crash-landed near Masterton New Zealand thirty years ago. Or am I thinking of Green Lantern...?

Green, red, what's the difference?

 

Or maybe I'm just like the guy in "Life on Mars", and you're just something I dreamed up to annoy Panavision (which you've done very well, incidentally).

 

But here's the really strange thing: until now I'd never heard of Cliff Clavin, in fact I've never watched a single episode of "Cheers". I had to look it up on the Wikipedia.

Hey! That's probably why it took you so long!

 

We will control the Horizontal!

We will contol the Vertical!

We will contol the market for fashion sunglasses and handgrips.

 

Nintendo Wii?

 

Nurse! Where's my damned meds?!

Waddya mean this is Burger King?!

 

Waugghhh! Waugghhh!! The world is getting complex again!

 

BTW, CAN Red Ray be used with other video sources?

Link to comment
Share on other sites

There are very, very few legitimate people who would claim that 35mm needs to be mastered at anything above 4K. It's just a waste.

 

Umm, no it wouldn't. To take advantage of, say 3.2K of resolution, a 6.4K scan would be necessary. 6K is actually slightly too little if you are scanning a 4-perf. OCN.

 

Then theres Vistavision, 5-perf. 65mm and 15-perf. IMAX. . .

Link to comment
Share on other sites

Yes but Cliff Clavin is only a character in a TV show, he's not a real person.

Are you suggesting I am not a real person?

 

 

Well, using a digital photo that isn't so obviously a face Photoshoped onto a scalp would help your credibility immensely, for starters.

 

:blink:

Link to comment
Share on other sites

  • Premium Member
Well, using a digital photo that isn't so obviously a face Photoshoped onto a scalp would help your credibility immensely, for starters.

 

:blink:

 

:blink: yourself.

 

Which photo are you talking about? The the 64 x 64 avatar, or the ones from the animated gif in my profile.

I can give you any one of them with the full 3.2 megapixel original resolution.

Or would you like a quicktime of one of my commercials?

References from some of my former employers perhaps?

 

Sorry if I came across as disrespectful before, but be reasonable, if I was respectful to you I'd have to be respectful to everybody.

 

As the old REM song goes:

 

"I don't want to disappoint you,

But I'm not ... here to anoint you."

 

See unlike you, I can produce a certificate that proves I'm sane :lol:

 

It's like when my therapist told me to go out and make three new friends; I would be amazed at the results.

Well guess what? I'm not amazed, and what the hell am I going to do with these damned friends?

 

Hey, (bonk,bonk) is this thing on...?

Link to comment
Share on other sites

Yeah, your Avatar is obviously fake at 64x64. No need to send me a bigger file that looks more fake.

 

You can treat me any way you want, but you've lost my respect with this latest tirade.

Link to comment
Share on other sites

  • Premium Member
Yeah, your Avatar is obviously fake at 64x64. No need to send me a bigger file that looks more fake.

When you reached that conclusion, were you whipping off your sunglasses to the sound of an old song by The Who?

 

You can treat me any way you want, but you've lost my respect with this latest tirade.

I had your respect?!

NOW you tell me!!

Sheesh....

Link to comment
Share on other sites

  • Premium Member
Yup. My point is that someplace between 4K and 32K, scanning film with more K stops making sense. You're not seeing the picture better, you're just seeing the film grain better. We'll find that point when we pass it. ;-)

-- J.S.

Ye-e-e-s.... My point was that 4K scanning, and certainly not 2K, does not extract anywhere near the theoretical maximum resolution actually stored in 35mm negative. With an extremely good, well-focussed lens, 50ASA film can record in excess of 160 lines per millimeter, which is about 4,000 lines across the active width of a super-35. The lines are clearly visible under a microscope, but that doesn't necessarily mean current scanning or post-production technologies can actually make make use of that resolution, nor is there currently any particular use for it even if they could.

 

The other fact willfully ignored (or more likely, simply not understoood) by many, is to accurately scan 4,000 physical lines you need a minimum of 8,000 photosites, (and preferably 10,000, which is available from some Imagica scanners, but rarely, if ever used). On that basis, 10K is all that would ever be required.

 

Also, film "grains" are all basically the same density, it's the grain structure that produces the variations in brightness that make up the image, just like a "half-tone" newspaper photo. On that basis, recreating the actual grain structure captures more of the tonal range, so "just seeing the film grain better" is not the whole story.

 

But yes, while the vast majority of movies are still projected on a third or fourth generation film print, none of this is going to be of any relevance. But we are talking about a 19th century technology....

Link to comment
Share on other sites

The other fact willfully ignored (or more likely, simply not understoood) by many, is to accurately scan 4,000 physical lines you need a minimum of 8,000 photosites, (and preferably 10,000, which is available from some Imagica scanners, but rarely, if ever used). On that basis, 10K is all that would ever be required.

 

 

This is nonsense. You're not shooting a chart, on a tripod in a lab, you're shoooting wide open, with 500T, out here in the real world.

 

I would say 6.4K (6K) is more than enough for anything ever shot on 4-perf. 4K gets pretty damned close, to my humble set of eyes, to the look of a real optically printed film.

 

2K is sh*&. The bare minimum for scans really ought to be 3.2K, not 2K. 2K is great for TV, and not for anything else really.

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.

Forum Sponsors

BOKEH RENTALS

Film Gears

Metropolis Post

New Pro Video - New and Used Equipment

Visual Products

Gamma Ray Digital Inc

Broadcast Solutions Inc

CineLab

CINELEASE

Cinematography Books and Gear



×
×
  • Create New...