Jump to content

I like the concept, not sure what the R1 FANBOYS will say


Stephen Williams

Recommended Posts

  • Replies 73
  • Created
  • Last Reply

Top Posters In This Topic

I'm disappointed.

All I really wanted was a 2k+ s35 sized sensor in a cheap small package (something like the Canon 5D mkII but recording redcode raw) there isn't anything like that in this release.

To get S35 in the Scarlet you have to go to 5k which is overkill (for me anyway) and is twice the price of the original Scarlet concept (without any of the required accessories). Still half the price of a Redone though which is amazing.

 

With a "2K" sensor you will only be getting about 720p resolution, especially in low light. You need 3K or 4K to be able to downsample to 2K or 1080p and be left with a pristine image. Scarlet FF35's 6K will result in beautiful "True 4K" images. Even the S35 Scarlet is 5K, enough to produce good-looking 4K results.

Link to comment
Share on other sites

  • Premium Member
. Even the S35 Scarlet is 5K, enough to produce good-looking 4K results.

 

Hi Tom,

 

The old Red One produces pretty good images at 3 or 4K, only DR is an issue IMHO. The 3k Scarlet should be adequate for many applications, the S35 Scarlet should be very good indeed. I don't know why people are getting so hung up about resoloution.

 

Stephen

Link to comment
Share on other sites

Hi Tom,

 

The old Red One produces pretty good images at 3 or 4K, only DR is an issue IMHO. The 3k Scarlet should be adequate for many applications, the S35 Scarlet should be very good indeed. I don't know why people are getting so hung up about resoloution.

 

Stephen

 

Stephen, I tend to believe that 4K is going to be the new 1080p, and sooner than many suspect. 4K Bayer-pattern cameras do not hold up at 4K, pixel for pixel. Some claim that a 20% spatial downsampling is needed for pristine Bayer images. Others put the amount of downsampling needed much higher. If you are shooting in low light, the need for downsampling is increased even further. So a 5K camera with a large sensor like the S35 Scarlet should create a nice "True 4K" final product. The 6K FF35mm camera, an even more beautiful 4K. I think that a clean 4K will essentially "future proof" any project you can shoot over the coming decade(s). Once you move past 4K as a presentation format, you lose 90% or more of all the movies and TV shows and music videos that have been shot over the last 100 years, so it seems unlikely that consumers will be buying flat screen TVs or projectors with greater-than-4K resolution in the foreseeable future.

 

(Note: I do understand that recent 35mm features and anamorphic and older Vista Vision pictures could scan out at, say, 6K, for example, but that is a small number of pictures and not enough to justify a jump to 6K LCD TVs, when those TVs would have to upsample the vast majority of films and TV shows, some of which will barely even hold up at 4K, IMHO.)

 

But yeah, at some point, wrangling and managing all that data becomes a serious pain in the ass for the shooter. I think the S35mm and FF35 Scarlets are at kind of a technological sweet spot. By the time they come out, the computers and storage devices needed to handle the data will be easily available.

Link to comment
Share on other sites

Hey Tom,

 

I agree with you that perhaps 4k will eventually become THE mastering format standard, but I have a hard time believing people will be going out to buy 4k televisions and projectors when it seems most people don't even care about HD!

 

I could post lots of links to articles that discuss this, but I'm sure you've seen them yourself. I mean a lot of people thought once the HD-DVD/ Blu-Ray battle ended everyone would rush out to get their HD televisions and blu-ray players, but they haven't. And with the economy the way it is I don't believe anyone will be buying this stuff even for the holidays.

 

Regardless of the economy, most people don't see the difference, or they at least don't think the difference in picture quality justifies the cost. Yes, costs are going down, but do you really think people are going to replace their entire dvd collections, and spend all that money all over again, just for the boost in clarity?

 

I'm getting a little off topic. Sorry!

 

 

Jay

Link to comment
Share on other sites

Jay, I understand all the reasons you listed for why it might take a long time for 4K to become the new "gold standard" for presentation, but even keeping those factors in mind, I think the shift to 4K will happen faster than most suspect. Video games alone are starting to drive the manufacture and sale of LCDs with higher and higher resolution - beyond 2K.

 

The Audio/Videophiles who spend tens of thousands on their home theaters are constantly looking for the latest and greatest, and willing to spend whatever it takes. And there are millions of them around the world, even in places like China now.

 

As for HDTVs, I can tell you this: people are still buying them like hotcakes in the United States. It's one the biggest items people here desire.

Link to comment
Share on other sites

  • Premium Member
As for HDTVs, I can tell you this: people are still buying them like hotcakes in the United States. It's one the biggest items people here desire.

Yes but what are they really buying?

 

HDTV sets, or huge screen flat panel displays that can be mounted on a wall?

The price is currently in free-fall. Just last Friday I was looking at a new 42 inch Sanyo 1920 x 1080 LCD with analog/digital tuner which will retail here for $A1499, an insane price compared with just a couple of years ago.

 

More and more manufacturers are offering full HD 1920 x 1080 models, but there are still plenty of "half HD" 1440 x 768 panels being sold, and (believe it or not) quite a lot of 820 x 480 plasma panels, at bargain-basement prices.

820 x 480 might sound hopelessly antiquated, but in most cases if it has a digital tuner it's still going to be a vast improvement over the CRT TV it replaces.

And, it's still a flat panel that you can mount on the wall

 

Now we need is enough True HD programming to make it worthwhile :lol:

Edited by Keith Walters
Link to comment
Share on other sites

Guest Glen Alexander
Now we need is enough True HD programming to make it worthwhile :lol:

 

 

just turn on the footy.... Go the Crows!!

Link to comment
Share on other sites

Stephen, I tend to believe that 4K is going to be the new 1080p, and sooner than many suspect. Once you move past 4K as a presentation format, you lose 90% or more of all the movies and TV shows and music videos that have been shot over the last 100 years, so it seems unlikely that consumers will be buying flat screen TVs or projectors with greater-than-4K resolution in the foreseeable future.

 

There really is no need for 4K home presentation. The home theater environment is too small for 4K to be of any real advantage. The further you sit away from the screen the less you really see the advantage of 1080.

 

Film doesn't have a fixed pixel grid because film isn't digital. Film isn't limited by fixed pixel resolutions like 4K. There are other factors that determine the level of usable resolution that can be scanned from a film negative, primarily film speed and exposure density. Prior to the 1980's film speed didn't go above 100ASA. Its likely that such fine grained films would scan pretty well in high resolutions.

 

 

By the time they come out, the computers and storage devices needed to handle the data will be easily available.

 

What is going to change with computers and storage devices a year from now?

Link to comment
Share on other sites

With a "2K" sensor you will only be getting about 720p resolution, especially in low light. You need 3K or 4K to be able to downsample to 2K or 1080p and be left with a pristine image. Scarlet FF35's 6K will result in beautiful "True 4K" images. Even the S35 Scarlet is 5K, enough to produce good-looking 4K results.

 

The number don't really work out the way you have presented them. How would a 2K sensor net you only 1280x720 resolution? When a 1920x1080 sensor will give you 1920x1080 resolution.

Link to comment
Share on other sites

The number don't really work out the way you have presented them. How would a 2K sensor net you only 1280x720 resolution? When a 1920x1080 sensor will give you 1920x1080 resolution.

 

Hey Tenolian,

 

It's because the Red uses a bayer filter CMOS censor. The video has to be debayered, which causes some of the resolution to be lost. So 2k ends up being less then 2k. So in order to get a solid 2k resolution, you shoot with something around 3k, and bring it back down to 2k in post.

 

I'm really tired of all these K's. :P

 

 

Jay

Edited by Jay Taylor
Link to comment
Share on other sites

The number don't really work out the way you have presented them. How would a 2K sensor net you only 1280x720 resolution? When a 1920x1080 sensor will give you 1920x1080 resolution.

 

Because it depends on how that 2K or 1080p is achieved.

 

The Panavision/Sony Genesis, for example, uses a cinema 35mm 12.4 megapixel CCD chip (5760x2160) with some kind of proprietary RGB color striping system, which results in a "True 1080p."

 

CMOS Bayer-pattern sensors (like Red One's), on the other hand, only have 70% to 80% spatial efficiency. So in order to get a "True 2K" you need roughly a 3K Bayer-pattern CMOS sensor. In other words, you have to downsample the image by 20- or 30% to get a pristine image. People who shoot on DSLRs are familiar with this.

Link to comment
Share on other sites

Yes I fully know the difference between CCD and Bayer filter sensors. My point is that it does not work out the way you described. 2K does not become 1280x720 because of bayer filters.

 

Resolutions (720,1080) are simply a fixed pixel grid. You can place any video source you choose into the grid. The question comes down to that video source having enough real information to adequately fill the grid without exhibiting digital artifacts.

 

 

Because it depends on how that 2K or 1080p is achieved.

 

The Panavision/Sony Genesis, for example, uses a cinema 35mm 12.4 megapixel CCD chip (5760x2160) with some kind of proprietary RGB color striping system, which results in a "True 1080p."

 

CMOS Bayer-pattern sensors (like Red One's), on the other hand, only have 70% to 80% spatial efficiency. So in order to get a "True 2K" you need roughly a 3K Bayer-pattern CMOS sensor. In other words, you have to downsample the image by 20- or 30% to get a pristine image. People who shoot on DSLRs are familiar with this.

Link to comment
Share on other sites

  • Premium Member
Yes I fully know the difference between CCD and Bayer filter sensors. My point is that it does not work out the way you described. 2K does not become 1280x720 because of bayer filters.

 

Resolutions (720,1080) are simply a fixed pixel grid. You can place any video source you choose into the grid. The question comes down to that video source having enough real information to adequately fill the grid without exhibiting digital artifacts.

 

We're just talking about the difference between pixel dimensions versus measurable resolution. If you want HD out of a single-sensor Bayer-filtered camera, it helps to start out using more pixels in the sensor than you need in the final HD recording. So 3K RAW Bayer is good for getting 1080P, whereas it gets harder to get good 1080P from only 2K RAW Bayer.

 

Obviously though there are other factors at work and some single-sensor cameras that only start out with 1920 x 1080 pixels still produce good HD pictures, just that they have to perform more tricks to improve sharpness.

Link to comment
Share on other sites

Yes I agree the more information you start out with the better a downsampled 1080 image in the end. But their are compromises.

 

Advantages and compromises can be shifted in differing ways. A 12MP sensor at 1920x1080 with little to no compression could produce a visually comparable image to a 8-10MP sensor at 4K with significant compression.

 

 

We're just talking about the difference between pixel dimensions versus measurable resolution. If you want HD out of a single-sensor Bayer-filtered camera, it helps to start out using more pixels in the sensor than you need in the final HD recording. So 3K RAW Bayer is good for getting 1080P, whereas it gets harder to get good 1080P from only 2K RAW Bayer.

 

Obviously though there are other factors at work and some single-sensor cameras that only start out with 1920 x 1080 pixels still produce good HD pictures, just that they have to perform more tricks to improve sharpness.

Link to comment
Share on other sites

Pardon? So...what is the difference then?

 

Since this has been answered I'm not entirely sure why it has to be answered again. But I'll give a real world working example.

 

The Arri D21 uses a 2880 x 2160 Bayer CMOS sensor that outputs 1920x1080. Its been used to recently shoot "The Bank Job" and "RocknRolla" which both looked pretty good.

Link to comment
Share on other sites

  • Premium Member

This might pre-answer some questions:

 

http://en.wikipedia.org/wiki/Image_sensor

 

Don't you hate posters that put in Wikipedia links?

 

To me, it makes sense to run a 3 chip CMOS. I'm not keen on Bayer. It works well enough. But the active, lower bit processing of a monochrome chip X 3 out to 3, synced UDMA mini drives would be pretty useful and fast enough to maintain high and true pixel resolutions. But, what do I know? People from Mississippi are good for little more than eating fried foods and mating with their siblings.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Forum Sponsors

BOKEH RENTALS

Film Gears

Metropolis Post

New Pro Video - New and Used Equipment

Visual Products

Gamma Ray Digital Inc

Broadcast Solutions Inc

CineLab

CINELEASE

Cinematography Books and Gear



×
×
  • Create New...