Jump to content

How does the camera compare to Red?


Adam Smith

Recommended Posts

Guest Glen Alexander
Anyone else notice the "Appeal to Authority" logical fallacy here? I detect a Linux fanboy...it's no use reasoning with fanboys of any sort.

 

Notice the ignorance of people who can't or don't do anything but only doubt the people that have done.

 

There is a difference of wannabees like you and people who have and are professionals.

 

Watch out before you start slinging words like 'fanboy' around. You 'sound' like someone who probably couldn't detect their own a_s from a hole in the ground.

 

How does your detector work with your knuckles dragging the ground anyway?

Link to comment
Share on other sites

But why not a 35mm size SI/2K? why didn't you build one in the first place?

 

Hi Evangelos,

 

The imaging specs on the Altasens sensor are very good, and it does produce some stunningly good image quality with great dynamic range, color-fidelity, and low-noise. As you've noted, the 35mm DOF issue is one downside of using 2/3" sensors, but compared to anything else on the market, the Altasens sensors are far-and-away the best CMOS sensors that money can currently buy for digital cinema production without starting from complete scratch with all the uncertainties and cost of a new custom sensor design. Obviously as demonstrated by Arri, RED, and others, there are 35mm-sized sensors that can be made, but those are all custom designs. With the SI-2K that was not an option for us.

 

We actually do have some 35mm-sized sensors in camera heads that we have made, but they are not digital cinema quality.

 

Thanks,

 

Jason

Link to comment
Share on other sites

With most linux/bsd distros you can obtain source code and re-compile.

 

The source code for the drivers we're using is not available. Also since the drivers were designed for Windows, I know simply having the source-code would not make it as easy as a re-compile.

 

Linux is not being used simply due to lack of drivers for our required hardware. And this is not a "simple" problem. Even for video drivers, Nvidia Detonator drivers are reportedly 15 million lines of code . . . we're talking some very complex stuff here that large companies spend lots of money making sure they are stable on a specific OS platform.

 

Having to deal with bad drivers is much worse that the fairly petty problems we've encountered with threading on the Windows kernel. For instance, Linux will have kernel panics, and that's the same thing as a Windows blue-screen . . . since we don't have blue-screen issues, I don't see why we would want to trade-away that underlying OS/hardware stability for the perception of a "stable" platform with Linux that due to bad or buggy drivers would be even more prone to actual crashing at the kernel level.

 

We haven't counted out Linux, and if that platform gains the hardware support we need in the future, then we will definitely entertain development for that platform. But for right now, it's not feasible.

 

Thanks,

 

Jason

Link to comment
Share on other sites

The source code for the drivers we're using is not available. Also since the drivers were designed for Windows, I know simply having the source-code would not make it as easy as a re-compile.

 

Linux is not being used simply due to lack of drivers for our required hardware. And this is not a "simple" problem. Even for video drivers, Nvidia Detonator drivers are reportedly 15 million lines of code . . . we're talking some very complex stuff here that large companies spend lots of money making sure they are stable on a specific OS platform.

 

Having to deal with bad drivers is much worse that the fairly petty problems we've encountered with threading on the Windows kernel. For instance, Linux will have kernel panics, and that's the same thing as a Windows blue-screen . . . since we don't have blue-screen issues, I don't see why we would want to trade-away that underlying OS/hardware stability for the perception of a "stable" platform with Linux that due to bad or buggy drivers would be even more prone to actual crashing at the kernel level.

 

We haven't counted out Linux, and if that platform gains the hardware support we need in the future, then we will definitely entertain development for that platform. But for right now, it's not feasible.

 

Thanks,

 

Jason

Out of curiosity, what hardware?

Link to comment
Share on other sites

Out of curiosity, what hardware?

 

The proprietary UDP/IP ethernet protocol we use to make sure that you can get 100MB/s across the gigabit ethernet line with no drop-outs and DMA (so very low CPU usage).

 

Secondly, the Intel GMA950 doesn't support the ARB_Fragment shader extensions for OpenGL like we need, so we have to use DirectX. A dedicated GPU (i.e, Nvidia) in the SI-2K would consume way too much power.

 

Thanks,

 

Jason

Link to comment
Share on other sites

The proprietary UDP/IP ethernet protocol we use to make sure that you can get 100MB/s across the gigabit ethernet line with no drop-outs and DMA (so very low CPU usage).
Not to split hairs, but that's not hardware drivers, just software
Secondly, the Intel GMA950 doesn't support the ARB_Fragment shader extensions for OpenGL like we need, so we have to use DirectX. A dedicated GPU (i.e, Nvidia) in the SI-2K would consume way too much power.

 

Thanks,

 

Jason

???

 

You are making no sense here. Linux offers such extensions, has for years. I can grab a Linux 3D expert for you if you'd like.

Link to comment
Share on other sites

Not to split hairs, but that's not hardware drivers, just software

 

???

 

You are making no sense here. Linux offers such extensions, has for years. I can grab a Linux 3D expert for you if you'd like.

Ok, double checked. For the dedicated bandwidth, that's a configutation issue for Linux, and using several of the RT kernels is more than possible. As for ARB_Fragment the support for the GMA950 is late beta in preparedness, talking to my friend. So you are correct in stability.

 

However, Linux is not the only game in town. Have you ever looked into Solaris, which does support both the ARB_Fragment and the guaranteed bandwidth needs in the embedded environment?

Link to comment
Share on other sites

Hi Nate,

 

Sorry if this is confusing, but we're not using the TCP/IP stack of Windows, which would mean we also wouldn't use the TCP/IP stack in Linux . . . our cameras involve custom hardware that use a standard gigabit ethernet signal pathwa, but rather than TCP/IP, they are using a custom protocol via UDP/IP on a custom transport layer stack to transmit the signal with DMA support for less than 1% CPU usage.

 

As for the ARB_Fragment support, yes, Intel has been increasing their support lately for Linux, and by extension for OpenGL in their display drivers, but 2-3 years ago (and back then we had the GMA900) when we started none of this support existed . . . and even then the latest stuff is in "beta" . . .

 

Suffice to say the Windows kernel is giving us very good stability, and we have the software flexibility to add features and support for some very innovative developments. For instance, is there another camera system on the market that provides full 64-point internal custom user-definable 3D LUT support and film-stock emulation? For 3D-shooting is there a integrated camera system that has the ability liked we previewed at NAB this year that can take two streams and combined them into a single live 3D stereo image? Also how about a camera system that natively shoots to QT and AVI and can be natively ingested (i.e. no proxies or conversions) into two of the most popular NLE's on the market as well as a number of other media-related apps? Add on top of that only 3.5:1 compression and 4:4:4 decoding, basically the equivalent of a HDCAM-SR deck?

 

There are a lot of great development platforms out there . . . hopefully people can look at the features we're able to offer and see the benefit of the development platform we've chosen.

Link to comment
Share on other sites

Hi Nate,

 

Sorry if this is confusing, but we're not using the TCP/IP stack of Windows, which would mean we also wouldn't use the TCP/IP stack in Linux . . . our cameras involve custom hardware that use a standard gigabit ethernet signal pathwa, but rather than TCP/IP, they are using a custom protocol via UDP/IP on a custom transport layer stack to transmit the signal with DMA support for less than 1% CPU usage.

 

As for the ARB_Fragment support, yes, Intel has been increasing their support lately for Linux, and by extension for OpenGL in their display drivers, but 2-3 years ago (and back then we had the GMA900) when we started none of this support existed . . . and even then the latest stuff is in "beta" . . .

 

Suffice to say the Windows kernel is giving us very good stability, and we have the software flexibility to add features and support for some very innovative developments. For instance, is there another camera system on the market that provides full 64-point internal custom user-definable 3D LUT support and film-stock emulation? For 3D-shooting is there a integrated camera system that has the ability liked we previewed at NAB this year that can take two streams and combined them into a single live 3D stereo image? Also how about a camera system that natively shoots to QT and AVI and can be natively ingested (i.e. no proxies or conversions) into two of the most popular NLE's on the market as well as a number of other media-related apps? Add on top of that only 3.5:1 compression and 4:4:4 decoding, basically the equivalent of a HDCAM-SR deck?

 

There are a lot of great development platforms out there . . . hopefully people can look at the features we're able to offer and see the benefit of the development platform we've chosen.

I've used several kernels over the years for similar projects and Windows could give stability for us only when there were no other factors, in a completely enclosed environment. Otherwise, I predominantly worked with Linux or QNX's kernel. Now, we were working on some unusual solutions, admittedly, and Windows just could not handle the distribution system at all reliably. This was, admittedly, in '98-'99, but to my knowledge, the NTKRNL has not changed much since then.

 

I avoid windows development due to the major headaches it gives me, especially the EULA where they can pull your license at the drop of a hat. I, for one, do not like letting another company have that level of control over a product. I have seen companies use that before to kill off competition to their "prized" partners in the past, including with my last computer venture. However, my past is not your past, so I wish you luck and shall keep an eye on things to see how they progress.

Link to comment
Share on other sites

  • 7 months later...
  • Premium Member
Hi Evangelos,

 

The imaging specs on the Altasens sensor are very good, and it does produce some stunningly good image quality with great dynamic range, color-fidelity, and low-noise. As you've noted, the 35mm DOF issue is one downside of using 2/3" sensors, but compared to anything else on the market, the Altasens sensors are far-and-away the best CMOS sensors that money can currently buy for digital cinema production without starting from complete scratch with all the uncertainties and cost of a new custom sensor design. Obviously as demonstrated by Arri, RED, and others, there are 35mm-sized sensors that can be made, but those are all custom designs. With the SI-2K that was not an option for us.

 

We actually do have some 35mm-sized sensors in camera heads that we have made, but they are not digital cinema quality.

 

Thanks,

 

Jason

Doesn't Dalsa sell their chips?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...