Jump to content

Jason Rodriguez

Basic Member
  • Posts

    175
  • Joined

  • Last visited

Everything posted by Jason Rodriguez

  1. Yes, we're seeing the same thing too . . . modern S16 zooms and good primes (like the Superspeeds or the Optars) are definitely resolving enough resolution.
  2. Hi Landon, The basic premise behind CineForm editing is that "offline" is a thing of the past. What basically happens, is that we have the RAW file encapsulated in the codec, which runs at around 12-13MB/s for 2K RAW. This is a 5:1 compression ratio. The codec, on a fast Pentium 955EE or Core 2 Duo machine is able to decode and render the RAW data in real-time as 4:4:4. Since the data is being decompressed into memory in real-time, it can be moved and manipulated very fast, with multi-stream effects, etc. So you're not pulling uncompressed 4:4:4 RGB from the hard-drive, instead you are decompressing the RAW file in real-time into memory. When the data is in memory, it can be accesed very fast, and quite a bit can be done (considering that memory bandwidth on a 925 chipset like the Penitum 955EE uses is 10GB/s). So using a combination of high memory (not disk) bandwidth and processing power, the CineForm stream can be manipulated in realtime, and that means a less expensive computer system since your not paying for gobs of hard-drives and speed for uncompressed editing. For more information you should visit their website at www.cineform.com But again, just remember that we have the RAW compressed file on disk which uses a low data-rate, and it's decompressed in real-time to 4:4:4 RGB in your computer's memory subsystem, and the data is processed and manipulated in real-time from memory using your computer's fast processor, and then sent to your screen or your video card. So hopefully that demonstrates how you can get such fast performance with 4:4:4 RGB editing without spending $100K on a computer system with requirements for hundreds of gigabytes per second of disk bandwidth.
  3. You use the quicktime version of CineForm RAW :)
  4. Zeiss Superspeeds and the newer Ultra Primes or Cooke S4-16's (Cooke has some 16mm models out now) are great, and can easily resolve HD resolutions.
  5. The Arri's actually a 3K camera when you work in the RAW data mode. I know it's not a full 4K, but it's also a lot more than 1920x1080 HD.
  6. 2K Digital Cinema Camera Streamlines Movie and HD production Silicon Imaging and CineForm team to shoot, edit and encode using Intel® Core? 2 Duo Processors Hollywood, CA, November 1, 2006 ?Silicon Imaging, with help from CineForm® and Intel Corporation, today unleashed a 2K Digital Cinema camera system set to revolutionize movie and HD production. The Silicon Imaging SI-2K will shoot footage directly to disk, at either 1920x1080P HD or 2048x1152 cinema resolutions, which can be instantly edited using CineForm?s Prospect 2K? real-time visualization and colorization software running on motherboards integrated into the camera or on laptop computers powered by Intel® Core? 2 Duo processors. Afterwards finished projects can be directly encoded for theatrical distribution to digital cinemas or on-demand Internet download. "The transition to digital cinematography started when George Lucas first shot 'Star Wars: Episode II' in HD," states Ari Presler, Silicon Imaging CEO. ?Today, we have completely eliminated the need for film or tape and now offer the ability to shoot at even higher resolutions than HD, edit while each shot is taken, and have a production ready for distribution within weeks instead of months.? The Silicon Imaging SI-2K is the first 2/3? digital cinema camera with 10-bit CineForm RAW? and 12-bit uncompressed 2K direct-to-disk recording. It offers unprecedented image quality, over 10 f-stop dynamic range, a touch-screen interface and IT-friendly connectivity. It is also the first to deliver in-context 3D color-corrected ?look? visualization and a complete raw post-production workflow. Unlike modern HD cameras, which develop and compress colorized imagery inside the camera, the SI-2K streams images as uncompressed raw ?digital negatives? over a standard gigabit Ethernet connection. An Intel Core 2 Duo processor-based laptop computer , located up to 300 feet away, processes the digital negatives, where they are non-destructively developed and colorized for preview, using the cinematographer's desired "look" for the scene. The digital negatives and "look" metadata are simultaneously recorded to a 2.5? notebook hard drive using the CineForm RAW? visually perfect codec. Up to 4-hours of continuous footage are captured on a single 160GB notebook drive; this is the equivalent of 14-reels of 35mm film which has an associated cost exceeding $25,000 for materials and processing. ?The Silicon Imaging camera is truly amazing? states Cinematographer Geoff Boyle. ?The SI-2K MINI remote camera head is small enough to be placed directly in a scene for point-of-view shots, used on robotic arms for model photography or incorporate two side-by-side for stereo 3-D. We used it on ?Mutant Chronicles?, a Sci-Fi feature with over 1500 visual effect shots. We shot flames and explosions using the 72fps slow motion mode and the recordings display smooth tonal gradations and natural colors. Finally, the director can immediately see full-resolution instant replays, without having to wait for film dailies or color correction." CineForm?s Prospect 2K software, optionally bundled with the SI-2K, provides real-time, multi-stream editing of CineForm RAW files with look metadata using Adobe Production Studio, and supports both Windows AVI and Apple QuickTime formats. Further details on MacOS workflow including Final Cut Pro will be disclosed in the coming weeks. ?Together we are enabling a major paradigm shift in the entertainment industry for the way films are shot and edited," states David Taylor, CineForm CEO. ?Our combined efforts allow CineForm RAW recording and Prospect 2K editing to retain the many benefits of shooting raw, including increased visual fidelity, while reducing costs during both production and post, and encouraging faster project completion.? ?With the power of multi-core ready application running on dual core Intel Core 2 Duo processors, Silicon Imaging and CineForm have streamlined the digital high resolution movie-making workflowD,? said Elliot Garbus, general manager of the Global Developer Relations Division, Intel Corporation. ?The Silicon Imaging and CineForm solution promises cost and time saving benefits for cinematographers and directors.? The SI-2K DVR with an embedded Intel Core 2 Duo processor and hot-swap drive cartridge system plus the removable MINI camera head is $20,000 or bundled with Prospect-2K for $22,000; with estimated shipment in January. The SI-2K MINI is $12,500 and shipping in December. About Silicon Imaging Silicon Imaging, Inc. is a leader in IT-centric high-definition digital camera and RAW workflow solutions. Silicon Imaging?s products incorporate disruptive technologies to deliver a new generation of products outperforming traditional HD cameras while increasing flexibility and lowering cost. Please visit: www.siliconimaging.com. About CineForm CineForm, Inc., develops software products for use by media professionals in high-resolution acquisition and post-production applications. Powered by CineForm Intermediate, CineForm's acclaimed products provide unmatched visual quality and real-time performance ? without the need for specialized hardware ? while enabling an online digital intermediate workflow that runs on affordable desktop PCs. Please visit: www.CineForm.com.
  7. Hi Gavin, It will be platform agnostic, as the .look LUT file will actually be a "side car", sort of like XMP's are right now with Adobe's software, and Iridas software is supported on Mac, Linux, and Windows. While one could say there are some downsides to this method, there are also a number of advantages, including access to the LUT at any given point in time without having to be a programmer. But basically we're putting pointers in the AVI header for what the LUT file should be, but not actually packing the LUT into the AVI. The main reason for this is that we're using a 64-point cube, and as a result, the file is too big to fit into the AVI or QT header . . . no other program would be able to read the file since it would be looking for video information and not getting it. The LUT itself is fairly staight-forward . . . it's a XML file, with some settings for the Iridas software and then a big 64-point cube. It's all human-readable, although the cube is in a 4-byte IEEE floating point hexadecimal word (inside a tag in the XML, so you don't need a hex editor), but it's not hard to decipher. Also there are some un-announced features in Speedgrade 2006 for supporting additional 3D LUT formats, so by no means are you "locked in" to the system. BTW, Mike, there is one nice thing about the 3D LUT's and that is for you "shoot it flat" approach, it's actually quite accomidating for that too. Since the 3D LUT also encompasses gamma corrections, you can make a log-like curve or whatever you desire as your transfer curve from the basic 12-to-10-bit LUT that we have in the camera. Also the LUT is non-destructive and is at floating point precision, so there are no concerns about clipping over or under values unless you transcode to a different codec from the CineForm RAW . . . if that's the case, then we'll be adding the ability to "turn off" the LUT, or you can use a program to adjust the LUT for exactly the look you want before you "bake it in" during the codec transcode (obviously since we're doing management at the codec level, you can't take the LUT with you when you go to another codec, although you can turn the LUT off, and then import the new file into Speedgrade HD or DI and re-apply the .look file, or import the LUT into another application for re-application of the LUT in that environment, say a Lustre, etc.).
  8. Actually no it's not :) Speedgrade On-Set will be packaged with the camera.
  9. Yes. This is more than just a fancy image in the EVF. We are applying color-correction to the actual camera as non-destructive metadata. That means you color-correct the camera, not through some confusing on-camera matrix (that then bakes the "look" into the camera), but through the very inutitive Speegrade interface. Once the look is applied, that is the "look" of the camera . . . i.e., when you open up or play back a recorded file, the look you made in Speedgrade is the actual "look" of the file itself. So "looks" are handled at the codec level. The huge advantage to this approach though, is that because look files are completely non-destructive, you can edit and replace the "look" for a file anywhere through the post-production chain. This is much more integrated that using truelight through-out the post production pipeline, although I guess that could be done. This is actually color-correcting the camera with non-destructive metadata that describes how files recorded off the camera will look, but can be adjusted at any point in the RAW worflow chain.
  10. Graeme, you're right, and I'm sorry . . . I should have thought about what I was posting before I did, and that wasn't very mature of me . . . and now for some reason I can't edit it out :(
  11. Hi Keith, Direct Mac support is coming via a quicktime codec that Cineform will be releasing in the coming months . . . QT support will trail AVI support by a little bit, but not much. Now in the meantime you can always use After Effects on a Windows box to transfer the AVI over to a QT format for use on the Mac . . . with Prospect HD, you have full control over the RAW file including the 3D LUT's, so there's quite a bit you can do there before transcoding the RAW data into a normal QT codec for use on the Mac. (toungue-in-cheek) Now I'm sure EVERYONE should be VERY exciting about having to taking our AVI's through After Effects to make color-corrections and then transfer to QT's, I mean that is the BEST workflow in the WORLD! (my jab of the day at the REDCINE workflow model :D ) BTW, just in case anyone is curious, the reason for the length of time it takes to get QT support is not because QT is so hard to support . . . it's not really, it's just making sure when you import a QT file into an application, that it's able to receive the greater-than-8-bits of data in our CineForm RAW files. Each program seems to handle this differently, and as a result, you have to make sure everything works correctly, and that unfortunately takes time when the program behaves in a way that you don't expect. Plus there's always the issues of Apple not willing to support third-party codecs in Final Cut Pro . . . that's a major pain. To help with that, if you want to see full Final Cut Pro support (you can bring in the QT's and edit in FCP, but "full" means we're talking about full parity with Prospect HD on the PC), tell Apple . . . we talk with Apple, other manufacturers apply pressure on Apple, but it it seems to-do no good. I think they need customers telling them "we want this" before they budge. So far all their "third-party support" codecs like DVCProHD and XDCAM are actually codecs they wrote themselves and licensed from Sony or Panasonic . . . but anyways, users bugging Apple is always a good thing.
  12. http://www.iridas.com/press/pr/20061025.html I'm excited, and really, this is big news because it means our camera is now a complete "blank slate" for you to make whatever color adjustments you want, load that into the camera, and have complete color management from the camera head through post. Nothing like this exists anywhere on the planet. There is no camera on the market that can take 3D LUT's and use them as the native colorimetry of the camera, shoot with that 3D LUT, and manage it as non-destructive meta-data all the way through post. You're typically either constrained to the annoying confines of the on-board matrix and colorimetry of the camera, or you can do the 3D LUT technology using the Iridas toolset, but you have to preview the look on a monitor that can use 3D LUT's, like an expensive Cinetal, or something of that nature. Speedgrade On-set has some amazing tools to create any color-look you can imagine . . . virtual film-stocks, etc. If you don't like the way the camera looks, make it look like *anything* you want! Seriously! The reason I say this is because 3D LUT's map one set of RGB triplets to another set of RGB triplets. As a result, a single 3D LUT can model absoluetly ANY non-linear color correction, be it gamma correction, curves, saturation, selective color corrections, etc. Also our implementation of IRIDAS's 3D LUT's are completely non-destructive, meaning that if you clip the whites in the 3D LUT, but there's still "over-white" information, you can simply dial that information back in during post correction in a 32-bit floating point environment. We're going to be packaging a bunch of generic looks with the camera to-do stuff like bleach bypasses, wide-dyanmic range stuff, low-con looks, etc. There'll be more information coming out soon, but we just wanted you all to be aware of the exciting new developments happening :)
  13. Hello everyone, For our first "official" post here on this forum, we'd just like to send out an invite to anyone who's interested on a product demo/seminar Steve Nordhauser will be giving using one of our protoype cameras at the Abel Cine location in LA on Nov 3rd (day after HD-Expo). If you're interested, please RSVP with Ari at ari@siliconimaging.com Also we will be in the Adobe booth at HD-Expo, so if you're at the show, please come visit us! Thanks :)
  14. Whoa, didn't see this thread! Yes, Mitch is right, we have cameras out the in the field right now as beta-testing units for early adopters. These are people who are willing to go through the beta-testing cycle with us (and it's pains), and have presented comprehensive plans for what they will be shooting . . . i.e., Steve and Ari are approving beta-testing on a per-project basis with the resulting hope that we learn a lot, fine-tune the workflow, and nix any bugs. These are not "general-use/sale" cameras going out. But those are coming soon . . . I can't say exactly what we will announce on Nov 2nd at HD-Expo, but please don't come with the expectation that you will be able to walk away with a camera :) We will be making announcements though pertaining to shipment, so we're not going to be leaving anyone in the dark. And remember I said "announcements" . . . that means more than one! ;) We've been pretty quiet lately, but Nov. is shaping up to be a "louder" month. Again, sorry for the vaugeness of my post, there's a whole lot I can't talk about just yet, but suffice to say, we're listening and things are rolling along very nicely. The input of people like Mitch/Abel Cine and the other beta testers is invaluable, and we feel that the best way to get a working camera on the market is to field test it, get it in the hands of people who are ready with productions, and make sure that everything works the way it should.
  15. I think you need to make a distinction between dynamic range and actual S/N ratio. Is it 66dB of S/N? . . . if so, that's VERY high for an HD camera . . . most of the sensors on the market today advertising 66dB of dynamic range only have actual S/N ratios in the low 50's which makes them somewhat noisy in comparsion to a sensor with an actual S/N ratio of 66dB.
  16. We're planning the first "see-it-all" at NAB. So right now the website is limited to that one teaser picture.
  17. Hi Everyone, Just wanted to give a quick heads-up on a new digital direct-to-disk digital cinema camera from Silicon Imaging in partnership with CineForm and Adobe. We'll be at NAB in the Adobe booth, #SL3732 doing demonstrations. This camera is NOT vaporware :) In a nutshell it's a new single-sensor Altasens 3570 2/3" CMOS camera (12-bit A/D), has a pretty neat RAW workflow by recording to a new 10-bit RAW codec from CineForm, and records to anything you want that is USB 2.0 compatible. Of course there is on-board recording with a removeable 2.5" HDD drive cartridge, and you can put any off-the-shelf notebook hard-drive in there that is at least 5400RPM. Compression of the RAW data is wavelet-based (like the other Cineform codecs), and is a mild 5:1 compression. You can mix/match the CineForm RAW data with any other CineForm CFHD files on a Premiere Pro timeline all in real-time; the RAW data is integration is seemless. You can see more at www.siliconimaging.com/DigitalCinema/ The press release is at www.siliconimaging.com/DigitalCinema/news.html, and should hit the news wires sometime this morning. Looking forward to seeing everyone at NAB :) Jason Rodriguez Silicon Imaging (my sig for today ;)
  18. Also add to that list the RaveHD: http://www.spectsoft.com/wiki/Website/HomePage
  19. Hi Mike, Does FCP uncompress the DVCProHD stream? If so, does that mean you're loosing a generation? Thanks, Jason
  20. I think Dalsa is misleading people in the same manner that every other still camera manufacturer is misleading their customers. Your points are well taken. And the images on the Dalsa site are half-resolution, so you can't see the "edge" artifacts you talk about. All I can say is this-almighty film, scanned at 4K, only has 1800 line-pairs/mm *after* enhancement, and 1530lp/mm before. This is from a test by Kodak themselves. You can read at this site. So, if that "film" scan qualifies as "true" 4K resoution, and the Dalsa, which can also resolve a similar amount of lp/mm (according to their white paper), is not "4K", then I'm confused on what constitutes what, and why film should be considered "right" all the time, when there are obviously some short-comings. If Dalsa is a "lier", then so is Kodak IMHO. BTW, if the Dalsa was "only" 3K, that would still mean 1500lp/mm (at least in perfect theory), which is still comparable in resolution to a "4K" film scan. The biggest problem with the Dalsa is that there's no way to record their data in an efficient manner (and it's a huge camera). I think it's a waste of time to slam them as being unethical. BTW, what lens did you use with that 20D? Your lens looks soft in general, and there's quite a bit of motion blur (or it may be lens blur if you're near the edge of the frame). I can attest that a good L-series lens is much better at resolving fine detail compared to the "normal" like of Canon lenses. That shot looks like it was taken with a 28-135 or one of the 24-85 or 28-105 series of "stock" lenses that come with the kits. If so, then again a lot of what you're seeing comes from the crappy nature and low-contrast of those lenses rather than the resolution deficit you're talking about, although I'm not denying that it doesn't exist. I'm just saying that your specific example may have more to-do with lenses and/or in-camera JPEG modes.
  21. I think this whole argument is moot point when we're talking about the fact that most 3-chip sensor cameras are going through an inordinate amount of prefiltering and compression-to-tape; after all is said and done, the "4K" Dalsa is definitely on-top. Go over to the Dalsa site and look at the images-guess what, they look good, better than any other digital camera I've seen. And furthermore I don't see any bayer artifacting. Also these crappy resolution numbers that people keep quoting is complete garbage unless all you're doing is a very dumb 3x3 interoplation. There are a number of adaptive interpolators that get you far above 4:4:4 2K resolution (approaching the 3K mark for "true" resolution). The fact is if you know the chromacity spectrum sensitivity of the sensor, then actually every pixel picks up a certain amount of light, and you can use every pixel for luminance calculations (off-setting whether they are green, red, or blue pixels and again, assuming from the chromacity spectrum sensitivity overlap of the three spectrums). So that gives you a "4K" luma sensor if you intelligently and adaptively interopolate the raw bayer data. Again, the images look good, so I think in the end this is all moot and a lot of hand-waiving over nothing.
  22. That is, until you want to do something with it in post . . . :o
  23. It would be great to get some frame grabs when you get the time.
  24. After seeing the Arri D20 in person with their new footage on display in the booth at NAB, I too can vouch that the D20 is much better-looking than anything I've seen so-far from the Drake. Of course, you now also have the option of mobility with the D20 via the Venom as well (from Thomson/GV).
  25. I love the Digiprimes, and have found them much sharper, especially on the wider focal lengths, than the Fujiprimes. At the longer focal lengths, the performance delta between the two diminishes quite a bit though, or when they are stopped down. But wide open or wide angle, the Digiprimes definitely have the advantage.
×
×
  • Create New...