Jump to content

John Sprung

Premium Member
  • Posts

    4,628
  • Joined

  • Last visited

Everything posted by John Sprung

  1. And each grain is either exposed or not -- they don't have in-between states. So each grain is basically a binary digit. If we had a more precise handle on how many there are in a frame, that would give us an upper bound on the real information content of a frame. For instance, 800 million grains would be 100 Mega Bytes. 4k x 3k would be 12 Mega Pixels, which would mean we have about 8 Bytes worth of film grain data per pixel. I don't see any immediate useful conclusion from that, but it sure is interesting. It feels like maybe some day somebody much smarter than me might connect some dots here..... -- J.S.
  2. Thus has it always been. I remember some 30 years ago a guy like that who called me up after the shoot and fired me. Then he called again and re-hired me. Then he fired me again and re-hired me again. By the end of that night, I had been fired eight times and re-hired seven.... ;-) Take heart that so far everybody who's ever told me "You'll never work in this town again" have themselves never worked in this town again. -- J.S.
  3. Ah, but with film there are really only two possible outcomes: Either it's fine, or you blame it on the lab. ;-) Seriously, though, knowing what you have on film is a skill that you develop with experience. A few months working as an assistant and going to dailies, and you pretty much have it. A few more years of the same, and by the time you move up to DP it's as much a part of you as knowing how to breathe. -- J.S.
  4. Uncertainty? What uncertainty? Once in a blue moon you'd get a major screwup like a mis-labeled short end, and you're shooting 45 when you thought you had 98. I like the Plus-8 gear where they've added those latched clear plastic covers. You can see how things are set, and be confident that nothing has been touched by accident -- or will get touched by accident. The more important point is that it makes no sense for me to be tweaking the color of the purple flowers while the cast and crew are on the clock, if it could be done in post. It's actually kinda rude in addition to being wasteful. Time is money, but the cost of time is variable. As a general principle, complexity should migrate away from high cost time to low cost time. -- J.S.
  5. > They have to come in at under $3000 a day to compete with Dalsa. Not really. Dalsa was at the show, too. They don't have a production camera. They don't even have a prototype. What they have is an engineering demonstration of principle unit. It's roughly the size and shape of the front half of a BNCR. It has to be cabled to an offboard rack of electronics. They put lenses intended for 35mm film on it, and get severe vignetting -- both sides, not just the corners -- because their chip is actually too big. Lenses from the 65/70 system should work on it. The Panavision Genesis could cost more to rent and less to use just because it's easier to use. When you work 18 hour days and 6 day weeks, ergoromics matter a lot. Time is money. Panavision's first competition will likely be the Arri D-20. They weren't at CineGear, but they showed some tests at HPA in February. Edit: Oops, I was wrong. I remember now that Arri was at CineGear with some nice new film cameras. It was just that they didn't bring the D-20. -- J.S.
  6. > Most of the time you wouldn't want purple reproduced in that way, True, the purple flowers do pop out severely. I wouldn't want that unless the purple object were important to a story point or the punchline of a joke. In the test I saw, they grab your eye on both film and digital. In both cases, I'd want to fix them in post. Either that or just frame the damn things out in the first place. (But for this test, they clearly wanted them in.) They're especially an issue in this test because they're one of very few places where you can pick out a JND. > and few would want to go through the trouble of correcting it later, if there were an option that you didn't have to in the first place. I really don't like the idea of burning production time on messing with internal camera tweaks, and the risk of not getting it un-tweaked correctly for the next setup and the rest of the day. But for some productions, I spoze you could go that route on Genesis. On film, you'd have to send a note to telecine to take the curse off those flowers. -- J.S.
  7. The very much larger CCD is definitely not an easy thing to make. I guess it's just a matter of opinion how important DOF is, but getting it back to a comfortable place is important to me. Nominal 2/3" CCD's require an image that's actually in between the size of 16mm and Super-8. Making lenses to produce a decent image for that tiny chip is extremely difficult. It should be a no-brainer to leverage the world wide investment in high quality lenses for 35mm flim. Getting rid of the prism block removes another significant restraint or two in lens design. You can put the back element on a film camera or single chip camera closer to the focal plane because you don't have that optical block in the way. 3CCD cameras have a brick wall at f/1.45 because of the block. Separation is a technique that seems to show up early in the development of an imaging technology, to be replaced later by single-surface methods. We saw this in still photography, in three strip Technicolor, and now in digital. Anyhow, to me this is a significant advance over all the nominal 2/3" 3CCD technology. We had a pilot that shot with the Viper cabled to a rack-mounted SR recorder, so that could be done with the little SRW1. Putting it on-board would just be a bunch of bracketry and adapters, just a matter of getting the engineers to do it. The tragedy of the Viper is that a reasonable way to record its output finally showed up, but attached to a camera that renders it obsolete. -- J.S.
  8. Ambient life -- I was thinking more like those big moths that get attracted to the lights on night exteriors. As for the motion problems that Alessandro saw, I think Phil got it. It's MPEG compression artifacts. Either that or a broadcast signal problem like multipath or weak signal. That's more likely if it varies from day to day. The magic number 19 Mbps comes from the ATSC broadcast standard adopted by the FCC. It's the max bitrate they felt they could reliably cram thru the same size (6 MHz) channel as existing analog NTSC TV. Uncompressed HD is more like 1.55 Gbps = 1550 Mbps. On satellites, they can do something called statistical multiplexing. When they have several feeds going at once, they can divvy up the bit rate on the fly to give more to some and less to others depending on content. So your show may be given less than the average bit rate while it's sitting on a lockoff shot, and more than the average when you do a swish pan. Terrestrial broadcasters could do that, too, when they're running multiple SD channels. Some have tried squeezing one HD and one SD into the 19 Mbps, but stat mux really does well when you get into dozens of streams. As for displays, the direct view CRT is the Achilles heel of HDTV. The big problem is that the phosphor triads or stripes and shadow mask impose yet another resampling on the image, just like looking thru a screen door. And the best of them are about 900 holes across the screen, so the thing can't possibly show you the detail that exists on HD tape. One to one pixel displays like DLP or DILA will give you a better idea of what's really there. That being said, if you're doing something for TV, it would be wise to also look at a downconverted feed on an NTSC monitor. It's just like the mixers on the dub stage work with the good speakers, but also listen to it on the 3" cheapie speakers to make sure everything will play for the majority of the audience. -- J.S.
  9. Prior to the Genesis camera, it has always been relatively easy to tell what was shot HD and what was film, even when the film was transferred with the objective of making things match. This latest test I saw at CineGear is the first time that I really couldn't tell, I tried to guess, and guessed wrong. By far the biggest difference between them was on the falling dominoes. The Genesis operator let them fall quickly out of a lockoff shot. The film operator made a quick tilt to catch them bouncing around on the tabletop. Some people have noted the slight difference in color on the purple flowers. This really doesn't mean much, since both elements went thru digital color correction. They could probably go back to cc and make them match better. The flowers were well within the color gamut of both systems. Likewise for the even smaller differences in highlight and shadow detail, there is no sane reason to prefer one over the other based on dynamic range. Trying for an even more perfect match would only be a pointless engineering exercise. As for the film being "dumbed down" to match Genesis, I doubt it. It doesn't look that way to me. These tests were shot and timed by Allen Daviau. I've met him before this, and we talked about other digital cameras. My impression is that he's an honest guy who'll tell you the pros and cons as he sees them. IMHO, Genesis is a significant step forward in cinematograpic technology, on the order of the first postwar Arriflexes 50 years ago or the first Panaflexes 30 years ago. It renders all three chip cameras as obsolete as three strip technicolor. It puts 2/3" cameras out of the running for anything but low budget productions. The Panavision and Sony people have really done themselves proud on this one. -- J.S.
×
×
  • Create New...