Jump to content

Andy Jarosz

Basic Member
  • Posts

    52
  • Joined

  • Last visited

Everything posted by Andy Jarosz

  1. Inexpensive DLP projectors don't play well with camera shutters. LED/LCD are the same thing, just different illumination sources.
  2. Unless the projector is the only light source in the room, it will be very difficult to shoot. If you plan on using other lights to simulate brakelights/streetlights, it will be nearly impossible to not affect the screen. We used 20k lumen projectors for rear projection on The 4400 last year, for daylight driving scenes, and we were still having trouble with light spill.
  3. Most of the time screens are replaced with VFX. Im not sure if it’s possible to genlock a macbook, it’s definitely not possible for a phone.
  4. You’re not doing anything wrong, the refresh of the screen simply isn’t synchronized to the camera shutter. Even if they’re running at the same frame rate, if the phase is offset, you will get double frames. if you can’t sync the screen and camera via genlock, there’s nothing you’ll be able to do.
  5. Phil—are you taking into consideration the dual-gain sensor path? Its likely achieved from two 60db-ish data pathways merged together.
  6. The PC component market is in a rough state nowadays. The silicon shortage has driven prices up and availability down. It may not be that those companies are ignoring you--they might literally just not have product to sell. Anyway, nowadays it's very easy to build your own PC--only takes about 30-45 minutes to get all the hardware installed, and there are tons of great build guides on youtube.
  7. Currently, it's Unreal--that wasn't my point. The point was that the tools being used now are being used out of necessity, not because they're good. In 5-10 years, we'll be using dedicated engines and software packages specifically for virtual production. To continue my analogy, we need a few years before the RED ONE or Alexa is developed.
  8. I work in virtual production, both on set as a consultant and on the tools development side. Asking if cinematographers will learn Unreal in the future is a bit like asking if they will become familiar with tape broadcast cameras, after AotC came out in 2002. The sentiment is correct, but the implementation will be wrong. Unreal is a fantastic tool, it's very powerful. But it was never meant to do the kinds of things we're doing with it in VP. The more you get into large setups with multiple clusters and multiple editors, the more evident this becomes.
  9. I don't really understand people who are mad at this camera...if you don't like it, don't buy it. Why waste time hating it? The other thing people are missing is they've already said there will be a Ranger version of this camera. If that's not everything you want in a production camera, I don't know what is!
  10. Apple regularly makes anti-consumer design decisions and they're the most profitable company in the world. Strong decision making will always be rewarded. In many ways, RED *is* a hobby--they make cameras to see what can be done, to push the envelope, and figure out the limits of imaging. If you agree with that mindset, great! If not, we are lucky enough to live in a time with many other camera options. I always laugh at the old school "companies only exist to server the consumer" mindset. That simply isn't true nowadays. Companies like SpaceX, Oculus, Looking Glass, were all founded simply as excuses to make cool stuff. It just so happens others think its cool tool.
  11. You forgot one key aspect: It uses CFExpress media, of which only one card is actually certified right now. The reason being, no other card can handle the data rates. They're not expecting everyone to agree with their design decisions because this camera isn't *for* everybody. If it works for you, great. If not--a larger, Ranger-style version of this camera will apparently be coming soon as well. RED simply doesn't care if you agree with them or not--they're going to make the best camera possible in the smallest body possible, and you either go along for the ride or buy something else ?
  12. If you're going to be doing realtime rendering with Unreal, it's more complicated than just playing back a video. You'll need 2mm pitch LED tiles (Roe Black Pearl 2) as well as a Brompton SX40 system to drive it. The Brompton system scales with the screen size, so you'll need to talk to them about how many you need--that's what allows you to genlock the camera to the screen. For Unreal, you'll need workstation PCs with nVidia Quadro graphics cards as well as the Quadro Sync card. Again, that's what lets you genlock the PC to avoid tearing. Depending on the screen resolution, you may need several PCs. If the camera is going to be moving, you'll need a motion tracking system like Optitrack or Mosys. Generally you're looking at a $500,000-1,000,000 budget. On Set Facilities sells everything you need in one easy to purchase kit: https://onsetfacilities.com/product/led-xr-stage/
  13. Don't underestimate the ability to shoot RAW. Same Alexa look, but with a ton more information for more aggressive grades.
  14. No, color bars are generated digitally from the camera and is independent of any look. Bars help you calibrate everything to an empirically correct standard, so you know all of your equipment is starting from the same point.
  15. Meal penalties are usually in the union contracts, so if you're nonunion they only have to do what they say they will in your deal memo.
  16. What Mark and I are trying to say is that by trying to boil the spectrum of a light down to one number, you're always going to be fundamentally lacking critical information. Averaging these values together just gets you an even less specific, less informative number. If you're a professional, you really shouldn't be looking for shortcuts like that--you should want to know the exact performance of the tools you use so you can make informed decisions about how to use them. For instance, if you know the camera you're using is biased more towards magenta, than choosing a light that has a a higher spectral content in the green part of the spectrum will result in a more neutral image. That is not a decision you can make based on singular numbers--including SSI, though that is the one that will get you closest. Think about cameras, different cameras absolutely are more sensitive to different colors than others. That's a part of their "color science." So why is there no singular CRI or TLCI type number for cameras? Because cinematographers test them thoroughly, shooting charts and test subjects under different conditions in order to understand how the camera performs. When the time is taken to do things properly, the shorthand numbers become increasingly useless.
  17. I would say that is the only way to truly empirically evaluate a light source. I don't want a "colored" number or one that is tailored for any specific capture device, be it a camera or a human eye. When I hear the Skypanels have low color rendering scores but look great on Alexa cameras, I don't go "what's wrong with the light?" I go "What's wrong with the camera?" As a creative, you want a light to be as unbiased as possible so that your big choices--camera and lens--are consistent and predictable across every shooting scenario. A light that only looks good on one kind of camera is not that, and so the color rendering numbers are suitably low. If look is important to you, it's in your best interest to want to control the variables as much as possible. Only empirical measurements give you enough information to form these decisions.
  18. I should have mentioned this in my other post, but TLCI is not the best metric to evaluate lights on. Like CRI, it can be "gamed" through particular tuning of the light. SSI is my preference.
  19. If you want a value that measures how lights look through a camera specifically, that's what TLCI is for.
  20. You're going to get accumulated error by averaging all the scores--which each have their own margin of error--into a single score. The error of one value on it's own is not that big a deal, but you could potentially get a misleading value when they're all factored together. Unless you are ensuring your instruments are calibrated and accurate, this will be a problem. What is the issue with SSI? It's just that the numbers aren't high enough for LEDS? That's kind of the point, the numbers aren't supposed to favor any one light source--they're just supposed to tell it like it is. Regardless, I think any single number value will ever tell the full story. Simply look at the spectrum overlaid on top of daylight, and you'll immediately be able to evaluate how the light will perform.
  21. 1000% test, and try to break it in your test--shoot close to the screens, at sharp angles, etc. to account for every scenario. If you're forced to not have genlock, use a 144 degree shutter and limit vertical camera moves. Make sure the screens you use are high quality--Roe BP2 or BO2 ideally--and that they have a low scan rate. Video panels usually run at either 1920 or 3840hz (they know that's confusing) so you have refresh rate for days, but the actual time that the screen takes to refresh is a large factor. The shutter "actuation" speed of film cameras may not play nice with it. For what it's worth, we did the video that is the background of loledvirtual.com without genlock on a 8K Weapon and it turned out fine, but there were some shots with flicker that had to be fixed in post.
  22. Andy Jarosz

    RED KOMODO

    I think the camera is still in "open beta," as it were. It seems that they're holding off on releasing specs until they're confident they're not going to change. Which is VERY un-Red--I'm glad to see them being a little more conservative this time.
  23. In this case, it has everything to do with the new CFA. Other cameras will either line bin (pixel skip) to get lower resolutions using the full sensor size, or window the sensor--both have major drawbacks. There are also cameras that will downscale in camera, but you can't do that in a RAW workflow. The new CFA is symmetrical across the sensor, so rather than skipping lines, they can skip clusters of photosites. And since the sensor is so dense, there is essentially no penalty for doing so--using just 4K worth of photosites on this 12K sensor is essentially going to give you the same pixel pitch that a native 4K sensor would have had anyway. It's very clever, and it means that they've made what could be the last camera they need to make for the next 10 years. Buy it now, shoot 4K. In a few years, you already have 8K. A few years later, you're already set for 12K.
×
×
  • Create New...