Jump to content

Phil Rhodes

Premium Member
  • Posts

    13,750
  • Joined

  • Last visited

Everything posted by Phil Rhodes

  1. I'll be the first to accept that I'm a reviewer and I'm likely to be getting cameras that they know work, but I've not seen huge problems with Ursas. I have two here right now (A G1 Pro and an original 4.6K) and both have seen quite a bit of action. I wouldn't be surprised to discover that the ability of the modern internet to concentrate certain types of information is making failures seem more common than they are. Yes, some more IR filtering is probably a good idea; some of the aftermarket OLPFs have included it. I would second the commentary on Red. In my view, the company has behaved questionably ever since its earliest claims of performance and release dates, which were quite clearly impossible and which it failed to achieve, then congratulated itself for achieving. All of that is quite apart from its interaction with the patent system. I view the way patents are being issued and used at the moment as largely broken and anti-useful and I think several companies, particularly including Red, are profoundly abusing the situation to the detriment of film and TV workers worldwide. For a long time I avoided criticising Red on the basis of its pictures but it was always clear that much was being sacrificed in pursuit of easily-marketable resolution figures, and let's not forget that the company managed to redefine a major piece of industry terminology to its advantage. What surprises me is that those resolution figures are not special anymore, and it's not even particularly affordable equipment. I have no idea why the company and its products are taken so seriously. That said, you can exchange my advice on Ursa for something like FS7 or FX9, EVA-1 or any other midrange camera, all of which will very significantly outperform an Alexa EV (that is, the original Alexa) on more or less every technical basis.
  2. Many more modern cameras beat it by a significant margin in more or less every department. An Ursa Mini 4.6K G2 is smaller, lighter, less power hungry, starts faster and has higher resolution, higher frame rates, around a stop more sensitivity, better sound, onboard raw, arguably has a better viewfinder and can control EF lenses. Alexa has the name and to some extent the look, but I'm not convinced how identifiable that look really is. Possibly the rolling shutter is less visible. You might want to get the optional aftermarket OLPF to go in the Ursa, and Wooden Camera have a couple of breakout boxes that add features which will be useful to people on high end single camera sets, but beyond that, I think Alexa is an increasingly difficult choice unless you can get one for next to nothing. Personally I'd wait for wider reactions to the Ursa Broadcast G2, which is basically the Ursa Mini 6K from what I can see of it. That might be a very welcome development. P
  3. That's kind of a broad question. Do you have any information on what it'll be used for, the kind of venues it'll be in, what you want to do with it?
  4. I'm sitting here listening to Zimmer's score to Dune, which is a piece of music I barely recognise because the first time I heard it, it was so loud I had my fingers in my ears. Went to see Tenet. Quite apart from the terrible dialogue mix, my lasting impression is of everything being so loud it was actively unpleasant, and I felt the need to protect my ears. Went to see Dune. Was massively distracted by the unpleasantly over-loud sound; almost painful. Horrible experience. Considered going to see No Time to Die. With enormous regret, decided that I'd rather see it under circumstances where I wasn't likely to suffer hearing damage. I want a volume control. I am not alone, and I'm not kidding; I strongly suspect Dune has the potential to create hearing damage. Now, I am fully aware that anyone from the audio-related disciplines of film and television production will be replete with excuses for this. It's the director's intention. It's the movie theatre. It's fine, you're just being silly. Sorry, no. This is now so bad that it's actively preventing people from going to movie theatres because the sound experience is positively disagreeable. It's not entertainment, it's something you want to be over because it is actually nasty. It's bad. There is no justification for this and people who try to make thoise justifications need to be slapped around until they realise that they're not mixing sound for themselves and their own state of hearing that's long been ruined by constant exposure to ultra-loud mixing stages. This has become completely insane.
  5. I think the idea is that it has such overwhelming resolution that the sharpness compromises of a mosaic sensor are overcome and even 8K material is so oversampled that it approaches the Nyquist limit of the frame dimensions in terms of sharpness and detail. In may circumstances, the factor limiting resolution is not the sensor nor the quality of the lens, but the diffraction limit of the lens, which is geometrically determined and therefore exactly the same for a low-cost stills zoom as it is for a Leitz Cine Prime. The point isn't really 12K. The point is it has much more than you need for any reasonably foreseeable circumstance. Yes, smaller photosites are noisier, which compromises dynamic range, but that's sometimes a rather misunderstood metric. If you take an HD area from the 12K sensor, it will be physically smaller than the HD area of a 4K sensor, and it will be noisier. But if you take the whole 12K sensor and scale it to HD, you gain back much of the loss. Not all of it, of course, because more pixels have more gap around them, so your fill factor becomes lower and you're using less sensitive area overall, but this isn't quite as simple as is often assumed. The big problem with the 12K is that using it to its full potential locks you into a Blackmagic Raw workflow. This, again, is not quite as simple a problem as we might think; generating 12K ProRes would be an impractically-huge processing load even if anything supported it, and scaling that vast sensor down to saner resolutions is hard work too. I do think they could have made slightly more effort in this regard, though; I think this is what puts most people off. Blackmagic Raw or bust, really.
  6. "Yeah, 'cause, like, that retro thing, that's like, really in right now, isn't it. Yeah. Let's do it. We know what we're doing"
  7. I made a pop-up diffuser from a collapsible laundry bin, and coiled some high-quality LED strip inside. Works great.
  8. StudioBinder, FrameForge. I've known people use Sketchup. None of them have the expression of a proper sketch, of course, though they do have one advantage: you can draw anything you like, but a proper 3D environment does enforce only showing things which are actually possible to shoot. P
  9. It may be worth adding that almost anything you can get from the theatre and live events world is likely to need careful testing for flicker, too. As Harrison says, though, I wouldn't be entirely discouraged by that reality as it might be possible to save a really enormous amount of money and make yourself popular. In order to avoid wasting a really enormous amount of money and making yourself unpopular, proper testing for flicker will be required. I note this here because I've been caught out myself, but there are a few easy things to do which can make this easier. Don't just point a camera at an open LED emitter. The actual emitters are extremely intense and likely to clip the camera, perhaps in strange ways depending on their colour, which can make subtle flicker hard to see (also you'll spend the next ten minutes seeing a little row of dots). Put them behind a diffuser of the type you're intending to use, or aim them at a diffusing surface, perhaps just a sheet of white paper. Don't just rely on a monitor. Look at them on a monitor to detect banding on rolling-shutter cameras, if any, but also look at a waveform monitor as this makes flicker much easier to see as a bouncing in the waveform. Check for long-term shutter phase problems. Frame up a shot, note where the exposure is on the waveform, and leave it like that, checking back every few minutes or so. This is mainly an issue on low-frequency lighting like iron-ballasted fluorescents or metal halide lights, often in 50Hz localities shooting frame rates near 24 which are almost half the mains frequency, but it's not impossible to imagine it here. Don't test them at full power. Or rather do, but also test them at 25%, 50%, and 75% as well. At full power the pulse width modulation duty cycle may be 100%, which means they don't actually switch off at all, and thus won't flicker. And naturally, test at whatever frame rates and shutter speeds you anticipate will be used. In a broadcast context this is likely to involve a discussion with a senior vision engineer. With something like WS2812, as Harrison says, you will very likely be able to find a setup which will make it flicker, probably at high shutter speeds. A pain. But again, you could save a massive pile of gold. P
  10. If you want to go really budget, there is, as you'll have realised, very inexpensive programmable LED strip which could potentially be used for something like this. The part number you're looking for is WS2812, which describes the tiny controller chip that's in each individual LED emitter. Various DMX and ArtNet-to-WS212 devices are available. This would be extremely inexpensive compared to almost any other imaginable way of doing it, it would fit easily inside almost any set piece and it would be fully programmable. If you don't need it to be a beautiful keylight to make the talent look wonderful, which you don't, you could probably make it work. They may not end up looking on camera like they do in person, but that's fairly normal. The concerns will be longevity and flicker. These things are based on individual red, green and blue emitters which generally have a longer life than white-emitting LEDs, and I would encourage any design for this sort of scenario to run the LEDs at no more than two thirds maximum power to extend their life, and that would go for an Astera tube or a length of WS2812 strip. In a news studio sort of situation I would expect a good chance of avoiding flicker problems at conventional frame rates and shutter speeds. They look OK here:
  11. From what I've heard a lot of filmout-and-scan-back services are using sub-super-35 areas of the film anyway; otherwise, it can suffer from excessive subtlety. Possibly this is because at least some places have to use intermediate stock, since the film recorders won't go dim enough for camera neg, but I know at least some places have started putting sunglasses on their ArriLasers for that reason. P
  12. I've heard it said this is a good reason for motion control, even if it's easier to just track the shot to markers in the end (which obviates a lot, if not all, of the messing about with lens rectification and other maladies).
  13. I think one of the best things about the current situation is that it really isn't. Material delivered across the internet routinely looks better than that delivered over the air, and so the big streamers (who spend a lot of money on bandwidth) clearly could reduce that bandwidth if they really cared to. There is naturally a limit beneath which things would start to attract complaints, but that's lower than where they are now. Perhaps this is an example of competition working well; Netflix don't want to start looking blocky compared to Amazon, I guess.
  14. I have some good friends in that part of the world and much of it is familiar - haven't been since pre-pandemic, more's the pity. Where did you stay? P
  15. They did. We had one, a Panasonic NV-FS88 with the barcode reader in the corner of the remote control. It was pretty silly. You could scan a barcode, or you could select "tuesday." I wasn't very clear what the point was.
  16. I can only refer you to my previous thoughts on the subject. Even once it's in your bank account, the deal is not sealed. P
  17. I won't knock anyone who wants to talk about how things were done. There's certainly a wide variety of expectations as regards budget but there's always something to learn. More is now possible with less than ever before, in terms of gear. The problem with achieving good results now is access to locations, which has probably become less easy as gear has become easier. At some point, a big light on a cherrypicker still requires a cherrypicker.
  18. I've never heard that term; the UK terminology is telerecording, and it's through this process that we have some examples of early Doctor Who, for instance. I wasn't aware that it had been used to create theatrically-released material. In PAL-world, at least, it involved one film frame per video field, so the results were at 50fps. Some interesting work has been done since on monochrome telerecordings of material which actually contained composite colour video information, visible in the telerecording as dot crawl. Modern computer techniques have made it possible to reconstruct the colour information from those images. Because the colour burst is not visible in a telerecording, the hue angle must be manually set. It's sort of entertaining. P
  19. Something that's worth looking out for is the widget that I think was called Sharp Max, which was designed to clamp on the front of DigiPrimes (or any other B4 lens) and present a target at optical infinity for back focus adjustment. When I saw DigiPrimes on set, the Sharp Max was routinely used at every lens change. I'm not sure how essential that really was, but it's a nice tool to have. With regard to JK's thoughts, I would take the position that for nearly $600 each, shipped, you could buy any of a wide variety of classic stills primes which would probably behave better than a converted DigiPrime on modern cameras, cover full frame, be much faster, etc. You'd be doing it for the build.
  20. The problem isn't the cost. Doing it in post is cheaper in the real world, which is why all the best work with digital firearms effects is actually done on YouTube. The problem is that even the best simulated firearms isn't often completely convincing. It looks cheap. Obviously, big productions find ways to make everything expensive, but that's politics and corporate power plays more than anything else. The other issue here is that this is a ludicrous knee jerk reaction. There are lots of things on film sets that kill way more people than guns. This got a lot of attention because it involved a famous name and a conventionally attractive woman. It's no disrespect to either of them to point out that the reaction is disproportionate. Has anyone here heard of Mark Milsome, for instance? Nobody famous was involved, and he was a middle aged man. Nobody's calling for an end to vehicle stunts. I don't thing it's very fair to end the livelihoods of the world's armourers, who overwhelmingly do a brilliant job, for no better reason than that Alec Baldwin is famous.
  21. I looked into this and I'm not sure it's a great idea if you don't already own the lenses, or unless you can get a fantastic deal. You'll lose two stops in the converter, so they'll end up being roughly T/4, which isn't spectacular. Furthermore, as you've noted, the converter will cost you something, particularly in terms of aberration. Some are better than others, but at least some of the poor reputation of B4 zoom lenses on Super-35 cameras is due to the converter, not the lens. B4 broadcast zooms generally don't have the same design goals as, say, DigiPrimes, and they tolerate different compromises, but the converter certainly adds a layer of... well... lack-of-ideality. Good converters are expensive. If I tripped over a set of Digiprimes in a box I'd use them happily, but I wouldn't pay a fortune for them now. Most of the sets I've seen go for more than I'd pay for this sort of solution. I think the world overvalues them at this point. DigiPrimes are good. DigiPrimes on a converter are going to be considerably less good. Personally I'm not convinced most people are really that concerned over sensor size; this bigger-is-better insanity is not often very considered. The advantages of smaller sensors are huge, in that you can use great-yet-inexpensive lenses. Obviously, if you've got a client that wants 4K, you're stuffed, but you could always get a Blackmagic 12K and get 4K out of an area I suspect converted DigiPrimes would cover. P
  22. This can probably be handled by ignoring switch events until the relevant key strokes have been sent to the recorder; the project I found had a 10ms delay between the key down and key up events, which should be adequate for debouncing.
  23. I'd agree about putting something in series to limit current, but bear in mind the ATMEGA328 has internal pullups (set it as an input and then set it high as if it were an output). So, yes, do that, maybe, and pull the pin down.
  24. Happy to. Bear in mind that everything I'm saying here is based on my theoretical, not practical, understanding of the camera and the recorder, so I could be wrong. From my understanding this is how it works: - Detect when the contacts on the camera close. I would buy an off-the-shelf flash port connector cable and cut it in half. Connect one side of the connector to the +3.3V power [actually, no, connect it to ground], and the other to an interrupt-capable GPIO pin. Configure the GPIO pin as an input with interrupts enabled [and the pullup set]. Write an appropriate interrupt-servicing routine. When the interrupt fires, you need to take some action. - If the GPIO pin is currently set, start the camera, otherwise stop it. This will involve sending serial bytes to the recorder. The bytes you need to send are known from this open-source project. From what I can see there you need to send a "key down" code for the record button, then wait a few tens of milliseconds, then a "key up" code. Then, to stop, the key down code for the stop button, then wait, then the "key up" code. You may need to mess about with this a bit to make it behave as you want. - The Arduino has a serial port. This is used for debug communications back to your computer, but you don't necessarily need that for this. I'd buy an off the shelf 2.5mm 4-pole jack cable, and cut it in half. Connect the Arduino serial output pin to the appropriate pin on the 2.5mm remote control port jack. The other pins on that jack will supply power and ground (you are running the Arduino from the Zoom recorder's power supply). P
×
×
  • Create New...