Jump to content

Evan Richards

Basic Member
  • Posts

    30
  • Joined

  • Last visited

Everything posted by Evan Richards

  1. But if the UHD version has an extra layer of blacks for more contrast, then I guess it wouldn't be as simple as a LUT. You'd have to do some tone mapping I guess (for going from Rec 2020 to Rec 709)?
  2. Thank you, this is very useful as this is exactly what I am trying to do. To my eye though, the color of the "simulated" images seems far enough away from the original Bluray that it make me wonder which image more closely represents the DP's vision. I mean, people have been watching blurays on standard HD TVs for years, but does the fact that the UHD blurays look so different mean that standard blurays never truly represented what the DP wanted you to see?
  3. Yes. I suppose that's true. I had forgotten that as we always have the show LUTs built into Nuke and RV that automatically convert our images to the proper viewing space without us having to do anything.
  4. Hmmm. That's interesting. I haven't found that on the blurays I've examined, but I guess I haven't looked at bery many. The ones I examined were "The Post", "Atomic Blonde", and "Moonlight". If what you say is true, that most are SDR made from 2k sources, it seems like you'd be able to apply a rec 2020 to rec 709 LUT to make the 4k bluray footage match the HD footage? But I haven't found that to be the case. Check out these 4 images from Atomic Blonde (slight nudity warning). https://imgur.com/a/1jZD3jm The First image is taken from the bluray, the second image is taken from the UHD bluray and had the color tweaked as it was being extracted. They don't look too similar. The UHD images look much more colorful, contrasty, and overall brighter. It seems like the way the UHD looks would have the original intention for how it should look, but both of these images are backed down to JPGs which means they COULD have gotten this look in the HD bluray, but chose not to. So...maybe the was the HD bluray looks IS more or less how the film was expected to look?
  5. I'm not sure I understand what's going on with 4k blurays. So, these films are somehow scanned at 4k resolution in High Dynamic Range using Rec 2020 colorspace? How is this accomplished? I've mainly worked in VFX, and I know that the films I've done VFX for did not have effects shots that were high dynamic range. A lot of matte paintings, sky replacements, etc were decidedly low dynamic range. I can understand if all the the negatives were re-scanned then you could capture something that might be higher dynamic range than you had baked into the digital version found on a standard bluray, but how can SDR footage be converted to HDR? I'm asking out of curiosity because I've been trying to get 4k stills out of movies as reference images, but they all end up looking very different from the regular HD/bluray. I don't know much about the conversion process, but it SEEMS like you should be able to take 10bit HDR footage and make it look like the 8bit SDR HD footage. Any thoughts?
  6. I think I've noticed that about Tony Scott's movies. I think Ridley uses long lenses a lot too. I'll have to take a look at "Revenge". Thanks!!
  7. Do you know what the idea is behind this? Was it to increase the background movement in a long-lens camera move? Or was it to say...compress the distance between the buffalo stampeed to make them appear closer to the actor and more dangerous? What effect is he achieving by using this technique?
  8. I'll check these out. Thanks David! Yes! That's the one! Will take a look. Thanks!
  9. I'm looking for some examples of scenes shot on extremely long lenses (200+ mm). I know there was a shot in "Tinker, Tailor..." that was shot on a 2000mm lens, I think "The Alamo" had some 400+ mm shots (I'm trying to find the exact scene). Can anyone recommend any other scenes in a movie that were shot with extremely long lenses? Thanks!!
  10. Sorry this is going to be a little vague, but if I suspect someone on this forum will be able to point me in the right direction. I read an article (or watched a video) where Dean Semler talked about shooting some shots in "The Alamo" with really long lenses. If I recall the lenses were 400+ mm. Maybe even as long as 800mm? Can anyone point me towards a reference that talks about this? And which scene(s) were shot with these lenses? Thanks!!
  11. Yes. I suppose that makes sense. When you put it that way, it seems so logical. But then I think about it more...and it starts to confuse me. Maybe my definition of what a "stop" is is off. I had always thought of it as doubling or halving the light. If you increase your aperture by one stop, you double your light. If you drop in a .3 ND (1 stop) you cut the light in half. So it seems like if you had two cameras with different dynamic range as long as you start with the same exposure on both cameras doubling the light (increasing by one stop) on each camera would be the same increment regardless of sensor. Double is double. Different dynamic ranges allow one camera to shoot MORE stops than another, but the stops themselves would be the same. How am I thinking about this wrong?
  12. In my mind I guess I was thinking exposure would be an objective measurement and and one stop increase would be the same irrespective of which camera is being used. So, it seems like what you're implying is that maybe the best way to measure where the exposure lies would be to shoot a test and view it on a waveform monitor. Expose a gray card to 50% (just for ease of measurement) then expose up one stop at a time until it gets to white and expose down a stop at a time until it gets to black. Thanks David!
  13. There has to be a correlation right? Lets say I shoot a 18% gray card and it is sitting at 50 IRE on a waveform monitor. What would an increase of 1 stop read? A decrease of 1 stop? I'm trying to create some customized false color LUTs just for my own purposes to analyze images in resolve. But I'd like to be able to say ok, gray = middle gray, one stop over that should be yellow, two stops over that should be orange, three stops over that should be red, etc. but I'm struggling with knowing how to measure the exposure gains. Thanks!
  14. I understand. I think. So if you were grading a 10 bit image, anything over 940 or under 64 will be blown out or crushed after it is saved as a rec709 image. Would be nice if they gave you the option to change the scale to IREs. Most other monitors I've seen use that scale. Perhaps it's not useful for grading 10 bit images though. But it does go up to 120 and down to -40. Seems like that would give you some more latitude. Thanks!
  15. OK. That makes sense. Do other applications or monitors measure it that way? I've never seen it measured on that scale before. Thanks!
  16. I thought waveform measured IRE and I thought IRE was measured from -40 to 120. Whats the deal? What does that number represent? What scale is it using to measure? Thanks!
  17. Very true. I have a 5D mk II so those are the files I'm used to working with. So, when you say 1600 is just 800 with 1EV gain, does that mean it's no different than adding the gain in post? I've read something like that but didn't know if it was true. Because if that WERE the case, there would never be any reason to change your ISO from 800. Good to know about the 5d mk III raw. I'll bet the mark II and mark III CR2 files have more dynamic range than the magic lantern raw.
  18. Right. I knew it was advertised at 13 stops. It's interesting because if I'm reading this right, this chart would seem to indicate it would be best practice to shoot at 1600 ASA in brighter light because you'd get more latitude in your brighter areas. Still doesn't answer how this compares to Canon RAW CR2 files though. Hmmm. Thanks!
  19. Which is why I specifically asked about raw stills in the title and body of my post. Actually it would be a folder full of DNG files. They are similar because they are both raw image file formats that can both be opened and manipulated with adobe camera raw. I was asking about the comparison in dynamic range which should be an easy correlation to make.
  20. I've shot a lot of stills and have some experience with Canon raw files. How do these compare with black magic raw video? The stills are higher resolution obviously but is the dynamic range comparable? Does Blackmagic raw have better dynamic range? Please advise. Thanks!
  21. Ok, so I'm a relative noob to cinematography in the sense that I've basically only ever shot with the 5d mk II and other cameras that shoot in h.264 formats. I'm interested to see how far I can push better footage. 4:4:4 footages and 4:2:2 footages. So I went on Kinefinity's site and downloaded some of their test footage. Correct me if any of the following information seems wrong. I'm just trying to figure things out. I brought the raw cineform .mov footage into after effects. In case it matters I'm specifically using this clip: https://www.dropbox.com/sh/r8a92kcey52p7ql/AABEGqhWEQjugWrYbHFN4dJxa?dl=0&preview=BLN-0001-028-A1-5426.zip I set the project settings to 16 bit with working space set to "none". I realize the footage isn't 16 bit but that should give me the color leeway I need to make sure I don't lose any information which would happen if I set the project settings to 8 bit (right). OK. So I have my raw footage. Raw footage is by its nature 4:4:4 right? The chroma subsampling is always a result of the compression correct? You'd never get a subsampled image from the sensor would you? I bring it into after effects and drop it into a timeline. I then export it three different ways. I export a prores 4444 (with export depth set to trillions of colors) and prores 422 (with export depth set to trillions of colors) and apple intermediate codec which I've read is a 4:2:0 codec (with export depth set to millions of colors). Then I import these three clips into my timeline. So now I have 4 tracks, each one the same shot. 1. My original file which is an uncompressed cineform .mov 2. An apple prores 4444 (hopefully this won't look much if any different from the source file) 3. An apple prores 422 4. An apple intermediate codec (4:2:0 I believe) When I start pushing these around the only difference I notice is that on the 4:2:0 shots the red colors seem a little more saturated. Some colors are slightly shifted as well. More green it seems. When I adjust the exposure though, it doesn't seem to give me any more latitude in the car headlights in the raw footage that it does in the 4:2:0 footage. When I crank up the saturation the 4:2:0 footage does seem to be a little blockier than the raw footage. But overall the difference between 4:4:4 and 4:2:0 seems to be pretty negligible and I know that can't be right. The differences between editing a quicktime movies I've shot on my 5D and editing a raw photo is night and day and I have to imagine a 14-16 dynamic range camera like the Kinefinity 6K the difference would be even greater. The only reason I used after effects is because I know how to use it pretty well and it was the only software I have that would read all the different formats (resolve wouldn't read the apple intermediate codec and I couldn't export an h.264 at the 6k and I didn't want the variables introduced if I resized anything. So what's going on. What am I doing wrong. Is there a better way to test this out? Please advise. Thanks!! Evan
  22. And I suppose the more flatly and evenly your cards are lit the more accurate your test will be.
  23. Awesome! Thanks for you expertise David! Will give it a shot. Cheers
×
×
  • Create New...