Jump to content

Evan Richards

Basic Member
  • Posts

    30
  • Joined

  • Last visited

Posts posted by Evan Richards

  1. 7 minutes ago, David Mullen ASC said:

    Considering the DP's often color-corrected the home video master, and since D.I.'s the same correction for theaters was used for home video once trimmed for Rec.709, I think the standard blu-rays represented what the DP wanted you to see IF they supervised it.

    Keep in mind that if you do a D.I. for theatrical release in a DCP,  it's in P3 color space / gamma which is not far removed from Rec.709 display gamma for monitors.  It's only for some special IMAX and Dolby Cinema presentations using laser projectors that you can even make an HDR version for cinema.

    It could be argued for older titles, it's the HDR version that looks different than the movie ever looked in the past.  A contact print projected may have had the black levels closer to an HDR release but it never had the image brightness of HDR because projection was standardized to 16 foot-lamberts and was usually a bit lower.  And HDR is capable of displaying all 14 to 15 stops of information on the film negative, which a film print or a P3/Rec.709 version didn't if they wanted decent contrast.

    I wouldn't think of HDR vs. SDR in terms of which is more "accurate" but just as another way of looking at images.

    Thank you David. This clarifies a lot of my questions.

  2. On 4/18/2020 at 3:40 PM, David Mullen ASC said:

    You might find this interesting, it's about putting HDR frames up for a blu ray review website when they know people are not watching on HDR monitors.

    http://www.dvdbeaver.com/film/DVDCompare3/saving_private_ryan_UHD_blu-ray.htm

    Thank you, this is very useful as this is exactly what I am trying to do. To my eye though, the color of the "simulated" images seems far enough away from the original Bluray that it make me wonder which image more closely represents the DP's vision. I mean, people have been watching blurays on standard HD TVs for years, but does the fact that the UHD blurays look so different mean that standard blurays never truly represented what the DP wanted you to see?

  3. On 4/18/2020 at 3:28 PM, Tyler Purcell said:

    I don't know how one would grab stills from a UHD BluRay source. Also, very few UHD BluRay's are HDR, most of them are SDR made from 2k sources.

    Hmmm. That's interesting. I haven't found that on the blurays I've examined, but I guess I haven't looked at bery many. The ones I examined were "The Post", "Atomic Blonde", and "Moonlight".  If what you say is true, that most are SDR made from 2k sources, it seems like you'd be able to apply a rec 2020 to rec 709 LUT to make the 4k bluray footage match the HD footage? But I haven't found that to be the case. Check out these 4 images from Atomic Blonde (slight nudity warning). https://imgur.com/a/1jZD3jm

    The First image is taken from the bluray, the second image is taken from the UHD bluray and had the color tweaked as it was being extracted. They don't look too similar. The UHD images look much more colorful, contrasty, and overall brighter. It seems like the way the UHD looks would have the original intention for how it should look, but both of these images are backed down to JPGs which means they COULD have gotten this look in the HD bluray, but chose not to. So...maybe the was the HD bluray looks IS more or less how the film was expected to look?

  4. I'm not sure I understand what's going on with 4k blurays. So, these films are somehow scanned at 4k resolution in High Dynamic Range using Rec 2020 colorspace? How is this accomplished? I've mainly worked in VFX, and I know that the films I've done VFX for did not have effects shots that were high dynamic range. A lot of matte paintings, sky replacements, etc were decidedly low dynamic range.

    I can understand if all the the negatives were re-scanned then you could capture something that might be higher dynamic range than you had baked into the digital version found on a standard bluray, but how can SDR footage be converted to HDR?

    I'm asking out of curiosity because I've been trying to get 4k stills out of movies as reference images, but they all end up looking very different from the regular HD/bluray. I don't know much about the conversion process, but it SEEMS like you should be able to take 10bit HDR footage and make it look like the 8bit SDR HD footage. Any thoughts?

  5. On 10/12/2019 at 2:25 PM, Satsuki Murashige said:

    Lots of Tony Scott movies - ‘Top Gun’, ‘Revenge’, ‘Days of Thunder’, ‘Man on Fire’, etc.

    In ‘Revenge’ one of my favorite long lens shots is where Costner’s character is driving down a long dirt road and meets Madeleine Stowe’s character riding her horse. The compression works really well there, have always wanted to recreate that shot.

    I think I've noticed that about Tony Scott's movies. I think Ridley uses long lenses a lot too.

    I'll have to take a look at "Revenge". Thanks!!

  6. 2 hours ago, David Mullen ASC said:

    Semler did that for “Dances with Wolves” too — keep in mind that both were shot in anamorphic where the focal length has a 2X wider view, so a 400mm anamorphic would be like a 200mm spherical.

    Do you know what the idea is behind this? Was it to increase the background movement in a long-lens camera move? Or was it to say...compress the distance between the buffalo stampeed to make them appear closer to the actor and more dangerous?

    What effect is he achieving by using this technique?

  7. 2 hours ago, David Mullen ASC said:

    A lot of exterior scenes in “A Bridge Too Far” used 10:1 anamorphic zooms with doublers plus telephoto lenses. But they are most b-roll cutaways.

    There is that silhouette dialogue scene on the beach swing set in “Tequila Sunrise” with long lenses for the coverage but I don’t know the focal length.

    I'll check these out. Thanks David!

    1 hour ago, Dan Hasson said:

    Is this the Tinker scene you're talking about here

    Yes! That's the one!

    1 hour ago, Dan Hasson said:

    The the film Too Late was shot in five 22 minute unedited take segments. The first segment used a 30mm-1200mm Angenieux Super Tele.

    Will take a look. Thanks!
     

  8. Sorry this is going to be a little vague, but if I suspect someone on this forum will be able to point me in the right direction.

    I read an article (or watched a video) where Dean Semler talked about shooting some shots in "The Alamo" with really long lenses. If I recall the lenses were 400+ mm. Maybe even as long as 800mm? Can anyone point me towards a reference that talks about this? And which scene(s) were shot with these lenses?

    Thanks!!

  9. If every camera had the same increase in signal for every stop of exposure, that would mean they all had the same dynamic range.

    Yes. I suppose that makes sense. When you put it that way, it seems so logical.

     

    But then I think about it more...and it starts to confuse me. Maybe my definition of what a "stop" is is off. I had always thought of it as doubling or halving the light. If you increase your aperture by one stop, you double your light. If you drop in a .3 ND (1 stop) you cut the light in half. So it seems like if you had two cameras with different dynamic range as long as you start with the same exposure on both cameras doubling the light (increasing by one stop) on each camera would be the same increment regardless of sensor. Double is double. Different dynamic ranges allow one camera to shoot MORE stops than another, but the stops themselves would be the same. How am I thinking about this wrong?

  10. Depends on the camera and the gamma setting you are measuring -- ARRI Log-C takes the 14-stops of DR of the Alexa and spreads it so that when pointing at an 11-step grey scale, black falls around 15% and white falls around 65%, leaving the range above and below to record extra shadow and highlight information (on cameras with a narrower DR like the Genesis/F35, in log mode, black was around 10% and white around 70%).

     

    And it's a log curve, not a straight line, so a 1-stop change in exposure isn't going to equal the same percentage of IRE values if near the bottom or the top of the scale. But in the middle, it seems to work out that 5%-ish equals a stop on an Alexa? 6%? But switch to measuring Rec.709 and probably each stop is more like a 10% change.

     

    In my mind I guess I was thinking exposure would be an objective measurement and and one stop increase would be the same irrespective of which camera is being used.

     

    So, it seems like what you're implying is that maybe the best way to measure where the exposure lies would be to shoot a test and view it on a waveform monitor. Expose a gray card to 50% (just for ease of measurement) then expose up one stop at a time until it gets to white and expose down a stop at a time until it gets to black.

     

    Thanks David!

  11. There has to be a correlation right? Lets say I shoot a 18% gray card and it is sitting at 50 IRE on a waveform monitor. What would an increase of 1 stop read? A decrease of 1 stop?


    I'm trying to create some customized false color LUTs just for my own purposes to analyze images in resolve. But I'd like to be able to say ok, gray = middle gray, one stop over that should be yellow, two stops over that should be orange, three stops over that should be red, etc. but I'm struggling with knowing how to measure the exposure gains.


    Thanks!

  12. Generally when you work in Resolve, you're grading in full range data. In a 10 bit file, the whitest white is 1023 and the blackest black is zero. This is reflected on the waveform. If you output to "video levels", or 64-940, the transform from full range to video levels will be done upon rendering. If you need video levels for your display, it is a selection in the preferences.

     

    So one can view the waveform as normal.

     

    If you are grading in LOG and outputting LOG for a filmout, then things could be different as one would be grading viewing through a film emulation LUT, and the scopes would be showing a LOG image. If I remember correctly, film white was around 800 and blacks 125, but I really don't remember. But, today, who grades solely for a film out? I haven't made a grade for filmout since 2011. My last filmout was done in 2013 and we graded for DCP and converted that version to LOG for a filmout with some adjustments made to the conversion to make a nice negative.

     

    So Evan, nothing to see here. Go back to work :)

    I understand. I think. So if you were grading a 10 bit image, anything over 940 or under 64 will be blown out or crushed after it is saved as a rec709 image.

     

     

    Would be nice if they gave you the option to change the scale to IREs. Most other monitors I've seen use that scale. Perhaps it's not useful for grading 10 bit images though. But it does go up to 120 and down to -40. Seems like that would give you some more latitude.

    Thanks!

  13. Stolen from the internet.

     

    In order to get a resulting clip with a range between 0 and 1023, a 10 bits video format needs to be selected as the transcode's format.

     

    In YCC 709 (itu.709 is the norm for HD televisions, except some very very old ones that followed SMPTE 240), black is mapped to 16/64 and white to 235/940 in 8/10 bits. Values below 64 (10 bits) are called super black and above 940 super white. The "super" colors are mainly used for color correction as they won't be noticeable on the television."

     

    So, It makes me think about the DV waveform scale: This would mean legal video should be kept inside 64-940 range?

    Also, I read the top and bottom 5 points of the 0-1023 10-bit scale are reserved for data only, no Image data is recorded... This would be true for the DV WFM scale?

    OK. That makes sense. Do other applications or monitors measure it that way? I've never seen it measured on that scale before.

    Thanks!

  14. Now, not to be rude or anything but your question doesn't really make sense, seeing as how the dynamic range of "Canon RAW CR2 files" differs from camera model to camera model

    Very true. I have a 5D mk II so those are the files I'm used to working with.

     

    So, when you say 1600 is just 800 with 1EV gain, does that mean it's no different than adding the gain in post? I've read something like that but didn't know if it was true. Because if that WERE the case, there would never be any reason to change your ISO from 800.

     

    Good to know about the 5d mk III raw. I'll bet the mark II and mark III CR2 files have more dynamic range than the magic lantern raw.

  15. Canon doesn't do camera raw in "video" mode only in still mode.

    Which is why I specifically asked about raw stills in the title and body of my post.

     

    Blackmagic has Cinema DNG Raw which is a folder full of Tiff files.

    Actually it would be a folder full of DNG files.

     

    They're not very similar actually... Cinema DNG isn't the same style of RAW at all.

    They are similar because they are both raw image file formats that can both be opened and manipulated with adobe camera raw. I was asking about the comparison in dynamic range which should be an easy correlation to make.

  16. Ok, so I'm a relative noob to cinematography in the sense that I've basically only ever shot with the 5d mk II and other cameras that shoot in h.264 formats. I'm interested to see how far I can push better footage. 4:4:4 footages and 4:2:2 footages. So I went on Kinefinity's site and downloaded some of their test footage.

     

    Correct me if any of the following information seems wrong. I'm just trying to figure things out.

     

    I brought the raw cineform .mov footage into after effects.

    In case it matters I'm specifically using this clip: https://www.dropbox.com/sh/r8a92kcey52p7ql/AABEGqhWEQjugWrYbHFN4dJxa?dl=0&preview=BLN-0001-028-A1-5426.zip

     

    I set the project settings to 16 bit with working space set to "none". I realize the footage isn't 16 bit but that should give me the color leeway I need to make sure I don't lose any information which would happen if I set the project settings to 8 bit (right).

    OK. So I have my raw footage. Raw footage is by its nature 4:4:4 right? The chroma subsampling is always a result of the compression correct? You'd never get a subsampled image from the sensor would you?

    I bring it into after effects and drop it into a timeline. I then export it three different ways. I export a prores 4444 (with export depth set to trillions of colors) and prores 422 (with export depth set to trillions of colors) and apple intermediate codec which I've read is a 4:2:0 codec (with export depth set to millions of colors).

     

    Then I import these three clips into my timeline. So now I have 4 tracks, each one the same shot.

    1. My original file which is an uncompressed cineform .mov

    2. An apple prores 4444 (hopefully this won't look much if any different from the source file)

    3. An apple prores 422

    4. An apple intermediate codec (4:2:0 I believe)

     

    When I start pushing these around the only difference I notice is that on the 4:2:0 shots the red colors seem a little more saturated. Some colors are slightly shifted as well. More green it seems.

    When I adjust the exposure though, it doesn't seem to give me any more latitude in the car headlights in the raw footage that it does in the 4:2:0 footage.

    When I crank up the saturation the 4:2:0 footage does seem to be a little blockier than the raw footage.

    But overall the difference between 4:4:4 and 4:2:0 seems to be pretty negligible and I know that can't be right. The differences between editing a quicktime movies I've shot on my 5D and editing a raw photo is night and day and I have to imagine a 14-16 dynamic range camera like the Kinefinity 6K the difference would be even greater.

     

    The only reason I used after effects is because I know how to use it pretty well and it was the only software I have that would read all the different formats (resolve wouldn't read the apple intermediate codec and I couldn't export an h.264 at the 6k and I didn't want the variables introduced if I resized anything.

     

    So what's going on. What am I doing wrong. Is there a better way to test this out?

     

    Please advise.

    Thanks!!

    Evan

  17. Put a white card and a black card on each side of an 18% gray card -- under and overexpose in whatever increments you want until you can't see a difference between the white and gray card at the overexposed end and between the gray and black card at the underexposed end. Make sure you shoot at whatever is the widest dynamic range recording format the camera offers (raw, log, cine gamma, hypergamma, etc.) but also test it in the narrower Rec.709 display gamma range just for comparison.

    And I suppose the more flatly and evenly your cards are lit the more accurate your test will be.

  18. Put a white card and a black card on each side of an 18% gray card -- under and overexpose in whatever increments you want until you can't see a difference between the white and gray card at the overexposed end and between the gray and black card at the underexposed end. Make sure you shoot at whatever is the widest dynamic range recording format the camera offers (raw, log, cine gamma, hypergamma, etc.) but also test it in the narrower Rec.709 display gamma range just for comparison.

    Awesome! Thanks for you expertise David!

    Will give it a shot.

     

    Cheers

×
×
  • Create New...