Jump to content

Jon Pais

Basic Member
  • Posts

    38
  • Joined

  • Last visited

Everything posted by Jon Pais

  1. Hi Boris, There are a number of videos on YT shot with the Lishuai Edge lights. Here are a couple of my own. The first video is HDR, so it will look awful if you watch in SDR. I haven't done any side-by-side comparisons with other lights.
  2. I fail to see how they’re copying the companies you mention. Sounds like you’re not really interested in them anyhow. Do you have evidence that all the companies’ lighting is made in the same factory? I’ve got two of their lights, the C-700RSV and C-1500RSV Edgelights. They’re not as color accurate as those made by some manufacturers today. I’ve used the smaller of the two for shooting models outdoors at night - it’s very lightweight, battery life is reasonable and it produces a flatteringly soft light for portraiture.
  3. Download a comparison between Dehancer film grain and Resolve film grain here. The file is BT2020 PQ. I don't add grain to YT projects because their aggressive compression algorithms destroy high frequency detail, but I will occasionally be offering them as downloads to my subscribers. It’s well-nigh impossible to see how the plug-in compares to Resolve on something like the MacBook Pro Liquid Retina XDR mini-LED, with its small, low resolution display, which is why I recommend throwing the clip on the timeline of your favorite NLE, setting it up for HDR PQ and viewing on an external UHD monitor or television set.
  4. This is an excerpt. For the full post, click here. “If you want to use an OLED for HDR mastering, I highly recommend using P3-D65 as your working color space. That’s the best choice in such pipelines. Don’t use BT.2020 with an OLED, only BT.2020 limited to P3-D65. I see this mistake all the time.” – How to choose an HDR Monitor for Color Grading, Tim Yemmax, Colorist Colorists are virtually unanimous in recommending BT.2020 (P3-D65 limited) for grading HDR content and for good reason: DCI-P3 happens to be the color gamut most widely used by the film industry; Netflix requires deliverables be P3-D65 ST.2084; reference monitors in post-production houses support the full P3 color space; and even ubiquitous mobile devices support as much as 99% of the P3 gamut. In fact, it would be no exaggeration to point out that billions of people are carrying in their pockets a display with greater color accuracy and higher brightness than those found on the sets of most multi-million dollar productions. It goes without saying, however, that few colorists regularly upload anything but tutorials to YouTube; that precious few have ever uploaded a single HDR video in their life; and that relatively few colorists have any experience at all grading HDR professionally; furthermore, it’s no secret that not a few are even openly hostile toward high dynamic range video. We only bring this up because it may help to explain colorists’ stunned disbelief upon learning that on Google’s own YouTube help page, using the P3 color space is strongly discouraged. Curiously, neither on his English language YouTube channel nor on his German language one does Tim Yemmax recommend P3-D65 in his otherwise excellent tutorials – even though video monitoring settings, color processing modes and a calibrated display are all mutually interdependent and need to match for accurate results. Numerous thoughts passed through our minds when we decided to embark on this undertaking. Initially, we thought it might be entertaining to conduct an online experiment to see whether anyone apart from a seasoned professional like Tim Yemmax could also instantly recognize when a YouTube video was carelessly graded in BT.2020 instead of the industry standard P3-D65. Naturally, we’d want to upload a few videos to YouTube to discover whether the color shifts were real – but how? And perhaps most importantly of all, would switching to P3-D65 help revitalize our waning YouTube subscriber base? Acquisition Format Any serious discussion of color spaces must surely consider the acquisition format. That is, if the color processing mode is going to be P3-D65, would we be better off recording a smaller color space like S-Gamut3.cine? Or should we instead be capturing the widest available gamut – S-Gamut3 – thereby preserving as much of the sensor’s native color space as possible and future-proofing our timeless masterpieces of cinematic art for posterity? Dolby weighs in: “Where content is going to be color graded, it is recommended to use source material that best retains the native color gamut and original dynamic range of the content at the time of its origination in order to create high quality Dolby Vision content. Where content will be directly edited and then aired with just some minor color corrections or even without a color grading process, capturing in a format that matches to the color gamut and dynamic range of the delivery requirements can enable a simpler workflow”. – Dolby Vision S-Gamut3 it is then. Or, so you’d think: S-Gamut3.cine still enjoys the greatest popularity among Sony shooters. Hardware When it comes to grading, the very first requirement is inarguably a mastering monitor large enough to assess UHD image quality at a viewing distance of 1.5 times picture height (a rule of thumb is the smaller the monitor, the better things look – and only when they’re really bad do problems become apparent) and capable of being calibrated to Rec.2020 PQ P3-D65. “That’s child’s play,” you’re probably thinking, “just adjust the settings on the LG CX!” That there is a hidden menu on the LG CX with Colorimetry, EOTF, Mastering Peak and Mastering Color, MaxCLL and MaxFALL, P3-D65, Rec.2020 as well as various flavors of SDR goes to show just how wildly popular LG OLEDs have become among both colorists and filmmakers, but information about how exactly to configure the settings is all but non-existent and we found that changing the colorimetry or the mastering color had no discernible affect on the picture whatsoever. Not only that, but regardless of whether HDR Rec.2020 PQ or Rec.2020 PQ P3-D65 were selected on the color management page of Resolve, our TV obstinately detected BT.2020 as the incoming signal. While the LG CX display itself is said to cover as much as 96% of the P3 gamut, apparently the HDMI protocol doesn’t technically support using P3 color primaries – which must be what the warning on the YouTube help page is referring to when it says that P3-D65 is not a supported format for delivery to consumer electronics. That being the case, in order to make the display compliant with other color spaces, you’ll need to invest an additional $1,295.00 into something like the Teranex Mini SDI to HDMI 8K ($1,600.00 here in Vietnam). So, we’re looking at a total expenditure of around $4,000.00 including an LG OLED TV and UltraStudio 4K Mini HDR with Thunderbolt 3 – all for a non-monetized YouTube channel with fewer than fifteen hundred subscribers of whom only 7% watch our content on a regular basis! According to Ted Aspiotis, “The P3D65 selection in the HDMI Override menu is just a cosmetic option; it’s the same as REC.2020. LG hasn’t enabled it yet, and it’s pending to be enabled in the future (or never, as its over two years of waiting already)”. Tyler Pruitt adds, “You can calibrate the TV to HDR 2084 P3 gamut using Calman. A lot of facilities are doing this and usually they do one picture mode, say HDR cinema mode to 2020 HDR, And a second HDR picture mode like HDR Filmmaker mode to P3 HDR. This is a very common practice when using LG TVs in postproduction for HDR”. Evidently, there are hordes of hapless filmmakers out there who sincerely believe that they’re grading HDR Rec.2020 PQ P3-D65 – when they’re not!
  5. I think you're confusing the sheer number of pixels with perceived sharpness: whatever the resolution of film is as measured in a laboratory, the fact remains that the overwhelming majority of movies are presented in 2K theatrically and no one ever complained that the picture was blurry. That's how I've watched movies my entire life. I set sharpness at 50 in Dehancer and the picture looks just fine on a 55" OLED display. At home, in order to see the difference between 1080p and 4K, you'd have to be sitting no more than four feet from a 55" UHD display, which few people in real life do. I should add that I upload all my videos in HDR, and the difference in clarity and detail HDR makes is orders of magnitude greater than that between 1080p and 4K. While you've got to sit inches away from a screen to tell the difference between say, 8K and 4K, anyone can instantly see the improvement in picture quality of HDR from across the room. Dynamic range and richness of color are of far more consequence than the number of pixels.
  6. The smallest details of film cannot be smaller than the grain size, which is why it’s recommended to always lower the resolution parameter accordingly.
  7. While Dehancer 5.1 failed to deliver on its promise of double the performance over version 5.0, Dehancer 5.2.0 brings it on with a new quality selector, allowing the user to choose between normal-fast/high-slow settings, each of which offers significant improvements in playback speed. On our 16″ 32 core M1 Max MacBook Pro (late 2021) with 4K 24p ProRes 4444 we measured real time playback speed in normal quality mode and 18.5 fps in high quality mode, either of which can be considered noteworthy upgrades over the painfully slow 14 fps of the previous update. Memory optimizations in Dehancer Pro version 5.2.0 also dramatically reduced memory pressure when rendering four stacked 4K 24p ProRes 4444 clips in DaVinci Resolve Studio 17.4.
  8. This is absolutely massive! Cullen Kelly has released a FREE Kodak 2383 print film emulation LUT that works in both ACES and in DaVinci Wide Gamut as well as in SDR or HDR. We tried it on some clips and it’s the real deal! Head on over to his website and pick yours up now.
  9. Not seeing any speed improvement here. Prior to update, we got around 14fps and after update, no change whatsoever on M1 Max 32 core.
  10. One year ago, almost to the day, we published a video entitled Why Are HDR Shows So Dark?, which turned out to be one of our most successful rants on YouTube. We’ve since identified no fewer than a half dozen different contributing factors, including: shows that continue to be lit in an SDR environment; fewer than 1% of productions are even monitored in HDR and not infrequently, the very first time anyone sees the picture in HDR is in the colorist suite; from there, you can bet it’s already a foregone conclusion that the HDR version of the show won’t depart radically from the SDR one; moreover, colorists have been known to compromise the grade in order to mitigate motion artifacts; and lastly, it’s no secret that not a few filmmakers and colorists are either ambivalent or even openly hostile toward HDR, preferring the low con look of traditional film instead. In and of themselves, apart from the absence of bright specular highlights that constitute one of the signature characteristics of high dynamic range video, none of these various assaults on HDR would necessarily make the image any darker than SDR; except for the fact that SDR’s relative approach to gamma means that the picture can always be made brighter in a room with high ambient light levels, whereas PQ based ST2084 HDR is an absolute standard where neither the EOTF nor the peak brightness can be increased, with the result that the SDR version could very well be brighter than the HDR one. Enter Michael Zink, Vice President of Emerging & Creative Technologies at WarnerMedia, who, during the course of a highly illuminating interview on The Display Show, proposes several more reasons why HDR shows might be too dark: “So, content metadata is this concept of describing technical parameters of the content itself so that that information can be provided to a display and the display can make better choices, especially when it comes to things like tone mapping. Now, you might recall that in the HDR10 format for instance, most people focus on the fact that it is using SMPTE 2084, the PQ curve in terms of the encoding, but it also includes SMPTE 2086, which is mastering display metadata. Now, that metadata describes what mastering monitor I used but that doesn’t really say anything about the content. So, you can have the most sophisticated mastering monitor and maybe I’m creating a piece of content that is in black and white. So describing the monitor, while helpful, doesn’t really tell me the full story. So, what we ended up doing was to come up with additional metadata that can go along as part of the HDR10 format and at the time it was really this notion of let’s at least create some sort of what we call static metadata that at least describes the two terms for MaxCLL and MaxFALL. MaxCLL is maximum content light level that essentially describes the brightest pixel in the entire film and MaxFALL is maximum frame average light level. You can equate that to an APL essentially: what is the brightest overall frame in the entire entire movie. Now, the reason we invented that is, as I mentioned earlier, is that as part of the discussions inside the Blu-ray Disc Association at the time, it was this distinction between how bright you need content to be for speculars versus for the entire frame, so these two parameters were kind of like developed and Mike Smith, one of the collaborators of mine here at Warner, kind of came up with that at the time to really help describe those content parameters. Now, where they become really useful for instance, let me give you a real-time example, is that let’s say you have a display that is capable of 750 nits yet you’ve mastered on a Pulsar which is able of mastering 4000 nits. But if you, for instance, only have a piece of content that isn’t very bright – maybe the MaxCLL is only 500 nits – the display actually doesn’t need to do any sort of tone mapping. Yet if you don’t use that metadata and you don’t look at the content metadata itself, and instead you just look at what mastering monitor was used, you would take whatever is in the master, assume it’s mastered to 4000 nits, map it all the way down to 750 nits, which means the actual content, the brightest pixel in that piece of content – 500 nits – will now be displayed much lower than that and content will end up looking very dark. And I think we’ve seen a lot of complaints early on when it came to HDR from consumers saying HDR looks too dark; and I think a lot of those instances were caused by those types of bad judgments – probably the wrong term – but certainly by using the wrong type of information; so I think it was always helpful using as much information as possible and I think it would be great for display manufacturers to really pay attention to the different types of metadata that is available and we wanted to make sure that we have or are providing information about what content metadata is there now as I said static metadata is that it’s static it just describes one snapshot over the entire feature film. There’s obviously a lot richer for metadata and dynamic metadata that describes that frame by frame; and for a lot of content, that metadata is available as well; and I think manufacturers should choose to use that one simply because it gives them more information i think from a display perspective more information is typically better if you want to maintain the creative intent”. Michael Zink then goes on to explain how so-called outlier pixels – very bright pixels that are unintended – can skew MaxCLL metadata, in turn distorting tone mapping; and how some television manufacturers are simply ignoring metadata altogether.
  11. It’s recommended to blur image slightly when adding grain.
  12. That's why his website is called celluloid dreaming.
  13. The most esteemed filmmakers and colorists on the planet use film print emulation LUTs. Cullen Kelly has spoken quite eloquently about our rich heritage of film color: “The final fundamental to grading photographically is to use a good print stock. This concept is largely forgotten today but for a century or more, print stock played a key role in defining the look of a film, providing a consistent baseline of creative contrast and color imagery and helping visually unify the images”. – Cullen Kelly “[…] I really do feel, as a hardcore devotee of traditional film print emulation and of borrowing from the incredibly detailed and fine work that’s been done over the course of the last century with color science as regards film negative and film prints and that whole system, that’s the best way for mastering aesthetically pleasing images that we’ve ever come up with as a species, by far. So we have a huge debt that we owe to those things and we owe a lot of diligence in terms of understanding how did that work and using that as our baseline instead of saying ‘I’m gonna write my own script and play around with my lift and gamma and gain’ – all the work’s already been done, and until we can do it better, that’s really where we owe our diligence, in my opinion”. - Cullen Kelly
  14. The print options in Dehancer Pro 5.0.0 are beautiful. Here's a short video I just uploaded to YT in HDR using Konica Centuria 100 DNP and Kodak 2383. And here is more information about how I color corrected the picture and settings in Dehancer.
  15. And here's another free download of Sony a7s III ProRes RAW HQ 4.2K 23.976 HDR footage.
  16. The footage we've seen from the DJI 4D looks amazeballs; and that DJI has committed to working with the following DPs on their upcoming works with the 4D is an arrangement unprecedented in our experience: Rodney Charters, ASC, CNSC, NZCS; Takuro Ishizaka, JSC; Rachel Morrison, ASC; XiaoShi Zhao, CNSC; and Academy Award winners for Best Cinematography: Erik Messerschmidt, ASC; Claudio Miranda, ASC; and Peter Pau, HKSC. Something else that's unprecedented in the business is for anyone to evaluate a pre-production camera in light of its HDR capabilities. For certain, if there are any weaknesses in a system, whether it's with sharpness, noise in the shadows, banding, weak dynamic range or poor handling of highlight roll off - those problems are unquestionably going to be exacerbated in HDR. So it's with great interest that we listened to DP Erik Messerschmidt - an HDR evangelist - talk about how the DJI footage looked in Dolby Vision during the launch event. BaseLight, which opens it natively. It seemed to have plenty of dynamic range. It was sharp, we didn't see any aliasing issues, we didn't see any banding or breakup. It's clearly got plenty of bit depth and dynamic range. And we were looking at it in HDR actually, so we looked at it in Dolby PQ in DCI-P3 and it looked fabulous, it was on par with anything else, for sure". -Erik Messerschmidt
  17. Having seen the footage and watched a bunch of reviews, I'm going to have to pass on the Sony a7 IV as a second body to my a7s III. The poor rolling shutter, the 60p crop, worse color, and less underexposure latitude make the latest alpha body one to skip. On the other hand, the a7s III is pretty spectacular once you balance the color; and I've yet to see any footage from the a1 that surpasses it. Download ProRes RAW HQ 4K 23.976 HDR sample footage.
  18. Dehancer OFX 5.0.0 with Kodak Vision 2383 is awesome, Anfisa. I hope you add Sony a7s III S-Log3 S-Gamut3 (not S-Gamut3.cine) to the growing list of supported cameras.
  19. First of all, 'the guy' has a name: Stephen Perera. Secondly, Perera specifically says "the cinematography was great aside from a few needless anamorphic flares that’s pandering to the YouTube generation that equates flares with great camera work haha". Linus Sandgren is not pandering to anybody. I suggest you cool your jets.
  20. DP Linus Sandgren chose the Panavision G Series anamorphic primes for the Panaflex Millennium XL2 because he was inspired by Allen Daviau's work on E.T. the Extra-Terrestrial. Says Sandgren, "I wanted a lens that would be a workhorse but also have beautiful flares that weren't overly dramatic..." Nothing whatsoever to do with pandering to anybody, let alone the YouTube audience.
  21. I'm not confusing anything, Tyler. Getting HDR to display correctly, either on my TV or phone, has never been an issue. Over 30% of N. American homes have 4K HDR television sets and over 200 models of smartphones have OLED HDR panels. More than a quarter of devices connected to streaming giant Netflix are configured HDR. The reason most HDR shows are flat is that they continue to be lit in an SDR environment, they're monitored in SDR, and the very first time someone sees it on an HDR display is in the grading suite. And it doesn't end there. While colorists do have access to the tools to see HDR, that alone is no guarantee that the master file will end up preserving all the dynamic range, tonality and color envisioned by the director and the DP. For example, the decision in post-production to constrain the levels in the HDR pass to maintain consistency with the SDR version can prevent HDR from taking wing. Not infrequently, a project gets the green light for an HDR master after the fact; both the post-production house and producer preemptively rule out a version that dramatically departs from the SDR version; the result being that HDR turns out to be little more than a marketing gimmick. Another seldom discussed issue is that colorists are resorting to compromising their HDR grades in order to avoid judder artifacts. Getting back to the topic, several reviewers have reported their extreme joy at seeing the jet blacks and bright highlights made possible by watching No Time to Die in a Dolby Cinema with projectors using dual 4K Christie lasers that emit 31ft lamberts, not to mention the incredible audio, which the director himself was thrilled with.
  22. I don't like SDR. I see everything in HDR. SDR is flat and lifeless. I've read that the HDR version of the Bond film was sensational.
  23. So nothing whatsoever to do with YouTube. I watch YouTube, even occasionally upload to YouTube and the films I watch don't abuse flare. You might as well blame the tooth fairy or Santa Claus.
×
×
  • Create New...