Jump to content

Erik Krambeck

Basic Member
  • Posts

    4
  • Joined

  • Last visited

Everything posted by Erik Krambeck

  1. Hi My question: Which settings and workflow for HDR grading in Davinci Resolve should I choose, if my HDR TV can display: HLG, HDR10, Dolby Vision? I have some experience in SDR grading in Davinci Resolve 16 Studio. Now I have bought a new 4K HDR SONY TV AG9 in 55 inch and I am interested in to finding out how to grade own footage (Slog 3) in HDR and play it back on the Sony HDR TV. I tried today to grade three shoots in Slog3 to HDR in Davinci Resolve and was confused with the different options, (HDR10, HLG, Rec 2020 etc). But I was able to output a H265 HDR file which I could play back via an USB stick on the Sony TV, which rendered the picture more brilliantly than SDR. I could connect the HDR TV SONY AG9 to the Davinci Resolve PC ( i9 9900K & Geforce 2080 graphics card) via HDMI as a second monitor. However, I am not 100% sure whether 10-bit signal is really transmitted. I am aware that the TV is not a class 1 monitor, and I do not want to do professional HDR grading. I just want to playfully gather HDR experience with material I have shot myself and play it back on the TV. An "island solution", so to speak. Do you have any tips on which HDR settings & workflow I should choose in Davinci Resolve? Thanks Erik
  2. Hi, I have already graded as a cameraman a TV show in HDR for ZDF (second German TV station). Today I ordered an HDR TV (Sony KD-55AG9) so that I can also experience HDR at home. The Sony AG9 does not yet have HDMI 2.1, but only 2.0. My question: How can I play HDR masters created in grading house on the HDR TV at home? In which file format should it be played out by the post production house? I think H265? What container? Can it be played from a fast USB stick? In the TV specifications it says that HEVC can be played from USB, which I think is H265? As an alternative playback station I have an PC I9 9900K with a Geforce 2080 graphics card. The connection at the graphic card is an HDMI 2.0b. I read that Nvidia has lifted the 10Bit limitation (see link below.)Which player on the PC (Win10) could play an HDR file from the post production? How can I be sure that it will be transferred via HDMI in 10bit? Thanks for help! Greetings Erik I https://www.pcgameshardware.de/Nvidia-Geforce-Grafikkarte-255598/News/Treiber-hebt-10-Bit-Limitierung-unter-OpenGL-auf-1295825/
  3. Hi Satsuki Murashige! Thanks for your reply! The noise in 12800 ASA slog was visible, even in with a Rec709 LUT applied. But it depends on the resolution and size of the monitor - my example with a 43 inch UHD is certainly extreme. I have the impression that Slog3 12800 ASA (with a LUT) absolutely needs noise reduction in post-production, otherwise the later compression for streaming or YT looks terrible because the compression logarithms can't cope with the noise. Hence the desire to reduce the High BASE ISO to 5000 or 6400 in order to be able to do without noise reduction in post-production. With my older MacPro computer, the grading can be played out in 4K at 12 fps and the noise reduction at 0.5 fps. For me, noise reduction is "pain in the ...". Greetings Erik
  4. Hi, I own since 4 years the FS-5 & Atomos Inferno for gimbal work, but mostly I work with ARRI Cameras on bigger projects. Last weekend I did a FX6 Test (from a rental company) and I think its a huge step forward to the FS-5, a very interesting camera ! But I was wondering why the High ISO Base in Slog3 is 12800 ASA? I watched the Slog3 material (12800ASA) in 4K via HDMI on a 43Inch LG UHD Monitor and you could see the grain "dancing". In the Custom Mode in Cinetone with 12800 ASA (Base) but set to 5000 ASA, it looks much better, hardly any visible grain. I think in most cases 5000 - 6400 ASA (in Slog3) would be quite sufficient, and easier to work with because you don't have to use noise reduction in post. I think in the High ISO Base 12800 ASA in Slog3 is to much electronic "Reinforcement", that causes grain. I would wish that Sony would "downgrade" the High Iso Base to 5000 ASA or 6400 ASA in a further firmware update. Similar to the FS-5, which had a slog base ISO of an absurdly grainy 3200 ASA when it shipped, which was changed to 2000 ASA in firmware 4.0. Who thinks similarly? Or is there a reason why Sony did this? Marketing or to take advantage of the full dynamic range?
×
×
  • Create New...