Jump to content

AJ Young

Basic Member
  • Posts

    525
  • Joined

  • Last visited

Everything posted by AJ Young

  1. Thank you so much! I was incredibly lucky to get both great actors and great wardrobe, plus an amazing set. Our shoot schedule lost a week because of a COVID outbreak, but the actors were so well prepared that we finished the film with everything we wanted.
  2. I shot a feature adaptation of Antigone and it's currently playing through a limited virtual performance. You can watch it here: https://bit.ly/antigone-virtual Here's the trailer: My old college hired me to shoot this feature for them. Because of the pandemic, their theatre department couldn't produce any live performances. The department decided to film a play, but instead of shooting a live (or in COVID's case "live") performance like Hamilton did, they wanted to actually shoot a single camera movie. They reached out to their alumni network to hire former students and I was one of them who was available. Here's a story from NPR about this production: https://kjzz.org/content/1671861/mesa-community-college-premiers-first-feature-film-antigone We shot the film on my DIY anamorphic lenses, a combination of Schneider Cinelux and BH16 anamorphosers with Nikkkor and Russian sphericals. You can find stills of the film here: http://ajyoungdp.com/articles/narrative/antigone/ A final interesting note, we utilized Blender for pre-vis. All of the prep work was done remotely. The art director sent me the blueprints for the set and I built it in Blender to build our shotlist, blocking, and lighting.
  3. I ran into a problem concerning Negative Pixel Values and ACES 1.1 (ACEScct) on a project in DaVinci Resolve 17. With the help of Tim Kang, we found the solution to my problem: incorrect pipeline set-up. Additionally, Elvis Ripley, Paul Curtis, and Nick Shaw gave incredibly valuable insight about this issue on Cinematography Mailing List. For posterity's sake, below are the problem, solution, and insight: Problem: Negative Pixel Values on an image in DaVinci Resolve 17 using ACES 1.1 (ACEScct). To over simplify what a negative pixel value is, essentially they're pixels that go below 0 but appear as bright and distinct noise. They can happen because pixel values (RGB) are floating point in color grading programs and somewhere along the line those negative values are displayed as positive, showing up as noise. Here's a screenshot of what negative pixels can look like: Solution: In this particular case, I had an incorrect pipeline set up. The movie files (clips) are 10 bit Apple ProRes 422 from the Panasonic GH4 in vLog/vGamut. My first mistake was having DaVinci automatically choose the data levels of the file. Apple ProRes can either be legal (video) or full range data depending on the camera. Sony, Canon, and Panasonic cameras are full data ranges while ARRI logC is legal (video). However, DaVinci automatically chooses legal data ranges, and my Panasonic video file was being used incorrectly by the program. My second mistake was using an incorrect ODT (Output Device Transform). Because I'm coloring this project on a traditional computer monitor (circa 2019), the ODT should be sRGB. I had my ODT set to Rec709. Both of those mistakes combined created my particular negative pixel problem. Once I corrected them (set data range to full and ODT to sRGB), my negative pixel values disappeared. However, there may be an instance where this won't fix the problem because there are times a negative pixel value can occur with a correct pipeline. One solution is to use a gamut compress created by the ACES community and available here: https://github.com/jedypod/gamut-compress --- For anyone who stumbles on this post in the future, I hope it helps! To those who helped me fix this problem, thank you!
  4. Hi Raj! If you're getting 2160, you should be getting 4k (4096x2160). Are you getting any errors? Could you also post a screenshot/picture of what the Shogun is telling you?
  5. This all looks fantastic. How does your plugin work with ACES?
  6. Very interesting! I've been shooting with a lot of DIY anamorphic lenses and they utilize variable diopters to pull focus. If I'm not mistaken, I believe a lot of higher and mid end anamorphic lenses do the same (ie Cooke, Atlas, but please correct me if I'm wrong). I would think adding a diopter is another way to distort the lens, albeit a very subtle way. It's another piece of glass that is most likely not as refined like the glass in the lens, so its aberrations could be...very interesting.
  7. Thank you, all! Phil, I still have an old trailer on my website. It's basically the same trailer, but with the old title: http://www.ajyoungdp.com/articles/narrative/noise_color/videos/noise_color_trailer02.mp4
  8. A feature I shot in 2017, then called Noise and Color, has been picked up for distribution by Gravitas and re-titled to Exodus. It will be released on Apple and in a limited theatrical run on 3/19/21. Here's the trailer: And here's where you can watch it: https://itunes.apple.com/us/movie/exodus/id1555414930 Some info about the movie: Camera: Red Epic Dragon Lenses: Kowa Prominar Anamorphic - 40mm, 50mm, 75mm, and a doubler for the 75mm ~1.5 Ton GE Package A lot of our cinematography inspiration came from The Rover, Blue Ruin, Out of the Furnace, and Mad Max: Fury Road. Shot in two areas: Surrounding region of St. Louis, MI Desert outside of Albuquerque, NM The crew put there all into it and I am beyond thankful for their hard work! If you're curious, I can post some BTS stills here.
  9. Of course it's more expensive, the Venice is newer. ? The struggle with owning a camera is that you're entering the rental game whether you like it or not. Productions and DP's are always going to ask for the latest and greatest; you might miss out on jobs simply because the F55 is too old for productions. (Not that you can't make great images with it, but they would rather rent newer equipment) BUT, if you're getting the camera for personal projects, then it'll be just fine. ?
  10. To be honest, the Venice will blow the F55 out of the water in both operationally and image quality. It's just a newer camera that has improved upon the issues in the F55. However, the F55 can still put out some beautiful images. If you're looking to invest into a camera package to make money on it (or be hired because you have it), then go with the Venice. If you just want a good camera for personal projects, then the F55 will be great.
  11. If I'm not mistaken, "multi-gamma function" is just a marketing term referring to different output displays the camera can record in. REC709 has its own gamma, vLog has its own gamma, etc. Choosing from multiple gammas means choosing to record in multiple outputs. If you're shooting for cinema and a color grade, then vLog captures the entire dynamic range of the camera and gives you the choice in post of how to apply the gamma. If you're shooting for broadcast, then you need to pick the gamma that matches your broadcast means. Please correct me if I'm wrong! ?
  12. Owning your own camera to rent and owning your own camera to shoot projects are two very different things. If you're looking to get hired because you have a camera, then you'll need the top name brands: ARRI or RED. At that point you're joining the carousel of the rental business; you'll need to always have the latest version of the name brands. It becomes a bit of a losing battle for small owner/operators. If you're looking to have a great camera for personal projects, then get the one that will meet the final delivery demands of the personal projects you want to make. (I assume most likely 4k DCI) At this point, the world is your oyster. The Komodo seems like it'll tick off the boxes you're looking for. Red's color science has gotten substantially better over the years, particularly with IPP2. My recommendation: Get the Komodo!
  13. Martin Frohlich demonstrated using Blender's real time engine, Eevee, in a virtual production environment with green screens. You can watch his talk from BlenderCon2019 where he discusses it here (start your video at 9:05 for virtual production): A cliff notes of the video: Frohlich made two films, each shot in two versions: traditionally on location and exclusively virtual via green screen. The goal was to make each version of the two films look identical. Frohlich wanted to see if they audience could tell the difference between on location and virtual production The virtual production used Eevee as a real time, on set composite for the DP and director to view. The virtual production was done on green screen with no LED volume. No composite was burned into the footage on set, they treated it more like a LUT; an estimate of what they'll expect in post-production. They used a motion capture system to record the camera's position and rotation However, they received a lot of "noise" during the real time capture of the camera's position and rotation because the infrared mocap system was also picking up the infrared light emitted from the lights used on set. To see this "noise", look at the background plate on 19:20. It's bouncing around because the mocap data is being interfered with the real movie lights Game engines like Unreal Engine(UE4) and Unity are much faster than Eevee, at the time of making his presentation in late 2019. --- Personal thoughts: One thing game engines do better than Eevee, currently, is baking in the lighting to the virtual set, known as lightmapping. Rather than calculate the light rays every frame, game engines instead paint the shadows and bounced light onto the virtual set ahead of time on anything that doesn't move, known as static. This means the computer only needs to render lighting, in real time, onthings that are dynamic, like a moving light or moving character. UE4 and Unity perform lightmapping faster than Eevee and the tools for doing so are built into their engines. In fact, Eevee doesn't do lightmapping, it's actually performed by a different render engine in Blender, Cycles. Cycles isn't real time and is more comparable to Pixar's Renderman, Maya's Arnold, etc. This means it takes longer for Blender to render lightmaps in comparison to UE4/Unity. If a DP/director wanted to change some aspect of the virtual set lighting, it'll take significantly longer to render out the lightmaps in Blender. Of course, to speed up lightmapping, Blender users can use addons like BakeLab 2. This speeds up the tedious process of lightmapping, but the choke point is still the Cycles rendering. Eevee does use something that is like lightmapping, called irradiance volumes. Instead of painting the virtual set, Eevee places "probes" across the area to record what the lighting should do in terms of bounce light. This video best describes it: https://www.youtube.com/watch?v=9SD0of-mOHo The only downside to Eevee's irradiance volume method is that any lighting begins to flicker when you move the camera/things. This is because the engine is updating the shadows. --- For those keeping track of these posts, CML's first web discussion of the year will be on virtual production. They're scheduled for Jan 6, 2021. I'll post a link to more details about it when CML does.
  14. Make sure to clean every roller and not just the tension band. It seems tedious, but necessary. I started at the platter and worked my way through the film path. In your case, start at the beginning and clean through the film path. Should only take you about 5-10 minutes.
  15. I don't recommend lubricating the film with anything that isn't designed for archival. Any chemical that sits on the film, even only on the sprockets, will affect the preservation of the print. FilmGuard seems safe, however I can't speak for its archival capabilities (I've never used it). If the prints are valuable to you, I'd recommend spending the money on a lubricant designed for film. I do recommend lubricating the tension bands of the projector. In the US, I used Tri-flow on our 35mm projectors when I was a projectionist for Harkins Theatres. A little goes a long way. Do you clean your projector between prints? We would clean our projectors between each movie with q-tips (cotton swabs) and denatured alcohol. For really messy parts, like the tension bands, 409 cleaner worked really well. If we used 409 cleaner, we would then clean off the residue with denatured alcohol. 409 is great for removing emulsion from the projector, but it also leaves a residue. The denatured alcohol removes the left over 409. As a final step, I would use the dry end of the q-tip to dry everything off.
  16. Very interesting. Restoration of old prints or remastering becomes a whole new problem, as you said. Let alone adding the streaming byproduct after the fact. It's also an interesting artistic choice. Degrain/noise an entire movie and add in the noise/grain. I think it's getting harder to keep the image consistent from format to format now. The Blu-ray looks great, but people are streaming it. The 70mm print looks amazing, but the blu-ray just quite can't compare.
  17. If a project shot on film and needed an HDR grade, how would they correct the "sizzle" from the grain? Denoise and then add grain?
  18. The ACS released a fantastic video on virtual production. Definitely worth a watch!
  19. This is becoming an interesting development with streaming. There are so many devices each with their own operating systems and various internet speeds that streams show better/worse/different qualities. A byproduct of streaming compression is noise reduction. Some devices are faster computers, some are slower. Some programs are faster than others even on the same device (ie web browser vs DaVinci)
  20. Just watched the movie last night! I sadly couldn't see it in IMAX because of COVID, but the blu-ray was great! Nolan movies really give me a good head buzz. I wonder what the movie would be like if we played it in reverse?
  21. A feature I shot before the lockdown earlier this year was just released on Amazon Prime, distributed by Indie Film Rights: https://www.amazon.com/gp/video/detail/B08QM77DLZ/ref=atv_dp_share_cu_r It's called Mandao Returns, a sequel to the film Mandao of the Dead. (I shot both) The first film played at San Diego Comic Con in 2019 which helped fuel the fundraising campaign for the sequel. It's a horror comedy film that takes place around Christmas. You can watch the trailer here:
  22. Looks super fun! Please share the final film!
×
×
  • Create New...