Jump to content

Frank Hegyi

Basic Member
  • Posts

    102
  • Joined

  • Last visited

Posts posted by Frank Hegyi

  1. I just watched the whole thing. First off, great job! It's a good story with a beginning, middle, and end; and you told it in a way that makes sense.

     

    My favorite shot is the walking shot at 1:15 where the homeless person asks for change. The camera switches sides, relegating the homeless person to the corner of the frame. I'm sure everyone knows what it's like to look at a homeless person out of the corner of your eye, while trying to avoid a conversation. Great use of camera movement to put the audience in the shoes of your protagonist.

     

    Some constructive criticism:

    1. The color grade wasn't consistent throughout. For example, the colors at 7:15 and 7:20 don't match. You may have been using the colors to indicate a "return-to-reality," but you should have gone further with it. It's subtle to the point of looking like a mistake.

     

    2. The Edgar Wright style quick-morning-routine at 0:48 felt out of place within the context of the rest of the film, but that's just my opinion.

     

    3. The only shot that distracted me was the wake-up-bed-shot. It looks like you pushed the colors very blue in post to play for night-time, but the bright light coming from behind the window shade says morning. Also, on a color grading note, the blacks in this shot are very cyan. While it's not necessarily "wrong" to have non-neutral blacks, it's rare to see in professional productions.

     

    Once again, great job overall.

    • Upvote 1
  2. I saw a couple little things that caught my eye. Quasar Science had color accurate daylight and tungsten LED practicals. They claimed they won't flicker on normal AC dimmers.

     

    Really Right Stuff had a tripod head with some kind of magical spring in it that kept the head completely still at any point in the tilt range.

     

    The black magic resolve mini/micro color panels are really nice. They felt great in person.

  3. Personally, I like the look. It's unique (in a good way), but if you're going for a heightened naturalism, the grade is probably a little too far. The first thing to do is neutralize the shadows in shot number one. Any color in the blacks is very un-natural looking.

     

    Secondly, I'm guessing here, but it looks like you've brought the midtones way up and then crushed the shadows back down. It's a cool look, but maybe dial that back a little. I think it's most noticeable on shot number 3. The trees almost look alien.

     

    Like I said, I like the bright midtones. It's the core of the look. Just dial it back a little and I think you'll be closer to what you're going for.

  4. It's also brings up another issue, similar to current issues with separate timings for 3D, different display gammas that the material is being viewed on, etc. which is the notion of whether it is a good idea to have different-looking versions of the same work of art. Maybe that's just an old-fashioned idea that has to be abandoned but I've never been fond of the notion of people watching versions of my movies with different looks involved.

     

    That's a very good question that will probably never be completely answered, but maybe we can see a bit of the future by considering other mediums that have been through similar situations.

     

    The web design world went through this 5-10 years ago with the rise of smartphones and tablets. Pre-iPhone, web designers could almost guarantee their work would be viewed at a screen resolution of at least 1024x768px, so the large majority of websites had a fixed width of 960px. That all changed with the smartphones. It was obvious the design community had to accommodate the narrower screen sizes, but there were competing schools of thought on how. One was to use metadata from the user's device to serve them a completely separate "mobile" website which was optimized for the narrow screen width with limited functionality. That strategy has largely faded away. Today, the generally accepted practice is called "responsive web design." I'm sure you've noticed websites that change layouts as you change the size of your browser window. Boston city hall even got into the game recently. Try it out. https://www.boston.gov/ It was a pretty massive mindset switch for the design community to accept that they'll never be able to control how their work is actually viewed.

     

    Another medium that comes to mind is music. I don't know if you've ever sat in on a "mastering" session for an album, but it's kind of similar to the HDR situation. "Mastering" is the final part of the recording process. It determines how loud a recording is (compression), the mix between bass, mids, and treble (kinda like color timing), and sometimes reverb. One of the jobs of the mastering engineer is to make sure the recording sounds good on any type of speaker. A good mastering room has a bunch of different speakers built into the wall. Everything from professional monitors that cost thousands of dollars to a pair of apple headphones. The mastering engineer will be constantly switching back and forth between all the different speakers in an attempt to strike a balance between all the different listening possibilities. There will even be the obligatory car test, when you burn a CD and drive around the block while listening. You can hear the effect this process has had on music over time. Steely Dan made records, designed to be played on a record player in your living room with a pair of relatively nice speakers. Modern bands don't know where you're going to listen to their music. If you buy a Muse album on vinyl, it sounds pretty terrible. It's not because sound engineers in the 70's knew something we don't know today. It's because the Muse album was engineered to be listened to on the bus. If you try to listen to Steely Dan on the bus with apple headphones, you can't hear anything. It's not loud enough. Now that vinyl sales have made a pretty significant come back, you'll see a lot of bands doing separate mastering passes for digital and vinyl. Not because the formats need it, but because the listening environments are so dramatically different.

     

    The last example I can think of is video games. If you play any games on your computer, you're familiar with the resolution settings. A lot of games have fine control over the graphics settings to allow them to be played on older computers. You can go into the menus and individually turn off things like, shadows, rain, lightning effects, frame rate, etc. That has a huge effect on what the game looks like. I also heard an interesting interview recently with a video game preservationist who is trying to figure out how to deal with software updates which have become so prevalent. The Super Mario Brothers cartridge that shipped to stores never changed. The game was the game. But nowadays, games are constantly updated through software updates. The Rocket League that was released on day 1 is totally different from the Rocket League that exists today. But which version is the "real" version. And how are you supposed to preserve something that's constantly changing. An even bigger problem are games that are run like services. World of Warcraft only works while the company keeps their servers up and running. That game is a huge cultural artifact, but when the company inevitably shuts down the servers some day, will that artifact be lost forever? What even is World of Warcraft if no one is playing it? It's a social game. It derives it's value from the other players playing it.

     

    All hard questions.

  5. My gut reaction to HDR viewing...

    1. A small bright screen in a very large dark space will be difficult to watch for long periods. Either from the back of a theater, or even a 50" screen in a dark living room. So, for a theater, the screen will need to fill much of the peripheral vision. For the home, some ambient light behind the screen may be necessary to avoid fatigue while watching.

    2. It's not ever required to use the entire HDR range in an image. A little, can go a long way.

    3. HDR will require a separate pass for color grading, and maybe a special version for theaters vs home HDR as well.

    4. I think, when shooting for HDR release, I'm going to test out a bunch of frosty filters for the lens. If I'm getting really good blacks, I may wish to see some light spill into larger dark areas of the frame.

     

    All good points. I was definitely getting some eye fatigue after only 1.5 hours with the HDR in a dark room.

     

    From the little bit that I've seen, the best uses of the extra dynamic range save the large majority of the top end for highlights (reflections, light sources, clouds, etc.) But all this discussion has me imagining a replay of the advertising "loudness wars" when all the TV advertisers over-compressed their audio tracks in an attempt to be the loudest commercial. I can definitely imagine watching a stunning episode of Planet Earth 2 on Discovery Channel HDR, but then a Tide commercial comes on were they've taken a low contrast scene and stretched it out over the entire dynamic range of the TV, burning everyone's eyeballs out of their head in a misguided attempt to be the brightest commercial.

  6. So where the technology is cool, I don't see it gaining ground like 1080p has. The broadcasters are still using old 8 bit 4:2:0 19mbps transport stream technology and updating to 50Mbps for 10 bit 4:2:2 4k transport stream, is going to be a challenge. There is a limit to how much bandwidth you can squeeze down a pipe and currently, the cable, satellite and over the air brodcasters are maxing it out, throttling back certain content in order to get higher 38Mbps streams on other networks for special events. Unfortunately, streaming from satellite, cable and over the airwaves, is where .h265 struggles. However, over the internet with buffering, .h265 works great and it has a Rec2020 provision. So we'll see HDR content streaming online first. The big question is... who will pay for all that bandwidth.

     

    I'm curious to see what the traditional broadcasters do. Looks like Netflix and Amazon are jumping on the 4K HDR train. It won't be good for the traditional broadcasters if Netflix can say, "we're cheaper, more convenient, and the only way to use the features on your fancy new TV.

  7. I saw some HDR monitors and TVs at a Sony event last night. They looked really incredible. The difference between SDR and HDR is almost like wiping the fog off a window. We can finally see the dynamic range of all these awesome cameras.

     

    It's also going to change the way we shoot. One example: in SDR, blown out practicals at 100IRE aren't that much brighter than the rest of the scene, but the HDR displays get sooo bright, practicals look like a real light source. It actually hurts your eyes.

     

    Anyone else have any HDR thoughts?

    • Upvote 1
  8. Files are in HD, 2k and 4k, cropped and overscanned for 2k and 4k. All are flat, ungraded scans in ProRes format (because it's just too clunky to do DPX sequences on DropBox).

     

    Thanks for the demos Perry. Quick newbie question: is there a standard 709 conversion lut to use with the scans? The alexa 709 lut seems to work pretty well. I also get similar results with the "cineon to linear" and then "linear to 709."

     

    P.S. I've got my first Bolex coming in the mail next week, so I'll probably be bringing some little projects your way the next couple months.

×
×
  • Create New...