Jump to content

Chris Nuzzaco

Basic Member
  • Posts

    25
  • Joined

  • Last visited

Everything posted by Chris Nuzzaco

  1. I've colorized about 90% of the projects I've worked on as a DP. The big difference between myself and other DP's however is that instead of sitting next to the colorist... I am the colorest. My advice is to invest time into learning the various tools of the trade, and you'll have better leverage as a DP. As for the power struggle, its up to the director. If you don't like that, then I suggest networking with directors who are really big into artistic delegation when it comes to things like lighting and color - these guys tend to be actor/performance focused directors.
  2. Some people actually do just that. Not everyone has the money though, so it can be very unrealistic, and it would be of more interest to a DP than a director. If thats the case, I would request some stuff thats been shot and already polished in post to view. The issue with viewing Red proxies is that they haven't been corrected, and they have a generic viewing LUT (redspace) that most likely doesn't show the look the DP intends the footage to have. You see where I'm going here? If you get handed .r3d files, your entering a very strange forest at the moment if you want to see the quality they can have. I highly suggest requesting a 1080p demo of finished red footage to view if you just wanna watch it and see if your impressed by it. I understand it's 1/4 the resolution of Red, but it is the same amount of data to deal with, which is my point. The files actually take a few seconds to load, so no, I don't expect it to work like greased lightning at 4K. You can actually view the .r3d files in RedCine, so apparently there is a way to work natively with them, I guess I just wish that was available more readily in other applications people use when dealing with the Red. Again, this is just a temporary hangup, I know the support is coming. I just can't believe how "super beta" the work flow was when they released the camera.
  3. You know, this is something that was mentioned to me during my "Miami Mayhem Color Correction" (LOL). I was told most people have just color corrected in FCP via the proxies and then mastered from that. That notion kinda blew me away, I don't know how true it is, but man, kinda sucks if people have to resort to that. As for just viewing the footage myself to check it, I have a big hangup with the whole Redspace thing being the default for proxies. I want to view the footage in a flat looking, "raw" view. The Redspace will apparently blow out highlights that still have detail in them. I understand why they made it that way (kinda creates this "illusion" that red has "overhead"), but to an experienced raw shooter like myself, it's kind of silly. Yes, its good for a client to see a "nicer" image, but the DP's gotta see what they need to see too. I'm still awaiting the day I can edit and load in my own custom viewing LUT's. I don't recall this feature being implemented yet (but it IS available on my Andromeda mod :P).
  4. Kinda, but not really. He wants to see what the Red can really do, that means going beyond the lame quicktime proxies and looking at the .r3d files in a superior format, which means some kind of conversion. It's the conversion process that killed me partly, the other aspects certainly didn't help me either (the whole conform in Red Cine just not working, period, for example). I don't see much use, for example, in watching underexposed footage that I can't somehow tweek and examine, etc. I would love to play back the files at full quality, right away, but I know thats hard, and comes with certain technical complexities, but it isn't "impossible". My Andromeda DVX100 system, while lower res (1K), actually pumps out the same amount of data per second as the red (it's fully uncompressed 10 bit data, the storage is about 2GB a minute, just like Red), which means it should be equally tough for a computer to debayer and play back the file natively in quicktime, at full quality. The kicker is this, I can. A user made a quicktime component that does just that. The bigger issue though, how do you deal with gamma, sharpening or color balance? Right now this Andromeda component does nothing, its straight up linear gamma (dark looking image on a 2.2 gamma monitor), and the colors are locked off to whatever the camera captures, and totally unsharpened. It is a bit of a drawback, but, at least I can quickly work with the files at full quality, and in any good color correction program, I can get to work right away, no more use of the rendering program. It would be great if red had this option, in addition to the lower res proxies. The proxies absolutely have their use's in the post work flow, I just want this other option to be available as well. It sounds like it will happen sometime in the future now that Adobe is on the scene, but realistically, how long will it be before they release their solution? Thats the million dollar question.
  5. Hey Guys, I recently flew down to Miami to colorize a film I DP'd with the Red camera back in August. It was a film shot mostly in the wilderness, using natural sunlight, underexposed, and I used some bounce cards whenever possible. I have to say, as much as I was impressed with how well the footage held up under some very brutal color correction work, the whole post work flow freaking SUCKS. I'm sorry, but it's true. We were "prescribed" a certain work flow that was regarded as being solid. I won't get into all the nitty gritty details, but essentially when I arrived, what I found was a disaster that ended up forcing me to stay in Florida an extra week to get the job done (not that staying in Miami is "bad" lol....). Why? Let's just say the game plan sorta failed in multiple ways. I'll be more clear, the OFFLINE editing of Red files in FCP seems to be fairly decent, its the online editing that is the major pain, and also where most of the issues seem to lie. In a nutshell, we had to render out 1080 10 bit uncompressed 4:2:2 quicktime files, that were quite large, which is to be expected, but what slayed me was the reason. I couldn't get the 10 bit HQ ProRes files (the quicktime codec that everyone seems to swear by) that were rendered to work in After Effects at the proper bit depth (I suppose I now have beef with Apple?). Why? I have no clue. Same with RedCine basically going AWAL on us, so I had no way whatsoever to utilize the white balancing tools in Red Cine, which I think are decent enough, and far better than trying to white balance via curves (I personally hate doing that). If it weren't for Red Line, and a homemade FCP to After Effects EDL script available online that *actually works* we would have been dead in the water. So in other words, Red made a camera that can put out a mean image (when you are mindful of certain things like using IR cut filters, etc...), but they kinda leave you dead in the water if you wanna actually see and work with that image at its full quality level, as opposed to the crappy proxies that apparently use redspace. I don't wanna sound like I'm totally ragging on the Red One, I think the camera can do some great stuff, I just hated the post work flow. THIS is the reason why they persuaded Adobe to support the Red camera line up, in my opinion. Yeah its kind of a rant, but can you blame me? I guess the silver lining is that I got to spend an extra week in Miami... Oh, and by the way, I highly recommend getting a MacPro station that is pretty loaded if you want to do some serious work on Red footage. It just comes with the territory man.
  6. Actually it won't. The neat thing about this type of technology is actually DOF control. If you read up on it more, you'll find that you can actually stop the camera down when shooting in bright sunlight, then in post, narrow the DOF. Simply amazing stuff.
  7. Hey Guys. I've read a lot of discussion about the new big sensors from Red, especially that huge 28K monster and the medium format one. Most of the argument centered around lenses, and rightfully so. Especially focus pulling. Here's my theory: They might be laying a sensor groundwork for light field cinematography, a moving version of this stuff: http://www-graphics.stanford.edu/papers/lfcamera/ http://www.notcot.com/archives/2008/02/adobe_lightfiel.php This type of imaging DEMANDS larger sensors (read, higher resolution) to really work at a professional level thats actually useful. What I find so darn ironic about this, assuming it's one of their targets, is that all the focus pulling arguments people are laying out are more or less vaporized! Another thing I noticed was Adobe's development with the technology, and their newfound chummy devotion to supporting Red's cameras. Makes you wonder.
  8. The irony is that for you to have good 2K results using bayer filtered sensors, you actually DO need sensors around the 5-6K resolution. It's the whole interpolation of color channel resolution thing that causes a 5K image at 100% to not really display a true 5K worth or res. The 4K redone is more like a solid 3K, but if you want it to be really nice in terms of color channel quality when down-resing, 5K and 6k are more appropriate. It does have drawbacks, the bigger the sensor, the harder it is to have super fast frame rates, etc... I just hope it has a global shutter this time around....
  9. For those interested in seeing just how flexible the image is without downloading that huge zip folder: Linear (unsharpened): http://www.chrisnuzzaco.com/special_uploads/linear.tif LOG/ "Viper" Like (unsharpened): http://www.chrisnuzzaco.com/special_uploads/shadowlift.tif
  10. Like many other people here... I can't figure out how to edit my posts! LOL Here is the image file sequence (source footage for the clips). Its big and beefy... 463MB: http://www.chrisnuzzaco.com/special_uploads/filter_test.zip
  11. Thats interesting... Lately I've seen lots of "less than stellar" posts at Red addressing all kinds of issues like image noise when shooting under tungsten, etc. The original cameras lens mount had issues that were posted prior to and after the new mount. Granted, a lot of "issue" posts are more related to operation and lack of experience working with RAW cameras, but I've seen plenty of posts regarding QC issues like knobs, connectors etc. that haven't been deleted. I did once get a bit harassed by a Moderator in the Off Topic forum once for posting some of my Andromeda research there, which many found interesting and exciting because it showed just how powerful RAW data is, even from a DVX100 (FYI you can check out the latest improvements at the Reel Stream Forum and here under the DVX forum). Supposedly some people wanted to keep the Off Topic forum "Red Only", but if you have ever been to that forum, you know as well as I do thats its far from that. So I suspect I was being a little targeted, but I was happy to see several members post in my defense.
  12. http://www.chrisnuzzaco.com/special_uploads/filtertest.tif That is a cropped screen shot from AE. I'm not sure how much information gets captured during a screen shot though, so it might not be a full 16 bit file, but its much nicer than the quicktime I compressed. Remember to view with a proper gamma of 2.2, your browser might display it wrong. This is an 8 bit Animation file, uncompressed. Remember... its 8 bit, I shot in 10 bit, so if you try to mess with this file, you are missing a full 2 bits of data (thats a lot of extra data), and this is already been tweaked by myself anyways. About 220 MB: http://www.chrisnuzzaco.com/special_uploads/filter_test.mov For those wanting to give editing the footage a whack.... Uncompressed 16 bit tiff image file sequence. LINK UPLOADING These frames were captured linearly at 10 bit, and rendered linearly into a 16 bit space, so the images will look dark on your average 2.2 gamma monitor, but all the data is there. It was also rendered without sharpening. I chose to use more advanced sharpening methods in After Effects, Sculptor HD's sharpening abilities are a bit rudimentary and they tend to accent signal noise. These frames have no lateral CA compensation applied either. A little bit of information abut the shot: This was shot using a special manual white balance setting I created that forces the blue and red channels to equal the green channels gain. Basically its a "uni-gain" setting. This effectively turns your camera into a Viper Filmstream like system, where optical white balance brings the best results. The camera is natively balanced for daylight, so I had to use an 80A filter on top of my 90ccM filter. If you load the frame into photo shop and isolate each channel, you will see that all three channels have the exact same amount of noise, which is very low, this is due to the optical white balance. The shot was rendered unsharpened. In After Effects I broke the shot up into individual channels and compensated for the lenses lateral CA, then I applied an unsharp mask to the footage, making sure to not accentuate signal noise. Sharpening is much easier when you have footage with a very low noise floor. It looks even better when you see it play :)
  13. Wow, thats really cool, but, I think thats more of a research camera than a cinema camera. I used to shoot high speed video for an equine research hospital, they really could have used that little bugger. Thanks for the link.
  14. David, I'm well aware of this :) Thanks for diving into it more, I just skimmed over my basic approach. As for noise varying shot for shot, I think thats more an issue of always exposing for the highlight, but not bothering to check the shots contrast, and thus you end up changing the ISO in post (for those wondering, I have one "pet" LUT that I use for almost everything at the moment, so I don't change LUTs during a scene, unless there is some kind of special shot). For example... When I light a scene, I create a contrast game plan of sorts, I figure out how far under the highlight I want things in a scene to always be, and then I light from there. I might want to always keep one particular actors face 2 stops under the brightest highlight, the walls at a constant 3 stops under, etc.. I do make exceptions though, specular highlights are usually left as is. Funny you mention the clipped headlights... Headlights are very similar to specular highlights if you can't mess around with them at all. The contrast, especially at night, can just be really out there. Sometimes you have to let stuff blow out in order to get the rest of the shot to show up due to sheer contrast. Claudio could have exposed for the headlights, but man, everything else in the scene would have looked like junk. Better to just let them blow out, and if they have the post funds, mess with the headlights in post, at least thats what I would have done (unless I could somehow dim or put ND on them). Thanks,
  15. Harsh highlight clipping is the big issue so many people do not seem to understand about digital cinema. I've been shooting RAW HD for almost a year now, and one of the big things I've learned about good RAW HD, well, any HD really, is figuring out what the ultimate contrast is going to be AFTER you color correct the image. Ultimately, in post, you will probably need to kick the footages shoulder up a bit in order to create the necessary "roll off" into the highlights. Because of this, you do need be very careful about where you place your brightest highlight exposure wise. I totally understand exposing to the right, but I also understand when I've gone too far, and how far is too far all depends on the final image and the subject matter you are shooting. Another often confusing thing about shooting RAW HD in particular is that fact that you basically have two different responses for the camera. One is linear, which provides the max amount of data you can capture, and the other is your footages ultimate "look" and subsequent curve needed to create it. You can certainly shoot in a manner that uses the cameras full linear range and create contrast in post, but odds are you will need to do more sophisticated work to pull it off if you do not want clipped highlights nor loose all of your shadow details but still have a nice bold black. Just because you can do that in post doesn't mean you should, especially if you have a limited budget, in which case, you should really just light the shot in a manner that looks good once the final "Look" or curve is applied to the footage, and I can guarantee you the image will also be much cleaner as well. Even with RAW HD, the more you jerk footage around, the worse it can possibly end up looking. Make sense? As for metering, I base my ISO ratings off clipping in the green channel. The green channel is the first to blow out most of the time, so thats why I base the rating off of that channel. To expose a scene, I simply find the brightest object in the scene and then use the f/stop (or t/stop) that my meter gives me. I light and meter from the highlights down. Easy as pie! Hope that helps.
  16. http://forum.reel-stream.com/viewtopic.php?t=850 As many of you know, the Andromeda/Hydra technology (performance enhancing modifications for the DVX100 and HVX200) is being sold and is in danger of no longer being an option for filmmakers. Go to the above link to help save this amazing technology!!! The link will take you to a Reel Stream thread that has instructions on how you can help. Thanks,
  17. I suppose someone should get this thread back on track.... I actually tried it all out, its pretty slick, but it's certainly a beta build. The interface looks a lot like Lustre, or any Autodesk type program really, in other words, the program takes over your monitor, LOL. There is a general agreement over at Reduser that a more user friendly way of minimizing the program needs to be developed, but at the moment, its not that big a deal, just one of the small luxuries desired really. All the "meat and potatoes" functionality is there, and thats what matters. The program will only improve if people actually provide some feedback, and believe me, people over at Reduser are picking it apart.
  18. An interesting thing about shooting linear digital HD (which I do using Andromeda DVX, not as nice as Red, but still linear and 10 bit) is that you really need a DP who understands what is required to create a "rolling" off in the highlights. This is something you really need to do in post, typically via curves. The catch however is that many people shooting HD expose "to the right" which is good for reducing signal noise, but they don't take into account how post will affect the upper end, and once they get there, they find that creating the roll off looses details they wanted to keep. I guess what I'm saying, is that they didn't light for the post roll off effect. Expose to the right, but also be aware that the highest stop or so worth of highlight detail may become lost when you generate the roll off effect. This same principal applies to the shadow end as well, though its much more forgiving. I've heard the camera has a good 11.5 stops to work with, but as far as I'm concerned, its more like 10 or 9 if you really want that film like tonal response. So far, it seems many shooters aren't quite realizing that yet, but I think they will eventually catch on, and start working within the "usable" range that actually makes it to the screen. As for being too clean, I hear this argument day in and out, and its an easy fix. Post grain. I'm sure I'll hear people say post grain doesn't look like real grain, I would say prove it to me. Effects post houses sample and emulate film grain from all kinds of stocks to match HD and CGI effects into film acquired shots all the time. Its called "match grain" in After Effects (that was used for a few shots in the "The Departed"), and I'm sure there are even more powerful proprietary post grain tools out there as well. The funny thing with grain matching, is that it works its best with very, very clean source footage. As for loosing a full 4 bits, that also makes a big difference. If you load any image into say After Effects, and then open up curves, you can pull the curve shorter, while keeping it linear, and create harsh highlight clipping. I'm not saying they did that, but some people use that trick to get rid of magenta highlights. I have no idea why anyone would do that with Red, as it has a cool highlight correction code in Red Alert! but you never know... they might not have good grasp on how to process the RAW data yet. I've read that some future red owners (I'm guessing these people are a minority) haven't even shot with a RAW DSLR before... I'm just saying! I can clearly see a learning curve that is being climbed when I watch some of the footage.
  19. WOW! Ok, there must be some confusion here... I really don't know what to say. I'm the guy who shot that footage, and it was shot months ago, using my DVX100 Andromeda. Since were all looking at it, I might as well say a few things about it... This is beta footage. In other words, its not the best footage I've pulled out of the system. It's actually some of my first footage shot using the system. I now have more than three months of extensive testing under my belt, and the LUT I originally used to shoot that footage is no longer in use. I now use my own custom LUT, which provides, in my opinion, much better performance in terms of noise. I now also use a magenta filtration/white balance trick that further decreases my footage noise floor, and also extends the usable highlight dynamic range. I can capture a full 9.5 stops for those wondering. I also don't recommend using the full range either if you can avoid it, as you'll get better results noise wise. On top of the custom LUT and the magenta filtering, I have also created lateral chromatic aberration correction templates for various focal lengths, which has also helped the footage drastically. As for this footage specifically, its certainly compressed for the web alright, not as nice as the uncompressed version, but again, its beta testing footage I shot when first learning about the camera. I don't know if this an honest mistake, joke, or someone trying to prove a point, but I had nothing to do with it. Once again... This is NOT Red One footage. Thanks and sorry for the confusion,
  20. Ok, I was looking over my after effects project settings, this current animation was all posted at only 8 bit @$%@%^! So its technically not its best, working on making a newer one for the nit pickers like me out there.... at 16 bit.... But still, the current 8 bit animation is better than the web compressed version.
  21. Thanks, By the way, the animation filed didn't upload all the way last night, so I tried again, it was successful this time, so if you downloaded the corrupt one, try again, its all there now :)
  22. This is a finished, and polished piece shot with a DVX100 Andromeda using my newest custom record LUT "nuzzaco_contrast_3" and rendered out using its companion render LUT. All I did in post was tweak the colors and add additional space on the sides for a more wide screen look. ZERO post noise reduction used on this. I've decided to do something unusual this time... Uncompressed HD Animation (178MB) http://www.chrisnuzzaco.com/special_upload...d_animation.zip Web HD Clip (16MB): http://www.chrisnuzzaco.com/special_upload...lerichardad.mov I highly recommend looking at the animation to see this clips full glory, and man is it beautiful, almost brings a tear to my eye :) Heres a web compressed teaser frame grab: http://www.chrisnuzzaco.com/special_uploads/moleadgrab.jpg By the way, I'm a huge fan of Mole Richard lighting equipment, gotta love that old school Hollywood look :)
  23. Actually my LUT doesn't crush the blacks, you can lighten them back up. This is color corrected stuff. It's quite amazing how elastic it really is. I can drag the range out in post to like 9.5 ish stops if I want, or I can squeeze it down again. I choose to use some heavy contrast for this clip. I've noticed that a wider range usually has more noise in it, at least with this system. This shot is available light, outside (very dark shade in the background...) I've taken to lighting to a more narrow range, and sleight remapping of the tones via curves in post, and the results are spectacular. I'll post some links when they are uploaded.
  24. http://www.chrisnuzzaco.com/special_uploads/leaf.mov I shot that with my Andromeda using my own custom record and render LUTs. This is actually a beta test, but the results are extremely good. I think I'm on the right path now :)
  25. I've shot with the M2, and let me say this much, TWEEK IT! The M2 never comes tack calibrated. You literally need to adjust the ground glass in order to get your lens focus marks to line up right, AND you need to set your HVX f/stop to something around 2.8 if you want better focus. Also, look into the SGPRO Rev.2 its supposed to be one of the better adapters out there performance wise, and its really quite cheap...
×
×
  • Create New...