Jump to content

Tyler Purcell

Sustaining Member
  • Content Count

  • Joined

  • Last visited

  • Days Won


Tyler Purcell last won the day on April 14

Tyler Purcell had the most liked content!

Community Reputation

411 Excellent

About Tyler Purcell

  • Rank

  • Birthday 07/28/1978

Profile Information

  • Occupation
  • Location
    Los Angeles
  • My Gear
    Aaton XTR Prod +, Aaton 35III 3 perf, Bolex EBM, K3, Blackmagic Pocket Camera
  • Specialties
    Cinematography (digital cinema and 16/35mm) and post production (DaVinci/Avid/Final Cut Pro)

Contact Methods

  • Website URL
  • Skype

Recent Profile Visitors

51899 profile views
  1. Well yes, you want your final re-framed scan to be higher resolution than your output. With that said, you can achieve this with a 5k scanner, where the frame nearly fills the imager. You may end up with a 4.2k file when it's all said and done being cropped/adjusted and you'll have a tiny bit of room for zooming as well. I was merely saying that 8k as an acquisition and delivery format is silly. I wasn't necessarily thinking about underscanning something by thousands of pixels, to me that seems silly. Good scanners have the optics so you don't need to underscan. If you're using a scanner like the blackmagic which does not have optical path adjustment, that to me doesn't even classify as usable product for anything else but it's native format. Obviously, with formats like super 8, you may not have a choice, but since super 8 is such a myopic part of the industry, it's pretty much irrelevant to discuss. Scanners need to natively work with Super 16mm, Super 35mm and 5 perf 65/70mm. The other formats (8mm, super 8, 9.5mm, etc) need to simply fit within the optical path of those main formats.
  2. Yea it will do 30, 25 and 24p as well, but of course the imager is stuck at 1280x720, so since it has a fixed bit rate recording system, might as well stick with the 100mbps 720p signal. Now that we've figured out all of this, I still don't understand the original question because the HPX2700 doesn't really record a log file. So you wouldn't be using luts to do anything really. Bruce's explanation on the knee adjustment is a great trick, but it's still a Rec709 color space no matter what you do. Getting detail out of both the highlights and the blacks, is going to be difficult, even if you screw with the camera's internal gamma curve. The dynamic range of the imager is still limited, so where you could get a bit more out of it, I doubt it would be worth the effort.
  3. I totally forgot the HPX2700 was a 720p camera. AVC Intra 100 is fine for 720p really, even at 60fps that's not a lot of bandwidth. That's the reason why it works with 10 year old software natively, because it's just not really that tasking.
  4. What kills me is that modern 4k sets look like crap. I have a beautiful color grading monitor that's 10 bit 4k 17:9 and it doesn't matter what media I put on it, darn thing looks great. However, if you take that same media and put it on a normal TV set, the 1080p looks super soft and the 4k looks overly sharp. It's a pretty dramatic difference and they do this purposely so people are like "ohh gosh the 4k is so much better" but reality is, it's just a horrible scaling chipset within the TV. What I keep telling people is that outside of Netflix, Amazon and UHD Bluray, there is no UHD at home. There are only a handful of UHD Bluray's made from 4k sources. Netflix and Amazon "originals" need to be UHD, but that content is only a limited amount of total programming. Standard ol' television is still 1080i and nearly all of the content is finished in 16:9 1080p because they don't want to spend the money on upgrading. Outside of a few sports games per year, there just isn't any UHD broadcast, with very few people even capable of receiving the signal. Where we have seen a shift in the last year to 4k, 6k and 8k sources for theatrical, the vast majority of movies are still being finished in 2k and NOBODY is finishing in 6 or 8k, even IMAX. So I do think the standard Lasergraphics Scanstation is fine for everything. I don't imagine seeing us in an 8k world with film, it's just not going to happen. One could argue that 5 and 15 perf should be scanned at higher resolution then 5k, but which one of us is doing that work? I mean if you're shooting those formats, you can afford to pay someone extra to deal with it. Super 8, 16mm and 35mm are the main formats and that's what would-be owners need to focus on in my opinion. Heck, even vistavision would be nice to have, but 65mm is whole other expense. Ohh and 8k? Yea so stupid.
  5. Helicopter, they work great when piloted from the ground. Just don't stand underneath it. lol
  6. Ohh the AJ-HPX2700, hehe. Yea, you should have said "ENG Classic Varicam" because I didn't even know the camera had log capabilities.
  7. Yep, AVC is a MPEG 4 format. There are a few variants like Long GOP or iFrame, 8 bit and 10 bit, 4:2:0 or 4:2:2, but it's still MPEG. The only NON-MPEG single file formats are; Pro Res, DNX and JPEG2000, which all use Wavelet compression technology. Even Red Code and Arri Raw are variants of the same tech. Cinema DNG is TIFF compression, but it's still similar. DVCPro has nothing to do with AVC funny enough. Panasonic makes you want to think they're similar, but they are not. AVC is just .h264. https://en.wikipedia.org/wiki/AVC-Intra
  8. Basically all it does is flatten the image out so the mpeg file has greater dynamic range than it would normally. So you would use a LUT to bring it into the color space you're using to grade, whether that's Rec 709, Rec 2020 or DCI-P3, the color space is chosen in the grading program as a "base" and you will grade to that space. I have given up using pre-programmed luts because I've found most of them to not give me the look I'm going for in-camera. AVC-Intra 100 is OK for 1080p, but not for anything else. If you're shooting 4k, you really need 400+Mbps as a source and I'm not sure if the Varicam can do that. I think one of the reasons so many people discount it, is because the codec is very limited. There is a software version that can do Pro Res, but I think it's only HQ.
  9. Yes, sorry... my bad. I didn't bother checking the spec and I remembered it being measured in megabytes not megabits. Thanks for the correction.
  10. It's really shot nicely and very professional in both image and sound design. I guess my only issues are the length and the translation. The same story could be told in 10 minutes easily and it totally loses my attention a few minutes in because I really don't know the importance of what I'm seeing. The "poetic" nature of the VO and henceforth subtitles, can be challenging for non-native speakers. Poetry is one of the most difficult things to translate and reading poetry vs hearing it, how it comes off the tung, especially in a sensual setting, just doesn't work in my opinion. If the poetry were re-written in English and then spoken, replacing the native language with English, I think it would be much better for me. It's one thing to subtitle dialog, normal sentence structure say, where our brains can easily translate the emotion we see in the characters faces and absorb the subtitles. However, where the heartbeat and story is driven exclusively from VO, it makes things very different. Just my .02 cents.
  11. Ahh ok. Yea I feel ya. I'm the polar opposite, everyone here is PL. I have adaptors for Nikon and Canon lenses for my digital cameras and nobody has ever asked for them, ever. All they want/care about is PL glass because that's what the rental houses use. However, PL is a premium product, which you can get for cheaper as Nikon or Canon mount in many cases. So I understand why someone would stay away from PL due to cost.
  12. Nearly all of the consumer grade cameras shoot in .h264 or .h265 depending on how new they are. The normal variety of these codec's are long GOP compression (you can google this to learn more) and don't have "frames" per say. The way they work is, they capture one frame and only store the differences of that frame for the next X amount of consecutive frames. The X variable is a wide range from 8 to even 40 in some cases, depending on how much motion is in the scene. The little processor in the camera is chewing on this encode in order to make the files small. So the editing software has to pre-read the entire MPEG file into memory extrapolate frames and somehow make an edit. It's very challenging for the software AND hardware to do so. Most people who shoot with these .h264/.h265 cameras make proxy files inside Premiere which are DNX. This process works great because when you're done, you can throw the whole mess into DaVinci and grade from the original files, not the transcodes, which can bake in the look if you aren't careful. I don't shoot with cameras that aren't iframe codec's like DNX, Pro Res, Cinema DNG, Red Code, Arri Code, JPEG2000, etc. These codec's exist to deliver higher quality in post production, albeit with an associated cost. You still need a decent computer to work with these codec's and the cameras that shoot these codec's are more expensive of course. www.bhphotovideo.com has a great "tech" page for everything they sell and if you wanna know what codec a certain camera shoots, you can simply go to their website, find the camera, click technical and you can see the codec's. Nearly all of them will be .h264 or MPEG4, which are basically the same concept, tho different implementations.
  • Create New...