Jump to content

cole t parzenn

Basic Member
  • Posts

    287
  • Joined

  • Last visited

Everything posted by cole t parzenn

  1. Then, you'd have a pixel count just slightly higher than cropped 1080. :D Better colors, though! It could be but just being on this forum is awfully niche...
  2. Very sad. I don't like most cinematography but I liked his.
  3. I was really just referring to digital steps after photochemical capture. I much prefer film, too, but I don't think there's digital's inherently bad.
  4. Then what meaningful difference is there between "constructs generated by a computer" and "actual images?"
  5. Not shooting unevenly lit brick walls wide open, free lensing, and then cropping from one of the corners - some people do other things, right? ;) Yes, a bit abstract.
  6. Thanks for the reply. This helps me a bit, conceptually. How does this apply when not shooting unevenly lit brick walls? Film?
  7. Go on... How do the Nyquist limit and aliasing apply to image capture? Light likes to pretend it's a wave but aren't we really just counting the number of photons to strike a certain area? And how is film immune to this? What is the maximum possible signal to noise ratio?
  8. I love 1.33/1.375. Wally has apparently retired from lighting (his commercials, aside). Selfish *******! ;)
  9. True. Maybe I'm not understanding how they're getting from the digital masters to the HD release. Taking 2001, as our example (a 1960s 65mm negative probably resolved about 4k, I figure), a pixel from my blu-ray represents four from the DCP, before chroma subsampling. Why isn't the grain smoothed out?
  10. Interesting. I haven't tried USB but I've noticed similar things. Thanks again.
  11. Interesting. My blu-rays are sharper than the DCPs I saw? Could you elaborate on this?
  12. Hi, all. Been lurking for a while - love what you've done with the place. I saw the 4k restoration of Citizen Kane and it was as gorgeous as you could hope for (not having the OCN). I bought the blu ray and it's grainy: Kane 1 Kane 2 I saw 2001: A Space Odyssey in 4k and it was... only ok looking (is it just me or does Warner Brothers over-compress their DCPs?). But it wasn't as grainy as the blu ray: 2001 1 2001 2 The seams between the "Dawn of Man" sets and screen were much less conspicuous, too. I haven't seen Apocalypse Now on the big screen yet but I expect it's less grainy than the blu ray: A. Now 1 A. Now 2 What's going on? I know that film can look grainier than it is, when scanned at low resolutions, but this is film looking grainier than it is, when shown at a low resolution. Additionally, I've seen this in S16 originated HD video online, compared to S16 originated HDTV - same stocks and display resolution (give or take a little compression). It's the kind of grain I hate most, too - analog noise, basically. I can live with soft but I hate noisy. A few weeks ago, I saw something else that I don't understand. I was watching the (according to IMDB) S35 originated True Detective pilot and saw moire-ing on a piece of wardrobe. I myself can't think of a particular reason film couldn't moire but conventional wisdom holds that it won't and, despite every darn exhibition being digital, now, I've never it from film originated material, before. It occurs to me that IMDB could have been wrong about the format (with all the compression, it's hard to be certain, but the images did look pretty Alexa-y) but, assuming it wasn't, why would I see moire and why hadn't I seen it before? (If you don't mind a tangent, does the Alexa moire, and if so, why? Shouldn't oversampling prevent that?) Thanks for knowledge.
×
×
  • Create New...