Jump to content

Mei Lewis

Basic Member
  • Content Count

    351
  • Joined

  • Last visited

  • Days Won

    1

Mei Lewis last won the day on March 30 2014

Mei Lewis had the most liked content!

Community Reputation

6 Neutral

About Mei Lewis

  • Rank

Profile Information

  • Occupation
    Other
  • Location
    UK

Contact Methods

  • Website URL
    http://missionphotographic.com
  1. I think you're a bit wrong about what is happening with your camera. The Raw and Mraw settings are for still photos only. Your camera doesn't record raw video, only .mov (h.264). Definitely check out working with proxies by the way. The way they work in premiere is very convenient once you get the nag of it.
  2. Does anyone have the .pdf that was linked please? The link on the kodak site is dead.
  3. For me the only noticeable difference between those two images is the bright hairlight on the Alexa shot, which makes it feel much more balanced.
  4. https://vfxblog.com/2017/10/26/lets-move-to-la-in-2049/ This interview has blurred my understanding of what a cinematographer is. I think many people would watch BR2049 and credit Roger Deakins with the look of the cityscapes, but it seems that's not (entirely) the case.
  5. It seems to me the point of this forum is to discuss, learn about and advance cinematography. Certainly opinions and subjectivity come into it, but if we're to progress we need to be able to accurately and unambiguously describe the aspects of cinematography we're talking about. From that point of view, saying an aquisition or display system is 'soulless' is useless. That statement can't be acted on. A digital sensor or projector maker can never make their system less 'soulless' because you can't tell them what it is you mean by 'soulless'. We're not talking about magic, we're talking about recording two dimensional images inside small rectangles, where each point within the rectangle can be completely specified by three continuous variables (for example, R, G and B values). There is nowhere in there for a soul to hide.
  6. Thanks everyone. I like Adrian's idea about an extra shot.
  7. Shooting a short film today and myself and the director disagreed about which shoulder we should come over when looking at the phone to match the wider shot. So we shot both. I think the matching shots are 1 and 3. The 'line of action' is between the girl and her phone, and by coming over her right shoulder to look down at the phone we don't cross that line. Who was right? I'm still nowhere near as good about this stuff as I should be. Any links to resources would be appreciated. Thanks.
  8. Maybe not how it was done in these examples, but it's easy to do in post, just use the lens distortion correction tool in the wrong direction.
  9. Best option is to use a studio with built in cyc/infinity backdrop. If not that then David's suggestion of a white paper roll is easily the best option, especially if you need a full length shot. Lighting any type of sheet, either from the front or back, and getting it even, with no seams, but not overexposed to the point of causing contrast reducing flare, is very tricky. The smooth, flat, matt surface of paper is much easier to light evenly.
  10. I really liked this film too.
  11. I really think there's something in this. I don't think cinematographers are using waveforms to judge composition, or even very much to judge about lighting, other than exposure levels. But by 'measuring' the images with a waveform monitor, some of their structure makes itself apparent. (That's true if we're looking at images from the camera or final shots as they appear in the film). The waveform is showing something about the geometry of an image, and often images that people like from a cinematography standpoint are very clear and simple, the viewer knows exactly how to interpret what's going on. Look at this list: https://www.buzzfeed.com/danieldalton/there-will-be-scrolling?utm_term=.umdYd0gv5#.drmeVqQmp (I'm not saying I agree with the list, but they are the sort of shots many people like). Most of them are very simple and could be sketched without too many pencil lines.
  12. Why are you showing "Low-con encoded video", "<<ITU709 video>>" and "ITU709 display" as having the same range? Aren't "Low-con encoded video" and "<<ITU709 video>>" just data, which doesn't have a range? (it's inside a computer).
  13. I think you _can_ talk about the number of stops a display has, it's possible to measure it as the difference between its black and white levels using a light meter. But this range doesn't really have much to do with the range of the camera or the original scene, and I think that's where Jan is having difficulty.
  14. This is my current understanding of what's going on here. When an image is displayed white in the image data is usually mapped to white on the display, and black to black, so CU=DU and CL=DL, but that doesn't have to be the case. If CU is mapped some value higher than DU then clipping of whites happens. If CL is mapped to somewhere lower than DL then black clipping happens.
×
×
  • Create New...