Jump to content

Chris Millar

Basic Member
  • Posts

    1,647
  • Joined

  • Last visited

Everything posted by Chris Millar

  1. A couple of preliminary thoughts, although the camera might give you images that have a 4k dimension, I really doubt that the lens in that camera can resolve to that detail, and if it could, it's unlikely it would have been installed correctly anyway. Next thought, is that anamorphic lenses project a relatively square image, so you're going to lose most of that 'resolution' in the nature of the process. Not saying don't do it, please, it's good to get involved any way possible :). Just don't expect 4K.
  2. Whatever eventuates, it will be the net result of mostly negative forces - ignorance born of mostly laziness on the behalf of those who don't see anything to gain combined with the purposeful misuse of engineering terminology for marketing purposes from those who do. I applaud those who educate themselves and then disseminate that information (as abstractly as any forum post can afford) ... and there certainly is something to be said for the feeling one gets from being around (at least online) others who 'understand'. I but wonder, perhaps ignorance is bliss :rolleyes:
  3. Quite a few threads in the grip section touch on that topic... Not sure how fragmented you want a forum structure to be though?
  4. Lots of people simply push a(ny cheap digital) camera up to the eye piece and get reasonable images from that... It's a reasonable way to keep light from coming back though the finder and prism onto the film also. First step IMO would be pushing your phone camera up to it, see what happens :) It's just for checking framing huh (?) if so, is resolution a major concern ? Unless you mean for focus, if so I'd do that optically i.e. focus the ground glass image on your eye...
  5. It's been interesting for me over the last few years as I've returned to study. I soon found out that the study was all encompassing and that work wasn't going to happen as it normally did and there are only so many times you can decline before you fall off the radar - it took about 4 months until calls stopped. Except out of the blue 2 years in I got a cold call from an unknown to me producer offering me 9 months work in a HoD position. Obviously it was quite gratifying, (or more likely an indication that they were scratching around in the leftovers!), once again I had to decline. But what was interesting was following the latent and passive trail of breadcrumbs I must have left behind - from what I can figure out it was a combination of working along side another producer in a much more limited capacity (just some local grunt work) and some overheard conversation, or discussion I wasn't a party to that maybe led him to be aware of my other skills ... and so on. Who knows ... I would have asked but the guy seemed under the gun. Anyways, just goes to show, your presence among your peers has an echo. But to answer your question directly, I prefer to make the call - not afraid to be known to be in the hunt. You can try to frame it like 'I really enjoy working for you, and would prefer you over other employers' - leaving it open that there are indeed other employers actually ringing :) To keep your intentions above board make sure your first calls are to the employers that you actually do enjoy working for more.
  6. File your gate and recenter the lens. Done! :)
  7. That is if you use the start/stop single frame option - even then, they are pretty solid units... If you freewheel the drive ('0' I think on the dial), then the two shafts on the outside are yours to operate as you see fit - if I recall one is 1:1 and the other 8:1. Design a stepper/servo/arduino/whatever based intervalometer that mates to these and you have a sweet variable shutter angle intervalometer with almost 0deg to almost 360deg depending on the period. You could program it with a computer and I guess assuming some kind of smartphone to arduino communications exists (I'd bet money on it) go down that route also :)
  8. You've got a few considerations to define if you want to discuss this succinctly - let alone develop the code that achieves the manipulations. I mentioned this on another thread, software is very opaque about what it is doing under the hood, to the point of internal inconsistency in the user interface, leading to all sorts of ingrained misconceptions in lay people about how images are composed as data (eg. Photoshops the use of 'ppi' (pixels per inch)). All I can suggest is people think more, which it appears the OP is doing :) Anyways, %99+ of the time it's not an issue - if it looks good, it is good - I guess it might be relevant when developing your own pipelines, translation to different representations and applications...
  9. Well, it could be... At least, in my mind the option should be there. The reason I've often resorted to coding my image processing applications is because After-Effects et al. are a black box when it comes to some simple things - the algorithms are too presumptuous and 'clever' for my purposes, with dumbed down parameters that can even appear to contradict themselves - although, I agree if someone less interested in this stuff doesn't know what I'm talking about, then maybe it is is perfect for them... I think we're in agreement :) ersatz <<=>> "of course it's up to you how 'constructive' you see the end the result turning out."
  10. I don't know the in's and out's of this in terms of the actual software workflows, especially with Red footage in Avid. But conceptually it's pretty simple and as David points out, you pretty much have two temporally interleaved streams of 1/100s exposures. Up to you which one you want to extract, odd or even :) You could blend every couple of frames into one and achieve a kind of 180deg shutter that way. Keep in mind that the 180deg is composed not of one 'half' exposure, but of two 90deg quarters separated by 90 of non-exposure and summed, something that might look interesting if you were to do a frame by frame analysis. Not sure if Avid would understand that or not, but it could be done easily enough in your own code/console app, once you had spent the time learning how to access, manipulate and save the values of the frames non-destructively - that is, outside of the effect you apply which by nature will be 'destructive' in terms of pure information - of course it's up to you how 'constructive' you see the end the result turning out. I'd hesitate to call it time remapping when the divisors are not only nice integers but are factors of 2. Time-remapping, at least for me, infers the more complex process of frame to frame interpolation.
  11. You can say there could have been improvements just as much as you can say there was no more room for them. Maybe Tyler has something up his sleeve - maybe he is reinventing a wheel... Point is that it's an open question :)
  12. I mean that the gender specific terminology has been solved with 'DoP' and 'cinematographer'. I'm not saying it's all that is required to solve the gender imbalance in the industry - but it's something. Your links are examples of the gender specific terminology problem in it's base state, but I figured we had already established that as given. It's a part of what we're talking about huh?
  13. Well, keep in mind - it's not a beachball - it is what it is, and what it is we don't really know yet. Best to just wait and consider yourself 'teased' in the meantime :) I can certainly relate to this! I'm back at university and the people I spend a lot of my time with share many common interests (that's why we are studying what we are studying) I still find that major gulf you mention with lots of things in film and music. Some of these people haven't even seen the new trilogy, that's just an age thing I need to get my head around, but what I find interesting is they don't share the realization that some parts of film are purely there to tick boxes for a target demographics expectations. Maybe film always was like this, montages of fulfillment and maybe I was just like them in the 80's lapping it up - I'd like to think it wasn't, but I find it hard to be objective.
  14. Why would that change the arguments already presented ? Unless you're pointing out the implied gender in the term 'cameraman' ? In which case it's ok for a male to apply for the position of camerawoman etc. (?) The terms DoP and cinematographer solved that already... Edit... and by saying 'guys' - is that another point made ? :)
  15. Well you obviously have real experience with this stuff compared to me so of course I'll defer but jeez, way to shoot my dream down... :D At least I can still grasp on to a fantasy/alternate universe, where that technology wasn't superseded by digital - but instead grew and evolved into something we just don't have today, but was still at heart, analog, composite, scale model, film based etc. Of course digital could still be thrown in, but at the death star 'many Bothans died to bring us this information' projection level - Tron, Last Star Fighter kind of stuff
  16. Would be interesting to see opticals done with todays technology - and by that I mean, film stock/grain improvements, glass/filters, positioning device precision etc. And done by people with a good budget and legacy to upkeep. Reading about Return of the Jedi in Cinefex was really interesting, a lot of the challenge there was the management and timing of all the footage. Maybe try a 'lean' approach ... I'd love to be involved.
  17. I've always taken an interest in the personality types that get attracted to certain positions... Of the two location scouts I've gotten to know a bit better, they were both good listeners, down to earth/personable and both very 'ok, let's get it done' types. I think key in your case is getting out and talking about your ideas, talk talk talk - the more exposure you have to peoples ideas the easier it's going to be huh?
  18. Although it seems I'm talking to myself ... I have the camera here now - it can run 120fps out of the box, but the algorithmic load of markerless tracking that I've managed to optimise so far, means the system is realistically running at around 76fps. I just drop frames and use the latest available in the instance that I can't use a frame before the latest arrives. I get the feeling there is maybe one thing I'm not noticing that might sneak the process up to 100fps - but that's just for personal satisfaction, it's running quite well otherwise. All good :)
  19. I'm finishing off study in a related field - related in my mind at least - one thing that learned and feel better for it is that academics aren't 'all that'. But unfortunately the environment they're in can perpetuate an air of superiority. Maybe it's not your personality but you could also try 'critically reviewing what the teachers have done', especially since your fees have been paying their wages all this time. Once you approach it like a business decision and then go about the business of learning, with expectations of reciprocation - i.e. money in, work/help/learning out - then you're on an level playing field and all the much better for it. I'm not saying disrespect your teachers, but relate to them with professional courtesy as you would anyone you're working with on set. It's true some teachers at least in my case don't like it at all (2 of them), but if you're playing level then it's them who need to adjust :) Maybe this doesn't relate to you, but in the case it does - try it out :)
  20. I remember working for a discovery channel documentary many moons ago in the UK organizing permits for the UK DoP and crew to shoot/work temporarily in USA, Canada and Zanzibar. At the time and by far, Canada was the hardest nut to crack bureaucracy wise... Not saying much I guess - things change :)
  21. Oops! duh, it's kind of obvious - you cant track a face without markers if the images you are projecting on it make it look like anything but a face. I guess as you know what you're projecting, you could do something with differences - but the bit-depth of the resulting 'image' would be hard to work with.
  22. Yes, that's unclear with undefined DoF, general 'circular motion' could mean a multitude of things - assuming the simplest cases, still leads to questions: - By rotation do you mean simple pan? - Maybe the camera rotating around the subject but pointing at it? ...which in turn could be done with an extending arm and pan for small subjects (and large crane) or full circular track. Very different set ups and costs. Also the word 'tracking', you mean the operator tracking talent on the fly - or the crane tracking it own movement repeatedly - i.e. motion control ?
  23. There are hundreds :) One that I found took a fair bit of effort and is better for it:
  24. You can see the retroreflective dots (scotchlite paint?) in action throughout the video - you'll note that they are bright in the frontal view but dull in the side shot. Not sure why they are using them yet, as markerless tracking has been a viable option for a while now, markers make the system much more robust, but then again, you've got to both apply them, and then remove them - or leave them be as in this case as they don't detract from the aesthetic. I think the green lines are added, if not even a red-herring for the geeks out there? (keen to learn however if I'm incorrect in that assumption). From a bit of web hunting and looking at the first few shots from the video I've found this: https://www.naturalpoint.com/optitrack/products/v120-trio/ As it says, plug'and'play ... 6DoF and IR or not as you choose. A bit out of my price range for the intents of my project, however it is likely I'll opt for two of the sensors that are used in the product: https://www.naturalpoint.com/optitrack/products/v120-slim/ It's my bag that I have to develop the backend and GUI for the application :D Lot's of jargon - 'Harr-features/Eigenfaces/AAM/POSIT/etc.' - but all doable, especially with libraries like openCV available. Reading the original papers from the 90's is interesting with them talking about 1fps rates, looks like it's more around 100fps now, although I imagine careful consideration of the algorithmic complexity would be required to avoid a brute force approach. I have control over the setting and it can arguably be less general and be optimised for specific cases - i.e. certain faces - which might afford a kind of offline pre-processing of sorts, might speed things up... Developing on my webcam at the moment is crippling once I extend the features to look at beyond face and eyes, e.g. add 'pose' (pitch roll and yaw) - let alone eye and mouth movement... Work in progress however! Although the video is great, it was just one example of many and I meant the question in a more general sense - so ok, Avatar was just standard video, but was it real time at any stage, maybe for visualization in video village?
×
×
  • Create New...