Jump to content

wolfgang haak

Basic Member
  • Posts

    31
  • Joined

  • Last visited

Posts posted by wolfgang haak

  1. Adrian, Alex,

     

    I like your advance towards using CGI Adrian. Well, your geek alert sirens should go off now:

    Yes, of course it's possible to CG the stuff, but to have it looking convincing bear a few things in mind:

     

    Motion: A person is not a rigid body. Matchmoving is more complicated as skin stretches and deforms. So does cloth. To have the clothes shot separately and then stick them in in AE will probably look like they were stuck on in post.

    Transparency: If there are any transparent parts in the clothes, this approach will not work, as the skin needs the be seen underneath, together with appropriate changes to lighting.

    Lighting: Most fabrics show interesting anisotropic effects. Brrr... Above method may have the exchanged part not fit in the shot in terms of lighting.

     

    Add to this the complications of fabric creases and folds, the distortion of clothes due to walking bounce, air/drag. Brrrr... IMHO, it's not as simple as sticking a few track markers on a lady and load the trajectories into a post package.

     

    If you go the CG route, I would think that you need some consulting from pro's. Approach a company with experience and get their opinion on the matter.

    Cloth simulation is quite advanced these days and it may be easier to shoot the "talent" naked and stick her clothes on in CG. Unless the final result may not hold up to the most critical of eyes (unless you are prepared to part with substantial amounts of cash), however this approach would allow a high degree of consistency in the shots and thus be more believable.

     

    kind regards,

    Wolfgang

  2. Karel,

     

    For Long distance shots with no foreground, mount three cameras in parallel on a rig and sync your footage. Bracket your exposure across the three cameras. With not too much parallax you might get away with it.

    Crystal sync for film or some computerised affair for digital cams.

     

     

    Have fun!

    Wolfgang

  3. What should I do to get the max possible latitude out of color neg film? Would overexposing by a stop and then push processing by minus 1 help?

     

    I want to shoot some tests to see if I can extract enough image information in the shadows and highlights to do HDR processing.

     

    HDR is this kind of stuff

     

    Cheers! thumbsup.gif

     

    Karel,

    I know as much about film stocks as I do about open heart surgery, so I leave it to the pros to answer that one. But Is I know about HDR(I):

    The footage you link is done by overcooking a process called tone-mapping. It tends to oversaturate colours. I don't like the stuff most of the time myself, but that is down to personal taste. Check http://www.hdrsoft.com/ for examples.

     

    - from neg: scan the same neg using multiple exposures. (i.e. adjusting scanner back light) Process the passes in HDRI package.

    Works with still images, LaserSoft do a package called SilverFast Ai that will allow you to operate most scanners this way.

     

    - DI: adjust back light/laser/etc. Make multiple passes, process in you favourite package.

     

    -from one DNG/RAW file: process multiple tifs (or similar) from your DNGs, in 16bit mode, pushing and pulling the exposure for each shot by 1-2 stops. Process in fav package.

     

    - time lapse: you could shoot with off the shelf DSLR, yousing exposure bracket program. In quick succession no-one will notice motion artefacts.

    Good fun for doing that are hacked canon cameras http://chdk.wikia.com/wiki/CHDK

     

    regards,

    Wolfgang

  4. Hunter et.al. ,

     

    do you mind if I play bad cop and ask a few questions?

    Flickr... You are aware that flickr processes all uploaded files? Dropped are all your imbedded profiles, left is a pile of mystery meat - numbers without meaning.

    Secondly, (I have to ask) is your computer/monitor profiled and calibrated, is your imaging software setup correctly?

     

    I may make myself a touch unpoplular, but these are the sort of cruitial points that give digital imaging it's bad reputation. The sensors of of today's DSLR's go far beyond the hue range that can be displayed on any CRT/TFT screen. Mapping the (calibrated) sensor readout to Adobe RGB colourspace equals tonal compression/clipping. Mapping them to PhotoPro RGB captures the hue range (but requires at least 16 bit of information) , but there's not a screen out there that can show that range. With Photopro you clip the colours not in the file, but in output.

     

    Most of today's sensor's capture 12bit of linar data, which is then down converted by an embedded chip using factory presets (These guess that they're looking at an average grey motive in an average lighting condition.)

    Say a photographer complained about a lack of latitude and colour range in his film but had left the development of his roll to a corner lab that doesn't know what numbers and abbreviations on the tin mean - you would not be surprised if the result was disappointing.

    But it is illogical to conclude that the film was at fault. It's more likely it's been used incorrectly. There's a host of things to know about films, and processing. And there is an equally confusing and sparingly documented mountain of technology to learn when it comes to digital images.

     

    Just bear this in mind: If you checked this post whithin two hours of turning your computer on, then the internal working temperature, white point, SCD and brightness of your screen will have shifted in the time it took you to read this post. Never mind the cast from your blue/red/whatever t-shirt since you took your mid-grey jumper off!

     

    I do not wish to to proclaim that one way of doing things was superior/more advanced/adventagous over another way of doing things, but I would say that with an equal amount of efford invested, the balance leans towards film. That means that there's a pile of things to consider before digital can start living up to its benefits.

     

    Food for thought.

    Wolfgang

  5. Hi Joshua,

     

    You will have heard the term "bit depth". This refers to the value range available to encode each colour channel.

    A 8 bit image contains 8 bits per channel, so 28 = 256 values. This allows for 2563 = 16m colours.

     

    To store and organize this data on a computer, you will choose an unisgned integer format, nice and simple.

    For Bit depth, which are larger, i.e 32 bit = 232 values per channel. To store that amout of data, you need a better container, as an integer runs out pretty quick (in programming variables can not take infinetly large values) so you need something a bit better suited for the job.

    The variable type used here is a Float (Float32). or half float (Float16) with 32 bits and 16 bits respectively.

     

    The terms used in the post packages here refer to the data types used to store the information and in a float you store greater numbers.

    Unless a specific package makes ambigous use of terms, the word float has no bearing on colourspace, linear/encoded gamma etc. Use it like "massive" vs half float that means "big" and integer that means "standard"

     

    Hope it helps,

    regards,

    Wolfgang

  6. Not sure why my link disappeared.

     

    http://www.flickr.com/photos/33094657@N08/...57610860482304/

     

    All of my first attempts had a really strong cyan cast in the negative, making the positive really red. I don't really understand why but maybe Karl does?

     

    I think the problem, at least the problem I have had doing this with super 8 is super thin negatives. It is really difficult to get anything off it. Perhaps a bit more experimenting to get them a little thicker is probably worth trying. Probably best to experiment with 16mm reversal

     

     

    Grant,

    Your link on flickr includes a pic of tree, rather beautiful stuff, but you say, experimented with developing in "coffee"?

    I suppose you only refer to the effect you got, not that you actually used coffee as a dye in the process.

    lovely stuff anyway.

     

    wolfgang

  7. Yashad,

     

    check out the tubes, there is usually a three digit number on there. 835, 550 etc. This has to do with the colour. The first is the CRI (Colour Rendering Index), an indication how bad the the SPD (Spectral Power Distribution) is anything 8 and nine is quite good, but chances are you're dealing with a 5 or 6 for cheap tubes.

     

    The last two figures are the white point, so in the above exampe it's

     

    835 : CRI = 8, whitepoint = 3500K and

    550 : CRI = 5, whitepoint = 5500K.

     

    Descent tubes are not expensive (~£7/$10 for 4foot T8) so you may want to consider re-fitting the shop for your shoot. Check out these bad boys: Full Spectrum Flourescents

     

    These ones are good enough that you find colour labs, retouching facilities etc fitted with them!

     

    regards,

    Wolfgang

  8. Yashad,

     

    check out the tubes, there is usually a three digit number on there. 835, 550 etc. This has to do with the colour. The first is the CRI (Colour Rendering Index), an indication how bad the the SPD (Spectral Power Distribution) is anything 8 and nine is quite good, but chances are you're dealing with a 5 or 6 for cheap tubes.

     

    The last to figures are the white point, so in the above exampe it's CRI8, 3500K and CRI5, 5500K.

     

    Descent tubes are not expensive (~£7/$10 for 4foot T8) so you may want to consider re-fitting the shop for your shoot. Check out these bad boys: Full Spectrum Flourescents

     

    These ones are good enough that you find colour labs, retouching facilities etc fitted with them!

     

    regards,

    Wolfgang

  9. i changed the mode from RGB to CMYK : same thing . noting that both pc's monitors are calibrated .

    what can i do to preserve the best quality from my pictures in that case?

    the whole project will be exported on a DVD.

     

    thanks anyways ..

     

    Shady,

    I'm gonna delve right in:

     

    Calibration.

    • Are the monitors calibrated or profiled? Notice that most "calibration" devices available for PCs/Mac are actually only profilers. The tell you how you monitor responds to a signal. It does not affect the way it displays a signal.

    • Packages like photoshop will use profiles to alter the signal they send to the monitor, as they know how the monitor responds. The actual colour info for each pixel, is adjusted by a LUT (Look Up Table) or ICC Matrix to find the new value that has to be send to the monitor. It adds the "error" of the display as an offset to the signal it send to it in order to achieve the colour that you expect to see. This technique is very error prone, as there's no limit to the degree of modification in the LUT/Matrix. With monitors, where the hardware whitepoint and gamut range (amongst other values) are miles apart, expect to see colour banding, colour clipping and clamped areas (looks like a solarisation in strong hues) etc.

    • What is the calibration target of you displays? If the calibration target of your display is miles out from the display hardware settings and different from the image colour space it's not gonna look pretty no matter what you do. See below under "Colour Space" dog-to-cat conversion.

    • Is you software set up right?
      Check your colour settings. In the CS versions of photoshop edit -> colour settings. Select "preserve embedded profiles" on all three tick boxes. (and check all the warning box messages as well.)

     

    Colour Space:

    • Sorry mate, what? CMYK is a four colour process intended for offset printing presses by splitting the image into its four ink plates. No-no-no for what you are doing. Have a consistent colour workflow and avoid every unnecessary conversion.

    • How did you "change" the mode? Edit-> Assign Color Profile, or Edit->Convert to profile? If you're unsure about ICC profiles, the rule of thumb is never "assign" a profile. It's like telling a dog to behave like a cat. You can convert a dog into a cat, but as you can imagine it's a non-reversible action that inflicts a reduction of quality.

    Compression.

    • Each open and save on a jpg will result in a degradation of quality. Don't believe it? Take any picture, open and save it a few times on highest settings. Watch the file size decrease each time.

    • choose tif format. tick embed profile, LZW compression (which is lossless).

    hope this helps a bit!

    regards,

    Wolfgang

  10. I don't think the conversion from 32-bit linear to 10-bit LOG is as simple as throwing away 22 bits. Most film scans are 10-bit LOG, whereas a lot of computer effects work is done in linear and converted later to LOG. LOG is one method of compressing dynamic range into a curve suitable for film-out work.

     

     

    That's right David. The digital imaging workflow is linear (a tonal correction curve aka 'gamma') is applied by the software on screen for viewing, but the files remain linear.

     

    some idea of numbers:

    Data amounts: a 10 bit file can hold up to 210 = 1024 values per channel. (binary 10 bits!)

    32 bit 232 = 4294967296 values per channel.

     

     

    To perform a tonal compression on this means to throw away 4294966272 values per channel. That would be to some degree insane, but it's certainly not what they are doing . the log format is somewhat more complicated. Here's a snippet from cineon.com:cineon.com

     

     

    "2.1 Full Latitude

    The Cineon scanner is calibrated for a 2.048 density range: this allows it to capture the fill latitude (density range) of the negative film with some margin at top and bottom. The scanner light source is balanced on film dmin so that the resulting digital image will have a neutral color balance if the film were exposed at the correct color temperature. The Digital Negative includes significant headroom above the nominal white point to handle over-exposed negative films and scenes with wide contrast range.

    2.2 10 bits

    With 10 bits per color over a 2.048 density range, the resulting quantization step size is 0.002 D per code value. This is below the threshold for contour visibility, which insures that no contour artifacts (also known as mach banding) will be visible in images. Furthermore, having 10 bits rather than 8 bits allows the Cineon scanner to capture the extended headroom of the negative film. "

     

     

    hope this shed some light on the issue...

    Wolfgang

  11. Karl,

     

    I know from a friend of mine who's working for a big digital VFX house (DNeg) that they scan everything to to 32bit EXR. I've noticed that exr's aren't popular around here, but they do have their merit, especially as nearly all VFX programs can output them. The workflow happens a 4k 32bit from DI scans and stays there until the how chebang gets printed back on film for release. For the biggest productions (The Dark Night) they filmed on 70mm/35mm scanned to 8K and worked 8k 32 all the way through. As their facility manager said:

    "The real challenge is how much electricity you can get into a facility to run all that storage..."

    With off the shelf software 4K@32 bit remains the technical threshold for the time being where the failure (crash) rate in post stays manageable.

     

    HellBoy II / Dark Night are heavily funded projects, but you can scan at higher bitdepth, and for the big boys it's routine. I for my day to day work render to 32 bit exr as well when producing 100% CGI work.

     

    scary, memory eating stuff ;)

    Wolfgang

  12. changes in technology and workflow will always influence the outcome.

     

    So the question do films using DI have the same look as film in the 90's is inevitably yes. Film stocks have changed since the 90's as well, so the result will change here too.

     

    Isn't this more of a question of preference though? Do you like what you see on screen? After all, only the people on set know what the "real thing" looked like. Just a thought.

    wolfgang

  13. Tom,

     

     

    do some practical testing. I've got a Nizo 481 knocking around here, from the chassis this is very similar to your 801.

     

    Aperture testing:

    Without film in, open the film loading bay, and turn the camera on. If you have a VarioShutter lever (where the grip meets the camera, left hand side, for manual fade-outs), pull it back and press the little button besides it. This opens the shutter permanently. Set the intervalo timer to single shot and hold the trigger.

     

    Look through the loading bay directly through the lens, as you move the camera from dark to light, and watch if you can see the aperture open and close. It should be really quick and responsive, and close virtually completely if you point the camera to a light bulb.

     

    This will tell you if the aperture is working correctly.

    I may be able to mail you some pictures of mine later on today if you like.

     

    With the variolever and intervalo timer back in normal position, squeeze the trigger (look through bay). At different running speeds you should notice how the shutter has different speeds. (best done on a tripod against an even target, set the aperture to a fixed setting (dial left top) and observe if the "average" brightness through the lens changes when filming at 54fps/18fps.

    Not scientific approach, but a good starting point.

     

     

    Light meter testing: (in absence of a light meter)

    Pop a cartridge in, exposed or otherwise so the camera takes correct reading of the notches. Set against an even surface (wall) and take a reading of the aperture. (film speed set to 18 fps) Current shutter speed is now 1/36s

     

    Take a (digital) slr camera take reading from same wall, setting camera to same ISO as your film, speed to 1/35 or nearest setting and compare aperture reading.

     

    Not scientific, but you should be able to work out if the camera light meter does stupid things.

     

    hope it helps,

    Wolfgang

  14. Hi, interesting thread.

     

    I'm not not gonna spoil the party, but if you choose to use some DIY fog, it's a good idea to read up about how they work and be on the cautious side using them.

     

    The cheapest and most effective smogcomes from zinc-chloride based chemicals, and using them requires taking out an indemnity insurance that covers you for negligent homicide. :(

     

    Most plumbing smoke is rated non-toxic, which relates to the typical concentration to which a worker may be exposed doing plumbing. That's no the same as "safe to inhale for hours" on set! The Smoke may be non-toxic, but the displacement of O2 in the breathing air is not!

     

    Never mind, actors could scald themselves on water steam...

    Have you considered taking apart cheap irons? Or even hiring steam machines? There are tons of them, ironing, steam cleaning...

     

    regards,

    Wolfgang

  15. I always go for a mattebox so that you can control your light a bit more (if you're using flags and doors). Any kind would work fine as long as it has at least two filter holders. If it has two (or more) filter holders one of them will be rotating for your grads and such.

     

    As for filters, go to a rental house and ask if they have any that they're about to throw away, if they do they'll give them to you for free.

     

    Hope that helps :) .

     

     

     

    Steve,

     

    Good tip with the rental houses... hope you live nowhere near me so there are some left by the time I make my rounds.. ;)

     

    Since I don't have camera here yet, what are typical connector/fitting issues?

    (i.e. what other then lens diameter do I need to take into account to make sure the box is going to fit? The manual that I found says the lens thread diameter is 67mm)

    Would you recommend anything in particular? This is really not my area of expertise...

     

    kind regards,

    Wolfgang

  16. Hi everyone,

     

    I'm starting a new little project in a couple of months, my "serious" project on film. I'm planning to shoot mainly on a Nizo 4080. which I will get in about a months time. The script is still in the muddle middle of progress, but it looks a few shots will need a couple of filters. To my knowledge the lens has SLR style threat for screw in filters, but I was wondering if I should splash out on a matte box instead.

    What is your experience?

    And If matte, which ones can you recommend? Nizo/EWA/...

    What's a good source for filters?

     

    All help and tips much appreciated!

     

    regards,

    Wolfgang

  17. I happened to own a Nizo S480. And I wanted to do some cheesy-but-cool pull-focus shots. I usually do some dry runs for a shoot, that means I operate all camera controls/movements as if it was the real shot, but without squeezing the trigger.

    As you a probably just a skint as me, and want to save on precious film, you'll be doing the same... My dry runs of pull focus shots result in so much camera shake (hand held), so I quickly change my mind and take the shot differently without fancy effects.

     

    But while out on DIY shopping for some home improvements, my eyes rested on a large selection of jubilee clips, and my mind began to work.

     

    Just have a look:

    Super8_PullFocus04.PNG

     

     

    Part list (UK)

    1x 35-50 mm (size 2A) Jubilee Clip £0.55

    1x 75 x 4mm Machine screw £0.15

    1x 3.2mm Steel Drill £2.75

    1x 4mm Tap (thread cutter) £1.95

    1x Ikea Drawer Knob (pack of two) £2.00

     

    1) Instruction: Secure the Jubilee Clip in a vice, best to secure it using scrap pieces of soft wood. It gives better grip.

    2) Drill a 3.2mm hole down the end of the jubilee clip adjustment screw, and cut the 4mm thread into it.

    3) Cut the head off the 4x75mm machine screw.

    4) Put some self adhesive foam or electrical tape on the inside of the jubilee clip to protect the lens cover from scratching.

    5) Tighten jubilee clip so it just slides on lens, the give it only 1/2 or so turn to stop it from slipping. Done.

     

    Let me know how you get on if you do your own!

  18. Thanks guys, I'll look into your suggestions, let's see how the shoot turn out in a couple of days or weeks...

     

     

    I just found this:

    GK-Film

     

    The text is all in German, but the manucaturer claims that their "high precision contact plate" allows newer film stock to run in old cameras, reducing drag, and stuck films. In addition they claim that it'll improve focus on the film edges as the plate reduces film movement.

     

    I haven't tried it yet, but if things are getting desperate out there with kodak catriges, why not give these guys a call?

     

    regards,

    Wolfgang

  19. Peter,

     

    you've probably noticed that forcing a TFT to run at anything then native res makes your eyes hurt. Expect a jaggered image, and tired eyes, and a visit to you optometrician of trust.

     

    But on a serious note, bear this in mind, while calibration software uses "solid patches" on the screen for measuring, it will work. The interpolated pixel sizes of a screen running out of native res however will mean that fine detail will get lost and color perception can change!

     

    By all means do get a screen that fits in your budget, (I would) but I would make sure it has a res higher than HD, and watch your footage "letterboxed". There is a case to be had to strive for technical perfection with a system setup, but IMHO in every project there comes a point where artistic skill of those involved outweighs the merits of academic number-crunching.

     

    You can grade your film till the cows come home, you know the majority of people who watch it will do so on a random combination of a) on a burnt out screen, B) wearing coloured specs, c) with glare and reflen on the screen d) before morning coffee with hangover in bad mood. Oh stop it Wolfgang you grumpy old man! not all clients are from hell...

     

    Wolfgang

  20. How does the meter know what is 'correct'??

     

    Jim,

     

    a lot of light meters like the sekonic ones have shuuter speed for SLR cameras and and fps setting. On an open shutter on the camera (180 deg) the speed is

     

    E = (fps * 360) / 180.

     

    So the light meter assumes that this is the corect shutter speed.

     

    It's all nicely explained here: http://en.wikipedia.org/wiki/Shutter_angle

    But in principal your setup is correct. Bear in mind, the Exposure meter can't guess in what lighting conditions you are filming, a lot of them assume an 18% grey (mid grey)

    Check out Sekonic's Website for more info.

     

    regards,

    Wolfgang

  21. Peter,

     

    a quick one re: what colour space. PC monitor calibration is based on ICC profiles V4. The Probe /software covert targets via a profile connection space (PCS) which is CIE L*A* B* (1976).

    The calibration works in two parts: 1) send and measure greyscale and RGB components signals to the graphic card, and use the measurements to hardware adjust the TFTs colorants to the desired primaries/whitepoint of your calibration target.

    2)measure and create a ICC profile for the output device for ICC aware applications to use. Here is the trick: The ICC Profile is actually empty (well it's linear and does not show any deviation). This causes ICC aware applications to correctly send the full range of RGB signals to the graphic card. The Calibration software "uploads" the ICC Matrix to the screen via USB, where the color nessesary transformation according to the ICC profile takes place.

     

    This is better then software calibration, where ICC profile causes the application to send less then full 8bit data to the screen in order to achieve whitepoint/target primaries. As screen with bright red phosphors will cause the ICC profile to set the maximum RGB values for that screen to i.e. 225, 232, 253, resulting in possible banding on attempting to replicate large gamut colorspaces, as the remaing bits have to "strech" across 8 bit of colour data.

     

    Anyway, jsut a bit of geeky info. :rolleyes:

    Wolfgang

×
×
  • Create New...