Jump to content

Carl Looper

Basic Member
  • Posts

    1,462
  • Joined

  • Last visited

Everything posted by Carl Looper

  1. Who says you can't make a feature film on Super8. Looks like a lotta fun.
  2. Hi Tom, I don't want to raise the heat again since we're in a conciliatory spirit now, but I do need to address this particular point. I think it is quite legitimate to shoot a feature film on Super8. But, yes, one needs to know the obstacles involved. However the traditional way in which film makers have learned their craft is not to jump head first into a feature film, but to work on short films first. That still remains the case today. But if not then I recommend it to everyone. Now especially in film one learns very quickly the obstacles involved. One learns to overcome those obstacles. Or one learns it's too hard - in which case digital might be the answer. Or doing something other than film making. Now most of those working in and discussing Super8 well understand the nature of the medium. They already know the obstacles involved. The question we should be asking of those working in film should not be the rhetorical why. A far more productive question would be how. Carl
  3. Hi Tom, Reasons for working in film vary from one filmmaker to another. In my case a film on which I'm working involves bringing some rolls of 8mm film, shot in the 1930s, to the big screen. Now I'm actually using digital cameras and custom digital signal processing algorithms to do that. The purpose is to restore a moment in time. It's a bit like CSI. A forensic activity. To recover something that was otherwise lost. Now it doesn't involve any actors so presumably you'll find it uninteresting. Fair enough. Another film on which I'm working does involve actors and shooting Super8. The Super8 will undergo the same process being done for the 1930s film. The purpose of the process is to obtain a result that is much better looking than the way Super8 might otherwise look when projected on a big screen. But it's a good question. Why shoot Super8? The first reason is easy. I just really really like the results of what I'm getting out of Super8. The signal is just so mesmerising. Now I've worked in video, 16mm, 35mm, CGI and digital - so it's not as if I don't know what I'm looking at. I do. What's so crazy for me is that it's Super8. But there in emerges the other reason - that it is Super8. If I shot it on digital, instead of Super8, then it wouldn't be Super8. There wouldn't be that crazy amazement. I guess it's an artistic-technical-historical thing. Carl
  4. My apologies. I didn't realize alloy rims, steel belted, carbon re-enforced sidewall tires were not wheels. Carl
  5. I should also add that I'm a great advocate of the digital domain - for thirty years now. The history of digital goes back thousands of years to the abacus, and before that, stones and bones arranged in the sand. So instead of poo-pooing "archaic" technology, embrace it for what it is: a form of knowledge handed down to us. A gift. Carl
  6. The wheel is archaic technology. When I find a smooth stone, by the side of the river, I skip it across the river. It doesn't matter to me that the stone age ended thousands of years ago or that there might be objects that can skip across the river better than the stone. Art is not just in the image. Amongst many things it is also in the materials and the concepts. Consider the concept work done in the 70s, where stuff was buried under ground where nobody could see it. The art was in the concept, not in the image. Or painters whose work is expressed as much by the texture of the paint as the entire image. Actors? Who says a film has to have actors in it anyway? What happens when a digital camera breaks on location? In any case who is poo-pooing digital?
  7. Hi Roberto - thanks. will talk to you soon. Carl
  8. Hi Roberto - if you are interested I'd like to talk to you about the transfer of some film. Are you also able to do standard/regular 8mm as well? I have some film shot in the 1930s that need a 2K+ transfer. But am also shooting new material, Super8, later this year for which I had been originally building a rig (while scouting for those who might have already established a 2K+ set up). I've been developing software specifically for digitally processing scans, which I've been testing on a handful of frames that were scanned by hand (manually sliding film through a gate). The software computes the optical flow between frames (and not just immediately adjacent frames) in order to establish the flow field in which pixels can interact with each other towards extraction of the latent signal otherwise encrypted in the grain structure of film - to put in a techno-poetic way. :) Carl
  9. Hi Roberto, It's good to see I'm not the only one pursuing the idea that there is something to be gained from scanning at much higher definitions than is normally available. Can you tell me what the source filmstock was. With my own rig I'm now multi-scanning frames to work around the limitations of a one chip camera (and it's single bayer filter which limits rez) - effectively making it an n-chip sensor. Do you scan films as a business and if so, send me a PM about it. The only reason I've been building a scanning rig is because I haven't found anyone to do it. My area is in software rather than hardware. cheers Carl
  10. Here's a flash frame from the evolving viral for Super8 the movie. Why does this frame not show Super8 sprocket holes? Probably because the filmmakers don't care. And if so, why should we do otherwise? Carl
  11. Discussing something that doesn't yet exist is somewhat difficult. I imagine discussion will eventually occur once the film is released. It will be interesting to see how Speilberg/Abbrams frame Super8. What role does Super8 play in the film, given the title and it's focus in the teasers? But apart from speculation, which just feeds into the film's promotional strategy, what else can one discuss? The teasers? The snippets of information? The manufacturing of an ausdience? The construction of anticipation. Speilbergian messianism - the audience, as moths, awesome-struck by the light of that which has yet to come, and in the context of everyday life no less. Carl
  12. That's great. I need one of those. Is there any more info available? Eg. the cost of all the required Lego components? Carl
  13. Can always try: Super8 Kodachrome 40, circa 1984, scanned @ 2.5K (Canon EOS 1000D), with 2.2X SR processing: Super8 K40 Super Resolution Carl
  14. The motion of objects in the scene, such as the hand, affect resolution. As you can see, the hand is blurred. The head was also moving, so is slightly blurred. However I've left such motion blurs in there as they play an important role in the completed motion picture. They contribute to the sense of the objects as moving. If you wave your hands about in sunlight (like a crazy man) you'll see they become blurred. Film transport can also affect resolution. Frame jitter, etc. Carl
  15. Hi Nicholas. here is what Kodachrome 40 Super8 looks like after SR processing (2X), level correction and some sharpening. 2546 x 1845 pixels Carl
  16. Yeah - a friend of mine has a Leicina Special as well - and said the same thing. It's the cheapest one I found at the time. Standard version. But only had the Optivaron zoom lens - no Cinegon. The eyepeice was second hand ($50). I was expecting I'd need to make an adapter for it but the shoe clip of the finder mated exactly with the Leicina groove in the eyepiece.
  17. Hi Phil, thanks for the info. But you are again misreading my post. I did not say your anecdote was false. I have no reason to disbelieve you. But you say "Redcode is JPEG2000". However for the current version of Redcode that is not the case. Now you add that the current version of Redcode is encrypted JP2K. Well, that could very well be the case, but there is no evidence of that. Certainly if there was a JP2K chip then it could be a proposition. But there may be no such chip, eg. the JP2K encoding could have been done via programmable hardware in the camera. Or the waveklet transform was done in fixed hardware but the JP2K specific coding done in software. There all sorts of alternative propositions. Now I'm not against speculation. I'm speculating as well. But then I'm not arguing as if what I'm saying is the case. Carl
  18. Here is a subsection from a full frame 3192 x 2250 scan. The subsection is 1280 x 720. There was a little bit of level correction done but otherwise it is as scanned. Keep in mind that this is only a subsection - less than half the width (and height) of the full frame and yet it still looks great. It represents 2.2mm x 1.2 mm of the Super8 frame. Or what K40 in a 4mm format might look like :) Carl
  19. Yep - it sat at the airport for a month while I saved the money to pay for the import tax. I didn't realise that was required. It was the first OS item I'd bought that was over a thousand dollars. And the Leicina remote controls eventually arrived - must have been shipped by sea. I got hold of a second hand Canon right-angle viewfinder that fit the Leicina eyepiece perfectly. Am going to attach a small Sumix USB 1024x768 camera to the eyepiece, feeding the signal to an attached tablet, to use as a video display which I'll program to do anamoporhic corrections, crops, even light readings! The 3.5K scanning rig is coming along slowly. Have decided to try a different strategy for the film transport mechanics - a second hand Elmo K100SM - modified. Building the transport mechanics from scratch has not been my cup of tea. I managed to get an enlarger lens (from a garage sale down the road) that works so much better than the microscope objective. Requires a longer distance throw. The super-res software is coming along extremely well. Even without super-rezzing, the full frame 3.5K scans are just so much better looking than anything less. I just scanned some K40 last night and it's just unbelievable how good it looks. Unbelievable. Don't listen to anyone that says scanning Super8 beyond 1K is overkill. It's not. It's exquisite. The more I've been studying about why the higher scan looks so beautiful the more I understand the real difference between film and digital. The thing about digital is that it has a fixed cutoff frequency beyond which details in the potential signal just simply vanish. Woosh. Gone. Nothing there. Zero information. The end. But with film the variation in signal (and/or noise) is still there no matter how high your sampling frequency goes. I mean technically it's infinite. Every particle, or corresponding dye-cloud, while having limits in terms of size and position, has an infinite number of sizes and positions (between those limits) that it could be. To reproduce this variation in digital terms is impossible. It would require an infinite number of pixels. Literally. That all said, I am so excited with the super-rezzing algorithms I've been writing up. Lots of ideas in play. Carl
  20. Hi Phil, further to my previous point, I can't find anywhere confirming that redcode was (or is) the result of a JP2K ASIC other than your anecdote: that once upon a time J2PK decoders could read redcode. The JP2K output could very well have been the result of an additional transcoding step in which the data was rearranged to be compatible with JP2K decoders. That makes to sense to me as well. I understand some 4K projectors were hardwired to read J2PK. Also, it doesn't necessarily follow that because JP2K decoders can no longer read redcode, that redcode is now an encrypted version of JP2K data. It could very well be that redcode now represents data at an ealier step in the output pipeline - before it was transcoded to JP2K. Just presenting alternative speculation. Carl
  21. Hi Phil, I'm not sure as to what you are saying "No Carl". I didn't say Redcode wasn't using a JP2K encoder. In fact I explicitly said "I don't know": Rather, what I was saying (and this remains correct) is that just because Redcode uses a wavelet transform (like Cineform) it doesn't follow that it uses JP2K. I mean if that was the case, it could also follow that it uses Cineform! However, if it does indeed use JP2K (as you say), then of course it will be using a wavelet transform, since JP2K uses a wavelet transform. Carl
  22. JP2K is built on top of a wavelet transform, as is Cineform. And as is Redcode. Now it could be that Redcode does use the same compression algorithm as JP2K, (I don't know) but that does not necessarily follow from the fact that Redcode (like JP2K and Cineform) use a wavelet transform. The wavelet transform is in the public domain and is well defined. One of the first applications of wavelet transforms was image compression, but it is not limited to that application. The wavelet transform itself is not lossy. It rearranges information in such a way that is completely reversible. When used in the context of image compression the wavelet transform provides information in a form that is easier (and more effectively) to compress. However, the actual compression of a wavelet transformed image is open to different algorithms. The actual compression algorithm is not a component of the wavelet transform. Since the wavelet transform itself is not lossy, it means raw data can be saved in a format where it has been wavelet transformed but without any loss. Carl
  23. I saw a machine vision camera with 25 lenses the other day. Arranged in a 5 x 5 grid. Robots must be hyper-predators. Carl
  24. Playing the film back at a slow rate is a good idea. The following is based on a misreading of your post. I thought you were talking about negative film rather than K40. But since I've written it already I'll leave it here - just reinterpret for K40: Regarding digital processing of negative this is (of course) a little tricky if you don't have software already set up for it. One approach is just "trial and error". What you want to do, after inverting the signal, is to control the RGB channels, or the CMYK channels, independantly of each other rather than together. This is because the film has a different bias for each channel. The main throttle will be moving the midpoint on each channel (altering the gamma). For example, if the signal is too pink, then you'll want to start by moving the midpoint of the red channel in the appropriate direction (to the right), or the blue/green gamma in the opposite direction (to the left). But of course pink is a mixture (in RGB space) of red and some blue, so you'll want to take that into account. A trial and error approach can be fustrating if you are not familiar with how colours mix. Many people use CMYK colour space, as it accords with how mixing paint works - but I find RGB space a lot easier. Note that in RGB space, red + green = yellow. Say this to a painter and they'll think you're crazy - but after working in RGB colour space, you eventually get the hang of it. I find I have much more control and sense of where colours are going in RGB space. Another approach (but costs a roll of film) is to shoot a colour chart on the filmstock you are using and take a digital photo of the same. Take both results to the computer and alter one to match the other - but alter each channel independently of the others, and in particular the midpoint (gamma) as much as the global levels. And since this could take hours if not days to get right, make sure you write down everything you've done to get the final result (the numbers) as you'll no doubt want to use those numbers again. Carl
×
×
  • Create New...