Jump to content

Keith Walters

Premium Member
  • Posts

    2,255
  • Joined

  • Last visited

Everything posted by Keith Walters

  1. I don't understand why memory is such an issue either. All you're basically doing is converting each input frame to a 1920 x 1080 (or whatever) bitmap, re-compressing to the ProRes format, and saving it a frame at a time to your output file. With Acrok, if you press the cancel button during the transcode, it leaves whatever it's done up to that point as a truncated but otherwise operational file, so it doesn't sound like there's too much "housekeeping" going on. It's not like you need to hold several frames in memory at once, which would be the case if you were actually editing.
  2. OK, just as an update, I tried my trial version of Acrok out on a 3 minute 10MB MP4 file and it turned it into a 1.3GB 1920 x 1080 Pro Res file! Apart from having "Acrok" emblazoned across the middle, it looks to be the real deal. The only thing I have that will play ProRes is iTunes, and it played it fine. Interestingly, the original had somewhat erratic sound-sync, but the ProRes playback was absolutely perfect. Only possible show-stopper is that the highest resolution it does is 1920 x 1080, but they say they're open to suggestions! Quicktime runs it as well, hardly surprising, as they're virtually the same program!
  3. Just a cautionary note here: One thing that most "non-propellor-heads" aren't aware of, is that FFMPEG is open source software; and what the FFMPEG people distribute is the source code (that is, the text file the programmer writes). That has to be compiled in a further step to make an executable file than can run on a PC (or Linux or Mac or whatever). Now, lots of people do that and offer free FFMPEG.exe "builds", but the thing is, their versions don't always have the full "suite" of video formats available, I guess in the interests of trimming down the file size by not including rarely-used formats. (This was probably more advantageous when computers had smaller hard drives and everyone was on diallup). One of the reasons people sometimes found FFMPEG so baffling is that they appear to be following the instructions to the letter, but it often refuses to run their operation, coming up with a incomprehensible command-line screen of gobbledegook. Usually, that's because they've downloaded a build where the particular codec they wanted has been left out, because the build was designed by a programmer, not a video editor :rolleyes:. Ironically, people usually turn to FFMPEG as a last resort, when they can't find any other program that handles their particular oddball video format, and then find that the person offering the build deemed that format unlikely to be ever used... By far the easiest way to drive FFMPEG is to use a windows program that writes a custom batch (.bat) file that launches FFMPEG.exe with the metre-long string of suffixes it needs, but many anti-virus programs now scream blue murder if any software tries to write to a batch file, so you have to make write a ".ba_" file and hand-edit it to ".bat".... Painful, but still less so than running FFMPEG directly~
  4. There's a converter program called Acrok that I downloaded a trial copy of, that's supposed to be able to output Prores, although I can't actually find that in the output options. Acrok does seem work OK. The trial version is not crippled, (at least they don't say it is), but the output files have a watermark. I've never had any reason to look for the ProRes option, but it might be in there somewhere. The full version is only US$39, so maybe you could see if you can get the trial version to work (with watermark) on ProRes, and if that looks OK, you could buy the registered version. It also works in reverse, so if you don't have any other way of playing ProRes files, you could revert them to MP4 or similar and see what happens. Acrok doesn't do anything I need to do that I can't already do with the software I already have, but it's nice to have it "waiting in the wings" as it were. Edited to Add: ProRes is in the "Editing Software" tab.
  5. I did see a motoring program once where they were showing the advantages and disadvantages of laminated glass and safety glass. The presenter did actually punch through a safety glass side window with his fist, but he was wearing a thick leather welding glove. The shattering glass is still the safest option as far as getting cut up goes, but having your windshield suddenly turn opaque white while you're driving sort of cancels that out! But that's why most cars only have laminated glass on the front window. Passengers are more likely to come into contact with the side glass, and it's not such a big issue if you suddenly can't see through it.
  6. The problem is, while all the solutions described above would be theoretically plausible, they all involve some rather specialized preparation, which would require anticipation of the said extraordinary event. A common plot hole in both story and script writing is the "two many extraordinaries" problem, basically extraordinary things happening to extraordinary people is one too many extraordinaries. You know, like the TV cop goes on vacation and there's a murder in the Motel he's staying at. I mean face it: How many motels have you stayed in, and how many times has there been a murder while you were staying there? Why would it be any more likely to happen to a cop? It's a very common plot device much used by lazy scriptwriters, but statistically, it's absurd. So while it's entirely possible that some well-heeled enthusiast could be one of the say 300 people in all the USA owning such a setup (or with access to it at any rate), you'd be stretching credibility somewhat if he just happened to have a situation crop up where the camera was exactly what he needed. It would be more plausible if he suspected something was going down and hired or otherwise obtained the camera from someone who specialized in that field. The thing is, there were bound to be people in the surveillance industry who had equipment like that custom made. The beauty is that all you have to say is: "he hired a modified super-8 camera capable of taking 800 foot rolls of film". For the vanishingly few people who would even know or care what sort of cameras were available in 1971, that would be sufficient explanation :rolleyes:
  7. It's not easy to damage the silicon sensor itself with heat, since they're already heated to hundreds of degrees in the manufacturing process to "anneal" the silicon after the multiple doping processes, and also to create a protective layer of silicon dioxide. However what you can easily damage is the coloured polymers that make up the Bayer mask on single sensor cameras. That can produce a localized discoloration that would ruin the sensor for commercial use. Three chip CCD and CMOS cameras are virtually indestructible because the dichroic filters they use are made of multiple thin layers of glass.
  8. I've had exactly the same problem. The images could be on any of four computers, and I don't know what the filenames are. I've certainly found some interesting photos while looking though :rolleyes:
  9. Web hosting has moved on somewhat. I have a website hosted by Gridhoster.com which cost me just US$120 for 3 years, (including registering the domain name) with unlimited storage and download. The catch is that they appear to limit the access speed to stop people hosting their entire movie collection on there but it's damned useful for storing large numbers of relatively small files. The great thing is you can have no-questions-asked hotlinks to your own images and video files, and there's no ads or other crap that get downloaded with them, plus (as long as you pay your renewal) they're yours for good. I had visions of putting all sorts of interesting stuff I'd written on there, but I never got past setting up the front page. (Jim Jannard hasn't done any better :-) Also, if you want to send files to someone, most email accounts are very fussy about what they allow as attachments, whereas with your own private website you can upload whatever you want and people can download whatever they want.
  10. Don't forget here was also the diabolical "Mars Needs Moms" in the mix. :rolleyes: The irony is that John Carter (of Mars) used an incredibly risky production approach, in an industry notorious and despised for taking "safe" approaches: They made the movie more or less the way you would imagine it would have been made, if modern movie production technology (but nothing else) had been available in 1911 -1912, when "John Carter of Mars" was actually written. Consider: In 1912 there had not been any world wars, radio and other communications technologies as we know them did not exist, and in particular, airplanes as we know them did not appear until halfway through WW1. The only aircraft at the time were basically box kites with lawnmower engines attached to them. And that's what you saw in the earliest illustrations in the books, and that's what the Martian aircraft in the movie were clearly based on. The end result was a refreshingly different movie, completely destroyed by a sad mish-mash marketing approach by people who clearly did not "get it". As others have pointed out, many of the popular Sci Fi movies of the late 20th century have ideas pinched directly from the John Carter books. What's ironic is that Burroughs clearly pinched many of his ideas for "Barsoom" for an almost-forgotten novel called "Lt Gullivar Jones - His Vacation" written in 1905 by Edwin Lester Arnold. https://en.wikipedia.org/wiki/Edwin_Lester_Arnold
  11. Actually the first Sony SP betacams that came out in 1989 had Peltier coolers on the blue CCD. They were also used in high-end video surveillance cameras.
  12. You appear to be commenting well outside your experience. Panavision made several abortive attempts at doing just that in the 80s and 90s, to allow 35mm film lenses to work with TV cameras. The results were universally horrible. Panavision make some of the best lenses in the world; if they couldn't make it work, I seriously doubt anybody else could. If you have the resources and technical skill to make it happen, by all means, please produce the goods; but please don't just come on here stating: "It must be possible." Or at least describe the technical problems, and then suggest how they might be overcome. Otherwise, you're just writing science fiction.
  13. You mean like the Panavision Genesis and Sony F35 have been doing for over 10 years? :rolleyes: There are a number of issues involved. Dichroic prisms and with three separate sensors are far and away the best way to produce colour images. Period, regardless of what you're read on the Internet. Apart from the better colour filtering, they're much less affected by IR contamination People keep talking as though single sensor cameras using filter arrays are some fantastic new idea; actually, they've been around since the 1950s. It's the availability of massive portable computing power that made them competitive with 3-sensor designs. The main impetus behind producing single-sensor cameras for cinematography work had a lot more to do with maintaining compatibility with existing lenses designed for 35mm film cameras, than image quality. You certainly could make 3-chip "35mm" size sensor cameras, but you would then have to make new lenses to go with them, because as Phil mentioned, in most cases the rear element of existing lenses would want to sit in the middle of the prism block. This exactly the problem faced by Panavision's "Panavized" Cine Alta cameras; your range of rentable lenses dropped from thousands to about six. The other major advantage of 3-chip cameras is the lack of "latency"; that is, the "live" pictures coming out of the camera are delayed no more than about 30 milliseconds. Because of the amount of processing required, the latency of most HD Bayer cameras varies between about a second for an Alexa studio camera, and a couple of days for early Red cameras :P The main reason the PV Genesis/Sony F35 use RGB colour striping (like an LCD TV screen) is simply that they were principally designed as TV studio cameras (hence 1920 x 1080 sensors), and straight RGB requires minimal processing time for full-resolution HD.
  14. MR RHODES!!! What an awful AWFUL concept! What has whether you think the the picture looks any good or not to do with anything?! At least I've learned that much in my years of posting here. I fully understand that there is a secret organization that hacks into digital projectors, to purposely maladjust them whenever you or I go to look at movies shot on certain brands of camera. How else do you explain the conundrum of how pictures from certain brands of camera generally look like pus to you or me, but other people find them be effulgent icons of cinematic verisimilitude, less than 0,2% removed from actual reality. NO! BETTER than reality! I must admit I am rather shocked to hear that someone actually managed to shoot a poor image using an Alexa, but I suppose if you shoot enough footage, statistically it must happen eventually. It's rather like that old adage that if you had an infinite numbers monkeys typing on an infinite number of keyboards, one of them will produce something significant. Now, thanks to the internet, we know that's not true....
  15. Er, an awful lot of them were Arri cameras. Panavision were Arri's biggest customer. Probably still are..... Panavision were never proud; if enough people asked for something, they'd get it.
  16. I'd like to know exactly what they mean by "15 stops". If it had a true 15 stop range, you'd think you'd be able to set the iris so that the highlights in your scene were just at the point of clipping and then convert this down to an 8-bit HDMI signal or similar to give a nice picture on a standard LCD monitor, and THEN if you either closed the iris down by 7 stops or put in 7 stops worth of ND, you should be able to crank the gain up by a factor of 128 times (42dB) and get pretty much the same image. Is that going to happen.?I seriously doubt it! Certainly would be easy enough to prove me wrong, but nobody ever does.... Anyway, there's no lens ever made that can give you a 15-stop contrast range. The darkest "black" material known only has a reflective attenuation of about 10 stops. I did enjoy the comments over on Reduser. Like suddenly Panavision have somehow "legitimized" Red cameras. Er, guys ... Panavision have been renting out Red cameras for some years years now. Also Alexas; they'll even rent you a Canon 5D package if you ask them nicely! "Those who called RED vaporware back in 2006 should feel ashamed now." Erm ... but it er ... was wasn't it? Back then anyway...
  17. You're absolutely right, but airline counter staff or courier drivers are not equipped to recognize the excellence (or otherwise) of professional (or "professional") customers' arguments. In reality, as with most safety-critical items, showing a certificate basically means squat. You still have to be able to demonstrate that the product you are carrying actually matches what it says on the certificate. Fortunately, the authorities don't seem to have woken up to that yet :D A large part of my current job entails: A. Getting Chinese manufacturers to understand what a test report and certification are actually for. B. Getting them to understand that once they have a product certified, it only remains certified if they keep making an identical product in the same factory.
  18. Yes, but the problem is, consumer Li-Ion batteries regularly do that spontaneously. The demos you see on Youtube etc are basically illustrations of what happens when they do. The biggest difference between Lithium-Ion batteries and just about all other types is that they use extremely flammable organic solvents for the electrolyte. All other commercial types of battery use water-based electrolytes, which, while corrosive, will not burst into flame. "professional camera batteries are not known to do this ever really.." I've seen the remains of a Betacam charger where one did :-) It's true that professional equipment normally uses cylindrical cells with a metal jacket which are much less likely to explode, but they still are capable of it. When they do it looks like a sky-rocket gone off course....
  19. Not lithium. It just rapidly turns a lead-grey colour. All alkali metals above sodium (Potassium etc) will however do this. The only other metal that does that, oddly enough, is plutonium.
  20. Here's one I prepared earlier. Note: 1.Sequence has been slightly edited for dramatic effect :rolleyes: 2. The battery contains two cells so the two separate fireballs are authentic. 3. This was shot in broad daylight so that gives you some idea of the ferocity of the flames 4. The same thing happened spontaneously in a person's lounge room ... http://www.930thursday.com/Fire.mp4
  21. Not sure what that's got to do with anything. The fires are caused by the extremely flammable organic solvents used in lithium batteries. The electrodes in a lithium battery are separated by a porous polymer layer that's about as thick as tissue paper. If that gets ruptured, the electrodes can touch, producing a white-hot arc that boils the solvent. When the pressure gets high enough the cell ruptures and the vaporized solvent bursts into flame. None of the combustion products produced are particularly toxic; the main problem is an uncontrollable fire starting in a confined space. According to UL, a sure-fire (so to speak) way of getting Lithium batteries to explode is to put them in a decompression chamber, which is why aviation authorities are so concerned about them
  22. One of the major reasons that digital Movie production was so slow to get off the ground was simply that the uptake of digital cinema projection didn't proceed at anywhere near the rate that the Pundits had been predicting. Even the recent wholesale conversion was basically shoved down the industry's throat; not too many cinema owners volunteered for it. Most cinemas had just one or two digital projectors so they could show 3-D movies, the rest was still 100% film. The real problem with early HD to film prints was that you were basically starting the four-generation duplication process with about the same resolution as normally came out of a four-generation process when you started with much higher resolution 35mm negative! With digital projection, all that was bypassed, but that only gave "OK" quality with HD origination. Of course, if the video was scanned off 35mm negative, there was simply no comparison....
  23. So where was this? I have seen quite a few posts that can best be described as "egregious" but I don't think anybody can be bothered replying anymore.
  24. OK, that's a totally different sort of "Plasma". As you say, the light source is a tiny bulb (about the size of a flashlight bulb) with no electrical connections. The energy is supplied by a microwave beam from a magnetron, similar to those found in domestic microwave ovens. The bulb is filled with sulphur dioxide, which, when electrically energized, produces an almost magically perfect spectrum, virtually all visible light and negligible UV and IR. This technology was actually first demonstrated well over a century ago, in the form of a conventional gas discharge lamp with metal electrodes. The problem was that the Sulphur Dioxide plasma is extremely corrosive, and even with tungsten electrodes, the lamp life was measured in minutes. The idea of using electrode-less microwave excitation as a way of solving that problem also dates back a few decades, but I think the problem was people were concerned about having a continuously operating microwave source in the room. There's no scientific basis for this common concern, but at least one lighting company thought the problem could be solved, at least for office use, by using a "light-pipe" distribution. The other problem is that it's hard to make cheap magnetrons designed for continuous operation. Whatever, nothing ever seems to have come of the idea. Another major problem with this technology is that there is no easy way of dimming the light since the plasma temperature is quite critical. Also, Hive's website only claim about a 50% power reduction over an equivalent HMI. DID YOU KNOW: That the term "plasma" was invented in 1914 by General Electric Engineer Irving Langmuir, who was also the true inventor of the vacuum tube (and not a certain charlatan, patent troll and all-round investment shyster by the name of Lee de Forest :rolleyes: ) Or that the "tron" ending of so many techie-words (at least originally) was actually ancient Greek for "device" or "instrument"? The term "electron" was derived from the Greek work for Amber "Elektron" because the ancient Greeks left records of their studies of static electricity, mentioning the effects of amber rubbed with fur. So although the endings are spelled the same in modern English, they have completely different meanings. Langmuir's first vacuum tubes were called "Kenotrons" basically "instrument containing nothing" or "Vacuum tube". Other manufacturers used soundalike names (eg RCA's "Radiotrons") and eventually the term came to be associated with "electronic".
  25. I thought they worked on the same principle as a plasma display, but using a white phosphor. In other words, a flat fluorescent tube. Ironically, they were originally developed as a backlight for LCD panels :-)
×
×
  • Create New...