Jump to content

Michael Most

Basic Member
  • Posts

    765
  • Joined

  • Last visited

Everything posted by Michael Most

  1. This is not uncommon for longer sequences, as it makes it much more manageable by most common filesystems. 20 minutes of 8MB DPX files would be over 28,000 files in one directory. With most desktop operating systems, that takes a while to evaluate and manage. By splitting it up, you get much better response when looking at the directories and evaluating the files. In most cases, the split points are determined by the material itself and how it is done in telecine. Did the 20 minutes consist of one shot (doubtful), or multiple shots that required different color corrections? Were you at the transfer, and/or did you tell the facility how you wanted the material organized? If you really want it to be one long sequence, you can copy everything into one folder and rename the files to be sequential. You can assign a time code for the first frame within the Gluetools interface, or just rename them so that the frame number represents the time code. In Gluetools (at least in version 3.0), you can have Gluetools use the frame number as the time code instead of the DPX header TC. I have absolutely no idea what a "CP" file is, but my guess is that it is written by whatever device or software was used to do the DPX export and only has meaning there. Are you sure that there actually is DPX timecode? Did you specifically request it, or ask the facility if the files would contain it? Even if there isn't, you can use the method I described above to have Gluetools interpret a timecode that is whatever you need it to be.
  2. I'm not trying to be crass or unhelpful here, but the information you're asking for is on every one of those manufacturer's web sites. Do some homework and you'll get all the information you need and more.
  3. No. The Varicam includes pulldown flags in the vertical interval to indicate the "actual" 24 frame sequence (they're actually in the time code user bits). There is no way to do that from other devices. Since you can't field edit on these machines, there is no way to record with cadence control, so the cadence is going to be random at each edit point. The more sensible way of getting DVCProHD footage from telecine is to receive it as files, mot likely Quicktime files, that you would use directly in your editing software. If you do it that way, the files can be 720 or 1080 and be 24p already, having been digitized from a "true" 24p source.
  4. I have to agree with Tyler on this. You do get what you pay for, but color decisions need to be made in the proper context. If you're color correcting for video, you do it in a video color correction environment. If you doing it for theatrical projection, you do it in a darkened theater with reasonably large screen. It's not just the colors themselves that dictate creative color decisions, it's the environment, which can lead to different decisions based on the impact of the image on the viewer in that environment. Lookup tables alone can't turn a 40" plasma display into a 20 foot theater screen regardless of how accurate the emulation is. And unless you work in a darkened room, you can't really judge brightness properly. You might get acceptable results with a "pre-correction" approach. But you'll likely never know where you would have gone had you been in a proper theatrical environment because you won't have the opportunity to start from scratch in the theater.
  5. Since we were both around at the time (only a couple of buildings away at MGM as I recall) and because you know I can't resist, a few minor corrections (based on my own memory, of course). It was more like the mid-80's. The first single camera show to be cut on an electronic nonlinear system (Editdroid at the time) and the first show to go through Pacific Video (now Laser Pacific) as an "Electronic Laboratory" show was Knots Landing, a show I remember quite well because I worked on it. That was the 1984-85 season. Dallas and Falcon Crest were added a year later (although not at Pacific) and I believe a couple of shows called Matlock, Beauty and the Beast, and Father Dowling Mysteries - shows which you are very familiar with - began the following season. Or was it the second half of that same season? Maybe my memory isn't that good after all.... As I recall, they didn't quite "build" it. They bought United Color Labs, a primarily 16mm umm, "adult" lab, and cleaned it up - literally. And I'm not sure they were "Laser Pacific" at that point. Were they? Ahh, memories...
  6. It is very common to do feature dailies to HDCam SR these days. These recordings are not normally used for a DI, but they are used for previews, which are typically put together "video style" from the dailies by doing an online assembly and tape to tape color correction. In some cases, editorial is done in an HD format, usually by using DNxHD files on an Avid (the vast bulk of studio features are cut on Avid equipment) or DVCPro or ProRes on a Final Cut system, and the preview is generated directly from that. However, this is not as common as you might think.
  7. A few rather minor corrections: Not on a PC (the proxies can't currently be read on a PC). On both Macs and PC's, Redcine has some problems with certain codecs. It seems happiest with ProRes and DVCPro on a Mac, but I don't know what it seems happy with on a PC. I do know that on both platforms is is quite touchy with certain codecs and hardware configurations, although the hardware incompatibilities seem to be mostly on PCs. Output to what? I guess that depends on who you're delivering to, but if you need to deliver videotape - and you will likely have to do that (among other deliverables) if you're delivering to a television network or a motion picture studio - you'll probably be enlisting the services of a post house at some point. The general gist of what you've said is true, but at this point in time, Red is far friendlier to an Apple/Final Cut editing path than anything on a PC. And for truly fluid editing, you likely won't want to be using those proxies directly. For viewing, sure, but not for editing, particularly on longform.
  8. The Century Plaza (it's now the Hyatt Regency Century Plaza) has hosted many awards banquets (the ASC Awards were hosted there until this year). The room is large, but I believe they can subdivide it on request. And it's basically next door to Fox.
  9. That was only just included in the most recent firmware update which is still very much a "beta" release. It only works in the camera, and there is no current way to use software - even Red's software - to create those LUTs. There is also no current way to retaain them into the post process. Like everything with Red, this will likely change, but that is the current state. It should, in the interest of fairness, also be pointed out that the Silicon Imaging camera has had this feature for at least 3 years. In fact, it has a complete workflow through post that is based around it, and the ability to create look files either on a computer or in the camera software itself. I like Red as much as anyone, but a lot of things they're doing are not nearly as original or radical (or working, production ready, and complete) as their fanboys think they are. In a world where anything posted on an Internet forum is often taken as established fact, and used for even further hype, I think it's important to be accurate and tell the whole story, at least whenever possible.
  10. If you use a Viper, you also need something to record it with, most likely an HDCam SRW1 or a file based recorder, such as an S.two. The Varicam and F900 are camcorders, so their recording devices are part of the camera.
  11. I'm not a "tape oriented" person. However, a telecine is, by definition, a linear transfer device. And time code and other metadata are what post production is based on. Telecines are not going to change - hell, we're probably seeing the last generation of them right now. So the ones that are used will continue to be used, and the installations they are in will continue to be operated the same way, because it's established, convenient, and efficient. So if you want to create a device for that market, you need to look at what's being done, how it's being done, and accommodate that rather than tell people to change what they already do quite well. Even the MTI Control Dailies system, which takes a rather nonlinear approach to dailies creation (at least to the point of separating ingest and combining of picture and sound) still has to read, track, and correlate time code and keycode in order to allow a relationship between the resulting tapes or files and the original material.
  12. That sounds like the view of a user, not a supplier. In a telecine operation, efficiency is key and accuracy is required, and that's why virtually all telecine facilities use TLCs to control the coordination of telecine and recording transports. It's the only real way to ensure that the transfer is controlled, and therefore metadata that is supplied is accurate - a must in the professional end of the business. That is the approach taken by Drastic Technologies, which can spin off a transcode operation when recording of a clip is complete. And that can work reasonably well if it's implemented correctly. But that is exactly what is done in the US, for essentially every television program and for most features as well, at least insofar as creating an element for dailies that is later used for previews (not for the DI, in most cases). It's not scans, of course, it's HD telecine transfers, but you get the idea.
  13. It needs to fully emulate a VTR (and work in NTSC, PAL, and various HD formats) so that time code can be added under editor control. Sometimes the time code is sent with the video in the form of pre-inserted VITC (for SD) or VANC (for HD), but it's often the case that the code is added in the VTR itself by striping the starting time code (say, hour 1) and assemble editing from there. That allows for new edits each time there is a new shot, necessary when you're doing daily transfers with sync sound. In the case of a Quicktime type recorder, it has to be smart enough to remember the last outpoint, but also to be able to change it. In the case of file based recorders (i.e., Rave HD, which records DPX sequences) this is not an issue because each frame (i.e., each file) has a time code attached via both the header and the file name itself, so if you need to overrecord previous material, it writes new files with the original names that replace the old ones (see how much easier everything is when you don't use wrapped movie files?). It also needs to play well with other devices, like other VTR's. This is necessary in part because backup tapes are often requested, but also because it is fairly common practice to create multiple formats simultaneously, i.e., HD and SD for viewing copies. This is usually done by dragging a DVCam as a second recorder. If you're using an editing controller (a DaVinci TLC is probably the most common in a telecine situation) it expects each transport to park, roll, sync, and report its position over RS422. You seem to be assuming that time code is always originating in the telecine suite and traveling with the material. That is not the case. The VTR's themselves are often the source of the time code, and this is necessary for multiformat transfers. That's why devices such as RaveHD and Clipster fill this need so well. The only device I know of that ever did this while writing wrapped movie files was the Avid Media Station (and Symphony, although very few facilities used that capability). As I recall (it's been quite a while) what they did was create a 24 hour long virtual tape device for every edit, so that any time code you enter on the edit controller would be a valid in point. It would then simulate a preroll based on that in point. This worked reasonably well, although the hardware at the time was pretty primitive by today's standards.
  14. They can't if the requested file format is a Quicktime file with an Apple specific codec, like DVCProHD or ProRes - the two that seem to be the most oft requested. Other than that, many US facilities - including the one in which I work - can and do perform direct to DPX or uncompressed Quicktime file transfers all the time, using RaveHD, Clipster, or other devices. And yes, these have to be copied to a client drive because I don't know of any Firewire drive that can sustain the 150MB/sec or so that's required to reliably record 10 bit uncompressed HD (more like 200+ if it's recorded as single frame DPX files). Write software that can reliably virtualize a VTR on a Mac, complete with RS422 control that works with a standard edit controller and simulates a tape preroll and I'll buy one. But to this point, nobody's done that.
  15. Name some. Because my impression is that you consider just about everyone who's achieved any serious degree of success to be a talentless hack unless their first name is Phil and their last name is Rhodes. Sorry, but bitterness is not an emotion I'm particularly familiar with and it's not one I consider particularly helpful.
  16. All telecines in North America are set up to run at 23.98, not 24, for "24p" transfers. This is because sync is derived from NTSC based sync equipment, because in order to do simultaneous dual system transfers (i.e., SD and HD - a very common situation, especially for dailies), the two must be correlated and in lock step. So if you ask for a 24 frame transfer, you're going to get 23.98 unless you very specifically order 24p HD only and the transfer house is willing to go along with that (it requires the sync generator to be set up differently, which is not commonly done). In practice, however, it is never necessary to do "hard 24," as 23.98 is still a 1:1 relationship between video and film frames. 23.98 also makes a lot of other post production steps much more straightforward, as many of them are based on standard definition video already (i.e., screening copies, outputs of cuts, post sound work, you name it). The only time you would be likely to get a "hard 24" element would be to have it generated by a scanner, and in that case, you would likely be delivered an image sequence - say, DPX files - which has no inherent frame rate.
  17. Geez, Phil. One really has to wonder why you haven't made it big in the business with such a healthy, positive attitude, and such a deep understanding of how the business works. And to think that some of us actually believe that talent has something to do with success. What idiots we are!
  18. In other words - and I've said this to people who haven't worked in post production of network television programs - it's really not that big a deal. I wish others could understand that.
  19. Having been in that chair for many years, I wouldn't put it that way. I would say that in television, the colorist works with the DP, but for the producer - a very different dynamic than that of the production crew. In some cases, the DP is given a lot of access to the colorist with the support of the producer - but that's not always the case. What is always the case is that the producer has the final say, not the DP. BTW, when I say "producer" I'm actually referring to multiple entities. The producer designated as the "post" producer usually has authority in this area. The associate producer or post supervisor often sits in on the sessions and passes on notes, but does not usually have final say. In some extremely rare cases, an executive producer might give notes, but as I said, that's rare.
  20. As the cost of the camera equipment goes down (for example: Red), the pay scale of those who use it also tends to go down. Eventually, the person running the equipment is seen just as much as a commodity as the equipment itself. This has been the case with still photographers, ENG shooters, wedding photographers, CG animators, and especially editors. Ultimately it gets to the point where the ability to make a living with this stuff might become questionable, except at the very highest levels where quality is still in demand. THAT's what I meant.
  21. I'd be far more concerned about the tie between the cost of technology and the perceived value of those that use it. If you do't believe this, take a look at the average editor's pay today vs. 10 years ago.
  22. I saw that picture on film and it looked quite good as well. In fact, I had no idea it was a Genesis origination until the end credits. Maybe it's not as weak a link as you think...
  23. Probably because they don't need one. That non-standard cadence was basically developed in league with Apple, whose software (and hardware) was, at the time, not fast enough to remove "standard" 3:2 cadence on the fly, particularly with Firewire transfers. To this day, Apple has never included standard pulldown removal on ingest in any version of Final Cut - although Avid seems to have figured it out at least 3 years ago in Xpress Pro. Since Panasonic primarily supports 720p/60 as its chosen HD format, pulldown techniques are necessary to record a 24fps video stream. Sony, on the other hand, supports true 23.98 recording in both XDCam and HDCam, so recordings made on those formats don't have any cadence to begin with. Bottom line: This is really Apple's problem, not Panasonic's, and not Sony's. If you don't want to deal with it, you don't have to. Either shoot true 24p on a Sony device, or use an Avid.
  24. I think you're confusing the physical details of the Red with the actual post production requirements. The instantaneous access you mention is available with products like the Panasonic P2 devices and the Sony XDCam devices because the files these devices record are already in standard video formats at the best quality they are capable of producing, given the compression schemes used. The Red, on the other hand, **must** have its files go through a processing pass in order to produce images of proper quality for delivery and in standard video formats. You can cut using the original files piped through a Quicktime wrapper to produce a live proxy, but for actual delivery, with clean debayering and sizing to standard video formats, you must render - a potentially time consuming step - and subsequently conform the resulting files. The camera is also not suited for live video applications for basically the same reason - the only live output is a "quick debayered" 720p/60 feed. If anything, ENG is one application that the Red is most definitely NOT suited for. It's great for a lot of other things, though...
  25. Emerson does have a good reputation and a number of alumni in the industry - primarily producers, in my experience. Having said that, no college or any degree programs give you "an edge." The only thing that gives you that is demonstrated talent. This is just as true for USC or NYU graduates as it is for Emerson. If you are interested in an internship, though, those producer alumni are worth contacting, as they do tend to keep in touch with the school and like to offer internships to its students.
×
×
  • Create New...