Jump to content

Search the Community

Showing results for tags 'workflow'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Cinematography Forums
    • General Discussion
    • News & Press Releases
    • Lighting
    • Camera Operating
    • AC's & DIT's
    • Grip & Rigging
    • Visual Effects Cinematography
    • Grading, DI and Telecine
    • Students and New Filmmakers
    • Cameras Systems and Formats
    • Lenses & Lens Accessories
    • Camera Accessories & Tools
    • Film Stocks & Processing
    • Books for the Cinematographer
    • Cinematographers
    • In Production
    • On Screen
    • Cine Marketplace
    • Business Practices
    • Jobs, Resumes, and Reels
    • Please Critique My Work
    • Regional Cinematography Groups
  • Not Cinematography
    • Producing
    • Directing
    • Sound
    • Editing
    • Off-Topic
    • Forum Support

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Skype


Location


My Gear


Specialties

Found 23 results

  1. Saban Capital Acquisition Corp. (NASDAQ:SCAC) (“Saban Capital Acquisition Corp.”), a publicly traded special purpose acquisition company, Panavision Inc. ("Panavision"), and Sim Video International Inc. ("Sim") announced today that the companies have entered into a definitive business combination agreement to create a premier global provider of end-to-end production and post-production services to the entertainment industry. The combined company will be well positioned to capitalize on the continued growth of content production spending and enhance the scope of service offerings to its customers. Under the terms of the business combination agreement, Panavision and Sim will become wholly-owned subsidiaries of Saban Capital Acquisition Corp. Immediately following the proposed transactions, Saban Capital Acquisition Corp. intends to change its name to Panavision Holdings Inc. (the “Company”) and is expected to continue to trade on the Nasdaq stock exchange. Headquartered in Woodland Hills, California, Panavision is an iconic designer, manufacturer and provider of high precision optics and camera technology for the entertainment industry and a leading global provider of production-critical equipment and services. Headquartered in Toronto, Canada, Sim is a leading provider of production and post-production solutions with facilities in Los Angeles, Vancouver, Atlanta, New York and Toronto. The transaction reflects a valuation for the combined entity of $622 million (inclusive of debt) or approximately 5.9x fiscal year 2018 estimated Adjusted EBITDA. The cash component of the purchase price to be paid to the equity holders of Panavision and Sim will be funded by Saban Capital Acquisition Corp.’s cash in trust, which is approximately $250 million, a $55 million private placement of common stock at $10.00 per share secured from a mix of premier institutional investors as well as an affiliate of Saban Sponsor LLC and newly raised debt financing. Upon the closing of the proposed transaction, Kim Snyder, President and Chief Executive Officer of Panavision will serve as Chairman and Chief Executive Officer, and Bill Roberts, Chief Financial Officer of Panavision, will serve in that role for the combined company. “We are excited to partner with Kim along with the Panavision and Sim teams to capitalize on the explosive growth in content spending,” commented Haim Saban, Chairman of Saban Capital Acquisition Corp. He continued, “Advancements in technology and the emergence of streaming have fundamentally changed how consumers watch and discover content. This is driving significant growth in the market for production and post-production services. This secular trend creates a tremendous opportunity for Panavision to leverage its leading technology and pursue opportunistic acquisitions to grow in a manner that is agnostic to the content creator and distribution channel.” Adam Chesnoff, President and Chief Executive Officer of Saban Capital Acquisition Corp., commented, “This transaction creates a leading global platform ideally positioned to capitalize on the rapid growth in content production. The combination of these two companies will create the foremost provider of end-to-end production and post-production services. Combining this platform with Saban’s wide-ranging global media relationships, experience in production, and successful track record of creating value for its partners, will position the Company to accelerate growth and pursue complementary acquisitions. We are excited about the potential.” “For nearly 65 years, Panavision has proudly served the entertainment industry providing cutting-edge equipment and exemplary service to support the creative vision of our customers,” says Kim Snyder, Chairman and CEO of the combined company. “This acquisition will leverage the best of Panavision’s and Sim’s resources by providing comprehensive products and services to best address the ever-adapting needs of content creators globally. These complementary companies subscribe to the same strategic vision: to support our customers as the category-defining provider of end-to-end production and post-production services.” “Combining the talent and integrated services of Sim with two of the biggest names in the business, Panavision and Saban, will accelerate our strategic plan,” added James Haggarty, President and CEO of Sim. “The resulting scale of the new combined enterprise will better serve our clients and help shape the content-creation landscape,” continued Haggarty. The respective boards of directors of Saban Capital Acquisition Corp., Panavision and Sim have unanimously approved the proposed transactions. Completion of the proposed transactions are subject to Saban Capital Acquisition Corp. stockholder approval, certain regulatory approvals and other customary closing conditions. The parties expect that the proposed transactions will be completed in the first quarter of 2019. For additional information on the proposed transaction, see Saban Capital Acquisition Corp.’s Current Report on Form 8-K, which will be filed promptly and can be obtained at the website of the U.S. Securities and Exchange Commission (“SEC”) at www.sec.gov. Deutsche Bank Securities Inc. and Goldman Sachs & Co. LLC are serving as financial advisors, capital markets advisors and private placement agents and Skadden, Arps, Slate, Meagher & Flom LLP and Dentons Canada LLP are serving as legal advisors to Saban Capital Acquisition Corp. Houlihan Lokey, Inc. is serving as financial advisor, Citi is serving as Capital Markets Advisor, and Kirkland & Ellis, LLP and Osler, Hoskin & Harcourt LLP are serving as legal advisors to Panavision. Marckenz Group Capital Advisors is serving as financial advisor and Stikeman Elliott LLP is serving as legal advisor to Sim.
  2. Hi, I've recently entered pre-production on a low budget feature I recently finished writing, now storyboarding awaits. I'd like to shoot the entire storyboard on set. Been looking into the Artemis app. It would be great to use the app to find the right focal length for each shot and add that information to each storyboard picture. Does anyone know how the metadata is stored in Artemis if you sync it to dropbox? Would be great to add the pictures and metadata into something like studio binder or shot lister, so that you know exactly what lens to use for each specific shot. It would be a huge timesaver, which would suit us perfect considering this'll be a run & gun type production. 30$ i a cheap investment if you use the app a lot, but it's expensive (for me at least) to just try it out, so I thought I'd check in with you guys before I decide. How would you go about storyboarding a low budget feature as a DP if you had access to the set and actors in pre-prod? Drawing the entire storyboard would be nice, but I don't think it'd be the most efficient approach, and I also feel that being able to go through the entire script on set with the actors when doing the storyboard would be great for both me as DP / Director and the actors. Any thoughts or ideas would be much appreciated! Thanks, Patrik
  3. RED to Host Panels Featuring Henry Braham, BSC, Dean Devlin and Red Bull Media House Cine Gear attendees can find RED Digital Cinema at booth S301 in Stage 2 of Paramount Studios at Cine Gear Expo 2017 from June 2-3. RED’s visitors can interact with a range of DSMC2™ cameras, including SCARLET-W™, RED EPIC-W™, WEAPON® 8K S35 and WEAPON 8K VV. Additionally, RED will be demonstrating the latest workflow options - covering their new image processing pipeline (IPP2), HDR, and 8K. RED will be joined in their booth by a wide variety of manufacturers of lenses, camera modules and accessories including RT Motion, Kippertie, foolcontrol, and Offhollywood. RED will also take attendees behind the scenes with two panels in Paramount’s Sherry Lansing Theater: June 2 at 12:45pm: Director Nicholas Schrunk and key members of Red Bull Media House’s production team will discuss the making of the feature documentary Blood Road. Blood Road follows the journey of ultra-endurance mountain bike athlete Rebecca Rusch and her Vietnamese riding partner, Huyen Nguyen, as they pedal 1,200 miles along the infamous Ho Chi Minh Trail. Their purpose was to reach the crash site and final resting place of Rusch’s father, a U.S. Air Force pilot shot down over Laos. Panelists will discuss the creative and technical process, and how this incredibly demanding story came together visually. Moderated by Andrew Fish of American Cinematographer. June 3 at 11:45am: Guardians of the Galaxy Vol. 2 director of photography Henry Braham, BSC will talk about filmmakers’ escalating preference for shooting with larger format digital cameras. Braham shot Guardians of the Galaxy Vol. 2 entirely on the RED WEAPON 8K VV. He will be joined by well-known producer, actor, and writer Dean Devlin. Devlin will discuss how new formats are driving an evolution in filmmaking for the big screen and touch on his experience using RED including WEAPON 8K S35. Moderated by Carolyn Giardina of The Hollywood Reporter. All of RED’s cameras offer superior image quality, incredible dynamic range, and simultaneous recording of REDCODE® RAW and Apple ProRes or Avid DNxHR/HD, combined with cutting-edge performance in a compact, lightweight, and modular design. For more information, visit http://www.red.com/cinegear-2017.
  4. INTERNATIONAL CINEMATOGRAPHERS GUILD TO HOST THREE PANELS AT NAB SHOW (April 24-25) GAME Of THRONES: Behind the Scenes with the Filmmakers GHOST IN THE SHELL: Creating a Cyber Future NEXT GENERATION IMAGE MAKING – Taking Content Creation to New Places The panels, hosted by the The International Cinematographers Guild (IATSE, ICG Local 600), at the NAB Show (April 22 – 27,) will examine advanced and future imaging techniques, including the use of machine intelligence in pre and post production. The discussions will be set within the context of high profile, consumer content including Game of Thrones and Ghost in the Shell. All sessions will take place at the Las Vegas Convention Center. Monday, April 24, 1:00 PM, Ghost in the Shell: Creating a Parallel Cyber Future, produced in collaboration with the ICG Location: South Hall Upper Room S220 Ghost in the Shell is a sci-fi thriller about cyber-enhanced soldier, Major (Scarlett Johansson), her battles against hacker terrorists and her hunt for her stolen identity and its perpetrators. Combining artistry with cutting edge innovation, the filmmakers designed unique tools and systems, including an original, anime inspired 28-color palette; custom LED lights; a video-photogrammetry rig for 3D video inserts; and a DCI P3 workflow from camera through color finishing and deliverables. See clips and hear stories about the making of this visually groundbreaking motion picture based on the original Japanese manga. Panelists: Jess Hall, BSC, Director of Photography (Transcendence, Brideshead Revisited, Hot Fuzz) John Dykstra, Visual Effects Designer (Oscar Winner: Spider-Man 2, Star Wars, Emmy Winner: Battlestar Galactica) Michael Hatzer, DI Colorist, Technicolor Justin Wagman, Post Production Supervisor Moderator: David Geffner, Executive Editor of ICG Magazine Monday, April 24, 2:30 PM Game of Thrones: Behind the Scenes with the Filmmakers South Hall Upper Room S220 With its spectacular battles, dragons, reverberating hugs, and death winnowing storylines, season 6 of Game of Thrones ratcheted up the tension and raised anticipation for an eventual, epic series conclusion. The season was shot simultaneously by director/ cinematographer pairs rotating between stage and location (as is the custom), spotlighted a cast of hundreds and featured nearly 1900 visual effects shots. Learn how the filmmakers applied the Game of Thrones naturalistic lighting style to new locations and storylines, how they chose to block and edit the emotional turning points, and how they harnessed postproduction to visually unify the footage. See some great footage and understand why, in 2016, Game of Thrones drew an amazing 25.1 million viewers across all major platforms. Panelists: Bernadette Caulfield, Executive Producer Greg Spence, Producer Anette Haellmigk, Director of Photography (Emmy Nominee: Game of Thrones) Jonathan Freeman, Director of Photography (Emmy Winner: Board Walk Empire, Emmy Nominee: Game of Thrones) Moderator: David Geffner, Executive Editor of ICG Magazine Tuesday, April 25 10:30 – 11:30 a.m. Next Generation Image Making – Taking Content Creation to New Places North Hall. Room 251 Light field, volumetric capture, computational photography, real-time rendering, and generative imaging have the potential to transform image making for all forms of content. Already, they are contributing to the blurring lines between live action and computer-generated imaging, and between what takes place in preproduction, production and post. What is science behind these new technologies and how do they work? What are their current limitations and promise? Glimpse what NAB’s Central and South Hall could look like in five to ten years… Panelists: Andrew Shulkind, Director of Photography (Clients include: Paramount, DreamWorks, Sony Pictures, Apple, Adidas, AT&T, Budweiser, Google, Old Spice and Samsung.) Jon Karafin, CEO Light Field Lab. Inc. Gavin Miller, Head of Adobe Research. Steve Sullivan, General Manager, Holographic Imaging Microsoft Moderator: Michael Chambliss, technologist and business representative for the ICG, IATSE Local 600
  5. Irvine, CA - January 20, 2017. RED, Atomos, G-Technology and Teradek will host the Digital Workflow House from Jan. 22-23 during the Sundance and Slamdance film festivals in Park City, Utah. The Digital Workflow House is a two-day event featuring exclusive discussions with creative and technical industry leaders, hands-on interaction with innovative technology, and networking opportunities for attendees. The Digital Workflow House will be open 10 a.m. to 4 p.m. on Jan. 22, and 10 a.m. to 7 p.m. on Jan. 23 at the Treasure Mountain Inn (255 Main Street). Registration is suggested, as the panels are expected to be popular and space is limited. Filmmakers interested in coming by the Digital Workflow House can RSVP here. A schedule of events includes: JANUARY 22 10-11 AM: FIELD OF VIEW (panel) 11:30 AM - 12:30 PM: WANDERING THE SPHERE: TELLING STORIES WITH 6 DEGREES OF FREEDOM (panel) 11:30 AM - 1 PM: DEMO OPEN HOUSE 1:30-2:30 PM: WOMEN IN FILM (panel) 3-4 PM: DEMO OPEN HOUSE JANUARY 23 10-11 AM: THE AIMLESS GAZE: THE ROLE OF CINEMATOGRAPHY IN SPHERICAL VIDEO (panel) 11:30 AM - 1 PM: DEMO OPEN HOUSE 1:30-2:30 PM: THE ECONOMICS OF WORKFLOW (panel) 3-4 PM: DEMO OPEN HOUSE 5-7 PM: LOUNGE AT THE DIGITAL WORKFLOW HOUSE (networking event)
  6. First post on this forum. I must say, having never actually worked with film before, I'm glad I found this website. Very informative. DISCLAIMER: Never having worked with film, my knowledge comes entirely from the internet, no hands on experience. So if I say anything that sounds absurd, just let me know. Having always used digital cameras, I want to shoot a movie on Super 16mm film, and have it finished on 35mm film. I've been trying to work out the workflow to get from the exposed negatives to that final print. I want to have it photochemically color timed, preferably without ever having a Digital Intermediate. Now, if I were shooting on 35mm, I would simply color time, make the inter negative, and make copies from there, all analog. But with super 16, there is the sticky problem of having to blow it up to 35mm, and from what I've read, there are many ways to go about this. There are several discussions already on this forum, but most of them are over 10 years old, and the technology seems to have changed rather significantly since then. I have an idea for some possible workflows, but I don't know if they would actually work the way I want them to or not. One of them is to edit the 16mm film together, have it color timed, then optically blown up to 35mm, but I'm not sure if the colors would translate well (I've read conflicting statements, but some say that an optical printer can't reliably transmit the colors, meaning it might have to be retimed.) If that were the case, I could have it edited, optically blown up to 35mm, then color timed, but that adds the cost of working with more 35mm in the process. For another option, and I wouldn't really mind this as long as I didn't have to digitally alter the colors, but I could edit the 16mm film, color time it, then data scan it at 4k(Not that much more expensive than 2k) then downscale it to 2k(or not, if printing 4k weren't much more expensive, but I don't know.) and have it printed to 35mm film. The problems with that, however, after it was scanned, I don't know if you would have to digitally alter the colors, or if the direct scan can be printed back without any processing. If it were the case that I would have to mess with the colors digitally anyway, then another thought was that I could edit the 16mm film, scan it without color timing, print it back to 35mm film, then photochemically color time that copy. But, that might be absurd. I don't really know, but I feel like after it was scanned, then printed, there might be some information taken from the film that makes photochemical color timing less effective, since you're just working with what a digital printer put on it, not the original analog goodness. That's a bit of a book, so I'll summarize my specific questions: 1. Assuming both processes were done properly, which would be less expensive, optical blow up, or scanning then printing back to film? (I have no reference for cost for digital printing or optical blow up. As far as I know, in this day and age, one could be far cheaper than another.) 2. Assuming both those processes were done properly, which do you think would give the best results? (Knowing that I want a photochemical timing done.) 3. Will an optical printer transfer the timed colors properly, or would it have to be timed again? 4. Will a scan of an already color timed print properly transfer the colors when printed back to film? Or would the colors still have to be digitally altered before printing. 5. Can and untimed scan that has been printed back to film still be photochemically color timed, or is that absurd? 6. Sort of related to the first question, but any reference as to how much printing 2k and 4k digital to film would cost? I can't find any information on the cost like you can with scanning. Also, specific costs of the optical blowup. I feel like some people are going to ask, "Why not just use a digital intermediate, instead of photochemically color timing? it would give you much better results, and be cheaper", and they're probably right, but it's just a hands on artistic thing. I'm relatively young, and grew up in a world that is entirely digital. Watching actual 35mm films at a theater is like a distant childhood memory, as most theaters have long been digital. And making movies with digital cameras is all I have ever done, but quite frankly, I'm at a point where I would like to create movies the same way my favorite movies from decades past were created, even if it slightly compromises visual clarity.
  7. Hi, I stopped shooting 4-5 years ago due to a sports injury which lead me to working in another field but now I want to start shooting again and I'm feeling a bit behind in the tech, workflow and rusty in my skills. I'm lucky enough to be visiting LA early next year & I'm looking at the ASC Masterclass. I'm concerned it's not really hands on? Is it? Isn't it? I see they're using the Alexa and going through the most current workflows etc but is it mostly learning through demonstration or are we able to get our hands on the tech, lights & set up some shots? Anyone who's done it got any reviews? What did u get out of it? Can anyone recommend an int-adv Cinematography workshop that's very hands on in or around LA? I don't often get over to the US so I'd really like to grab the opportunity. Thanks.
  8. i'm shooting a music video mostly in 60P with lip sync, and want to be able to timecode slate it so, for example, i can shoot a 15 second segment mid-way through the song and know exactly where to slot it into my edit. the tricky part of course is that my camera's framerate is 60P while the project framerate will be 24P. i won't have access to a proper professional slate with timecode (not that i'm sure it would help?) - i was planning on holding up a laptop running timecode from the song to use as a slate. the issue is, if i'm playing back the song on set at 250% with timecode so that it matches the 60P being recorded on my A7S, that timecode obviously won't match the A7S footage when everything is slowed down to 24P. i'm wondering if my solution (though definitely not frame-accurate) is as follows: - export a video file of the song at 24P with a timecode overlay. - re-import that video into my editor and increase speed by 250% so the timecode is sped up as well. - play back that sped up video with sped up audio / timecode on set - slow everything down by 40% for editing and have a rough idea of where footage should fall in the timeline. any advice, better methods of doing this? note: i am using outdated editing software (FCP7) if that makes a difference to this workflow..
  9. Award-Winning Technology Built by DigitalFilm Tree Experts NAB BOOTH: SL9016 (AWS) LOS ANGELES - Critique 4, developed by the people who brought you DigitalFilm Tree, will be unveiled at the 2016 NAB Show in Las Vegas, April 18-21. This new version of the popular cloud-collaboration software will be showcased at the Amazon Web Services (AWS) booth (SL9016). Critique 4 is already utilized by award-winning productions such as Modern Family, The Simpsons and NCIS: Los Angeles, among others. "Critique has undergone a significant upgrade and offers many new features and security controls to take clients to the next level," says Critique CTO Chris Chen. "It is also the first time we have deployed the platform on AWS." Technologically advanced, Critique 4 is a secure digital media asset management (MAM) platform with extensive features to support creative processes and production workflow for both the media and entertainment space as well as enterprise. Built to be extremely easy to use, Critique facilitates collaboration through real-time chat, live annotations, and secure sharing over the internet to deliver productions on time and on budget. Real-time chat and drawing annotations are viewable across the Web and iOS - they even work with the new Apple Pencil for iPad Pro. Designed to improve workflow, the software facilitates every step from protected dailies screening to VFX workflows to post-production to distribution while capitalizing on enterprise-level security to protect valuable assets. Critique 4 was born of the minds of its executive team: Guillaume Aubuchon, a veteran in the production space having worked on such projects such as Her, NCIS:LA and Angie Tribeca, and Chris Chen, a heavyweight in production streaming space and the former CTO of DAX. With its ability to leverage DigitalFilm Tree's post-production facility, Critique is built to ensure it works in real-world media environments. Critique's new relationship with AWS is key to version 4. "AWS is not only the largest cloud provider, but they are the cloud provider of choice in the M&E space," says Aubuchon, CIO of Critique. "Our infrastructure shift to AWS afforded us the ability to architect the software to leverage the range of services in the AWS cloud platform. It allowed us to build Critique 4 from scratch in a matter of mere months." One of the new exciting features of Critique 4 is its ability to index Amazon Simple Storage Service (Amazon S3) to allow companies to manage their own content inside of Critique's award-winning interface. It also offers high-performance cloud MAM for simultaneous video and document management: Users can collaborate with Critique's review, approval and annotation workflows not only for video but also for production documents including scripts, graphics and still images. "Digital Rights Management (DRM) protection is rarely used, if at all, for unreleased content, which is arguably where it is needed the most," notes Chen. "Critique was designed to leverage DRM invisibly throughout its video distribution system on desktop, web, and mobile environments. This allows Critique to break through the legacy walled-garden approach, allowing a new level of flexibility in collaboration while maintaining security. But we do it in such a way where the users don't even know it's there." The ability to share assets in this way expands its mobility and Critique is available via web, phones, tablets and Apple TV. The video service is backed by a true CDN running multi bit-rate video to prevent glitches on any platform. "Users can take advantage of Critique anywhere - in their office, living room, the subway, or even a plane," explains Chen. "And it will be true to the original media. "One of the challenges of working offline is that people don't want to wait for content to download on their device before they leave," notes Chen. "They need to be able to grab and go. With Critique's new Sync feature, a push-based system that downloads selected content to your device while you are in the office or at home. It is ready to go when you are. For executives, there is an admin mode which permits assistants to specify content to push to an executive's device." Other highlights of Critique 4 include: Storage, archiving and management of Raw material Automatic transcoding of Raw material into a proxy format for viewing Granular permissions on files, folders, and projects Easy-to-manage sharing functions for users outside the system with the ability to time-limit and revoke/extend individual permissions Customizable watermarking on videos While Critique was born in the creative and operations side of the media and entertainment market, it is extending to enterprise, small to medium-size businesses, publishing, education and government/military sectors. "The diligence and demands of working with studios, and our ability succeed in that high-demand, high-output environment, has road tested our video and document management and our enterprise-level security, making Critique a perfect fit for new tiers of users," says Chen. This latest version of Critique is available now for a free 30-day trial (AWS usage fees apply). Pricing is extremely competitive with 10, 20, 50 and 100 user levels starting as low as $39 per user. Enterprise level contracts are available for larger projects and companies with multiple projects. The fee includes unlimited streaming of current content and 24/7 white-glove tech support. AppleTV, Apple iPad and iPhone apps are also included. For a nominal fee, users can add DRM, high-resolution cloud transcode and storage for camera raw and mezzanine files. For more information on Critique, visit www.critiquecloud.com, or follow us on Twitter.
  10. I was wondering if anyone had any advice on ARRIRAW workflows. I am getting ready to shoot a feature with the D-21 recording ARRIRAW externally to an S.Two OB-1. I am most looking at most likely using dpx sequences for editing, but I was wondering if anyone had some advice in this area. Thanks in advance.
  11. Hello all, I am in my final year of studying film production and I am currently doing research about cinematography and color grading. I am trying to find out more about the collaboration/ workflow between the DP and the Colorist and the issues regarding the authority over the look. I am also looking into whether new, aspiring cinematographers should be their own colorists. I was wondering if you could please share your opinions about this from your experiences so far. As a Cinematographer, how were your collaborations with Colorists ? (and vice-versa). What can DPs/Colorists do to ensure they can work together successfully? If you are working both as a Cinematographer and a Colorist, why did you choose to have both roles and how difficult was the transition from cinematography to color grading? Your help would be much appreciated!
  12. Hello all, For the current film I'm working on, I had to go through an odd process to do what I needed to it. First, I imported the footage into Premiere so as to trim it down and piece it all together. Then, I opened up After Effects, and imported the project from Premiere into AE so as to do some edits to it that I could only do in this program. Then, after finishing these edits, I opened Premiere back up and imported this AE timeline that I had added the edits to into a timeline on Premiere. Basically, I did all this without ever having to export from a program. Just some specs: I use Windows 7 with 8gb of RAM. Also, the video files I used were ProRes HQ. Everything turned out fine, but there is one inconvenience: when I play the final video in Premiere, it has extremely choppy playback (even on 1/4 quality). I did do some pretty heavy edits to it in After Effects, but my question is, is this the best way to go about doing something like this? Or should I have exported an H.264 file from AE, then import it into Premiere? Thanks a lot! John
  13. Hello everyone, I have researched online and can't come to any solid conclusions about how sound design and/or mixing is done when you're working optically. For instance: I'm making a movie and seriously considering taking it the full photochemical route, ie. shooting on 35mm scope, processing the film at Fotokem, getting a work print made (no DI), cutting the film on a flatbed, conforming the negative, timing the answer print, and striking a release print. The idea is to keep my movie completely off a computer. But the thing I can't seem to wrap my brain around is the audio part of the process. How do I sync the separately and digitally recorded dialog to my work print? How do I mix in the music I want? Most importantly, the sound effects? I tend to have substantial sound design in my films, sometimes 150 tracks or more, and spend around 80% of my post production process on sound. Is there a way to do this optically? Should I just go with a DI? I hear a lot of terms like sepmag, and 35 sound mag, but I'm not really sure what they are or how you edit with them. Thanks in advance for your help. Colin
  14. So I'm on a project that is based around multirotor helicopters. Needless to say I have a lot of gopro footage coming my way. I was wondering if anyone had any advice or leads to good resources on matching gopro footage and cinema log footage in the grade I would be much appreciated! Here is what I've been capable of so far. Ignore the last 18 minutes it's black video there was an encoding error. http://youtu.be/Isge70aoLtE
  15. It looks absolutely awesome... but it's a bit pricey. I'm wanting to add a few extra ports to how I'm working as a data wrangler (I'm primarily an Editor) and this Echo 15 looks like it may be twisting my arm. However, the price tag had me searching for other solutions. I found one other dock that had USB 3.0, eSATA, Thunderbolt & FW. That was the Akito Thunderbolt Dock (with a couple less USB 3.0) and there's not much in the price difference. If I was to chose from these two the winner would be the Sonnet for sure. I thought about it and tried searching a little more... keeping in mind that I still own an Echo ExpressCard Pro, which connects via Thunderbolt. I use this mainly with an eSATA adapter. I tried to find a cheaper solution (we're all about saving money and we know it!) and I found this... the Elgato Thunderbolt Dock. It has 2 Thunderbolt ports, 3 USB 3.0, HDMI, Ethernet and it's less than half of the price. I know it doesn't have eSATA but... I have a question (at last, I got to the point!): If I connect my Echo ExpressCard into the spare Thunderbolt port of the Elgato (thus giving me eSATA connectivity) will transfer speeds be massively different compared to the Express 15? or does adding that ExpressCard screw things up a little? Overall, my suggested solution won't be a tidy little box like the Echo 15 is but it should be a cheaper alternative right? Am I being naive or should I just shut up, bite the bullet and get an Echo 15? Any help or insight would be great. Help a newbie out... Thanks guys. Links: SONNET http://www.sonnettech.com/product/echo15prothunderboltdock.html AKITO http://www.amazon.co.uk/Akitio-Thunderbolt-Computer-Docking-Multilink/dp/B00INMUTUS/ref=sr_1_2?ie=UTF8&qid=1409246154&sr=8-2&keywords=Akitio+Thunder+Dock ELGATO http://www.amazon.co.uk/Elgato-Thunderbolt-Dock-Apple-MacBook/dp/B00HWR3XGW ECHO EXPRESSCARD http://www.amazon.co.uk/Sonnet-Echo-ExpressCard-Thunderbolt-Adapter/dp/B0080MQJJ6/ref=sr_1_1?s=computers&ie=UTF8&qid=1409249810&sr=1-1&keywords=echo+expresscard
  16. International Cinematographers Guild to Field Workflow Panel at Cine Gear Los Angeles, June 3, 2014…The International Cinematographers Guild (ICG, IATSE Local 600) is producing a panel discussion on “New Workflow Choices and the Director of Photography” at the 2014 Cine Gear Expo in the Sherry Lansing Theatre on the Paramount Pictures lot this Saturday, June 7, from 10:15 to 11:30. The panelists are Paul Cameron, ASC, Director of Photography (Total Recall, Dead Man Down), Mark Doering-Powell, Director of Photography (ABC’s Super Fun Night; CW’s Everybody Hates Chris), Steven Poster, ASC, Director of Photography, National President, ICG (Donnie Darko; Someone to Watch Over Me), Andrew Turman, Director of Photography (Target, Lexus), Michele deLorimier, Digital Imaging Technician ( DIT) & Phantom Tech (Cover Girl’s Pink; Cream Reunion at the Royal Albert Hall), Bobby Maruvada, Digital Imaging Technician (DI colorist Project X, After Earth) and E. Gunnar Mortensen, Focus Puller (Amityville Horror: Locked In, Cooties). Digital Imaging Technician Joshua Gollish (Skyfall, Prisoners) will moderate. The panel will discuss how Directors of Photography and their crews encounter an ever-expanding range of approaches to color management and dailies. These new paths profoundly impact the DP’s artistic style and every aspect of their job, including on-set operations and creative environment and communication with the director, talent, editorial and postproduction. This group of leading DPs and other camera crew members will examine how their workflow choices on recent, high profile motion picture and TV projects impacted the filmmaking process, the craft and the product. http://www.cinegearexpo.com/83/ Press Contact: Rick Markovitz Weissman/Markovitz Communications rick@publicity4all.com 818-760-8995
  17. Here is a tutorial video I posted on YouTube on how to get timecode on DSLR clips and other post production workflow tips. Mac centric.! Or you can read my blog post at http://remoteaccesspost.com/adding-timecode-to-dslr-footage/ Hope you find it useful!
  18. Hi guys, After watching the excellent THR cinematographers roundtable, I feel compelled to draw more attention to one of the key issues they discuss - the lack of control of the final image in todays modern world of different screens and projector standards. I've been lucky enough to have a few films I've shot recently coloured by Rob Pizzey and Adam Glasman at Co3 in London and we've achieved results I've been delighted with in their colouring sweet, both in the P3 space on their projector, and in Rec709 on their Dolby monitors. The problem however, is when we output to home deliverables in Rec709 h264, the image is so drastically different on my home macbook, iPad, TV that I find when sharing it I'm constantly having to explain and apologise for the final quality. Personally I've found this much less of a problem with DCPs in theatres, but modern displays at home are surely capable of better. It strikes me that rec709 (created in 1990 for TV) surely can't still be the answer for grading to a standard that will look good across all devices. Of course the issue is in two parts here, but is there a push anywhere to standardise a modern colour space and standard across hardware, that can reap the benefits of more dynamic range, contrast and better colour representation? I've heard about rec2020, but as far as I can see this seems to be just for 4K TVs? Would love to hear your thoughts and hopefully there's an answer out there in the works, to ensure a more modern and standardised presentation of the work we do in peoples homes and on their devices. Eben
  19. Hello there, I am a freshmen undergrad in film school and recently got the position to work as a DIT and editor on an upcoming short. The short is a collaborative effort of one of our school clubs where we managed to get together to funds to shoot with an epic. During the shoot, which is coming up this spring, I am to apprentice a professional DIT and learn the ropes. Eventually taking over once I am comfortable with the procedures and so forth. I am also going to be grading and editing the RAW footage. So prior to when my apprenticeship begins I would like to have a firm knowledge of what I am getting myself into. So I was curious if anyone had any articles regarding the DIT workflow and responsibilities of the job. Also I am looking for any articles and info on grading RAW footage which I have experience with in terms of stills, and the RAW editing workflow. Despite going into this without any experience in being a DIT, I am a very adequate editor in Final Cut and Premier. Just to let you know I’m not entirely unqualified for the position. Summary: I am looking for all information, advice, and articles regarding being a DIT to a RED Epic, and grading and editing RAW footage. Thanks!
  20. Hi there, I am new to the cinema DNG workflow and I just started working with the Ikonoskop a-cam-dii. Although I have been doing a lot of research on the subject, I would like to have some advice from someone with some experience in the field. My questions is the following: for projects that do not need massive color correction, would it be acceptable to throw away the DNG files after transcoding them to Prores 4444 or Prores HQ 422 and use these Prores files as the master files? What would the disadvantages be in this case, if any? Thanks
  21. Gentlemen (and ladies), as some of you know I have shot on film and digital. My first short was shot on super16 way back in 2001 and I had a 35 blowup for festivals. A little trailer I cut in December from an HD scan of our blowup... http://www.youtube.com/watch?feature=player_embedded&v=tMFRPMHTiZ0#! Over the last three years I have shot a number of shorts, some using DSLRS and some 8mm (a format I love). Right now I am in talks with some people about a feature film project and while everyone wants to shoot on the RED EPIC, just because the EPIC is cool and easy and cheap (uh, yeah, ok), I want to shoot one of the three acts on film. The story lends itself to this as we need a different look for each act. I also want to be able to say "we shot on film" not only because it's an interesting talking point, but also because before long I think it won't be possible. What I need to know is this: What would you advise as the cheapest solution and workflow for shooting on Super 16? 35mm is out, naturally, due to cost. I like the look of super 16 and it's easy to find lenses in the NYC area. Back in 2001 when I shot my first short the negatives were processed, given a one light with time code and we edited in AVID. Then the negative was cut AB style and we bypassed creating a positive (because I was out of money) and made a 35mm print from the AB rolls. (For a feature we clearly would not have been able to skip the positive step) It is now my understanding that traditional negative cutting is almost never used today. ??? The people involved with this planned film are almost 100% RED and 5D people and as you can imagine, mostly low low budgeters. Those that have shot on film were not in any way involved with post. So I go to you for ideas! I hear things like, "It's too expensive to scan all that negative and color correct it." So how about photo chemically, like it used to be done? "No that's too expensive too." Etc. etc. Thanks for your help.
  22. Hi. Most of my narrative films are shot on RED camera´s. I have a job where we might shoot on Arri Alexa camera. I heart that post production and color correction is easier with Alexa. Any in here with great experience with the Alexa workflow:-)? Best, Henrik A. Meyer. www.henrikameyer.com
  23. FotoKem and SPY (a FotoKem company) combined in-house talent and key post-production services for the Sundance winning film Fruitvale, written and directed by Ryan Coogler with cinematography by Rachel Morrison. The film won the festival’s coveted Grand Jury Prize and Audience Award for a drama, and was acquired by The Weinstein Company soon after its premiere screening. Providing end-to-end services for the Super 16 mm project, FotoKem and SPY worked closely with the filmmakers from start to finish. Fruitvale was processed and transferred at FotoKem in Burbank. The files were delivered to SPY’s headquarters in San Francisco, where colorist Chris Martin color graded the film with Coogler and Morrison. The two facilities are securely connected by a high-speed network offering real-time interface capabilities between the locations to provide the creative community with easy access to the full breadth of post-production services that FotoKem and its companies offer. Fruitvale follows the final day of 22-year old Oscar Grant, who was gunned down by a police officer at the Fruitvale stop of an Oakland transit line. The tragedy was caught on video, and the incident made national headlines. The movie was shot on location in San Francisco. The Super 16 footage was then shipped to Burbank, scanned to HDCAM SR, and conformed at SPY. “When I met with Ryan and Rachel prior to production, they were committed to shooting film. They felt strongly that film would convey the emotion they wanted to draw from Oscar's character,” recalls Martin. “Our color grade supported the quality of grain and tonal palette that the Super 16 format brought to the story. Specifically, we approached the process as if we were timing in a film laboratory, avoiding the feeling of a digital grade. Building in contrast and adding weight to the mid-tones, rather than overcooking the shadows and highlights, brought a very specific emotional element to the film. The result is a feeling of intimacy that holds the personality of the film and supports Rachel’s amazing camera work. Obviously the audience at Sundance agreed!” Martin notes that Fruitvale can be divided into two worlds - the quiet intimate world of Oscar and his family, and the larger institutional world where Oscar encounters conflict. “Oscar's world tends to be defined by warmer scenes with more intimate contrast while the institutional scenes are seen with a wider lens and more mixed lighting, embracing an observational feel,” says Martin. “So while we maintained the same approach for all the scenes, there's a wonderful dichotomy between the two worlds.” “We are honored to have collaborated with the talented team behind Fruitvale. At FotoKem, we continually strive to offer independent filmmakers the workflow expertise that they need, and invest in the talent and technology to support their creative visions,” says FotoKem’s Mike Brodersen.
×
×
  • Create New...