Jump to content

Search the Community

Showing results for tags 'workflow'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Cinematography Forums
    • General Discussion
    • Cine Marketplace
    • Cameras Systems and Formats
    • Lighting for Film & Video
    • Camera Operating & Gear
    • Camera Assistant / DIT & Gear
    • Grip & Rigging
    • Visual Effects Cinematography
    • Post Production
    • Students, New Filmmakers, Film Schools and Programs
    • Lenses & Lens Accessories
    • Film Stocks & Processing
    • Books for the Cinematographer
    • Cinematographers
    • Directors and Directing
    • In Production / Behind the Scenes
    • On Screen / Reviews & Observations
    • Business Practices & Producing
    • Camera & Lighting Equipment Resources
    • Jobs, Resumes, and Reels
    • Please Critique My Work
    • Cinematography News
    • Sound
    • Off Topic
    • Accessories (Deprecated SubForum)
    • Regional Cinematography Groups

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Occupation


Location


My Gear


Specialties

  1. Internet Archive Search: The Quantel Guide to Digital Intermediate Film scanning and production in 2003 How have things changed over the last 2 decades, in what is presented in the publication, other than going from 2K to 4K as a standard. (Or is 4K even the standard nowadays?)
  2. As a long time GH4 user, I’ll be looking to upgrade eventually to the BMPCC 4K. In post production, I’ve always loved working with Prores for both editing and exporting . Since the BMPCC 4K can acquire all forms of Prores internally, it seems to be an ideal choice for having a End to End Prores workflow (acquire>edit>finish in the same format). Outside of increased file size, are there any downsides of this approach that I should be aware about? Extra info: I primarily edit in both FCPX and Premiere on my MacBook Pro.
  3. Panavision and its family of companies return to the EnergaCamerimage International Film Festival from November 9-16 in Torun, Poland, with an immersive, end-to-end experience for attendees. Torun’s newly renovated Karczma Damroki facility, across the street from the Jordanki Festival Center, will be transformed into an interactive festival space called PanaVillage and will showcase the integrated technologies, products, and services from Panavision, Panalux, Light Iron, LEE Filters, and Direct Digital. “We are giving filmmakers the opportunity to experience our broad array of tools in a simulated production environment,” says Kim Snyder, Panavision president and CEO. “Our global team and product experts look forward to engaging with attendees in a hands-on experience with our end-to-end offerings.” At the PanaVillage, guests will interact with Panavision’s complete ecosystem of cameras and lenses, lighting, filters, gels, grip, and remote systems. Visitors will be able to fully control and monitor an 8K Millennium DXL2 large-format camera mounted on a SuperTechno 30 crane, using an innovative new wireless fiber technology with a range of more than a kilometer (0.6 miles). Mounted in front of a Primo 70 lens, Panavision’s LCND filter offers six stops of variable density. Guests can remotely adjust the camera, iris, and LCND filter while monitoring 4K video inside the PanaVillage on a LINK HDR Cart. Panavision and Light Iron’s LINK HDR system, which debuted at Cine Gear Expo Los Angeles earlier this year, is now being utilized on feature and episodic projects. PanaVillage guests will experience how the Panavision LINK HDR cart and Light Iron LINK HDR dailies and finishing services put the power of creating HDR images into the hands of all creatives – including cinematographers, editors, and colorists – throughout the entire imaging chain. While inside the PanaVillage, attendees will have the opportunity to experiment and craft unique looks using the DXL2 camera and the more compact DXL-M system. A selection of large format lenses, including the T1.4 Panaspeeds, will be available to frame the scene. Guests can illuminate the scene and control the lighting with a variety of Panalux and third-party options. LEE Filters’ full collection of lighting gels, including the expanded range of Zircon LED gels, will be available to further modify the scene. PanaVillage visitors will also encounter LEE Filters’ ProGlass CINE IRND range of neutral-density filters and the LEE100, a lightweight, high-performing 100mm photographic filter mount system. Experts from Direct Digital will be on hand to discuss stills and motion rental services. Guests looking to purchase consumables and merchandise can find the Panastore in the main Jordanki Festival Center. Additional micro-workshops hosted by the Panavision group will be programmed throughout the week and real-time details will be shared on Panavision’s social media accounts. This year, Panavision is proud to sponsor a cinematic retrospective from EnergaCamerimage Lifetime Achievement Award recipient John Bailey, ASC. On November 12 at 15:00h at CinemaCity, Bailey will present insights from his impressive career, spanning more than 40 years with credits including Ordinary People, The Big Chill, In the Line of Fire, American Gigolo, A Walk in the Woods, Mishima, and The Accidental Tourist. Panavision’s long-standing relationship with Bailey resulted in the 2004 creation of the AWZ2, or the “Bailey Zoom,” the first modern zoom to use anamorphic elements in front of the lens. For more information about Panavision’s end-to-end offerings, visit www.panavision.com. To learn more about EnergaCamerimage, go to https://camerimage.pl/en/.
  4. BURBANK, CA (May 1, 2019) – There’s nothing small about the bicoastal post-production workflow provided by FotoKem for Universal Pictures’ and Legendary Pictures’ Little. FotoKem’s Atlanta and Burbank facilities supported the production from digital dailies through finishing with a full ACES finish for this fantasy comedy. From blockbuster producer Will Packer (Girls Trip, Night School, the Ride Along franchise) and director and co-writer Tina Gordon (Peeples, Drumline), Little tells the story of a tech mogul (Girls Trip’s Regina Hall) who is transformed into a 13-year-old version of herself (Marsai Martin), and must rely on her long-suffering assistant (Insecure’s Issa Rae), just as the future of her company is on the line. Martin, who stars in the TV series Black-ish, had the idea for the film when she was 10 and acts as an executive producer on the film. She is the youngest person to hold that title on a major Hollywood production. Principal photography for Little took place last summer in the Atlanta area. FotoKem’s Atlanta location provided digital dailies with looks developed by FotoKem colorist Alastor Arnold alongside cinematographer Greg Gardiner (Girls Trip, Night School) who shot with Sony F55 cameras. “Greg likes a super-clean look, which we based on Sony color science, with a warm and cool variant and a standard hero LUT,” says Arnold. “He creates the style of every scene with his lighting and photography. We wanted to maximize his out-of-the-camera look and pass it through to the grading process.” FotoKem responded to the sharp growth of production in Georgia, and entered the Atlanta market five years ago to offer on-the-ground support for creatives. “FotoKem Atlanta is an extension of our Burbank team with colorists and operations staff to provide the upfront workflow required for file-based dailies,” says Senior Vice President Tom Vice of FotoKem’s Creative Services Division. “Atlanta is an exciting place to be, and we’re thrilled to be part of that community.” When editor David Moritz and the editorial team moved to Los Angeles, FotoKem sent EDLs to its nextLAB dailies platform, the facility’s proprietary digital file management system, where shots for VFX vendors were transcoded as ACES EXR files with full color metadata. Non-VFX shots were also automatically pulled from nextLAB for conform. The online was completed in Resolve. The DI and the film conform happened concurrently, with Arnold and Gardiner working together daily. “We had a full ACES pipeline, with high dynamic range and high bit rate, which both Greg and I liked,” Arnold says. “The film has a punchy, crisp chromatic look, but it’s not too contemporary in style or hyper-pushed. It’s clean and naturalistic with an extra chroma punch.” Gordon was also a key part of the collaboration, playing an active role in the DI, working closely with Gardiner to craft the images. “She really got into the color aspect of the workflow,” notes Arnold. “Of course, she had a vision for the movie and fully embraced the way that color impacts the story during the DI process.” Arnold’s first pass was for the theatrical grade and the second for the HDR10 grade. “What I like about ACES is the simplicity of transforming to different color spaces and working environments. And the HDR grade was a quicker process,” he says. “HDR is increasingly part of our deliverables, and we’re seeing a lot more ACES workflows lately, including work on trailers.” FotoKem’s deliverables included a DCP, DCDM and DSM for the theatrical release; separations and .j2k files for HDR10 archiving; and ProRes QuickTimes for QC.
  5. RED Digital Cinema® released its RED R3D® SDK and accompanying REDCINE-X PRO® software with accelerated decode and debayering on NVIDIA CUDA® platforms. By offloading the compute-intensive decoding and debayering of RED R3D files onto one or more NVIDIA GPUs, real-time playback, edit and color grade of 8K footage is now possible. Benefits and efficiencies of this new software-hardware combination during the post-production process include: • 8K real-time 30 fps or greater playback performance • Up to 10x faster transcoding, depending on the format and content • Improved efficiencies and quality control within the content review process • Creative freedom using flexible R3D files instead of proxy files 8K performance is available with NVIDIA Quadro® RTX™ 6000 and 8000, GeForce® RTX™ 2080 Ti and TITAN RTX™ GPUs when coupled with a moderately configured PC. Creators can achieve additional performance improvements with multi-GPU configurations and may see noticeable gains even with older NVIDIA GPUs. Also, new NVIDIA RTX laptops from the world’s leading computer manufacturers, including Razer, Acer, Alienware, ASUS, Dell, Gigabyte, HP, Lenovo, MSI and Samsung, provide real-time playback at up to 8K and offer flexibility in choosing the right tools to fit a variety of budgets. Support from major NLEs and other SDK integrators is expected soon.
  6. Saban Capital Acquisition Corp. (NASDAQ:SCAC) (“Saban Capital Acquisition Corp.”), a publicly traded special purpose acquisition company, Panavision Inc. ("Panavision"), and Sim Video International Inc. ("Sim") announced today that the companies have entered into a definitive business combination agreement to create a premier global provider of end-to-end production and post-production services to the entertainment industry. The combined company will be well positioned to capitalize on the continued growth of content production spending and enhance the scope of service offerings to its customers. Under the terms of the business combination agreement, Panavision and Sim will become wholly-owned subsidiaries of Saban Capital Acquisition Corp. Immediately following the proposed transactions, Saban Capital Acquisition Corp. intends to change its name to Panavision Holdings Inc. (the “Company”) and is expected to continue to trade on the Nasdaq stock exchange. Headquartered in Woodland Hills, California, Panavision is an iconic designer, manufacturer and provider of high precision optics and camera technology for the entertainment industry and a leading global provider of production-critical equipment and services. Headquartered in Toronto, Canada, Sim is a leading provider of production and post-production solutions with facilities in Los Angeles, Vancouver, Atlanta, New York and Toronto. The transaction reflects a valuation for the combined entity of $622 million (inclusive of debt) or approximately 5.9x fiscal year 2018 estimated Adjusted EBITDA. The cash component of the purchase price to be paid to the equity holders of Panavision and Sim will be funded by Saban Capital Acquisition Corp.’s cash in trust, which is approximately $250 million, a $55 million private placement of common stock at $10.00 per share secured from a mix of premier institutional investors as well as an affiliate of Saban Sponsor LLC and newly raised debt financing. Upon the closing of the proposed transaction, Kim Snyder, President and Chief Executive Officer of Panavision will serve as Chairman and Chief Executive Officer, and Bill Roberts, Chief Financial Officer of Panavision, will serve in that role for the combined company. “We are excited to partner with Kim along with the Panavision and Sim teams to capitalize on the explosive growth in content spending,” commented Haim Saban, Chairman of Saban Capital Acquisition Corp. He continued, “Advancements in technology and the emergence of streaming have fundamentally changed how consumers watch and discover content. This is driving significant growth in the market for production and post-production services. This secular trend creates a tremendous opportunity for Panavision to leverage its leading technology and pursue opportunistic acquisitions to grow in a manner that is agnostic to the content creator and distribution channel.” Adam Chesnoff, President and Chief Executive Officer of Saban Capital Acquisition Corp., commented, “This transaction creates a leading global platform ideally positioned to capitalize on the rapid growth in content production. The combination of these two companies will create the foremost provider of end-to-end production and post-production services. Combining this platform with Saban’s wide-ranging global media relationships, experience in production, and successful track record of creating value for its partners, will position the Company to accelerate growth and pursue complementary acquisitions. We are excited about the potential.” “For nearly 65 years, Panavision has proudly served the entertainment industry providing cutting-edge equipment and exemplary service to support the creative vision of our customers,” says Kim Snyder, Chairman and CEO of the combined company. “This acquisition will leverage the best of Panavision’s and Sim’s resources by providing comprehensive products and services to best address the ever-adapting needs of content creators globally. These complementary companies subscribe to the same strategic vision: to support our customers as the category-defining provider of end-to-end production and post-production services.” “Combining the talent and integrated services of Sim with two of the biggest names in the business, Panavision and Saban, will accelerate our strategic plan,” added James Haggarty, President and CEO of Sim. “The resulting scale of the new combined enterprise will better serve our clients and help shape the content-creation landscape,” continued Haggarty. The respective boards of directors of Saban Capital Acquisition Corp., Panavision and Sim have unanimously approved the proposed transactions. Completion of the proposed transactions are subject to Saban Capital Acquisition Corp. stockholder approval, certain regulatory approvals and other customary closing conditions. The parties expect that the proposed transactions will be completed in the first quarter of 2019. For additional information on the proposed transaction, see Saban Capital Acquisition Corp.’s Current Report on Form 8-K, which will be filed promptly and can be obtained at the website of the U.S. Securities and Exchange Commission (“SEC”) at www.sec.gov. Deutsche Bank Securities Inc. and Goldman Sachs & Co. LLC are serving as financial advisors, capital markets advisors and private placement agents and Skadden, Arps, Slate, Meagher & Flom LLP and Dentons Canada LLP are serving as legal advisors to Saban Capital Acquisition Corp. Houlihan Lokey, Inc. is serving as financial advisor, Citi is serving as Capital Markets Advisor, and Kirkland & Ellis, LLP and Osler, Hoskin & Harcourt LLP are serving as legal advisors to Panavision. Marckenz Group Capital Advisors is serving as financial advisor and Stikeman Elliott LLP is serving as legal advisor to Sim.
  7. Hi, I've recently entered pre-production on a low budget feature I recently finished writing, now storyboarding awaits. I'd like to shoot the entire storyboard on set. Been looking into the Artemis app. It would be great to use the app to find the right focal length for each shot and add that information to each storyboard picture. Does anyone know how the metadata is stored in Artemis if you sync it to dropbox? Would be great to add the pictures and metadata into something like studio binder or shot lister, so that you know exactly what lens to use for each specific shot. It would be a huge timesaver, which would suit us perfect considering this'll be a run & gun type production. 30$ i a cheap investment if you use the app a lot, but it's expensive (for me at least) to just try it out, so I thought I'd check in with you guys before I decide. How would you go about storyboarding a low budget feature as a DP if you had access to the set and actors in pre-prod? Drawing the entire storyboard would be nice, but I don't think it'd be the most efficient approach, and I also feel that being able to go through the entire script on set with the actors when doing the storyboard would be great for both me as DP / Director and the actors. Any thoughts or ideas would be much appreciated! Thanks, Patrik
  8. RED to Host Panels Featuring Henry Braham, BSC, Dean Devlin and Red Bull Media House Cine Gear attendees can find RED Digital Cinema at booth S301 in Stage 2 of Paramount Studios at Cine Gear Expo 2017 from June 2-3. RED’s visitors can interact with a range of DSMC2™ cameras, including SCARLET-W™, RED EPIC-W™, WEAPON® 8K S35 and WEAPON 8K VV. Additionally, RED will be demonstrating the latest workflow options - covering their new image processing pipeline (IPP2), HDR, and 8K. RED will be joined in their booth by a wide variety of manufacturers of lenses, camera modules and accessories including RT Motion, Kippertie, foolcontrol, and Offhollywood. RED will also take attendees behind the scenes with two panels in Paramount’s Sherry Lansing Theater: June 2 at 12:45pm: Director Nicholas Schrunk and key members of Red Bull Media House’s production team will discuss the making of the feature documentary Blood Road. Blood Road follows the journey of ultra-endurance mountain bike athlete Rebecca Rusch and her Vietnamese riding partner, Huyen Nguyen, as they pedal 1,200 miles along the infamous Ho Chi Minh Trail. Their purpose was to reach the crash site and final resting place of Rusch’s father, a U.S. Air Force pilot shot down over Laos. Panelists will discuss the creative and technical process, and how this incredibly demanding story came together visually. Moderated by Andrew Fish of American Cinematographer. June 3 at 11:45am: Guardians of the Galaxy Vol. 2 director of photography Henry Braham, BSC will talk about filmmakers’ escalating preference for shooting with larger format digital cameras. Braham shot Guardians of the Galaxy Vol. 2 entirely on the RED WEAPON 8K VV. He will be joined by well-known producer, actor, and writer Dean Devlin. Devlin will discuss how new formats are driving an evolution in filmmaking for the big screen and touch on his experience using RED including WEAPON 8K S35. Moderated by Carolyn Giardina of The Hollywood Reporter. All of RED’s cameras offer superior image quality, incredible dynamic range, and simultaneous recording of REDCODE® RAW and Apple ProRes or Avid DNxHR/HD, combined with cutting-edge performance in a compact, lightweight, and modular design. For more information, visit http://www.red.com/cinegear-2017.
  9. INTERNATIONAL CINEMATOGRAPHERS GUILD TO HOST THREE PANELS AT NAB SHOW (April 24-25) GAME Of THRONES: Behind the Scenes with the Filmmakers GHOST IN THE SHELL: Creating a Cyber Future NEXT GENERATION IMAGE MAKING – Taking Content Creation to New Places The panels, hosted by the The International Cinematographers Guild (IATSE, ICG Local 600), at the NAB Show (April 22 – 27,) will examine advanced and future imaging techniques, including the use of machine intelligence in pre and post production. The discussions will be set within the context of high profile, consumer content including Game of Thrones and Ghost in the Shell. All sessions will take place at the Las Vegas Convention Center. Monday, April 24, 1:00 PM, Ghost in the Shell: Creating a Parallel Cyber Future, produced in collaboration with the ICG Location: South Hall Upper Room S220 Ghost in the Shell is a sci-fi thriller about cyber-enhanced soldier, Major (Scarlett Johansson), her battles against hacker terrorists and her hunt for her stolen identity and its perpetrators. Combining artistry with cutting edge innovation, the filmmakers designed unique tools and systems, including an original, anime inspired 28-color palette; custom LED lights; a video-photogrammetry rig for 3D video inserts; and a DCI P3 workflow from camera through color finishing and deliverables. See clips and hear stories about the making of this visually groundbreaking motion picture based on the original Japanese manga. Panelists: Jess Hall, BSC, Director of Photography (Transcendence, Brideshead Revisited, Hot Fuzz) John Dykstra, Visual Effects Designer (Oscar Winner: Spider-Man 2, Star Wars, Emmy Winner: Battlestar Galactica) Michael Hatzer, DI Colorist, Technicolor Justin Wagman, Post Production Supervisor Moderator: David Geffner, Executive Editor of ICG Magazine Monday, April 24, 2:30 PM Game of Thrones: Behind the Scenes with the Filmmakers South Hall Upper Room S220 With its spectacular battles, dragons, reverberating hugs, and death winnowing storylines, season 6 of Game of Thrones ratcheted up the tension and raised anticipation for an eventual, epic series conclusion. The season was shot simultaneously by director/ cinematographer pairs rotating between stage and location (as is the custom), spotlighted a cast of hundreds and featured nearly 1900 visual effects shots. Learn how the filmmakers applied the Game of Thrones naturalistic lighting style to new locations and storylines, how they chose to block and edit the emotional turning points, and how they harnessed postproduction to visually unify the footage. See some great footage and understand why, in 2016, Game of Thrones drew an amazing 25.1 million viewers across all major platforms. Panelists: Bernadette Caulfield, Executive Producer Greg Spence, Producer Anette Haellmigk, Director of Photography (Emmy Nominee: Game of Thrones) Jonathan Freeman, Director of Photography (Emmy Winner: Board Walk Empire, Emmy Nominee: Game of Thrones) Moderator: David Geffner, Executive Editor of ICG Magazine Tuesday, April 25 10:30 – 11:30 a.m. Next Generation Image Making – Taking Content Creation to New Places North Hall. Room 251 Light field, volumetric capture, computational photography, real-time rendering, and generative imaging have the potential to transform image making for all forms of content. Already, they are contributing to the blurring lines between live action and computer-generated imaging, and between what takes place in preproduction, production and post. What is science behind these new technologies and how do they work? What are their current limitations and promise? Glimpse what NAB’s Central and South Hall could look like in five to ten years… Panelists: Andrew Shulkind, Director of Photography (Clients include: Paramount, DreamWorks, Sony Pictures, Apple, Adidas, AT&T, Budweiser, Google, Old Spice and Samsung.) Jon Karafin, CEO Light Field Lab. Inc. Gavin Miller, Head of Adobe Research. Steve Sullivan, General Manager, Holographic Imaging Microsoft Moderator: Michael Chambliss, technologist and business representative for the ICG, IATSE Local 600
  10. Irvine, CA - January 20, 2017. RED, Atomos, G-Technology and Teradek will host the Digital Workflow House from Jan. 22-23 during the Sundance and Slamdance film festivals in Park City, Utah. The Digital Workflow House is a two-day event featuring exclusive discussions with creative and technical industry leaders, hands-on interaction with innovative technology, and networking opportunities for attendees. The Digital Workflow House will be open 10 a.m. to 4 p.m. on Jan. 22, and 10 a.m. to 7 p.m. on Jan. 23 at the Treasure Mountain Inn (255 Main Street). Registration is suggested, as the panels are expected to be popular and space is limited. Filmmakers interested in coming by the Digital Workflow House can RSVP here. A schedule of events includes: JANUARY 22 10-11 AM: FIELD OF VIEW (panel) 11:30 AM - 12:30 PM: WANDERING THE SPHERE: TELLING STORIES WITH 6 DEGREES OF FREEDOM (panel) 11:30 AM - 1 PM: DEMO OPEN HOUSE 1:30-2:30 PM: WOMEN IN FILM (panel) 3-4 PM: DEMO OPEN HOUSE JANUARY 23 10-11 AM: THE AIMLESS GAZE: THE ROLE OF CINEMATOGRAPHY IN SPHERICAL VIDEO (panel) 11:30 AM - 1 PM: DEMO OPEN HOUSE 1:30-2:30 PM: THE ECONOMICS OF WORKFLOW (panel) 3-4 PM: DEMO OPEN HOUSE 5-7 PM: LOUNGE AT THE DIGITAL WORKFLOW HOUSE (networking event)
  11. First post on this forum. I must say, having never actually worked with film before, I'm glad I found this website. Very informative. DISCLAIMER: Never having worked with film, my knowledge comes entirely from the internet, no hands on experience. So if I say anything that sounds absurd, just let me know. Having always used digital cameras, I want to shoot a movie on Super 16mm film, and have it finished on 35mm film. I've been trying to work out the workflow to get from the exposed negatives to that final print. I want to have it photochemically color timed, preferably without ever having a Digital Intermediate. Now, if I were shooting on 35mm, I would simply color time, make the inter negative, and make copies from there, all analog. But with super 16, there is the sticky problem of having to blow it up to 35mm, and from what I've read, there are many ways to go about this. There are several discussions already on this forum, but most of them are over 10 years old, and the technology seems to have changed rather significantly since then. I have an idea for some possible workflows, but I don't know if they would actually work the way I want them to or not. One of them is to edit the 16mm film together, have it color timed, then optically blown up to 35mm, but I'm not sure if the colors would translate well (I've read conflicting statements, but some say that an optical printer can't reliably transmit the colors, meaning it might have to be retimed.) If that were the case, I could have it edited, optically blown up to 35mm, then color timed, but that adds the cost of working with more 35mm in the process. For another option, and I wouldn't really mind this as long as I didn't have to digitally alter the colors, but I could edit the 16mm film, color time it, then data scan it at 4k(Not that much more expensive than 2k) then downscale it to 2k(or not, if printing 4k weren't much more expensive, but I don't know.) and have it printed to 35mm film. The problems with that, however, after it was scanned, I don't know if you would have to digitally alter the colors, or if the direct scan can be printed back without any processing. If it were the case that I would have to mess with the colors digitally anyway, then another thought was that I could edit the 16mm film, scan it without color timing, print it back to 35mm film, then photochemically color time that copy. But, that might be absurd. I don't really know, but I feel like after it was scanned, then printed, there might be some information taken from the film that makes photochemical color timing less effective, since you're just working with what a digital printer put on it, not the original analog goodness. That's a bit of a book, so I'll summarize my specific questions: 1. Assuming both processes were done properly, which would be less expensive, optical blow up, or scanning then printing back to film? (I have no reference for cost for digital printing or optical blow up. As far as I know, in this day and age, one could be far cheaper than another.) 2. Assuming both those processes were done properly, which do you think would give the best results? (Knowing that I want a photochemical timing done.) 3. Will an optical printer transfer the timed colors properly, or would it have to be timed again? 4. Will a scan of an already color timed print properly transfer the colors when printed back to film? Or would the colors still have to be digitally altered before printing. 5. Can and untimed scan that has been printed back to film still be photochemically color timed, or is that absurd? 6. Sort of related to the first question, but any reference as to how much printing 2k and 4k digital to film would cost? I can't find any information on the cost like you can with scanning. Also, specific costs of the optical blowup. I feel like some people are going to ask, "Why not just use a digital intermediate, instead of photochemically color timing? it would give you much better results, and be cheaper", and they're probably right, but it's just a hands on artistic thing. I'm relatively young, and grew up in a world that is entirely digital. Watching actual 35mm films at a theater is like a distant childhood memory, as most theaters have long been digital. And making movies with digital cameras is all I have ever done, but quite frankly, I'm at a point where I would like to create movies the same way my favorite movies from decades past were created, even if it slightly compromises visual clarity.
  12. Hi, I stopped shooting 4-5 years ago due to a sports injury which lead me to working in another field but now I want to start shooting again and I'm feeling a bit behind in the tech, workflow and rusty in my skills. I'm lucky enough to be visiting LA early next year & I'm looking at the ASC Masterclass. I'm concerned it's not really hands on? Is it? Isn't it? I see they're using the Alexa and going through the most current workflows etc but is it mostly learning through demonstration or are we able to get our hands on the tech, lights & set up some shots? Anyone who's done it got any reviews? What did u get out of it? Can anyone recommend an int-adv Cinematography workshop that's very hands on in or around LA? I don't often get over to the US so I'd really like to grab the opportunity. Thanks.
  13. i'm shooting a music video mostly in 60P with lip sync, and want to be able to timecode slate it so, for example, i can shoot a 15 second segment mid-way through the song and know exactly where to slot it into my edit. the tricky part of course is that my camera's framerate is 60P while the project framerate will be 24P. i won't have access to a proper professional slate with timecode (not that i'm sure it would help?) - i was planning on holding up a laptop running timecode from the song to use as a slate. the issue is, if i'm playing back the song on set at 250% with timecode so that it matches the 60P being recorded on my A7S, that timecode obviously won't match the A7S footage when everything is slowed down to 24P. i'm wondering if my solution (though definitely not frame-accurate) is as follows: - export a video file of the song at 24P with a timecode overlay. - re-import that video into my editor and increase speed by 250% so the timecode is sped up as well. - play back that sped up video with sped up audio / timecode on set - slow everything down by 40% for editing and have a rough idea of where footage should fall in the timeline. any advice, better methods of doing this? note: i am using outdated editing software (FCP7) if that makes a difference to this workflow..
  14. Award-Winning Technology Built by DigitalFilm Tree Experts NAB BOOTH: SL9016 (AWS) LOS ANGELES - Critique 4, developed by the people who brought you DigitalFilm Tree, will be unveiled at the 2016 NAB Show in Las Vegas, April 18-21. This new version of the popular cloud-collaboration software will be showcased at the Amazon Web Services (AWS) booth (SL9016). Critique 4 is already utilized by award-winning productions such as Modern Family, The Simpsons and NCIS: Los Angeles, among others. "Critique has undergone a significant upgrade and offers many new features and security controls to take clients to the next level," says Critique CTO Chris Chen. "It is also the first time we have deployed the platform on AWS." Technologically advanced, Critique 4 is a secure digital media asset management (MAM) platform with extensive features to support creative processes and production workflow for both the media and entertainment space as well as enterprise. Built to be extremely easy to use, Critique facilitates collaboration through real-time chat, live annotations, and secure sharing over the internet to deliver productions on time and on budget. Real-time chat and drawing annotations are viewable across the Web and iOS - they even work with the new Apple Pencil for iPad Pro. Designed to improve workflow, the software facilitates every step from protected dailies screening to VFX workflows to post-production to distribution while capitalizing on enterprise-level security to protect valuable assets. Critique 4 was born of the minds of its executive team: Guillaume Aubuchon, a veteran in the production space having worked on such projects such as Her, NCIS:LA and Angie Tribeca, and Chris Chen, a heavyweight in production streaming space and the former CTO of DAX. With its ability to leverage DigitalFilm Tree's post-production facility, Critique is built to ensure it works in real-world media environments. Critique's new relationship with AWS is key to version 4. "AWS is not only the largest cloud provider, but they are the cloud provider of choice in the M&E space," says Aubuchon, CIO of Critique. "Our infrastructure shift to AWS afforded us the ability to architect the software to leverage the range of services in the AWS cloud platform. It allowed us to build Critique 4 from scratch in a matter of mere months." One of the new exciting features of Critique 4 is its ability to index Amazon Simple Storage Service (Amazon S3) to allow companies to manage their own content inside of Critique's award-winning interface. It also offers high-performance cloud MAM for simultaneous video and document management: Users can collaborate with Critique's review, approval and annotation workflows not only for video but also for production documents including scripts, graphics and still images. "Digital Rights Management (DRM) protection is rarely used, if at all, for unreleased content, which is arguably where it is needed the most," notes Chen. "Critique was designed to leverage DRM invisibly throughout its video distribution system on desktop, web, and mobile environments. This allows Critique to break through the legacy walled-garden approach, allowing a new level of flexibility in collaboration while maintaining security. But we do it in such a way where the users don't even know it's there." The ability to share assets in this way expands its mobility and Critique is available via web, phones, tablets and Apple TV. The video service is backed by a true CDN running multi bit-rate video to prevent glitches on any platform. "Users can take advantage of Critique anywhere - in their office, living room, the subway, or even a plane," explains Chen. "And it will be true to the original media. "One of the challenges of working offline is that people don't want to wait for content to download on their device before they leave," notes Chen. "They need to be able to grab and go. With Critique's new Sync feature, a push-based system that downloads selected content to your device while you are in the office or at home. It is ready to go when you are. For executives, there is an admin mode which permits assistants to specify content to push to an executive's device." Other highlights of Critique 4 include: Storage, archiving and management of Raw material Automatic transcoding of Raw material into a proxy format for viewing Granular permissions on files, folders, and projects Easy-to-manage sharing functions for users outside the system with the ability to time-limit and revoke/extend individual permissions Customizable watermarking on videos While Critique was born in the creative and operations side of the media and entertainment market, it is extending to enterprise, small to medium-size businesses, publishing, education and government/military sectors. "The diligence and demands of working with studios, and our ability succeed in that high-demand, high-output environment, has road tested our video and document management and our enterprise-level security, making Critique a perfect fit for new tiers of users," says Chen. This latest version of Critique is available now for a free 30-day trial (AWS usage fees apply). Pricing is extremely competitive with 10, 20, 50 and 100 user levels starting as low as $39 per user. Enterprise level contracts are available for larger projects and companies with multiple projects. The fee includes unlimited streaming of current content and 24/7 white-glove tech support. AppleTV, Apple iPad and iPhone apps are also included. For a nominal fee, users can add DRM, high-resolution cloud transcode and storage for camera raw and mezzanine files. For more information on Critique, visit www.critiquecloud.com, or follow us on Twitter.
  15. I was wondering if anyone had any advice on ARRIRAW workflows. I am getting ready to shoot a feature with the D-21 recording ARRIRAW externally to an S.Two OB-1. I am most looking at most likely using dpx sequences for editing, but I was wondering if anyone had some advice in this area. Thanks in advance.
  16. Hello all, I am in my final year of studying film production and I am currently doing research about cinematography and color grading. I am trying to find out more about the collaboration/ workflow between the DP and the Colorist and the issues regarding the authority over the look. I am also looking into whether new, aspiring cinematographers should be their own colorists. I was wondering if you could please share your opinions about this from your experiences so far. As a Cinematographer, how were your collaborations with Colorists ? (and vice-versa). What can DPs/Colorists do to ensure they can work together successfully? If you are working both as a Cinematographer and a Colorist, why did you choose to have both roles and how difficult was the transition from cinematography to color grading? Your help would be much appreciated!
  17. Hello all, For the current film I'm working on, I had to go through an odd process to do what I needed to it. First, I imported the footage into Premiere so as to trim it down and piece it all together. Then, I opened up After Effects, and imported the project from Premiere into AE so as to do some edits to it that I could only do in this program. Then, after finishing these edits, I opened Premiere back up and imported this AE timeline that I had added the edits to into a timeline on Premiere. Basically, I did all this without ever having to export from a program. Just some specs: I use Windows 7 with 8gb of RAM. Also, the video files I used were ProRes HQ. Everything turned out fine, but there is one inconvenience: when I play the final video in Premiere, it has extremely choppy playback (even on 1/4 quality). I did do some pretty heavy edits to it in After Effects, but my question is, is this the best way to go about doing something like this? Or should I have exported an H.264 file from AE, then import it into Premiere? Thanks a lot! John
  18. Hello everyone, I have researched online and can't come to any solid conclusions about how sound design and/or mixing is done when you're working optically. For instance: I'm making a movie and seriously considering taking it the full photochemical route, ie. shooting on 35mm scope, processing the film at Fotokem, getting a work print made (no DI), cutting the film on a flatbed, conforming the negative, timing the answer print, and striking a release print. The idea is to keep my movie completely off a computer. But the thing I can't seem to wrap my brain around is the audio part of the process. How do I sync the separately and digitally recorded dialog to my work print? How do I mix in the music I want? Most importantly, the sound effects? I tend to have substantial sound design in my films, sometimes 150 tracks or more, and spend around 80% of my post production process on sound. Is there a way to do this optically? Should I just go with a DI? I hear a lot of terms like sepmag, and 35 sound mag, but I'm not really sure what they are or how you edit with them. Thanks in advance for your help. Colin
  19. So I'm on a project that is based around multirotor helicopters. Needless to say I have a lot of gopro footage coming my way. I was wondering if anyone had any advice or leads to good resources on matching gopro footage and cinema log footage in the grade I would be much appreciated! Here is what I've been capable of so far. Ignore the last 18 minutes it's black video there was an encoding error. http://youtu.be/Isge70aoLtE
  20. It looks absolutely awesome... but it's a bit pricey. I'm wanting to add a few extra ports to how I'm working as a data wrangler (I'm primarily an Editor) and this Echo 15 looks like it may be twisting my arm. However, the price tag had me searching for other solutions. I found one other dock that had USB 3.0, eSATA, Thunderbolt & FW. That was the Akito Thunderbolt Dock (with a couple less USB 3.0) and there's not much in the price difference. If I was to chose from these two the winner would be the Sonnet for sure. I thought about it and tried searching a little more... keeping in mind that I still own an Echo ExpressCard Pro, which connects via Thunderbolt. I use this mainly with an eSATA adapter. I tried to find a cheaper solution (we're all about saving money and we know it!) and I found this... the Elgato Thunderbolt Dock. It has 2 Thunderbolt ports, 3 USB 3.0, HDMI, Ethernet and it's less than half of the price. I know it doesn't have eSATA but... I have a question (at last, I got to the point!): If I connect my Echo ExpressCard into the spare Thunderbolt port of the Elgato (thus giving me eSATA connectivity) will transfer speeds be massively different compared to the Express 15? or does adding that ExpressCard screw things up a little? Overall, my suggested solution won't be a tidy little box like the Echo 15 is but it should be a cheaper alternative right? Am I being naive or should I just shut up, bite the bullet and get an Echo 15? Any help or insight would be great. Help a newbie out... Thanks guys. Links: SONNET http://www.sonnettech.com/product/echo15prothunderboltdock.html AKITO http://www.amazon.co.uk/Akitio-Thunderbolt-Computer-Docking-Multilink/dp/B00INMUTUS/ref=sr_1_2?ie=UTF8&qid=1409246154&sr=8-2&keywords=Akitio+Thunder+Dock ELGATO http://www.amazon.co.uk/Elgato-Thunderbolt-Dock-Apple-MacBook/dp/B00HWR3XGW ECHO EXPRESSCARD http://www.amazon.co.uk/Sonnet-Echo-ExpressCard-Thunderbolt-Adapter/dp/B0080MQJJ6/ref=sr_1_1?s=computers&ie=UTF8&qid=1409249810&sr=1-1&keywords=echo+expresscard
  21. International Cinematographers Guild to Field Workflow Panel at Cine Gear Los Angeles, June 3, 2014…The International Cinematographers Guild (ICG, IATSE Local 600) is producing a panel discussion on “New Workflow Choices and the Director of Photography” at the 2014 Cine Gear Expo in the Sherry Lansing Theatre on the Paramount Pictures lot this Saturday, June 7, from 10:15 to 11:30. The panelists are Paul Cameron, ASC, Director of Photography (Total Recall, Dead Man Down), Mark Doering-Powell, Director of Photography (ABC’s Super Fun Night; CW’s Everybody Hates Chris), Steven Poster, ASC, Director of Photography, National President, ICG (Donnie Darko; Someone to Watch Over Me), Andrew Turman, Director of Photography (Target, Lexus), Michele deLorimier, Digital Imaging Technician ( DIT) & Phantom Tech (Cover Girl’s Pink; Cream Reunion at the Royal Albert Hall), Bobby Maruvada, Digital Imaging Technician (DI colorist Project X, After Earth) and E. Gunnar Mortensen, Focus Puller (Amityville Horror: Locked In, Cooties). Digital Imaging Technician Joshua Gollish (Skyfall, Prisoners) will moderate. The panel will discuss how Directors of Photography and their crews encounter an ever-expanding range of approaches to color management and dailies. These new paths profoundly impact the DP’s artistic style and every aspect of their job, including on-set operations and creative environment and communication with the director, talent, editorial and postproduction. This group of leading DPs and other camera crew members will examine how their workflow choices on recent, high profile motion picture and TV projects impacted the filmmaking process, the craft and the product. http://www.cinegearexpo.com/83/ Press Contact: Rick Markovitz Weissman/Markovitz Communications rick@publicity4all.com 818-760-8995
  22. Here is a tutorial video I posted on YouTube on how to get timecode on DSLR clips and other post production workflow tips. Mac centric.! Or you can read my blog post at http://remoteaccesspost.com/adding-timecode-to-dslr-footage/ Hope you find it useful!
  23. Hi guys, After watching the excellent THR cinematographers roundtable, I feel compelled to draw more attention to one of the key issues they discuss - the lack of control of the final image in todays modern world of different screens and projector standards. I've been lucky enough to have a few films I've shot recently coloured by Rob Pizzey and Adam Glasman at Co3 in London and we've achieved results I've been delighted with in their colouring sweet, both in the P3 space on their projector, and in Rec709 on their Dolby monitors. The problem however, is when we output to home deliverables in Rec709 h264, the image is so drastically different on my home macbook, iPad, TV that I find when sharing it I'm constantly having to explain and apologise for the final quality. Personally I've found this much less of a problem with DCPs in theatres, but modern displays at home are surely capable of better. It strikes me that rec709 (created in 1990 for TV) surely can't still be the answer for grading to a standard that will look good across all devices. Of course the issue is in two parts here, but is there a push anywhere to standardise a modern colour space and standard across hardware, that can reap the benefits of more dynamic range, contrast and better colour representation? I've heard about rec2020, but as far as I can see this seems to be just for 4K TVs? Would love to hear your thoughts and hopefully there's an answer out there in the works, to ensure a more modern and standardised presentation of the work we do in peoples homes and on their devices. Eben
  24. Hello there, I am a freshmen undergrad in film school and recently got the position to work as a DIT and editor on an upcoming short. The short is a collaborative effort of one of our school clubs where we managed to get together to funds to shoot with an epic. During the shoot, which is coming up this spring, I am to apprentice a professional DIT and learn the ropes. Eventually taking over once I am comfortable with the procedures and so forth. I am also going to be grading and editing the RAW footage. So prior to when my apprenticeship begins I would like to have a firm knowledge of what I am getting myself into. So I was curious if anyone had any articles regarding the DIT workflow and responsibilities of the job. Also I am looking for any articles and info on grading RAW footage which I have experience with in terms of stills, and the RAW editing workflow. Despite going into this without any experience in being a DIT, I am a very adequate editor in Final Cut and Premier. Just to let you know I’m not entirely unqualified for the position. Summary: I am looking for all information, advice, and articles regarding being a DIT to a RED Epic, and grading and editing RAW footage. Thanks!
  25. Hi there, I am new to the cinema DNG workflow and I just started working with the Ikonoskop a-cam-dii. Although I have been doing a lot of research on the subject, I would like to have some advice from someone with some experience in the field. My questions is the following: for projects that do not need massive color correction, would it be acceptable to throw away the DNG files after transcoding them to Prores 4444 or Prores HQ 422 and use these Prores files as the master files? What would the disadvantages be in this case, if any? Thanks
×
×
  • Create New...