Jump to content

Some guy sold his Ursa Mini Pro to buy an FS7


Samuel Berger

Recommended Posts

Sure I wouldn't say they have a good foothold in the drama world..Arri have that for ages now..Venice might a look in.. could be too little too late..

 

Actually, I think Sony have a pretty good foothold in the drama world. There's more and more productions where you can't use an Alexa because they want 4k, and Sony have filled that hole very well. Canon cameras are not really regarded as professional equipment, regardless of how good they may look, and there are many people who won't touch RED with a barge pole, despite their improvements. I've shot 4 movies this year alone on Sony cameras, and they are my first choice, after Arri.

 

I think we could probably end this discussion right here (and prevent it from rearing its ugly head again) if we could just convince Tyler that instead of making misleading and erroneous statements about Sony cameras and workflow, he should instead simply state that they are not right for him and the kind of work that he does.

  • Upvote 2
Link to comment
Share on other sites

  • Premium Member

but you were referring to the REC709 type shooting..

I edit mostly everything camera original which is LOG. It's easy to apply a base corrector to an entire sequence. It's also easy to automate LUT's upon import. I never work in REC709.

 

By the way, this is the very same workflow dozens of top editors are using today, especially on Premiere and FCPX.

 

Once cut, it's one button for your project to be in DaVinci with all the camera originals linked. Then you go do your "fine" grading. The great thing is, it's all done on the same bay. This is a super powerful workflow and it's becoming pretty common in the low budget world.

 

The best thing is, no DIT necessary. No transcoding necessary. No "delay" on editing. You can literally start to edit the second the first card is downloaded. Sometimes I'll have a cut of the show done before production has even ended!

 

all these tens of thousands of shoots each year by fs5/ fs7/f5/f55..from the arctic to equator .. are all being done without any problems with very good results .. yet you still struggle with them.. preference is one thing.. but to say Sony and XAVC is "crap" because you may lack the experience and thus skill levels necessary to use them as designed .. is misleading to inexperienced visitors to the site.. sorry to bang the same drum but just pure logic and weight of numbers is against you..

What your missing here is that the editors have NO CHOICE. If they want a job, they have to go through the rigmarole of making XAVC work. Cinematographers dictate the codec the post production people use these days and that's why I'm posting stuff about it here because I personally think that's not the right way to work.

 

The right way to work is shooting with a system that's designed specifically for post production. Nobody gets this... and it drags down post production tremendously. You don't see it because you're already onto the next show, but I see it because I have to deal with the bullshit mess.

 

The worst part is, we have a "triple" system in 2017. Camera original, Proxy/edit and separate audio. This harkens back to the film days where we had OCN, video transfers and separate audio. When will people wake up and realize that digital is NOT film and editors want to work faster but the current workflows prevent this.

  • Upvote 1
Link to comment
Share on other sites

"Sometimes I'll have a cut of the show done before production has even ended!".. but is this the normal way.. I don't think Ive ever worked on anything that has final cut edited before production ended !!.. sometimes an editor is doing a rough cut along the way..but its only rough and then the "real" edit of course will be done later..

 

Ive always been told by editors the advantage of separate sound.. iso tracks etc..and proxy files ?.. editors would prefer want to work with native RAW/4K files ? ..surely this is not usual practice..

 

Each to their own.. but Im really not convinced your work flow is typical.. or ever will be..proxies and separate audio files are very common ..if its working for your type of production all power to you.. but to then say XAVC and Sony cameras are "crap" and fundamentally incapable of originating decent images.. is a nonsense .. when there is so much evidence to the contrary.. ?

Link to comment
Share on other sites

  • Premium Member

"Sometimes I'll have a cut of the show done before production has even ended!".. but is this the normal way..

"A CUT" refers to something that's watchable, not a "final".

 

This is very common practice. In fact, David Fincher and James Cameron pioneered these techniques in the 90's at the beginning of NLE. They literally edit as the movie is being shot and the moment those last dailies are in, there is a cut done. I've worked on shows where we had a watchable cut just a few days after production wrapped.

 

I do have 3 edit bays, but I don't have any assistant editors, the jobs I work on can't afford them. So I generally have to do all the prep work. On $500K + shows, they can usually afford an AE for a week or two, anything under 500K, they usually can't. I've actually done quite a bit of AE work where I needed the hours and they needed someone to go through things and prep shows properly. So I've been on the ground floor with a bunch of pretty big shows. Unfortunately, I like working from home (making up my own hours) too much, so I've not tried to get an online chair somewhere, I couldn't deal with it.

 

I've always been told by editors the advantage of separate sound.. iso tracks etc..and proxy files ?..

Yes, but nothing stops you from running AES directly into the camera and capturing 12 channels right onto the XAVC file. This is the workflow Blackmagic uses on the URSA Pro and it's perfect for people who actually wanna get work done, instead of sync audio for a month.

 

If you knew how horrible the audio syncing process is, even with automated solutions like Pluraleyes, you'd understand.

 

This current show I'm working on right now has 4831 clips. A, B, C camera. So this means I have to first log every single clip. Then I have to take those logged clips and put the A, B, C camera into a bin with the audio. Then if the timecode is perfect, and I mean it has to be down to the MS perfect, then I can push one button and they will sync. Now, I have to create sub clips of each A, B, C camera and put them in their respective bins. It takes between 2 - 5 minutes to do each A, B, C clip.

 

The other way to work is to attach audio directly in camera and NOT WORK WITH PROXY FILES! The 1st assistant camera operator will label the clips in the camera to match the script notes so there is NO labeling necessary in post. The 12 channels of audio is directly put onto the camera as well. Thus, when the editor gets the files from the cards, the clips have everything the editor needs to start editing. The only thing they need to do is create bins for the individual scenes and that can be done automated if you change cards for every scene... which is something I've done before.

 

So it took me 4 days to make the proxy files, 3 days to sync the audio and 2 days to relabel everything into bins. That's 8 days I could have been editing. Also... I don't get paid for prep work. Nobody does in the low budget world. We get paid to edit, so if I'm not turning in scenes the producers can watch, I'm not getting paid. Outside of union shows, this is pretty typical. Prep is a nightmare and most people get paid little to nothing.

 

editors would prefer want to work with native RAW/4K files ? ..surely this is not usual practice..

It's not the usual practice because most editors are old timey and they don't know better. There have been several high profile features done using the methods I discuss and the results have been really good. There is a whole documentary about it on youtube from the guys who made "Focus".

 

Each to their own.. but Im really not convinced your work flow is typical.. or ever will be..proxies and separate audio files are very common ..if its working for your type of production all power to you.. but to then say XAVC and Sony cameras are "crap" and fundamentally incapable of originating decent images.. is a nonsense .. when there is so much evidence to the contrary.. ?

I never said it was typical... I just said that's the direction the industry is moving and it's moving there fast. I predict in the next 5 years, proxy files for 4k shoots will be a thing of the past. We will be editing with the raw material more and more as storage becomes cheaper and the software becomes more capable.

 

My "dislike" for Sony is only exacerbated by their proprietary antique codec's, which were great in 2005, but in 2017 are long in the tooth. At least Pro Res was designed properly... DNXHR maybe the ultimae/perfect futureproof codec as it's based on JPEG2000 which means it's very robust and easy to playback on GPU's.

  • Upvote 1
Link to comment
Share on other sites

This current show I'm working on right now has 4831 clips. A, B, C camera. So this means I have to first log every single clip. Then I have to take those logged clips and put the A, B, C camera into a bin with the audio. Then if the timecode is perfect, and I mean it has to be down to the MS perfect, then I can push one button and they will sync. Now, I have to create sub clips of each A, B, C camera and put them in their respective bins. It takes between 2 - 5 minutes to do each A, B, C clip.

It's a little weird to be moaning about logging every clip. That's something that has to be done regardless of the camera or codec used. And timecode is not designed to be millisecond perfect, only frame accurate.

 

 

 

The other way to work is to attach audio directly in camera and NOT WORK WITH PROXY FILES! The 1st assistant camera operator will label the clips in the camera to match the script notes so there is NO labeling necessary in post. The 12 channels of audio is directly put onto the camera as well. Thus, when the editor gets the files from the cards, the clips have everything the editor needs to start editing. The only thing they need to do is create bins for the individual scenes and that can be done automated if you change cards for every scene... which is something I've done before.

It's not the 1st ACs job to label clips in camera, even if there was a way to match the script notes, which as far as I know there isn't. That's what the clapperboard is for. And the reason we shoot dual system sound is so that the camera doesn't have to be tethered to the sound mixer, because that's a huge pain in ass for both camera and sound departments.

 

 

 

So it took me 4 days to make the proxy files, 3 days to sync the audio and 2 days to relabel everything into bins.

Which is why you have a DIT who does it on the day, and provides editorial with everything they need to get going right away.

Link to comment
Share on other sites

It well might be that 4K can be dealt with directly.. but RAW has to be transcoded regardless .. as Stuart says, I think its penny rich pound poor to not have a DIT on set in your case then .. surely this would save alot of time in post.. how is the AC labeling clips in camera ..in the camera menu.. ?

 

Never heard of anyone having problems syncing up separate audio.. you want it all on the card as you shoot ..?..

 

How has it ever happened that a weeks prep work for post in not considered work and you don't get paid.. frankly I wouldn't work of those projects .. what ever about post work flow.. if not being paid means you want all the audio on the cards and everything to be "easy" ..I would suggest not working on those projects in the first place ..not being paid is definitely a post workflow problem .. :)

 

 

But regardless of any of the above.. I still don't see that Sony camera,s or XAVC codec .. is some how some fundamental barrier to shooting or editing .. when so many others are doing just that.. without throwing themselves off tall buildings ..

Link to comment
Share on other sites

  • Premium Member

It's a little weird to be moaning about logging every clip. That's something that has to be done regardless of the camera or codec used. And timecode is not designed to be millisecond perfect, only frame accurate.

It's a gripe of mine... there is zero reason why clips have to be labeled in post. Cameras are fully capable of labeling properly in-camera, it's just they never made an interface that works. I believe someone has a solution, but I've never been able to test it.

 

And the reason we shoot dual system sound is so that the camera doesn't have to be tethered to the sound mixer, because that's a huge pain in ass for both camera and sound departments.

Just for the record, AES is carried on 3G HDSDI, so it can be sent wirelessly to the camera with no problems.

 

Which is why you have a DIT who does it on the day, and provides editorial with everything they need to get going right away.

Again, when you've got the money, you can afford two or three DIT's to deal with these things. As I've been learning, one DIT, even if they're really good, struggles to deal with the amount of footage we throw at them. It seems you need one DIT per camera in my opinion. Even then, they aren't syncing the audio on set, that never happens.

Link to comment
Share on other sites

  • Premium Member

I think its penny rich pound poor to not have a DIT on set in your case then .. surely this would save alot of time in post.. how is the AC labeling clips in camera ..in the camera menu.. ?

DIT's are only a requirement because we make it a requirement. Yes, there needs to be a "loader" on set because someone needs to download cards. But our cameras are fully capable of labeling and editors are perfectly capable of using the camera originals.

 

Again, there is no reason why editors should have RAW files on their bay because Premiere, DaVinci and even Avid 8, can generate "cache" files on the go. So when you're playing back, you're working with cache files, but when you're paused you can see the full RAW frame. I'm not sure if this workflow works with Sony cameras, but it does with Cinema DNG and RED Code. So thats two out of the 4 RAW codec's.

 

Never heard of anyone having problems syncing up separate audio..

I never said "problems", I said IT'S A LOT OF EFFIN' WORK! Again, would you like to sit at a desk for 16 hours a day for 7 days dealing with this poop, or would you like to be editing?

 

There is NO REASON why the audio isn't on the camera files. Separate audio can be married later if necessary, but the editors shouldn't have to deal with it in 2017.

 

FYI, all of my personal productions, the audio is married to the video files. When I do film transfers, I drop off my timecode audio to the transfer house and they sync it for me as well. I HATE syncing audio, mostly because poop happens and clocks fall off and people make mistakes. Maybe on a $10M movie, things are different, but on our low budget poop stain productions, I've found this to be a real stumbling block, no matter what crews I use.

 

How has it ever happened that a weeks prep work for post in not considered work and you don't get paid.. frankly I wouldn't work of those projects .. what ever about post work flow.. if not being paid means you want all the audio on the cards and everything to be "easy" ..I would suggest not working on those projects in the first place ..not being paid is definitely a post workflow problem .. :)

Eh, I've been paid for prep and I've not been paid for prep. It depends on the contracts and what you can negotiate.

Link to comment
Share on other sites

it's just they never made an interface that works.

So, as I said, it's not possible. Nor is it something that 1st ACs have time to do. They are generally too busy running the camera department to have spare time to do the editors job for them.

 

 

 

Just for the record, AES is carried on 3G HDSDI, so it can be sent wirelessly to the camera with no problems.

And wireless connections are totally reliable, and never break down in the middle of a take...

 

 

 

you can afford two or three DIT's to deal with these things. As I've been learning, one DIT, even if they're really good, struggles to deal with the amount of footage we throw at them. It seems you need one DIT per camera in my opinion.

I shoot with two or more cameras all the time, and have never had more than one DIT.

 

 

It sounds like you're describing some alternate universe, certainly not any set I've ever been on.

Link to comment
Share on other sites

  • Premium Member

So, as I said, it's not possible. Nor is it something that 1st ACs have time to do. They are generally too busy running the camera department to have spare time to do the editors job for them.

It's possible, I just haven't spent any time testing the interfaces because crews are so use to working in a 1940's mentality, they aren't generally open to experimenting with new techniques.

 

There is a beautiful iPhone/iPad app for the Red Cinema cameras. There is a similar interface available for the Ursa Mini Pro as well. So you could assign someone in the camera department to that job as well as others.

 

And wireless connections are totally reliable, and never break down in the middle of a take...

So what you're saying is that on a 28 day shoot with hundreds of hours of material, you may get twenty clips that have no audio? I'll take the risk and because AES audio requires WAY LESS bandwidth than 3G video, it doesn't breakup. I've used this trick on a few shoots and Audio guys were amazed by how good it worked.

 

It sounds like you're describing some alternate universe, certainly not any set I've ever been on. Perhaps you don't really understand how a professional set works.

I'm describing the problems and the solutions to them.

 

When we shoot on film, we're stuck to antiquated techniques because that's the limitation of the medium. When we shoot digital, there are NO EXCUSES! We the filmmakers have free reign to make up any workflow we so choose. The only person stopping you is yourself because I've been using these "alternative universe" techniques for quite sometime, outside of the clip naming one.

Link to comment
Share on other sites

" Also... I don't get paid for prep work. Nobody does in the low budget world. We get paid to edit, so if I'm not turning in scenes the producers can watch, I'm not getting paid. Outside of union shows, this is pretty typical."

 

​This is what you said right..? or am I going mad.. its what came up on my computer ..

 

Why has Sound devices become so dominant ..in just about all markets.. nearly all doc,s are shot this way now.. anything with more than 1 radio mic running.. there has to be a reason that separate audio is so popular no.. ? if it was so trouble some why is 99% of the industry doing it..I think your work flow is not the norm.. but for sure you should use what works for you .. but your very much in the minority it would seem.. how are you recording 12 iso audio tracks to a camera card.. ? .. by wireless ??? really ? thats pretty risky if thats your only audio..

 

28 day shoot. with 20 clips with NO audio !! yes thats a big problem .. are you kidding.. what if thats your main interviews .. let alone drama where every scene is important.. I can just imagine the phone call to the producer.. yes all went well.. we had about 20 shots with no audio .. but over 28 days thats not bad..only a couple of major scenes and we saved money not having a DIT.. and the post guys were happy as they had less work to do.. mind you if they arnt paying you.. they get what they are not paying for..

Edited by Robin R Probyn
Link to comment
Share on other sites

  • Premium Member

On the shows I have been DIT the audio dept has either set up a wireless audio send to the camera by themselves or I have requested one on the first shooting day and the rest of the time it has used continuously. This is only a preview stereo mix from the audio recorder and meant to simplify and speed up dailies workflow, the audio was normally synced again from the original wav files before editing. Just a wireless audio transmit+reseiver like sennheiser or similar. Being able to have dailies without syncing is a huge benefit but the stereo mix was not used in editing because it would be of no use for makimg omf or final audio and wireless has problems every now and then

Link to comment
Share on other sites

  • Premium Member

I have remained quite surprised that as digital has taken over, that NO ONE has built a digital slate into their camera for clip naming and metadata tagging.

It should (visually and ergonomically) be laid out identically to a conventional timecode slate on the assistant's panel, and with the tap of a few buttons, you could match your camera slate to the real thing. Takes would count up automatically, and could be marked as MOS or P/U as needed.

Then the clips coming out of the camera would be labelled "TITLE_SCENE_SHOT_TAKE.filetype", and everyone's life would become much easier.

Hell, these days, your 2nd AC should be able to do it all remotely from the camera using their smart phone and wifi if they want to. They've got the real slate in their hand already.

Audio recorders should also offer EXACTLY the same thing too, so that no one ever has to waste paid hours again, scrubbing through audio or video clips in order to find their mate. Software could pair the audio and video files up together by filename, even if they still need to be manually synced, at least they'd be paired up before you got to them.

I'd always thought this was the collective dream of a drag-and-drop self-contained codec like Prores. But nothing's ever been done with it.

Cameras should even be able to file the clips away into folders labelled with their respective scene numbers. It should all be incredibly easy. I think it's a no-brainer with digital cameras. We're capturing FILES on these cameras, so give us some bloody file management.

Link to comment
Share on other sites

I have remained quite surprised that as digital has taken over, that NO ONE has built a digital slate into their camera for clip naming and metadata tagging.

 

The Ursa Mini Pro 4.6k allows you to enter metadata remotely from an iPad app via bluetooth. And in regards to the syncing files bit, tentacle sync + it's software does this perfectly.

 

That being said, I completely agree with you that slating, file management and syncing between audio recorders and our cameras is still a bloody mess.

Link to comment
Share on other sites

  • Premium Member

 

The Ursa Mini Pro 4.6k allows you to enter metadata remotely from an iPad app via bluetooth. And in regards to the syncing files bit, tentacle sync + it's software does this perfectly.

 

That being said, I completely agree with you that slating, file management and syncing between audio recorders and our cameras is still a bloody mess.

Blackmagic are trying to do something with their metadata. Which is good, but their clip-naming conventions are ludicrous, and incredibly messy.

 

It does need to be a seamless, simple, integrated process (with the option for people to simply have their clips named CLIP0001-CLIP9999 if they don't want anything fancy).

Link to comment
Share on other sites

I'm describing the problems and the solutions to them.

We all agree that this work needs to be done. We just disagree as to whose responsibility it is to do it. Should it be an overworked AC on a high stress set working 14+ hours a day, or should it be a an assistant editor in a nice air conditioned edit suite?

  • Upvote 1
Link to comment
Share on other sites

  • Premium Member

What would be amazing would be a new slate which can read what you wrote on it and digitize that out to all the cameras/recorders on it. That way you have the digital version and a visual and audible back up--- of courses such a thing would require a new standard to be made up lol.

Link to comment
Share on other sites

  • Premium Member

It should not be that difficult to make a new version of a ipad slate app which would just send the same data to the metadata fields of the camera files, for example on ursa mini.

 

the main problem with metadata is that the camera dept guys generally don't have much time to edit them and add descriptions etc. because they need to concentrate to the current/next shot being made.

the task must also be possible to do fully remotely without any need to use the camera even for a second (that would basically stop all the shooting for a moment every time the metadata would need to be edited in camera so it must be able to be done remotely by someone who does not have anything else to do at that moment)

It would thus be easiest having an additional person who is fully concentrating to metadata editing only (someone who is positioned near the director and script supervisor. maybe an assistant script supervisor who would be hired for only this metadata task? or an additional assistant editor hired for the task?

 

audio recordist generally names the audio files with shot and take number based on the slate data he/she sees on the monitor so the audio is easy to find with current workflows without camera metadata altered. if the slate is correct of course.

 

Multi camera shoots with remote metadata editing on all the cameras for every shot?

I don't know how much this is actually done nowadays, do others have experience with for example 2 or 3 camera movie shoots with very detailed metadata for editing purposes applied in camera?

Link to comment
Share on other sites

  • Premium Member

I'd rather have it not be so much an "app" as much as maybe a whole new slate which you can write on, as a normal slate, but which somehow reads what's written and wirelessly sends that data to the camera(s) and audio. It's a whole pipe-dream of course, but who knows, maybe one day as cameras get more "wireless" out of the box someone can make something. I don't think it should be an "App" as much as a piece of actual kit with easily changeable batteries, and relatively rugged.

That way it's less "metadata editing" as it is regular slating which also smartly communicates to the camera(s) and audio what the shot has been called so in post it all has the same names.

Link to comment
Share on other sites

  • Premium Member

28 day shoot. with 20 clips with NO audio !!

I was responding to the comment that wireless transmissions can fail on set. So that would mean if you maybe have 20 clips with no audio on the original video file, you could simply go to the audio recorders files and re-sync them.

 

You always run double system, but the "double" should be a "backup".

Link to comment
Share on other sites

  • Premium Member

What would be amazing would be a new slate which can read what you wrote on it and digitize that out to all the cameras/recorders on it. That way you have the digital version and a visual and audible back up--- of courses such a thing would require a new standard to be made up lol.

That would be pretty awesome and it's 100% doable today. It's just, good luck pulling the denecke out of the hands of your crew.

 

Timecode slates took a decade to propagate through into bigger productions. Heck, when was the last time you heard of anyone using the Aaton or Arri timecode system on film?

 

Nobody adopted these new features because people are so stuck in their assbackwards way of working... which is the reason I brought up this subject to begin with.

Link to comment
Share on other sites

  • Premium Member

We all agree that this work needs to be done. We just disagree as to whose responsibility it is to do it. Should it be an overworked AC on a high stress set working 14+ hours a day, or should it be a an assistant editor in a nice air conditioned edit suite?

Again, I used "ac" as a generalization. I should have said "a person in the camera department".

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...