Jump to content

What hardware affects NLE playback?


Max Field

Recommended Posts

  • Premium Member

It really depends on the kind of codec used.

 

ProRes is a multi-threaded codec, so it actually uses the CPU more then the GPU.

 

Heavily compressed codec's like JPEG2000 (Redcode) and Tiff (Cinema DNG/Targa/DPX) are VERY GPU intensive and CPU matters a lot less.

 

MPEG is the least efficient and works on the CPU.

 

The GTX660 is a pretty decent card, but it really depends on the drivers and if they have the right support for the different codec's. If you're on a PC, that's the biggest problem you'll encounter, codec's and drivers will kill your speed and finding the right combo to work with different software packages, is really challenging. Mac's on the other hand don't have those issues because driver support is built-in and software/hardware manufacturers know this, so they make products that are updatable very easily.

 

Pro Res XQ @ 4k 12 bit is pretty heavy, around 1660Mbps. I've done a lot of testing with pro res 4444 @ 4k and have found 2 drives raided zero works fine. When working with raw Red files, you need a faster graphics card more then anything else. It's a bit heavier then Pro Res XQ, but not much. Still, I suggest either 8gb fiber direct attached 8 drive raid, or if you have a mac with 10Gb thunderbolt and a simple 8 drive raid, that will work fine as well. Where things get harry is multi track playback and dealing with throughput for FX and audio tracks. So you need a lot mover overhead then you'd normally build in.

Link to comment
Share on other sites

  • Premium Member

Misinformation abounds on this subject. The resources of a GPU can be used on a case by case basis by any piece of software that thinks it can get something out of them, or a part thereof, right down to individual filters or particular codec implementations using the GPU as their developers see fit. There are still dozens of filters in, say, After Effects, which could use the GPU very effectively as a matter of mathematical technique, but don't because they were written before it was commonly done or because their developers can't be bothered, or for other reasons.

 

 

 

Heavily compressed codec's like JPEG2000 (Redcode) and Tiff (Cinema DNG/Targa/DPX) are VERY GPU intensive

 

There are JPEG2000 decoders that use the GPU, but most don't, and bear in mind Red would like to keep selling DVS Atomix (er, "red rocket") boards to everyone to do it. Conversely, there's been GPU-based decoding of F55 raw stuff in Resolve for a long time.

 

 

 

MPEG is the least efficient and works on the CPU.

 

Depends which MPEG you're talking about. MPEG-1 and MPEG-2 are barely worth accelerating. MPEG-4 (of which part 10 defines the common H.264 format) is widely handled on GPUs. Particularly, Nvidia cards implemented a specific block of circuitry that they call NVENC in the hardware revision codenamed "Kepler" (implemented in the GeForce 600 series cards) which does H.264 encoding; this is not GPU computing in the more traditional sense as it doesn't use the stream processors, it uses a task-specific bit of hardware that happens to be part of the GPU. More recent Nvidia products support better encoding and h.265.

 

So, it's complicated. What does and doesn't use the GPU is very task- and implementation-specific, and small setup problems can prevent the GPU from being used. Unless we've been told very specifically that a particular piece of software uses the GPU, it's safe to assume that it doesn't. ProRes encoding in FCPX is reportedly GPU-accelerated under certain circumstances, for instance, but other instances of it will not be. The Gaussian Blur filter in Premiere is GPU-accelerated. The Fast Blur filter isn't. It's complicated.

 

 

 

If you're on a PC... codec's and drivers will kill your speed and finding the right combo to work with different software packages, is really challenging.

 

There is only one current driver for any particular graphics card on either platform. While there are a lot of settings that need to be right, on either platform, it is highly unlikely that any particular piece of software would gain or lose the ability to offload work to the GPU based on driver versioning considerations. Naturally, if a recently-implemented feature was required for some particular piece of software, both platforms would need to be updated. These days, the hardware is essentially identical and the software concerns are therefore broadly equivalent.

 

To Macks' original point, we don't have enough information to answer the question. No, you don't need to upgrade your graphics card to handle 4K or 4:4:4 footage. Your graphics card probably won't know or care if you're using 4:4:4 footage on an NLE timeline. You also may not need fast disks to handle 4K material. I have a 4K camera at my right elbow right now which records 100mbps files, producing a data rate that could easily be handled by a single 2.5" laptop disk. If you want to perform advanced grading with uncompressed 12-bit at 48fps in 3D, then you will need massively more.

 

What sort of material are you likely to be working with, and in what software, and what do you foresee wanting to do with it?

 

P

Link to comment
Share on other sites

  • Premium Member

Misinformation abounds on this subject. The resources of a GPU can be used on a case by case basis by any piece of software that thinks it can get something out of them, or a part thereof, right down to individual filters or particular codec implementations using the GPU as their developers see fit. There are still dozens of filters in, say, After Effects, which could use the GPU very effectively as a matter of mathematical technique, but don't because they were written before it was commonly done or because their developers can't be bothered, or for other reasons.

And AE is very Open GL friendly, not CUDA. So depending on the type of graphics card, you will actually get a different feature set from AE.

 

There are JPEG2000 decoders that use the GPU, but most don't, and bear in mind Red would like to keep selling DVS Atomix (er, "red rocket") boards to everyone to do it. Conversely, there's been GPU-based decoding of F55 raw stuff in Resolve for a long time.

The software decoder for Red Code doesn't work well. For 4k it does require some sort of high speed CUDA support. This is why I use fast CUDA cards in machines that work with DaVinci and Red files, it actually works! Put the old card back in, Red won't playback in 4k anymore.

 

Depends which MPEG you're talking about. MPEG-1 and MPEG-2 are barely worth accelerating. MPEG-4 (of which part 10 defines the common H.264 format) is widely handled on GPUs. Particularly, Nvidia cards implemented a specific block of circuitry that they call NVENC [/font][/color]in the hardware revision codenamed "Kepler" (implemented in the GeForce 600 series cards) which does H.264 encoding; this is not GPU computing in the more traditional sense as it doesn't use the stream processors, it uses a task-specific bit of hardware that happens to be part of the GPU. More recent Nvidia products support better encoding and h.265.

MPEG type 1, 2, and h264 has no requirements for hardware support. Meaning, they are designed to function without specialized hardware, which is the polar opposite of Red Code (JPEG2000) and Tiff (Cinema DNG, Targa, DPX), formats which were designed specifically for CUDA hardware support. This is why MPEG is so widely used on camcorders and low-end cameras, because it doesn't demand a lot of power. Sure, you can accelerate anything, but it's unnecessary and not even worth discussing when it comes to editing since .h264 isn't a professional camera codec. Today's 8bit and 10 bit iFrame XAVC codec, is so easy to playback, it works on pretty much anything, but doesn't have bandwidth necessary for professional use.

 

Unless we've been told very specifically that a particular piece of software uses the GPU, it's safe to assume that it doesn't. ProRes encoding in FCPX is reportedly GPU-accelerated under certain circumstances, for instance, but other instances of it will not be. The Gaussian Blur filter in Premiere is GPU-accelerated. The Fast Blur filter isn't. It's complicated.

In post production, most software is GPU enhanced. (avid, FCP, Premiere, DaVinci) Any real-time effects will be CUDA supported, so yes if you're using a lot of real-time effects, then it maybe smart to invest in a faster graphics card. Software like DaVinci has strict requirements on hardware GPU acceleration and I've tested a myriad of cards and workflows with the other software and you will see small advantages in real-time effect playback speed. However, that wasn't the original posters question. He asked about playback, not about editing.

 

 

There is only one current driver for any particular graphics card on either platform.

Actually not true at all. In the PC land, there are drivers for specific software packages. AE for instance, has it's own home brew driver package for Open GL support. It's installed when you run the Adobe installer and we've found it to be poorly coded. We've done extensive testing with 3rd party drivers, disabling the Adobe drivers and have found that windows 8 causes a whole bunch of issues with support. So it's not that easy on the PC platform and we've had to revert back to Windows 7 on some machines in order to make the older, better drivers work properly. With the Mac's, the drivers are built into the operating system and the software installers doesn't overwrite them. This is a HUGE benefit for quick setup and after spending months trying to get the PC's to play nicely and literally an hour getting the mac's to play nicely, it's a word of advice that customers who use PC's be aware of these driver issues.

 

Your graphics card probably won't know or care if you're using 4:4:4 footage on an NLE timeline.

It sure does! Full RGB displays entirely differently then compressed RGB because it's greater then 10 bit. Most NLE's only support 10 bit playback, so it has to subsample on the fly to 10 bit for playback.

 

I have a 4K camera at my right elbow right now which records 100mbps files, producing a data rate that could easily be handled by a single 2.5" laptop disk.

Yep and that's a consumer format. In the professional world, that would be rejected by pretty much every facility I've worked at. To give you an example, minimal requirements for HD are 10 bit 4:2:2 @ DNX115. Most facilities require 220Mbps Pro Res HQ as their "base" camera format. For 4k work, everyone requires 12 bit 444 in true 4096x2160 because that's the same format used for DCP delivery. So if you wish to play in the sand box with the kiddies, go ahead and use 100Mbps MPEG, it will play off a flash drive no problem. Professionals like Macks who is asking about a 12 bit 444 format (probably Pro Res QX) are clearly looking to do professional post production, which is my expertise.

 

Thanks for tearing apart my post, but all day long I travel around So Cal installing editing systems for companies that produce products you've seen on television, on BluRay and in the theaters. :)

Link to comment
Share on other sites

What sort of material are you likely to be working with, and in what software, and what do you foresee wanting to do with it?

Just bought a Sony F3, plan on doing either Prores422 or Prores4444 at 1080p.

Later on, after the Raven's release causes price drops, I plan on upgrading to a RED One MX, recording in the same Prores formats but trying my hand at 4k footage. Would like to work with all of this footage in Sony Vegas.

Link to comment
Share on other sites

  • Premium Member

It wasn't in any sense my intention to be combative or attack you unfairly. My principal thesis is, as I've said, that it's not not necessarily easy to evaluate what effect a particular GPU will have, and I don't think you're disagreeing.

 

That said, there were some misapprehensions in your first response, and there are more in your second, which could cause people to spend money inappropriately and I think it's worth discussing them.

 

 

 

MPEG type 1, 2, and h264 has no requirements for hardware support... which is the polar opposite of Red Code (JPEG2000) and Tiff (Cinema DNG, Targa, DPX), formats which were designed specifically for CUDA hardware support.

 

This is not the case. TIFF is a thirty-year-old still image format that found use mainly in publishing. It is completely unconnected with CDNG, Targa, or DPX. TIFF, Targa and DPX were all developed a long time before GPGPU existed and have no connection with it. They are almost invariably used to contain uncompressed material and GPGPU is not very relevant. Cinema DNG has a very slim connection, if you want to go looking for one, because it is generally used to contain raw Bayer data, or compressed data, which can be recovered to uncompressed RGB using GPU resources. However, this is far from specific to DNG and the file format itself was not in any sense "specifically designed for CUDA hardware support".

 

 

 

h264 isn't a professional camera codec. Today's 8bit and 10 bit iFrame XAVC codec, is so easy to playback, it works on pretty much anything, but doesn't have bandwidth necessary for professional use.

 

The 4:2:2 modes of XAVC-I record at about 600mpbs in 4K. ProRes in 4:2:2 4K records 734. Beyond that, F55 raw is only about a gigabit, whereas ProRes 4444 is 1.1, so you may as well take the raw. I'm no great fan of vendor lockin on codecs, but the capability is very similar.

 

 

 

 

There is only one current driver for any particular graphics card on either platform.

Actually not true at all.

 

There is only one current driver for any particular graphics card. There may be more than one OpenGL stack, but that's application space.

 

 

 

Full RGB displays entirely differently then compressed RGB because it's greater then 10 bit. Most NLE's only support 10 bit playback, so it has to subsample on the fly to 10 bit for playback.

 

You're confusing compression and bit depth. It's entirely possible to have 8-bit uncompressed RGB. It's entirely possible to have 12-bit compressed RGB. It's also possible to have non-RGB component video at any bitrate with or without compression.

 

More to the point, none of this has anything to do with the UI display. I repeat: the video you have on the timeline makes very little difference to the GPU that may be providing your user-interface display. Almost all UI displays are 8 bit, and the video will be internally resampled to suit. This work may or may not be done on the GPU but in either case it is trivial enough that any even vaguely modern GPU could handle most common cases, regardless of the pixel format of the timeline video.

 

GPGPU resources may be stretched further by higher-resolution or higher-frame-rate video, but they are entirely commoditised and don't care what the timeline format is beyond the requirement that there's enough to go around. Running CUDA utilisation monitors while playing with modern postproduction software reveals that most of them don't get anywhere near maxing out the hardware, and huge spends on Nvidia Quadro boards are almost always unnecessary.

 

 

 

Yep and that's a consumer format.

 

 

Arguably, but here's the point: it's all uncompressed float by the time the software actually starts working on it.

 

The problem I'm trying to address here is that it's common to hear about people spending huge amounts of money in order to do things which are really quite trivial. It's not necessary to do that. If you've been running around specifying huge systems to do basic jobs, fine, it'll work, it'll just be unnecessarily expensive.

 

P

Link to comment
Share on other sites

  • Premium Member

Just bought a Sony F3, plan on doing either Prores422 or Prores4444 at 1080p.

The camera is a 10 bit, so max color space would be compressed 422. If you use an external recorder, it doesn't make any difference because the actual output from the camera is compressed 422. You'd have to move to an F5 to get 444 color space, which is 12 bit.

 

Later on, after the Raven's release causes price drops, I plan on upgrading to a RED One MX, recording in the same Prores formats but trying my hand at 4k footage. Would like to work with all of this footage in Sony Vegas.

Pro Res is not compatible with PC's. There are hack plugin's which allow you to use the codec, but it's inefficient (slow) and will always be slow to playback no matter what. Sony Vegas has a Pro Res plugin, but it only supports HQ. For Pro Res XQ or 4444, you will need a different editor.

Link to comment
Share on other sites

  • Premium Member

For pity's sake, Tyler, there's a 4:4:4 RGB upgrade for the F3.

 

 

 

The camera is a 10 bit, so max color space would be compressed 422

 

 

444 color space, which is 12 bit.

 

...and these concepts are entirely unrelated.You can have 10-bit 4:2:2. You can have 10-bit 4:4:4. You can have it compressed or uncompressed.

 

 

 

 

Pro Res is not compatible with PC's. There are hack plugin's which allow you to use the codec

 

Quicktime for Windows has supported reading ProRes for years.

 

There are third-party plugins which support writing ProRes. Some of them are approved by Apple, some aren't. Definition of "hack" is at your discretion, although I'd be careful.

 

P

Link to comment
Share on other sites

  • Premium Member

This is not the case. TIFF is a thirty-year-old still image format that found use mainly in publishing. It is completely unconnected with CDNG, Targa, or DPX. TIFF, Targa and DPX were all developed a long time before GPGPU existed and have no connection with it.

Interesting, I was part of the committee who wrote the book on motion tiff sequences. The patents that Adobe holds for Cinema DNG were developed by the company I worked for! LOL

 

DPX, Targa and Cinema DNG are ALL 24 bit Tiff formats. I like to use the word "tiff" because it's easy for someone with photoshop experience to understand what's behind those three formats. Yes, there have been A LOT of modifications over the years, including CUDA support.

 

The 4:2:2 modes of XAVC-I record at about 600mpbs in 4K. ProRes in 4:2:2 4K records 734. Beyond that, F55 raw is only about a gigabit, whereas ProRes 4444 is 1.1, so you may as well take the raw. I'm no great fan of vendor lockin on codecs, but the capability is very similar.

Yes, XAVC-I @ 60FPS is around 600Mbps, but most people shoot at 24, 25 or 30fps, which is a more respectable 450Mbps. Remember, megabits per second, so not very much data.

 

Yep and raw is 24 bit and around 250MBps megabytes for 4k (not UHD)

 

There is only one current driver for any particular graphics card. There may be more than one OpenGL stack, but that's application space.

Sure, but the CUDA and Open GL drivers are separate.

 

You're confusing compression and bit depth. It's entirely possible to have 8-bit uncompressed RGB. It's entirely possible to have 12-bit compressed RGB. It's also possible to have non-RGB component video at any bitrate with or without compression.

Since there isn't a 444 10 bit format, I was merely using the phrase "10 bit" as a way to differentiate between 12 bit or higher bit depth formats and 10 bit and lower formats. The graphics card will always down sample and you're right, most displays and editing software will sample to 8 bit. Avid and DaVinci can sample to 10 bit or full color space. The higher the bit depth of your material, the harder the GPU works to sub sample. We've done thousands of tests on this and it's very clear that subsampling via the GPU where the display monitors are connected to, makes a huge difference. In other words, the faster the CUDA GPU is, the faster the subsampling is. You can test this yourself by downloading sample clips from the internet of different codec's and forcing DaVinci or Avid to display in 10 bit mode. You will see a radical difference in real-time rendering speed based on graphics card.

 

We've installed K5000's on many machines for DaVinci as well and have seen HUGE performance leaps. It doesn't work if you just playback footage through the machine. So as a "tech", you wouldn't see anything change in the monitoring system. As a creative, once you start using the real-time functions of software like DaVinci, that's when it becomes apparent. My clients sometimes have 20+ mattes in each shot, rendering in real time at full resolution, at full bit depth. That's impossible to do without an accelerated CUDA graphics card.

 

Arguably, but here's the point: it's all uncompressed float by the time the software actually starts working on it.

True, but now you're talking about the imager/hardware in camera converting to a compressed color space and lower bit depth and then expanding that in your editing software. You loose most of the imagers ability to deliver an image at that point.

 

The problem I'm trying to address here is that it's common to hear about people spending huge amounts of money in order to do things which are really quite trivial. It's not necessary to do that. If you've been running around specifying huge systems to do basic jobs, fine, it'll work, it'll just be unnecessarily expensive.

I'd love for you to come out to hollywood sometime and sit down for a session with my clients who spent millions on post production solutions. It's not because they wanted to, it's because they had no choice! I work with big clients every day, solving these problems one by one and I tells ya, there is far more to it then meets the eye. I've been flabbergasted at some of these companies inefficient work around's because they have no choice since their hardware and storage isn't fast enough. Mind you, working at home on a single computer, you can get away with a lot more. Modern/new computers do come with fast enough graphics cards, but they're VERY fast compared to that of only a few years ago. So when I talk about upgrading cards, I'm really not talking about computers made in 2015... more like computers made in 2010, which is what MOST people have in this industry still.

Link to comment
Share on other sites

  • Premium Member

For pity's sake, Tyler, there's a 4:4:4 RGB upgrade for the F3.

I did forget about that, sorry about that. 10 bit 444 isn't even on my radar anyway. It's completely worthless in my book. It's a "faux" raw, it's still not working directly with the imager data like real raw. The difference between 422 and 444 is generally unnoticeable outside of the most trained eye. The difference between 10 bit 444 and real raw is night and day. Sony just wanted to pretend they had something more powerful then they really do.

 

Quicktime for Windows has supported reading ProRes for years.

It's not native, the 3rd party plugin's have issues. I deal with windows made Pro Res files every day and we've been forced to install mac's in places that create Pro Res due to the extremely slow encoding speed AND issues with speed on playback. If you encode the same Pro Res on a Mac vs a PC, you will see huge discrepancies on how they function, it's almost two different codec's.

Link to comment
Share on other sites

  • Premium Member

 

 

I like to use the word "tiff" because it's easy for someone with photoshop experience to understand what's behind those three formats

 

If this is the level we're at, I'm going to bow out at this point, because I'm unable to interact meaningfully with someone who insists on using a private, modified version of common terminology.

All I'd ask is that you refrain from giving advice, because you're saying things which are not correct and which will cause people trouble. For instance, you say:

 

 

 

Yep and raw is 24 bit and around 250MBps megabytes for 4k (not UHD)

 

You are simply wrong. Raw is not intrinsically 24-bit. Most raw-recording devices record rather more than 8 bits per channel; F55, for instance, can record 16 bit in linear raw. Even the Blackmagic cameras can record 10 or 12. It is also not 250MBps; F55 raw hovers somewhere around a gigabit, that is 125 to perhaps practically 150 megabytes per second, for 4K.

 

If someone followed your advice you would cause them to overspecify their storage 2:1, which is madness, no matter what big companies you've worked for.

 

P

Link to comment
Share on other sites

  • Premium Member

If someone followed your advice you would cause them to overspecify their storage 2:1, which is madness, no matter what big companies you've worked for.

Funny you mention that, the guys who listened to me, have plenty of storage and bandwidth. The guys who don't are always scrambling for backup drives to dump data to.

 

I guess things are entirely different on the other side of the pond. :shrug:

 

BTW, my numbers are generally accurate. I double check them from manufacturers data sheets before posting every time.

Link to comment
Share on other sites

  • Premium Member

Here is the required bandwidth for standard Pro Res 4444 4k: 1199.17Mbps = 149MBps (image below)

So for ONE STREAM, as if you're editing a single track, you'd be OK with a two or more drive raid zero with a high speed data connection; USB3, ESata, Thunderbolt or Fiber. But at that speed, you're looking at 540GB per hour! OUCH! Which is why I recommended a lot of storage.

 

However, since nobody really edits ONE TRACK at a time, you really need 2 times the bandwidth. Why? Because you'll at least have two sources, one will be camera original and one will be render files for things like transitions and effects. The moment you add another video layer, you need to a lot more bandwidth to deal with it as well. Even though NLE's today are smarter then they've ever been, USB3 and ESata aren't very good with multitasking. They get bogged down very quickly with high bandwidth requirements.

 

These are the reasons I made my first recommendation.

 

prores4444_4k_test.png

Link to comment
Share on other sites

Pro Res is not compatible with PC's.

 

Um, we use ProRes all the time in Resolve on Windows. You can't write to it out of the box from most software, but it plays back just fine. I'm talking 2k files, in 4444, 4k is a bit dicier on our system, but that's largely because of the spec of our machine. When we upgraded from the old hardware, ProRes 4444 playback improved dramatically. It's not a format we use daily (we mostly do DPX), but ProRes definitely is not incompatible with Windows.

 

On the file creation side, our Lasergraphics scanner can create 2k ProRes 4444 HQ or XQ files faster than real time (30fps), or 4k files at 15fps (and that speed limit is not a function of ProRes, it's of the sensor, which is limited to 15fps in 4k mode). The ScanStation's control software runs on Windows 7, and the ProRes file creation is done in software, NOT in the scanner itself. The output file format is created after all GPU processing is done (scaling, color correction, etc), and before the file is written to disk.

 

We have not seen any incompatibilities with the ProRes files made from our scanner. They open up perfectly in Mac-based software and in Windows applications that support Quicktime, and they are completely compliant. No gamma shifts, none of the stuff that you see if you use something like ffmpeg and don't know what you're doing...

 

-perry

Edited by Perry Paolantonio
Link to comment
Share on other sites

Here is the required bandwidth for standard Pro Res 4444 4k: 1199.17Mbps = 149MBps

 

So for ONE STREAM, as if you're editing a single track, you'd be OK with a two or more drive raid zero with a high speed data connection; USB3, ESata, Thunderbolt or Fiber.

Do you recommend waiting for USB3.1? I heard USB3 is still a bit too slow in contrast to thunderbolt.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...