Jump to content

Uploading 25fps to the web..?


Karel Bata

Recommended Posts

.

  • Here in the UK, and in much of the world -
http://upload.wikimedia.org/wikipedia/comm...C-SECAM.svg.png - we shoot everything at 25fps. However most people's computers display at around 60Hz - mine are set on 75 or 100.
 
Maybe I should do some tests on this (yawn...) but it seems to me that if a web page is running at 60Hz then embedding a video encoded at 25fps may well cause some loss of quality.
 
Or do the likes of Vimeo and YouTube re-encode to a different frame rate? That can't be good...
 
Anyone know anything about this...?
 
p.s. for a comparison, there's plenty of videos on this page originally encoded at 25fps: http://kareltests.co.uk/

Link to comment
Share on other sites

  • Premium Member

A lot of modern LCD monitors won't display anything other than 60, anyway, regardless of what your computer is doing - they resample internally. Modern graphics card and monitor combinations will communicate via the I2C bus on DVI to ensure that nothing other than 60 is even offered to you.

 

That said, the theory of what you're proposing is entirely correct. This happens as an intrinsic characteristic of how dual-ported frame buffers - such as the video RAM on a computer graphics card - are operated. Usually, things are arranged so that the video playback software will update the graphics memory whenever it has a new frame to display, and the output and display hardware will read it whenever they want to do a refresh. There is no intentional decision to use a named type of pulldown. Between those factors, things are allowed to fall where they may, and as a result you usually end up with each 25-frame video image being displayed for either two or three sixtieths of a second.

 

However, it's more complicated than that, and merely ensuring that the display refresh rate is an integer multiple of the target video frame rate is not enough to ensure that each frame will be displayed for the same amount of time. There are two reasons for this.

 

First, the video track may be marked for 25fps replay, but that timing is derived from the computer's CPU, not from the graphics card. This timing cannot be even as good as the crystal-based CPU clock, as most computers do not run realtime operating systems and are unable to guarantee a process access to the processor at a given time. Also, the CPU clock is not phase locked to the graphics card's own internal timebase. There is no guarantee that the video player will flip frames at exactly the right time for the graphics hardware to always scan them out with precisely correct timing. Eventually, the gradual phase shift between the two clocks will be made up with either a short or long frame, differing in duration to its neighbours by an proportion equal to one monitor scan, so, 1/75 of a second on a 75Hz display.

 

Second, master timing for video replay on computers is not normally derived from the CPU clock in any case. Most video has sync sound and most computer sound card clocks are not phase locked to either the CPU or graphics clocks, so something has to give. It's comparatively difficult to stretch or compress the duration of audio without introducing pops or clicks, but it's pretty easy to duplicate or omit video frames simply by choosing when you update the buffer. Because of this, desktop computer video playback is generally audio synchronised - the player will wait for the sound card to replay a given number of sound samples before updating video frames. Some consumer sound cards have wildly inaccurate sample clocks - enough to noticeably detune instruments - and this is a significant source of inaccuracy.

 

Because of all this, it generally isn't possible to ensure true frame for frame playback on a computer, at least with everyday software. It is in theory possible, but as in reality very little software actually does it (some high end postproduction software that's designed to run on desktop graphics hardware does). This is also why higher end nonlinear editing cards tend to include both audio and video output on one device - partly because SDI embeds both on one cable, but mainly because they can then ensure that both the audio and video clocks are derived from the same crystal timebase.

 

As a practical matter, where the display refresh rate is an integer multiple of the video frame rate, you will in general get playback where all the frames are displayed for the same amount of time. Occasionally, though, all these factors will conspire to make it hop and skip a little to keep things matching up, and there's nothing whatsoever you can do about it unless you are prepared to write software to do it from scratch.

 

P

Link to comment
Share on other sites

Many thanks Phil for an incredibly detailed reply. That advances my knowledge of the subject substantially. I'll link to this thread as 'further reading' from my site, if you don't mind. ;)

 

Hmm... makes me rather pleased too that I've always equipped my PCs with decent sound cards. :rolleyes:

Link to comment
Share on other sites

  • Premium Member
A lot of modern LCD monitors won't display anything other than 60, anyway, regardless of what your computer is doing - they resample internally. Modern graphics card and monitor combinations will communicate via the I2C bus on DVI to ensure that nothing other than 60 is even offered to you.

 

That said, the theory of what you're proposing is entirely correct. This happens as an intrinsic characteristic of how dual-ported frame buffers - such as the video RAM on a computer graphics card - are operated. Usually, things are arranged so that the video playback software will update the graphics memory whenever it has a new frame to display, and the output and display hardware will read it whenever they want to do a refresh. There is no intentional decision to use a named type of pulldown. Between those factors, things are allowed to fall where they may, and as a result you usually end up with each 25-frame video image being displayed for either two or three sixtieths of a second.

 

However, it's more complicated than that, and merely ensuring that the display refresh rate is an integer multiple of the target video frame rate is not enough to ensure that each frame will be displayed for the same amount of time. There are two reasons for this.

 

First, the video track may be marked for 25fps replay, but that timing is derived from the computer's CPU, not from the graphics card. This timing cannot be even as good as the crystal-based CPU clock, as most computers do not run realtime operating systems and are unable to guarantee a process access to the processor at a given time. Also, the CPU clock is not phase locked to the graphics card's own internal timebase. There is no guarantee that the video player will flip frames at exactly the right time for the graphics hardware to always scan them out with precisely correct timing. Eventually, the gradual phase shift between the two clocks will be made up with either a short or long frame, differing in duration to its neighbours by an proportion equal to one monitor scan, so, 1/75 of a second on a 75Hz display.

 

Second, master timing for video replay on computers is not normally derived from the CPU clock in any case. Most video has sync sound and most computer sound card clocks are not phase locked to either the CPU or graphics clocks, so something has to give. It's comparatively difficult to stretch or compress the duration of audio without introducing pops or clicks, but it's pretty easy to duplicate or omit video frames simply by choosing when you update the buffer. Because of this, desktop computer video playback is generally audio synchronised - the player will wait for the sound card to replay a given number of sound samples before updating video frames. Some consumer sound cards have wildly inaccurate sample clocks - enough to noticeably detune instruments - and this is a significant source of inaccuracy.

 

Because of all this, it generally isn't possible to ensure true frame for frame playback on a computer, at least with everyday software. It is in theory possible, but as in reality very little software actually does it (some high end postproduction software that's designed to run on desktop graphics hardware does). This is also why higher end nonlinear editing cards tend to include both audio and video output on one device - partly because SDI embeds both on one cable, but mainly because they can then ensure that both the audio and video clocks are derived from the same crystal timebase.

 

As a practical matter, where the display refresh rate is an integer multiple of the video frame rate, you will in general get playback where all the frames are displayed for the same amount of time. Occasionally, though, all these factors will conspire to make it hop and skip a little to keep things matching up, and there's nothing whatsoever you can do about it unless you are prepared to write software to do it from scratch.

 

P

Great post indeed. For someone like me entirely based on materials the abstract functions within computers and other electronic devices will always remain abstract, as quick as thought but also invisible, untouchable, unreliable. No electricity, no anything. Our manual dexterity makes us divine humans, never our abstract functioning brain alone.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...