Jump to content

Transfer frame rates


Will Montgomery

Recommended Posts

  • Premium Member

Why in this world of progressive scan LCD TV's and 1080p Blu-Ray players would we work in 23.976 FPS when transferring film?

 

I always understood pulldown to be for interlaced broadcast and especially when transferring film shot at different speeds to another frame rate at the same "perceived" speed.

 

For film shot at 24fps does it make more sense to transfer and edit at 24fps then transcode as needed? I feel like I'm missing something very basic.

Link to comment
Share on other sites

Why in this world of progressive scan LCD TV's and 1080p Blu-Ray players would we work in 23.976 FPS when transferring film?

 

I always understood pulldown to be for interlaced broadcast and especially when transferring film shot at different speeds to another frame rate at the same "perceived" speed.

 

For film shot at 24fps does it make more sense to transfer and edit at 24fps then transcode as needed? I feel like I'm missing something very basic.

 

Because SMPTE standards require a 59.94/29.97/23.98 hz in order to be compatible with NTSC standards. It all has to do with standards and what is being used now. Of course when you get dpx scans back from a post house you could edit them all at 24.00 fps (or any frame rate you want, really) but you'll have to end up going to 23.976 sooner or later unless you're making a film print. I've never done it, but playback would be weird on a dvd if you burned 24-native material on it versus 23.976. When you transfer stuff in telecine, it really isn't going from 24 to 29.97, the pulldown is making it go from 24 to 23.98 and then to 29.97.

 

What you really should be asking is: why in the hell did we adopt a fractional frame rate standard?! :-)

Link to comment
Share on other sites

  • Premium Member

What you really should be asking is: why in the hell did we adopt a fractional frame rate standard?! :-)

 

It was one of those last minute "Aw S--t" moments in the development of NTSC color. The original B&W standard that was developed in the 1930's and adopted in June of 1940 used most of the 6 MHz channel for amplitude modulated luminance, with a 4.5 MHz FM audio subcarrier. To add color to that system without making the existing TV sets obsolete, what they went with was another subcarrier for the chroma. They wanted it to alternate phase from line to line, so the subcarrier frequency had to be an odd number over two times the line rate. They went with 455/2.

 

The arithmetic:

 

30 frames per second X 525 total lines = 15.75 KHz line frequency.

 

15.75 KHz X (455/2) = 3.583125 MHz chroma subcarrier frequency.

 

Then came the "Aw S--t". When you combine two different frequencies, you get "beats" -- a new frequency that's the difference between the two. Nobody guessed until they tried it that the beat frequency between those two subcarriers would produce a visible herringbone pattern on the picture. I mean, the beat is a little under a MHz, who woulda guessed?

 

Anyhow, they discovered that they could fix the problem by shifting either one of the subcarriers up or down by a tenth of a percent. Alas, they chose to move the new chroma subcarrier down. If they had moved the audio subcarrier, none of this would have happened. The multiply by 1000/1001 thing propagates back down thru all the other arithmetic, the 455/2 business, and gets us to 59.94 fields and 29.97 frames per second, and the familiar 3.579545 MHz color subcarrier.

 

If only we could go back through time and tell the orignal committee that in 20+ years there'd be time code, and we'd be putting numbers on all the frames, and having to mess with drop and non-drop....

 

So, along comes the transition to digital HDTV. In order to downconvert and simulcast during the transition, the new system had to run at the same "point nine something" rates as NTSC. As long as we're using converter boxes to feed the old NTSC sets, what they stepped in back in 1953 will still be stuck to the shoes of television.

 

Theoretically, some day we could do another transition, and have a day when all the engineers throw a bunch of switches and check different boxes in a bunch of sub sub sub menus, and convert everything to 30.00 and 60.00. But then all the old shows would come up short by 3 seconds and 18 frames per hour. It would be a tough tough sell to convince everybody involved that the change would be worth the effort. It kinda makes you wonder what we're effing up today that'll bite down hard on the tush of the future....

 

 

 

 

-- J.S.

Link to comment
Share on other sites

My understanding of this is that when television converted from b&w to color, the frame rate that was standard for b&w introduced audible noise into the signal that was sent to homes. So, they discovered that by changing the frame rate to 29.97 the audible noise went away. I maybe oversimplifying it, but my understanding is that this is the gist.

 

So, because the frame rate was not a "whole number" (like 30fps), they had to "invent" dropframe timecode to adjust for the fractional frame rate. That' pretty much it in a nutshell. Because of AUDIO problems, they were not able to keep the frame rate at a whole number so that led to the necessity of drop frame timecode.

Link to comment
Share on other sites

  • Premium Member

My understanding of this is that when television converted from b&w to color, the frame rate that was standard for b&w introduced audible noise into the signal that was sent to homes. So, they discovered that by changing the frame rate to 29.97 the audible noise went away.

 

It was a herringbone pattern in the video, not a sound issue. Other than that, your summary is correct.

 

 

 

 

-- J.S.

Link to comment
Share on other sites

I found this from a while back:

 

http://www.cinematography.net/Pages%20DW/DFvsNDF.htm

 

Way back when B&W TV was invented, TV signals in North America were broadcast at 60 interlaced scans per second to create 30 complete frames per second. This was very convenient because the transmitters and home receivers could use the available 60 cycle AC power to keep everything in sync. Then along came colour (or color) as the Yankees call it. It was then discovered that a 30 fps color picture when received on a B&W set had an annoying hummm.

 

So the politicians of the day, being politicians, and not wanting to offend their aged voters by telling them to chuck their old B&W sets and buy colour, decreed that the new color signals had to be broadcast in such a way that grandpa could see a B&W picture of a young Johnny Carson on his 12 inch TV set with the round picture tube. So came into being the FCC law that color programs would have to be broadcast at 29.97 frames per sec. which eliminated the audio hummm. We have been burdened with the immense consequences of this stupidity ever since.

 

For example a one hour production recorded at 30 fps would contain 30(frames) x 60(seconds) x 60(minutes) or 108,000 frames. The problem was that if this tape was broadcast at just 29.97 fps, at the end of the hour only 107,892 of the 108,000 frames would have been broadcast - with the last 108 frames or 3.6 seconds never seeing the light of day. While missing the last 3.6 seconds of credits would not have been the end of the world, the points at which commercials and station breaks were to be inserted would be a mess because they would be out of sync from anywhere from 0 to 3.6 seconds depending where in the hour show they were placed.

 

The solution was to skip certain frame numbers (NOT TO DELETE OR DROP THE ACTUAL FRAMES) so at the end of the hour as the 107,892nd frame was being broadcast, the time code would read exactly one hour, zero minutes, zero seconds and zero frames. (01:00:00:00), and if in editing, you fast forwarded the tape to the 15 min. or 30 min. point as shown by the time code, then you would have reached the 15 min. or 30 min. point in the show that the real broadcast would be at as seen by grandpa on his B&W set and by the affiliate stations who were inserting local commercials at these points.

 

The exact formula as to which frame numbers (not actual picture frames) to drop so you would end up with a 107,892 frame tape that would last a full broadcast hour is as follows :

 

Two frame numbers - "00" and "01" - are dropped every minute ( 2 x 60 = 120 frame numbers ) EXCEPT for minute points that have a "0" number - 00, 10, 20, 30, 40 and 50 minutes - ( 2 x 6 = 12 frames). This results in 120-12 or 108 frame numbers (not frames) being skipped.

 

 

 

 

Anything destined for broadcast must be in drop frame. That's both for real time accuracy and the fact that the broadcasters require it. Note that in terms of television series, this does not mean that post production is in drop frame, for a number of reasons. Primary is the simple fact that most series production (whether film or HD tape) is done at 24fps. Editing systems that can cut at 24 frames (Avids, Final Cut systems properly equipped for capture and output) operate under the assumption that all "A" frames fall on time codes ending in 0 or 5 - but this assumption can only be made if the time code is non-drop frame.

 

In the case of NTSC finishing, a drop frame recording can be made for network delivery after post production is complete. In the case of HD finishing, there is no drop frame mode for 24p, so once again, network delivery elements are made with drop frame, but the masters are not.

 

 

 

While it is true that DF is appropriate for broadcast (due to keeping clock-accurate)...my understanding is that it really does not matter what format you acquire in, DF, or NDF. It only matters what TC format you post in.

 

When you create a timeline in DF for a network, it doesn't matter what the tape was shot in...it still creates a time-accurate DF EDL.

 

We used to argue about this when I was in tv news...and the engineers always said that as long as the tape machines were in DF when we edited and went to air, the acquisition TC format did not matter.

 

BTW -

 

I can tell you that from very recent personal experience...the NDF/DF issue is HUGE over the course of a 45 min (re: hour) program. Because of a misunderstanding over which symbol was DF (;) and which was NDF (:) my editor put us more than a minute over. For those who struggle...my college proof always said that you can easily tell the difference because the "comma" in DF appears to "drop" in the semi-colon...compared to the colon

 

……

 

The only time I ever shoot NDF is when I know the video will be intercut with film - I know that editors can now compensate for this, but it does make their lives easier if the source material is at least consistent.

 

…..

 

 

NTSC time code runs at 29.97 as its native rate. Not 30. However, in order to record double system sound with a film camera running at a crystal 24fps and stay in sync during a telecine transfer, it is necessary to use time code running at 30 fps. This is because all telecines transferring to NTSC are resolved to 23.98, not 24, so in order to stay in sync, the sound must be "pulled down" by the same amount. This is accomplished by resolving the sound to video sync (29.97).

 

Now, for some reason, many people equate the 29.97 frame rate with drop frame time code. However, the drop frame scheme is a counting scheme - skip two frame numbers every minute except every 10th minute - and has nothing whatsoever to do with the actual frame rate. Thus, you can have either drop frame or non-drop frame counting in either 29.97 or 30 frame code. This has come in handy because when shooting 24p video, you are usually shooting at 23.98. So to stay in sync, the sound must also be recorded at a video rate (29.97) so that it doesn't get pulled down when it's resolved to video sync in post. Hence the need for "29.97 non-drop frame" code.

 

Have I successfully added to the confusion?

 

Mike Most

VFX Supervisor

IATSE Local 600

Los Angeles

Link to comment
Share on other sites

  • Premium Member

I found this from a while back:

 

Yup. That's incorrect. There were also a lot of audio buzz problems in the early days, but that was due to failure to filter out video detail approaching the 4.5 MHz audio subcarrier. I've heard a little of that recently, on a hotel TV where they had some cheap computer generated text that wasn't filtered. The beat frequency between the two subcarriers is like about 917 KHz, way above the audio range.

 

Oh, and there's also the original channel numbering story, why it was 2 - 13 instead of 1 - 12..... ;-)

 

 

 

-- J.S.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...