Wednesday, July 21, 2010

Rate difference means jitter

Movies on film are shot at 24 frames per second (fps), and television (in the US) is broadcast at approximately 60 fields per second, or 60i where the i suffix means "interlaced" where one frame shows only the even horizontal scan lines, and the next one shows the odd scan lines. The antonym of "interlaced" is "progressive" which means the whole picture is shown for every frame. Movies on film are progressive. The problem to adapt movies for show on a television is known as Telecine. Film and TV engineers have always known that telecine could introduce jitter. The jitter is not so much due to the difference between progressive and interlaced, but due to the difference in frame rate, since 24 fps does not divide 60 fps.



Frames are supposed to be shown at a constant rate, which means a fixed interval. However, when you adapt one frame rate to another one, a frame would be shown for a longer duration than the next one. In the figure above, the green frame from the 24 fps source spans 3 frames in 60 fps, but the blue frame only spans 2 frames.

This difference causes the motion to appear jerky.

Of course, this problem happens when you perform constant rate sampling and wish to adapt the sampling to another rate. When it comes to a stream of wave-form such as audio, an interpolation scheme tries to "guess" the curve of the wave-form and reapply sampling to the guessed curve. This introduces interpolation error if our guesstimate is off.

How does it relate to computer science? When you have two fixed-rate link layer network and try to connect them, the rate difference causes jitter.

No comments: