Giz Explains: Why Frame Rate Matters

Giz Explains: Why Frame Rate Matters

We all know the motion picture is a lie. That movement on screen? It’s just a bunch of still images. Still images that seem more like believable, realistic, lifelike motion the faster they flicker along. Faster is better, and that 48 frame-per-second version of The Hobbit was just the beginning.

What is frame rate anyway?

If you understand how film projectors work, you’ll know that the individual images that make up a film strip are run through a projector assembly and flash consecutively before an lighted aperture that projects the image up onto the screen, which gives the illusion of motion at high enough speeds. Movie magic! The rate at which these frames are shown is expressed in frames per second (FPS) for traditional celluloid film, and as a “refresh rate” measured in hertz (Hz) for digital films and display monitors. In both cases, that value reflects how fast the still images can flicker, and the faster they can flicker, the more lifelike and realistic the motion appears.

FPS and refresh rate are related, but they’re not exactly the same. It all ties back to an old projectionist trick. To help minimise the jitteriness of 24 FPS films (the standard speed you’ll see at a movie), projectionists would flash the same frame two or three times before the next frame came up. The frame rate is the number of complete still images shown every second — so that would still be 24 FPS — but the refresh rate is the total number of times any image flashes over the corse of a second, in this case it’d be 72 Hz. So a 24 FPS film can still have a refresh rate of 72 Hz if each frame is being shown three times, or a refresh rate of 48 Hz if they’re only being flashed twice.

This is slightly different from the refresh rate listed for your TV or monitor though. Both measure the same basic thing — the number on your monitor is just from the perspective of the hardware instead of the media. That is to say, the 60, 120, or 240 Hz refresh rate on your TV measures the maximum speed at which that gadget can flash new images, independent from the media you’re trying to watch on that screen.

How it hits your eyes

The human eye is capable of differentiating between 10 and 12 still images per second before it starts just seeing it as motion. That is, at an FPS of 12 or less, your brain can tell that its just a bunch of still images in rapid succession, not a seamless animation. Once the frame rate gets up to around 18 to 26 FPS, the motion effect actually takes effect and your brain is fooled into thinking that these individual images are actually a moving scene.

Giz Explains: Why Frame Rate Matters

An illustrated example of frame rate judder – image: Reddit

So if a frame rate is too slow, motion looks jagged, but if it’s too fast you can have problems too. Live-action movies filmed at 48 FPS tend to have that certain soap-opera effect people hated in The Hobbit.

That’s because one major component of making the motion seem real and lifelike is motion blur. In the natural world motion blur is simply the loss of detail you get when you’re looking at something that’s moving fast, or when your eyes are moving fast as they look at something. Your focal point is really only about as big as a silver dollar held at arm’s length but when your eyes are fixed on a stationary object — or the object is travelling slowly enough for your eyes to track it — there’s no loss of visual acuity. However when your eye move quickly — glancing from the left periphery of your field of vision to the right, say — your eyes don’t have sufficient time to take in the same level of detail and visual information, which causes motion blur.

In film, motion blur occurs because you’re really looking at a series of static images displayed in tiny amounts of time. That is, for a film run at 25 FPS, each frame will be onscreen for just 40 milliseconds (1/25th of a second) and so when every one of those frames flashes, and there’s a quick flash of blankness, and then a new frame, you’ll get the effects of motion blur. You won’t get that — at least not the same way — in real life, which is part of what makes movies look like movies. But it’s also important, especially in modern CGI films, because without motion blur the progression between frames can appears to stutter (an effect called strobing). To get a more visceral handle on what we’re talking about, go check out Frames Per Second and play around with the various frame rates.

The current industry standard is 24 FPS — that number was decided on for economic rather than theatrical reasons, though more on that in a second — and that’s what’s determined what movies look like for use. But that is hardly the maximum we can see. Both current technology and the innate visual prowess of the human eye can handle far higher rates than what we see on TV.

A frame rate for every media

From its inception, cinematic frame rates have been getting undercut by the economic interests of the moving-making industry. The earliest silent movies were shot at around 16 to 20 FPS — since that was the bare minimum that actually generated the continuous motion effect — but were also limited by the arm strength of the cameraman, who had to manually crank a reel of film through the camera. Movie houses at the time would often play them back at a slightly faster rate than that at which they were filmed but this caused the on-screen motion to appear jerky.

Thomas Edison was a very early proponent of higher frame rates. He argued for a 46 FPS base rate because “anything less will strain the eye” but even by the 1920s, the average frame rate had only climbed to between 22 and 26. When Talkies hit in 1926, projectionists could no longer vary the frame rate on the fly like they used to, because it would throw off the pitch of the sound playback, so the film industry had to pick a stable frame rate at which to project. The industry settled on 24 FPS, mostly because that was the slowest (and therefore least expensive to produce) frame rate that could still support audio when played from a 35 mm reel. Similarly, early home video cameras shot at equally poor frame rates: Standard 8 cameras shot at 16 FPS, Super 8 bumped that number up to 18. So just like the current 50 nit brightness standard was chosen because that’s how bright the the cheapest usable bulbs movie houses could find were, our modern frame rate standard is based on the cheapest 20th century option the industry could find.

Today, we have three primary frame rate standards — 24p, 25p, and 30p — and a whole slew of competing alternatives that constitute various potential future standards American (NTSC) broadcasts are done so at 24p and provide a very “cinema-like” motion blur effect, European PAL/SEACAM-derived broadcasts go out at a perceptively identical but mathematically different 25p since their TVs work on the a base-50Hz scale rather than North America’s 60Hz, and 30p is the de facto standard for home movies and personal camcorders as it accurately mimics 35mm’s feel without as many visual artefacts.

Giz Explains: Why Frame Rate Matters

The alternative frame rates include: 48p, which is what Peter Jackson used to film The Hobbit. Going higher, you’ve got 90p and 100p, which are options on the GoPro Hero, and 120p, which is the new standard in UHD televisions (and part of rec.2020). The highest current commercially available frame rate is 300 FPS, which the BBC has been playing around with for some of its sports broadcasts (no, not for cricket) and is could prove quite helpful in the future as it can easily be stepped down to both 50 Hz and 60 Hz without a lot of effort.

So which frame rate is best?

That depends on who you ask. Peter Jackson thought he saw the light when filming the Hobbit,

posting the following to Facebook in November, 2011:

Film purists will criticise the lack of blur and strobing artifacts, but all of our crew — many of whom are film purists — are now converts. You get used to this new look very quickly and it becomes a much more lifelike and comfortable viewing experience. It’s similar to the moment when vinyl records were supplanted by digital CDs. There’s no doubt in my mind that we’re heading towards movies being shot and projected at higher frame rates.

Unfortunately, very few people agreed. It will be interesting to see how James Cameron’s Avatar sequels, both of which are reportedly being shot at 48 FPS , will fare. Maybe Edison was just wrong, and maybe we’re too used to the effects of 24 FPS motion blur which soften our movies, make them look more dreamlike, and make the props and other semi-realistic stuff a little fuzzier and easier to believe.

According to
Simon Cooke of Microsoft’s Advanced Technology Group, faster is indeed better because of how the human eye works on a mechanical level. Cooke’s explanation immediately dives into a bunch of maths and complex biological terminology (you can confound yourself with it here) but basically, his point is that your eye jiggles just a little bit — as a sack of jelly is wont to do — even when you’re focused on a fixed point. These jiggles, known as ocular microtremors, occur at an average rate of around 84 Hz and, he proposes, this helps your brain better discern edges within your field of vision by providing the cones in your retina two very slightly different angled views of the same object. With twice the amount of information coming in to your visual cortex, your brain is able to stitch together a better visual image with more defined edges.

But with the current 24 FPS standard, your eyes’ jiggles aren’t actually doing anything because the image isn’t changing fast enough for the microtremors’ sampling effect to actually work. At those rates, “Your eye will sample the same image twice, and won’t be able to pull out any extra spatial information from the oscillation,” writes Cooke. “Everything will appear a little dreamier, and lower resolution.

Cooke recommends running content above 41 Hz (that’s at least 43 FPS) or about half of the oscillation rate of the human eye. For movies specifically, Cooke argues for 48 Hz, though that isn’t without its drawbacks:

At 48 Hz, you’re going to pull out more details at 48 Hz from the scene than at 24Hz, both in terms of motion and spatial detail. It’s going to be more than 2x the information than you’d expect just from doubling the spatial frequency, because you’re also going to get motion-information integrated into the signal alongside the spatial information. This is why for whip-pans and scenes with lots of motion, you’re going to get much better results with an audience at faster frame rates.

Unfortunately, you’re also going to get the audience extracting much more detail out of that scene than at 24 Hz. Which unfortunately makes it all look fake (because they can see that, well, the set is a set), and it will look video-y instead of dreamy — because of the extra motion extraction which can be done when your signal changes at 40H z and above.

In short, higher frame rates look more real, but it makes things that are not real look less real.

So either Hollywood will either need to invest in better special effects or movie-going audiences will need to retrain our suspension of disbelief. But the increased visual information that comes with a higher frame rate could still prove valuable for the cinematic experience, especially in large formats like IMAX.

As the Red Camera company points out, since the visual field on an IMAX screen is so large, some onscreen action — when played at the current 24 FPS — judders more visibly and contains more visual artefacts, simply due to the amount of real estate of the screen the images are being projected onto.

“Moving objects may strobe or have a ‘picket fence’ appearance as they traverse a large screen,” the company’s blog reads. “At 24 FPS, a 50 foot screen shows an object as jumping in 2 foot increments if that object takes one second to traverse the screen.”

But with a higher frame rate, that movement increment decreases significantly — as does screen flicker and eye strain. The question is, whether or not movie studios will be as on-board with the new method as directors like Cameron and Jackson are. After all, the main effect of the higher frame rate that viewers tend to react to is that it looks “weird” and “wrong.”

Overall, the industry does appear to be gradually accepting the value of higher frame rates. YouTube recently began offering select videos — mostly video game playback — at 60 FPS to much fanfare. And with action cams like the GoPro Hero 3 which now offer 120 FPS at 720p (and 60 FPS at 1080p), the amount of content being generated at those rates is only going to increase. Though it’s worth noting that video game footage is totally fake, and GoPro footage is totally real; neither medium has to blend the fake and real as carefully as movie makers do.

And it’s these guys who will lead the charge. We’re likely going to see high frame rate content spur on adoption as legions of home videographers clamour for an online means of sharing their backyard adventures — not just Hollywood directors. [Extreme Tech – Wiki 1, 2CMUAccidental ScientistRed CameraWeb ArchiveBBCHigh Def DigestFrank SchraderSony]


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.