Higher Frame Rates in Movies

Too Much Garlic

Master Member
I've always wondered about higher frame rates and how they are derived at and why they look different. Shooting higher frame rates and playing it back in real time, would look different, right, depending on how many frames pr second is shot? Similar to strobe light experiments that makes moving objects look like they slow down and even reverse. Are anyone actually experimenting with higher frame rates to try and get the similar look to the current 24 frames pr second look of movies, since 48 or 60 certainly isn't it? Could a test be made where the frame rate is upped by one frame with each test and playing it back in that frame rate pr second to see the movements speed up and slow down and find that perfect frame rate that could be used for higher frame rate movies, where it won't look like the movements are sped up, because the people choosing that frame rate just picked a random one based on perhaps technical grounds instead of visual grounds?

What once made me think of this is a music video shot at 200 or something frames pr second and played back in real time, where the movements almost looked slowed down a bit. Comparing that to higher frame rate movies like the Hobbit movies at 48 frames pr second where the movements look sped up, and TV movies shot at 60 frames pr second, then that means there are changes in the look of movements when going up the higher frame rate scale, depending on how many frames pr second used, right?

The reason I ask this is because I don't usually like higher frame rate filmed movies because the movements look too fast and I've always thought that instead of 48 frames they should perhaps have used 42 or even 40. What made them choose 48? Simply because it is the double of 24? Why do they chose 60 frames? Or 120 as Gemini Man was filmed at - which I haven't seen or anything in that frame rate? Do they actually go through all the options to find the best looking higher frame rate or are they just picking the number at random? Would actually be interesting to see someone test all frame rates, from 24 to 120 to see how movement speed changes when viewed in normal speed and what frame rates looks similar in movement speed to 24 frames pr second.

Does any of this make sense? I've tried talking with my friends about it and they don't understand what I'm talking about, and when talking about a strobe light test a physics teacher did when I was in school, I may not be able to explain it correctly and how I feel it illustrates what I mean about finding the best frame rate for movies to be shot at in higher frame rates that don't make the movements look sped up or slowed down, but rather look exactly how everyone remembers from 24 frames pr second - only in a higher frame rate.
 
Higher frame rates aren't about speeding up or slowing down, it's about depicting motion in a more realistic way when displayed at the same high speed. In standard 24fps the camera shutter is closed for more time than it is open, so there is more action taking place between the frames than what gets captured. Higher frame rates allow more action to be captured, resulting in smoother motion.

Douglas Trumbull's Showscan process from the early 80s was the first experiment with the technique, shooting 70mm film at 60fps. They tested different frame rates and found there was no noticeable improvement beyond 60fps.

Showscan - Wikipedia


I personally like the traditional look of old fashioned 24fps film, but I got to see some of the Showscan films and they were spectacular. Motion appeared much more lifelike, and the film grain practically disappeared, it was more like watching a live play than a film.
 
What robn1 said, and also it's all about the "motion blur" basically, that gives films the "film look" if you will.
Record something at 240 fps, then play it back at the regular 24 fps, and it will not look like "film". Because of the lack of motion blur.

There are a lot of interesting videos on the subject on youtube. Some go very in-depth.
 
The main reason I began to think this is due to a music video shot in 200+ frames pr second and played in normal speed, where the movements looked "slowed down", whereas when I watched movies in 32 frames or 48 or 60, movements were "sped up", but 24 looked "normal". Why this difference in perceived motion speed when played back in normal time? It made me wonder if the perceived movement speed changes depending on frames pr second the higher you go, from "too slow" to "normal" to "too fast".

Why have they settled at 48, why 60, why 120? My question is why not 42? Why not 65? Why not 122? What made them settle on the ones they settled on? 48 because it's double 24? 120 because it's double 60? Why?

Or am I totally off my rocker here and talking about things again, which I clearly don't understand? :(
 
Last edited:
I haven't seen any of the recent high frame rate films so I can't comment on how they look. But I know they had sequences shot at high frame rate and were projected at the same high rate, the motion should appear normal speed just smoother than 24fps.
 
TV in the states is basically 30fps, and may be actually 30fps under HD.
TV in Europe was 25fps. No idea if it changed for HD or not.

film has always been 24fps as far as know, with the high frame rates being 48fps.

Another way to think of the difference is the same way you think of photo resolution or screen resolution. At a resolution of 1920x1080, you see a whole picture. At 3820 x 2160 you can see the same picture. Both contain the same objects and elements, but the one at 3820 x 2160 has a lot more detail to the photo.

In film/video, That 48fps shot has 480 frames that are played back over 10 seconds. A 24fps shot has the same content but contains 240 frames to hold the same amount of data. The 48fps clip has 2x the amount of motion detail that the 24fps clip has.

The stuff you're thinking about is when they use high speed camera's like for capturing a bullet shot from a gun. That camera is shooting 300fps. You can't see the bullet in normal time, so that 300fps source is played back at 30fps which slows the motion down a factor of 10 so the human eye can see what's going on.
 
I think this debate about frame rates is a bit silly and misdirected ...

When you ride as a passenger in a car, bus or train and you look straight out of the window, objects appear as blurry but when you follow them with your eyes, they look sharp. The difference is in how you choose to see the real world through the lenses of your eyes.
Similarly, in a movie scene when something is supposed to be portrayed very subjectively, then you might want things that the character does not focus on to move by in a blur. But when the scene instead is supposed to overwhelm the viewer with a large vista of moving, visual detail, then you might want to let every motion to be followed by the viewer and be crisp.
The difference here is in how the cinematographer chooses to portray the scene. It would be artistic choice to have one or the other. But the other would be possible only in a high frame rate.

These days, every major movie spends a whole lot of time in post-production. It is not a big deal to add motion-blur to footage shot at a higher frame rate. You can even make it selective to particular objects in the scene. And it will look just as blurry as something shot at a lower frame rate. And if you'd want to replicate a stroboscope effect, then that is possible too.

This resistance against high frame rate has come about because some director or other had got pressure on him from his studio/producer to shoot at a high frame rate. But I think the issue is not really frame rate as such, but about the director being deprived of his artistic choice of how the scenes should be shot.

Why have they settled at 48, why 60, why 120? My question is why not 42? Why not 65? Why not 122? What made them settle on the ones they settled on? 48 because it's double 24? 120 because it's double 60? Why?
24fps had been chosen a long time ago by Eastman and Edison because it was the lowest frame rate that they could get away with, so that they would be able to get as much run-time as possible from a length of celluloid.
Even if film had been captured with the shutter at 24 fps, each frame was often projected with a shutter that showed each frame two or three times to make it appear as if the actual frame rate was 48 or 72fps.
Even digital projectors use shutters. Cheaper projectors use a spinning colour wheel in front of a single set of DLP mirrors.

60Hz and 50Hz had been chosen for television to be the same as the electric current in the respective country, so that there wouldn't be interference with lamps.
When office workers here in Europe started using 60Hz computer monitors from America, people started getting headaches because of flicker created by the interference with fluorescent office lights at 50Hz. This is less of a problem nowadays because most modern lamps run at higher frequencies.

Frame rates are doubled or tripled because that makes is easier to convert from one to the other, while avoiding stuttering or having to blend images.
There are issues when showing 24 fps movies on TV. In Europe, movies were usually shown slightly faster at 25fps. Conversion to 60Hz is usually done with 3:2 pull down which causes some stuttering.
BTW, later iPhone and iPad use 120Hz: which is double than before, to sync with media playback and to make it easier to make smooth animations for both newer and older devices. The high frame rate is to avoid stuttering when people move objects and follow them with their eyes.
 
Last edited:
As someone working in video production, I can say that 24 should pretty much be standard across the board. Most of the “film standards” were developed because they approximate how we see reality. 24 frames per second, at a shutter angle of 180 degrees, is how we see motion. 16x9 aspect ratio is the perspective that a person will see—we have a bit more in periphery, but 16x9 is the space we focus on. Any time anyone has tried to convince me that video should be delivered in anything other than 24, I get pretty heated. With as many things you have control over in film to add your own unique voice or style, motion should not be one of those things. There are situations where you would want more or less motion blur—action scenes generally benefit from a smaller shutter angle, leading to slightly crisper motion. (Think of some of the fight scenes in Captain America 2). “Shell-shock” type effects benefit from a higher shutter angle, resulting in more motion blur.

As for 30fps, I really can’t stand it when people defend it, both as a US broadcast standard and from a “stylistic” choice. The 30fps standard in broadcast comes from the advent of television, where they realized that 24fps just didn’t work with the frequency they broadcast at. Thirty happened to work better. But nowadays, most television isn’t delivered through analog broadcast but rather through satellite, so it’s a defunct standard. If you’re shooting sports, or slo-mo, or doing animation, different frame rates are normal/acceptable—but something like Gemini Man is an abomination, and it boggles my mind that some people think it looks more realistic.
 
In film/video, That 48fps shot has 480 frames that are played back over 10 seconds. A 24fps shot has the same content but contains 240 frames to hold the same amount of data. The 48fps clip has 2x the amount of motion detail that the 24fps clip has.
Yes, 48 fps has double, but visually, it may not have the same "speed" to the movements as the 24 fps clip. Our eye perceives things in a frequency range. 24 fps looks good or comparable... but 48 fps doesn't for me. What I'm asking is, if higher frame rates are equal, then why are there differences when played back in normal speed and would it not make sense to look into those differences and find one fps that fits the look of 24 fps?

The stuff you're thinking about is when they use high speed camera's like for capturing a bullet shot from a gun. That camera is shooting 300fps. You can't see the bullet in normal time, so that 300fps source is played back at 30fps which slows the motion down a factor of 10 so the human eye can see what's going on.
That's not what I'm talking about. I'm talking about shooting in HFR and playing it back in real time. If you shot something in 300 fps and played all 300 images in one second - that's what I'm talking about. Why does that look different from 48, 60 or 200? Can anyone answer this? Why does it look different, when all higher frame rate played back in normal speed is said to be the same?


Sure, people shooting in higher frame rates usually don't use the same lenses as movies shot in 24 fps. Maybe they should!? We've seen examples of that done where it looked so much better than the crappy lenses they have on the HFR cameras. That's another issue, but I want to focus on the HFR part.
 
I'm not going to say I know the definitive answer, but this goes in to great detail saying we see substantially more than 24fps...

 
Yes, 48 fps has double, but visually, it may not have the same "speed" to the movements as the 24 fps clip. Our eye perceives things in a frequency range. 24 fps looks good or comparable... but 48 fps doesn't for me. What I'm asking is, if higher frame rates are equal, then why are there differences when played back in normal speed and would it not make sense to look into those differences and find one fps that fits the look of 24 fps?


That's not what I'm talking about. I'm talking about shooting in HFR and playing it back in real time. If you shot something in 300 fps and played all 300 images in one second - that's what I'm talking about. Why does that look different from 48, 60 or 200? Can anyone answer this? Why does it look different, when all higher frame rate played back in normal speed is said to be the same?


Sure, people shooting in higher frame rates usually don't use the same lenses as movies shot in 24 fps. Maybe they should!? We've seen examples of that done where it looked so much better than the crappy lenses they have on the HFR cameras. That's another issue, but I want to focus on the HFR part.

When i replied I wasn't sure which aspect you were referring to so, i answered with both :)

I think we wouldn't have the conversation if movies were shot at 48fps from day 1. We've got a 100 years with of film at 24fps so anything else looks odd.

I've never looked at any film and thought that looks exactly like what I see in real life. It simply doesn't. There are aspects of TV that do, but not tons.

Be nice if there was a video online you get two versions of one at 24 the other 48 to see the difference since i've never had the chance to see a 48fps flick.
 
24 frames per second, at a shutter angle of 180 degrees, is how we see motion.
That's the best compromise between strobing and motion blur, but it's not lifelike. Natural vision doesn't have a frame rate, we see motion as continuous without interruption. We accept the 24fps standard because we are used to it, and the 30fps TV standard isn't any different visually. The early Showscan films I saw were the closest to "reality" I've ever seen. No strobing or motion blur, no shutter angle compromise, just realistic believable motion.

I prefer the dramatic effect of standard 24fps, but the higher rate works for special applications. Showscan was adopted for theme park rides because it looks more real than standard film/video, creating a more immersive experience.
 
Like I said, it approximates how we interpret motion. We don’t actually see in frames per second—if we did, we wouldn’t be able to tell the difference with higher frame rates. And yes, with different applications, higher frame rates are appropriate. But I stand by my words when it comes to typical video content—be it web, film, broadcast, etc.

I actually do see a difference with 30fps, though. I think it looks terrible.
 
Thanks for the link. That was really informative.

I was actually just woken up by this debate roaming through my head like a drunk werewolf dressed in drag telling me I should think of the eye as a camera and what happens when you use a higher frame rate camera to film a lower frame rate projection and that it would either seem out of sync or in sync and that frame rates were like oscillation curves and we had to find a frame rate that is higher than 24 fps that matches the eye the same way that 24 does... but that link kinda makes those thoughts sound utterly ridiculous. Though, I'm still interested in why people have different reactions to different higher frame rates. Why are they different. It isn't simply because they get more and more smooth in the movements. There's something else going on. Why does some higher frame rates feel "slow", while others feel "fast" when played in normal speed. That's what I don't understand. Because, if that happens, we should also be able to find one that plays just right - what feels like "normal".

I don't know much about frame rates or the eye or just about anything. Maybe this is just one of these instances where I'm just having a brain fart and I just need to open the window instead of keeping on smelling it.
 
As someone working in video production, I can say that 24 should pretty much be standard across the board. Most of the “film standards” were developed because they approximate how we see reality. 24 frames per second, at a shutter angle of 180 degrees, is how we see motion. 16x9 aspect ratio is the perspective that a person will see—we have a bit more in periphery, but 16x9 is the space we focus on. Any time anyone has tried to convince me that video should be delivered in anything other than 24, I get pretty heated. With as many things you have control over in film to add your own unique voice or style, motion should not be one of those things. There are situations where you would want more or less motion blur—action scenes generally benefit from a smaller shutter angle, leading to slightly crisper motion. (Think of some of the fight scenes in Captain America 2). “Shell-shock” type effects benefit from a higher shutter angle, resulting in more motion blur.

As for 30fps, I really can’t stand it when people defend it, both as a US broadcast standard and from a “stylistic” choice. The 30fps standard in broadcast comes from the advent of television, where they realized that 24fps just didn’t work with the frequency they broadcast at. Thirty happened to work better. But nowadays, most television isn’t delivered through analog broadcast but rather through satellite, so it’s a defunct standard. If you’re shooting sports, or slo-mo, or doing animation, different frame rates are normal/acceptable—but something like Gemini Man is an abomination, and it boggles my mind that some people think it looks more realistic.

Is that why older films that have been obviously reformatted, look bizarre on my HD TV now? Being digitally transferred and presented? I'm really irritated that I can't describe what's off about some movies when I watch them.

They're so incredibly crisp and clear, yet so unnatural looking at the same time. And I don't mean unnatural compared to an original grainy movie theatre projection version. I mean just unnatural looking compared to real life. How it can be so vivid and crisp, and move so oddly is just not what I see in real life. It does bug me.

Then there's the opposite, where some older TV series look like digital blobs (a bit of hyperbole there) with contrasting colours.
 
Is that why older films that have been obviously reformatted, look bizarre on my HD TV now? Being digitally transferred and presented? I'm really irritated that I can't describe what's off about some movies when I watch them.

They're so incredibly crisp and clear, yet so unnatural looking at the same time. And I don't mean unnatural compared to an original grainy movie theatre projection version. I mean just unnatural looking compared to real life. How it can be so vivid and crisp, and move so oddly is just not what I see in real life. It does bug me.

Then there's the opposite, where some older TV series look like digital blobs (a bit of hyperbole there) with contrasting colours.
That could be due to some remastering, but it’s possible you have motion smoothing turned on. Most modern TVs have a “motion smoothing” feature buried in their settings. It’s mostly meant for sports and the like. I remember when the last Mission Impossible came out, there was a big campaign based around the film to get people aware of that feature and to turn it off.
 
Is that why older films that have been obviously reformatted, look bizarre on my HD TV now? Being digitally transferred and presented? I'm really irritated that I can't describe what's off about some movies when I watch them.

They're so incredibly crisp and clear, yet so unnatural looking at the same time. And I don't mean unnatural compared to an original grainy movie theatre projection version. I mean just unnatural looking compared to real life. How it can be so vivid and crisp, and move so oddly is just not what I see in real life. It does bug me.

Then there's the opposite, where some older TV series look like digital blobs (a bit of hyperbole there) with contrasting colours.
The old TV show thing is probably due to SD digital channels that transmit at a very low bit rate. As for the movies there's any number of things that could be going on there, can you name some in particular that look odd?
 
In the days of high speed special effects developed by Howard & Theodore Lydecker they used formula's to gauge size of models to frame rates. The old Lost in Space Jupiter 2 was filmed in higher speeds. most were 96-116 FPS. The pyro effect shots were done at 180 FPS. When reduced back to 24 FPC the model appeared to be much larger. In the mid 1970's Fox produced the Irwin Allen movie Poseidon Adventure. It has what was at the time, the highest frame special effect as the ship is flipped by the tidal wave. 2200FPS!!
 
I can't speak to the issues of higher than 24fps, pro or con, but as far as why they don't use very high frame rates on general cinematography (live action stuff, I mean), I presume the major obstacle is just the amount of light that would be necessary.

SSB
 
There aren't any playback devices to play back at higher frame rates and you would lose the motion blur that looks real to the human eye. Well, there are projectors now as evidenced by the select places that can show 48fps. But those aren't the projectors that are in all theaters either.
 
This thread is more than 4 years old.

Your message may be considered spam for the following reasons:

  1. This thread hasn't been active in some time. A new post in this thread might not contribute constructively to this discussion after so long.
If you wish to reply despite these issues, check the box below before replying.
Be aware that malicious compliance may result in more severe penalties.
Back
Top