4

I noticed for the newest episodes of WandaVision that are coming out that some versions of them are tagged as 60 FPS. I don't recall previously seeing 60 FPS explicitly marked on any other TV shows or streaming content.

Apparently the previous standard was:

24fps – This is the standard for movies and TV shows, and it was determined to be the minimum speed needed to capture video while still maintaining realistic motion. Even if a film is shot at a higher frame rate, it’s often produced and displayed at 24 FPS. Most feature films and TV shows are shot and viewed at 24 FPS.

WandaVision is certainly not a show with a lot of busy scenes nor does it seem like there are a lot of slow-mo effects - which usually seems like a good use of 60 FPS. In fact, a lot of the "modern day" scenes in WandaVision actually look quite amateurish cinematography wise; it almost looks like a High School film project in parts. Could it be that the higher FPS makes it look worse?

Is 60 FPS becoming the new standard for TV?

Mou某
  • 664
  • 8
  • 13

1 Answers1

5

I see nothing relating to this on either Wikipedia or IMDB. You'd think if they were trying some new format, they'd want to tell people about it.

BTW, 24fps is traditional cinema; TV is 25 PAL/SECAM [Europe] or 29.97 NTSC [America/Japan].

Artificially interpolating to 60 fps [called Sport mode on some TVs] can make regular cinematic imagery look awful. The eye/brain expects motion blur on cinematic productions, so much so that if it's artificially reduced it looks odd. The brain expects this so much that sometimes it's actally artificially increased, to make action look 'even more cinematic than cinema'. This was one of the things they had to learn in CGI too, to make things look more realistic.
When the Hobbit was first released at 48fps, this lack of motion blur was what people were complaining about - whether they realised why it bothered them or not.

Without knowing what the source of these 60fps versions is all I can do is voice is my suspicion that these have been done by the same people who are confused by the teen gaming interweb meme voodoo that games are 'better' at higher fps.

Edit
A quick google shows the only source to be clueless interwebz bozos on YouTube, as surmised.

The official trailers are all at approx 25 fps (my onscreen fps widget measures actual, which varies slightly, not transmitted.)

In short; don't mess with things you don't understand. Allow that the people who made the show and spent millions of dollars doing so probably know what they're doing.

Napoleon Wilson
  • 58,981
  • 64
  • 338
  • 660
Tetsujin
  • 53,336
  • 12
  • 177
  • 170
  • I'm seeing it in Chinese actually: 60帧版. I'm not sure where they're getting their info - but it's intriguing. – Mou某 Feb 13 '21 at 19:34
  • 2
    The artificial 60 fps is called the 'soap opera effect' (https://en.wikipedia.org/wiki/Motion_interpolation#Soap_opera_effect), so considering the setting of the show... – magarnicle Feb 15 '21 at 00:01
  • Ah, yes. It's a term I know - but they wouldn't do that to old 'squarevision' black & white, would they. BTW, the "watching a behind the scenes featurette" aspect is precisely what I've always called LCD television's attempt at rendering movies, or to use my exact definition "the making of…" (I skipped LCD myself, I held onto plasma until OLED came along). – Tetsujin Feb 15 '21 at 07:45
  • Games are generally better at higher FPS. If you have tried playing an action game at 24 FPS, you'll notice that its hardly playable and actually can be very bothersome to look at. Honestly, the 24 FPS movies bother me as well: there are times when you can really see just how jerky they are. – Obie 2.0 Jan 08 '22 at 07:59
  • @Obie2.0 - if a film looks jerky at 24fps, then it's either a poor conversion, low quality stream, or a bad interpolation down from your screen's natural sync rate. Some TVs will change sync rate to match. My old plasma used to, my new OLED doesn't, but it interpolates pretty well considering. – Tetsujin Jan 08 '22 at 08:08
  • @Tetsujin - No, it's just the nature of the medium. The difference between 24 FPS and 60 FPS is usually extremely obvious. It becomes harder to perceive over 60 FPS, but plenty of studies show differences in perception up to 76 or even 120 FPS, and even up to 1000 FPS in extreme cases (flashes of white on black). One can condition oneself to view low-frame-rate content as natural (which is what happens to a lot of people who are used to seeing videos at 24 FPS), but human visual perception is sensitive enough that it's kind of obvious. – Obie 2.0 Jan 08 '22 at 08:11
  • …or it could be you're using an LCD screen. They're terrible for movies. The only LCDs I've ever seen that can handle movie/TV images properly are the ones we use at work… & they cost £30,000 each! Don't believe the hype of what the human eye can perceive. You can see the flicker of 60Hz on a CRT screen, especially using peripheral vision which is more sensitive to motion [hunter/prey reaction from when we lived in trees] Once the 'flicker' is gone, as with LCD/OLED then that perception disappears. You'd be more likely to be able to perceive it in the cinema than on a TV. – Tetsujin Jan 08 '22 at 08:15
  • I don't believe the hype. I believe research. Take a look at the thesis Dynamic Frame Rate: A Study on Viewer’s Perception in Changes in Frame Rates within an Animated Movie Sequence by Kai-Lin Chuang. It is relatively recent and summarizes a lot of the research on perception of frame rates (and the perception of frame rate quality). – Obie 2.0 Jan 08 '22 at 08:17
  • Or take a look at Detecting meaning in RSVP at 13 ms per picture (13 ms/picture is about 77 pictures per second) from McCourt et. al. The fact is that visual researchers have recognized for decades that human visual perception can easily detect short-duration changes, but a large part of the movie industry persists in claiming otherwise. – Obie 2.0 Jan 08 '22 at 08:21
  • One thing I do suspect is that younger people, having grown up with screens that are almost universally at least 60 Hz, video games that can actually be played at that rate, and frequently even higher frame rates now, may be less conditioned to view lower frame rates as normal. Who know, maybe they are even better at perceiving them. That would be an interesting study to conduct. – Obie 2.0 Jan 08 '22 at 08:23
  • Oh, I forgot to mention one of my favorite ones, the aptly named Humans perceive flicker artifacts at 500 Hz, from Lee et al. That is really a comprehensive article: it looks at the influence of color, the spatial form of the flicker, and so forth. Also, it talks about how saccades influence perception of high-FPS flicker (my cousin is a neuroscientist who does research on saccades, so that's cool). – Obie 2.0 Jan 08 '22 at 08:29
  • I skimmed the DFR article. I'm not extracting from it what you seem to. It says some people can tell the difference & some of those like it. They were testing simple animations. I already know how flicker is perceived. Frame rate in a movie does not equate to flicker in a modern screen, or even modern cinema where there is no 'dark gap' between frames like there used to be. But, sure, dynamic frame rate can be used for emphasis. It was used to superb effect 20 years ago in Band of Brothers [can't remember now whether it ewas actually first used in Saving Private Ryan]. – Tetsujin Jan 08 '22 at 08:33
  • @Tetsujin - "It says some people can tell the difference & some of those like it." Well, if you believe that, it sounds like you agree with me, because that is precisely what I am saying. There is just sooooo much research showing how easy it is to distinguish between 24 FPS and higher frame rates, even if you cannot tell yourself. – Obie 2.0 Jan 08 '22 at 08:34
  • Try not to put words in my mouth. When asked to concentrate on one single aspect of an animation, some people could see it & some people liked it. That's all the study shows. – Tetsujin Jan 08 '22 at 08:37
  • @Tetsujin - Aaaaand that is what I am arguing. So what exactly do you disagree with? My use of the word "jerky" to describe that perceptible difference? – Obie 2.0 Jan 08 '22 at 08:38
  • We're now going round in ever-decreasing circles. – Tetsujin Jan 08 '22 at 08:40
  • @Tetsujin - Sorry, I have no idea what you are trying to say at this point. It sounds like you agree that frame rate differences between 24 FPS and higher frame rates are perceptible. I agree; I can see them; I call them jerky because to me the jumps between locations are more perceptible than in real life or video with a high enough frame rate for me not to notice. – Obie 2.0 Jan 08 '22 at 08:43