6

I've recently restarted playing video games after around 10 years, and I've noticed that there is a lot of attention about resolutions.

Now, putting aside that this could be as much an obsession as megapixels for cameras are, I wonder if increasing the resolution in video games is more significant than it is in movies.

After all, movies "look real" already at 576p (not even counting the comparative loss due to anamorphism), so why is there this rush to freak resolutions?

Are there technical grounds for giving great importance to a resolution of 1440p, rather than 720p?

Edit: originally, the question was about resolution exclusively, then somebody edited it and added the framerate. There are already answers for the latter, but since this somewhat adds value to the question, I'll leave it there.

Marcus
  • 327
  • 1
  • 6
  • 2
    I didn't know that resolution could be higher than framerate. – Q20 Oct 31 '15 at 15:18
  • 3
    Resolutions and framerates are completely separate things. While they're both appllied to video, they aren't the same thing. – Frank Oct 31 '15 at 15:33
  • Do you mean to ask why are video game frame rates higher than movie and tv frame rates? – Tharius Oct 31 '15 at 17:17
  • 1
    This is a great question! Consider rewording the title to something like "why do video game resolutions and framerates need to be so much higher than tv and cinema". – Alexandreana Constantinescu Oct 31 '15 at 17:46
  • none of the body mentions anything about framerate, question was most likely just about resolution. – Aequitas Oct 31 '15 at 21:41
  • @Marcus, play a game at low resolution, you can immediately see the difference – Aequitas Oct 31 '15 at 21:42
  • @Aequitas perhaps incidentally,some games look better at lower resolutions than they do at higher resolutions. also some games look much better on a crt than they do on an lcd, though we wade deeper into subjectivity at this point... – Alexandreana Constantinescu Oct 31 '15 at 23:45
  • @AlexandreanaConstantinescu name one – Aequitas Oct 31 '15 at 23:58
  • The question is exclusively about resolution. The concept I'm questioning is quite subtle; it's obvious that increasing the resolution makes the game look better, but the question is if there are technical grounds for which increase in resolution has a more significant effect in videogames compared to movies. In other words, I wonder if aliasing has some inherent difference between games and movies. – Marcus Nov 01 '15 at 11:32
  • @Aequitas I play games at 720p, never more, even if my setup can "afford" it, so I'm aware of it :-). I rather have a higher framerate than a higher resolution. – Marcus Nov 01 '15 at 11:33
  • could someone with higher reputation add more relevant tags? I can't create new ones (eg. resolution). – Marcus Nov 01 '15 at 11:35
  • @Marcus ? Are you referring to my first comment? That was related to the edit of your question to include fps. Also if your setup can "Afford it" you should be getting the same frame rate at the higher resolution. Note that in a lot of games there's an advantage to having a higher resolution as you can see more, like in terraria – Aequitas Nov 01 '15 at 11:41
  • 3

1 Answers1

7

Yes, there are good reasons for why games need a higher resolution and framerate than movies.

TL;DR

Basically, it all boils down to the fact that a PC needs to compute everything it wants to show, while a movie simply records everything it sees. Therefore a movie can display the real world more accurately than a game, using less precision.


Framerate

When you take pictures with a camera, sometimes you may end up with a blurry picture. This can happens when you move too much and the scene is too dark. That's because even though a camera is supposed to capture a single moment, it can't. Ideally, it will capture a scene that is shorter than 100 ms, which isn't too blurry.

Movies are made of several such pictures every second. Since all frames have a minimal amount of blur, they appear to be seamless. Thanks to that, a movie can still look decent even at framerates as low as 24 FPS.

PC games are quite different in that regard. Since each frame is rendered for a very specific point in time, there is no blur. Since there is no blur, there is a gap in between each frame. This gap is much clearer in a game due to the lack of blur, and thus much more noticeable to the human eye.

But isn't there plenty of blur in modern games nowadays? Yes, but this blur is applied to each individual frame and is quite different from the natural blur found in movies. Modern rendering techniques attempt to blur each frame based on how objects have moved since the last frame. This makes motion blur in games look more natural and might help make framerates appear smoother than usual.


Resolution

With resolution, we have a similar phenomenon happening, but for a different reason.

As a gamer, you're surely familiar with the term "aliasing". Aliasing does not happen in movies, because there is a "blur" happening on each pixel. When the camera records several light sources for a single pixel, it will mix that light. That way, the seam between one object and the next is blurred, hence we do not see aliasing.

The PC operates differently. It doesn't record a scene, instead it computes it. In order to determine which color a specific pixel needs, the PC needs to calculate the color for a very specific pixel. For the PC, a pixel has no size; it is merely a point in the scene which needs to be rendered. For us, a pixel has a very real size, and unless you have a screen with a very high pixel density (commonly measured in DPI or PPI) those pixel are pretty big.

In order to avoid aliasing, the PC therefore needs to treat a single pixel as if it were made of several smaller pixels (aka. super sampling: rendering a frame at a higher resolution than you display it). Obviously, the smaller those pixels, the more accurate the anti-aliasing, and the more resource intensive it becomes. There are of course other, less resource intensive (and less accurate) anti-aliasing methods, like FXAA, which simply applies a very weak blur to hide aliasing.

Increasing the resolution doesn't exactly get rid of aliasing, but with small enough pixels, it becomes more difficult to spot.

Nolonar
  • 44,741
  • 32
  • 165
  • 225