Is the Next Generation Limited to 30 FPS?

16 Views

Luckily the year 2020 has arrived – a year of next-generation consoles in which we are expecting 60 fps as the norm for console gaming. Get excited gamers, good times are coming for console fans!

It was a sentence that concluded my optimistic article half a year ago. We are half-way through 2020 now, meaning we got a glimpse of some new games and along with new hardware in the PlayStation 5 and Xbox Series X. Let’s examine where we are in terms of console gaming expectations and performance to see how that optimism is holding up? Let’s dive-in together.

Allow me to shortly analyze “the problem”. Seven years ago, the XboxOne and PS4 arrived. Both have an 8-core AMD Jaguar CPU clocked close to 2GHz. Why is that relevant? Different video games have different hardware (power) requirements. For example, if you look at the following two images, you will undoubtedly understand that these two games can in no way require the same amount of processing power. Thus, Celeste can run on the Atari VCS while Gears 5 requires at least Xbox One. Meanwhile, Celeste will run at 60 fps on the Atari (which is going to be less powerful than XboxOne) while Gears 5 will run at 30 fps on Xbox One.

Why is “framerate” important?

If you have already checked the animation above you will notice some differences. While square 60 undoubtedly moves most smoothly, square 15 doesn’t look very pleasing to the eye. So what about the middle one?

When you are playing a video game the image on your screen usually changes 30 or 60 times every second. Some PC gamers can run games on 144 and 240 fps, while console games usually fall within the 30 and 60 fps range. This number reflects how your brain perceives motion. The human brain is highly adaptive and can fill in the gaps. It’s been proven that humans indeed can benefit from 240 fps. However, it’s usually most beneficial in competitive games like CS:GO. Meaning that when you watch a 60 fps motion, your brain will fill in the blanks in between and understand it as motion. The same applies to 30 fps and lower. However, the problem is with 15 fps is that the screen content changes so rarely that playing competitive multiplayer games like CS:GO is almost impossible.

Comparing Chess and Celeste

Imagine playing chess at 5 fps. You move the cursor with your controller’s D-pad, select your queen, and move her to another position. At 5 fps, the animation will appear as judder. But it WILL BE an animation and your brain will perceive it as motion. In this case, the motion is happening on its own and your input is no longer required. Sort of like watching a movie.

Playing Celeste at 5 fps is another story entirely. Even at 5 fps, you’ll still be able to see her move, but it’ll appear as judder. When it comes time to jump, you’ll likely end up falling. Because by the time you press the jump button it’s already too late. How do I know that? The game knows where Madeline is at any given moment. Whereas, your display only refreshes 5 times per second meaning that Madeline will, in most cases, be much closer to the hole than you’re seeing on your screen. As a result, your screen isn’t updating Madeline’s latest position fast enough. Your brain will eventually learn to compensate. Yet, it will always be much easier for it to make a decision based on updated information.

Many experts conclude that 60 fps is an ideal compromise. It’s a tradeoff between performance and gameplay experience. That’s why you will see many PC players playing at 60 fps. Since computer resources are limited by their hardware, they are always distributed. This explains the dichotomy between the fluidity of motion (framerate) versus visual fidelity (screen resolution, foliage, texture quality, number of on-screen NPCs, etc…).

Why are we even talking about this, then? Why are most console games 30 fps and why can’t we change it like on the PC?

These are very good questions! Graphics are the first thing gamers notice when they play a game. Because of the nature of hardware limitations, developers who strive to console games visually appealing first.  Games like Gears 5 are usually designed around 30 fps in order to maximize visual fidelity. In turn, this design decision compromises the “feel” of the game such as reaction times, input lag, etc…. But is it all doom and gloom?

Arguably, no.

Once the brain adjusts to 30 fps gaming it will remain capable of filling in the gaps that are missing. From then on, players will have no problems at all playing other games at 30 fps. Admittedly, input latency (the time that passes between a button press and seeing the action happen on-screen) is amplified by lower framerates. This too is also being compensated by the brain. It’s best for the same player to not switch framerates often because the brain will need to re-adjust. Ideally, all games should have the same framerate, but for the most visually impressive games on consoles, it’s just not possible.

The gaming market generally consists of two major consoles and unlimited variations of PC configurations. Because of the fixed nature of console hardware, it usually sets the base-line for the power required to run the games that we call “current-gen”. Typically, developers then create games that use all of the available resources on consoles towards its visual “presentation” rather than its performance.

Why can’t we have visually impressive games that run at 60 fps?

We can! But it’s not that simple. Because most games are designed around the power of the consoles (lowest common denominator) you can run them on an adequately powerful PC. As a result, you can have both visual fidelity and raw performance. Not only that, but you can also have even higher visual fidelity and even higher performance. However, that all depends on how much money you’re willing to spend on that PC. While console prices are usually close to $500, gaming PC prices can be range between $500 and $3000.

But there’s another catch! Framerate is heavily dependent on the CPU’s power. Remember: gaming devices have a CPU, GPU, RAM, storage, and other parts… So let me quickly recap each component’s role:

  • GPU renders the image on your screen
  • CPU calculates changes on your screen
  • Storage holds the data that’s going to be drawn on your screen
  • RAM loads the data from the storage and connects the CPU and GPU to work together on that data

All these components should be as fast as possible. Yet in the current PS4 and XboxOne generation, one of these components was severely lacking in power. This created a system bottleneck known as the AMD Jaguar CPU. It’s common knowledge by now that many developers blamed the Jaguar CPU for causing unstable framerates. As a result, they opted for halving the framerate in most cases, resulting in many 30 fps titles. Some incredibly talented developers managed to work around that limitation, but it was certainly not a standard feat in this generation. Let’s name a few: Fortnite, Call of Duty, Gears, HALO, Forza Motorsport, Gran Turismo Sport, Warframe, Apex Legends

The good news is that this generation is coming to an end and console manufacturers have pledged not to create any more bottlenecks!

We are finally getting 60 fps as a standard on consoles!

Not so fast.

AMD’s Ryzen CPUs will power the next-generation consoles. They are roughly four times faster than the Jaguar CPU that we have today. So will next-gen games completely dump 30 fps games? Well, that Hellblade 2: Senua’s Saga trailer that we saw during the Xbox Series X reveal was running at 24 fps. What does that mean? Hopefully, nothing. A recent tweet from Microsoft’s Aaron Greenberg is telling us that we should expect 60 fps as a standard on the next-generation consoles.

What exactly that means remains to be seen and is open to debate. But personally, I would expect all the next-gen games to be 60 fps minimum. Yes, but… There is always a catch! After the May Inside Xbox presentation, Ubisoft stated that “Assassin’s Creed Valhalla will run at a minimum of 30 fps”. What does that mean? An educated guess tells me that the game will target 4K resolution at a stable 30 fps by default. But it will give us an option to switch it to 60 fps while losing some visual fidelity.

Are you telling me that next-gen consoles still cannot do 4K60?

Essentially, yes. But no.

Consoles have never been about raw power. PC’s do the “raw power” thing while also carrying eye-watering prices. Consoles were always about optimization and finding the best balance. During the current mid-gen console refresh,  developers learned a lot of neat tricks to achieve the highest possible visual fidelity. For example, checkerboard rendering pioneered by Sony’s first-party studios, while Microsoft’s first-party games used dynamic resolutions. These and many other workarounds are used to lower screen resolution when there’s a lot of action on the screen to unload console resources and make room for stable framerates. Usually, when action is happening on the screen the human brain is busy processing action and cannot pay attention to little details. Meaning that it won’t notice the temporary drop in screen resolution, but it will notice if the framerate drops because stable framerates are important for gameplay. So, if developers are targeting 60 fps, it will be harder to maintain that number rather than opting for 30 fps and maintaining that.

Next-gen consoles have even more tricks up their sleeve like variable-rate shading. What this trick does is that it lowers the screen resolution, but only on the parts of the screen which are unimportant. For example, lowering the resolutions of mud puddles somewhere on the ground where nobody will notice while maintaining the character models in full resolution. This method will produce similar unloading of system resources as the old dynamic resolution trick does while producing a visually much better picture overall.

So we’re good? New CPU’s + new tricks = 60 fps as a standard!

Sony recently unveiled many of its next-gen games a few days ago. The steam was uploaded at 1080p and 30 fps, Digital Foundry has made a few educated guesses about which ones were designed around 30 fps. Namely Horizon Forbidden West and Ratchet & Clank: Rift Apart both seem to be targeting 30 fps, although nothing is confirmed at this point by Sony.

Why? Why would games still aim for 30 fps even though the next-gen consoles come with powerful hardware such as blazing fast Ryzen CPUs and why would you even care? All of that is up for debate once again. Unfortunately, it’s going to be another seven years before there’s a definitive answer.

Leave a Reply

Your email address will not be published. Required fields are marked *