Graphics — specifically as they pertain to resolution — have been a hot topic since the Xbox One and PS4 launched, and will continue to be a hot topic when the PS5 and Xbox Series X launch, much to my chagrin. To put it bluntly, I couldn't care less which is capable of higher resolution graphics. Why? Because no matter what, games on both consoles will look absolutely outstanding. Don't kid yourself otherwise.
I should clarify that I'm not referring to graphics in a broad sense when it comes to everything it takes to create a game's visuals. The term "graphics" has unfortunately almost become synonymous with resolution. Anti-aliasing is important. Ray tracing is important. Shadow mapping is important. That's where you'll see real differences. Resolution, though, not so much. And when it comes to the former elements, I have every faith the PS5 and Xbox Series X will be able to deliver if the developers optimize correctly.
As long as I'm not sacrificing major components of a game and developers aren't feeling hindered by one console over another, I'm cool. I'm not talking about "this console can hit 60FPS and this one can't" or "my game runs at checkerboarded 4K on this console but native on this one." I'm talking about drastic aspects of a game that completely change how they may function. Notably when Middle-earth: Shadow of Mordor came out across both Xbox 360/PS3 and Xbox One/PS4, the Nemesis system had to be dumbed down to its bare bones on the older generation hardware.
And before any of you start saying, "the damage control has begun" based on rumors regarding either console, here's my history of owning video game consoles between Xbox and PlayStation.
I Received an original Xbox in 2002 for Christmas; got an Xbox 360 for Christmas in 2006; upgraded to an Xbox 360 S shortly after it came out; bought an Xbox One with Kinect in 2014; and upgraded to an Xbox One X in 2018. I didn't get my first PlayStation until I bought a PlayStation 4 Slim in 2018. I had used PlayStation consoles prior, all the way back to the PS2, at friends' houses, but Xbox has been my main platform nearly my entire life. I grew up on it.
This isn't me "being a Pony and shilling for Sony" because the PS5 is rumored to be less powerful.
I'm privileged enough now to own both — and cover both for work — and after spending so much time with them, I can honestly say that one isn't inherently better than the other just because of its power. The type of games that they offer, the services they offer, the ecosystems — those are what matter.
I'd also like to remind you that we don't know the final specs we are dealing with. I don't even consider teraflops as the be-all and end-all of a console. Regardless of what the specs end up as, both will be immensely powerful and showcase phenomenal looking games. There comes a point when you get diminishing returns on resolution, anyway.
For starters, let take a look at Digital Foundry's comparison of Rise of the Tomb Raider in 2017.
The Xbox One X screenshots look marginally better, but the difference isn't drastic. It definitely doesn't affect my enjoyment of playing the game.
Now yes, I'm cherry-picking an example. There are thousands of games you could compare on Xbox One X and PS4 Pro, and some may look better on one than the other. Part of the job is also on the developer to optimize it. My main point is this: those both look pretty damn good and even when taking into account YouTube video compression, there really isn't a huge difference between the two. Not one that most people would notice while playing, anyway. It's easy to nitpick when you have a still screenshot to stare at and analyze. And those minor differences in detail will only get smaller and smaller as technology progresses next-gen, whether or not one console is 12TFlops or 9TFlops.
Good luck telling the difference between a game running at 8K resolution or 4K resolution, because you probably can't. Does 8K technically have four times the amount of pixels as 4K? Yes. Does it translate to visual details that will be immediately apparent and perceptible? Probably not.
Let's not forget Spider-Man's "puddlegate" controversy that Insomniac took a lot of flack for. Were the visuals downgraded? No. But a puddle was taken out and lighting was changed, making the scene look "worse" to a lot of people.
Insomniac even made light of the situation by adding puddle stickers to Spider-Man's in-game photo mode. Because when people are ridiculous, you need to call them out. It's a reference I hope the studio somehow carries over into Spider-Man's inevitable sequel.
When I start playing a game, I honestly can't even tell you if I'm playing in 1080p or 4K unless I know what the game runs at off the top of my head. I'd wager most of the general player base for either console are in the same boat. As long as something isn't incredibly blurry, I'll be having fun. 900p, 1080p, 4K, whatever.
To those who can tell the difference, are you really going to argue that playing a game in 1080p instead of 4K completely ruins the experience? Or are you just trying to get into fights over your favorite pieces of plastic?
On the subject of diminishing returns, the picture below has popped up time and time again over the past several years when discussing polygon counts.
After one Reddit user went to "debunk" the image six years ago, they were met with this response from another user who claims to have been an artist at Guerrilla Games at the time:
My biggest takeaway from playing on a PS4 Slim and an original Xbox One before moving to each console's respective premium offering is that resolution wasn't a big factor when it came to my enjoyment. What mattered, and what I immediately noticed, were performance differences. How consistent the frame rate was. What loading times were like.
I only started to notice graphical differences between newer and older tech when I started playing around in Assassin's Creed Odyssey's photo mode. Because I'm a photo mode snob, I needed the perfect shot, which meant I'd always over-analyze any screenshots I'd take.
It's not like we'll suddenly lose anti-aliasing, ambient occlusion, anisotropic filtering and all the other good stuff that begins with the letter A. Shadow mapping, draw distance, and volumetric lighting will only get better and better on next-gen systems. However the PS5 or Xbox Series X implement ray tracing, I guarantee it will look stunning on both. Let's not act like either will deliver potato quality visuals.
And while internet mobs squabble over 8K vs 4K and which console has the most teraflops, the Nintendo Switch will be off in the corner swimming in piles of money, despite having by far the worst specs on paper. Graphics certainly play a part, but they aren't everything, folks.
You know what actually interests me? How the CPUs and SSDs can be utilized to craft bigger and better worlds that load seamlessly between areas. How they can run more complex systems within a game. How game worlds can react to your actions. Just imagine the enemy AI you could be able to encounter.
That's what makes me excited about the PS5 and Xbox Series X. Not resolution. Regardless of the nuance there is to the argument and everything that comprises video game graphics and graphical fidelity, this is a hill I'm willing to die on.
Also, we should all continue playing more indie games next generation. We don't always need hyper-realistic 4K graphics where I can see every hair follicle on someone's face. Stylized graphics are important, too.
PlayStation 4 Pro
Sony's most powerful console... for now
Not everyone will be able to make the jump to next-gen when it releases. If you're one of those people, it's still worth it to upgrade to a PlayStation 4 Pro. It offers some amazing games at steadier frame rates than its PS4 Slim counterpart.
It doesn't matter which one has better graphics because I'm just going to play the Switch anyway.
Enjoy that 2009 experience.
I'm sure they are.
But try acting smug, it's an adorable internet attribute.
And this is exactly what the author was trying to say. You reply with a smug retort to the OP enjoying Switch because they feel they get a better gameplay experience out of it regardless of graphics. And that's what the author was stating as well and you seemed to gloss over that or didn't even read the entire article. Graphics do NOT make a game. It's the experience with the gameplay itself. You could have the greatest looking graphics in a game, but if the gameplay is substandard (crappy AI, crappy story-line, etc), then the graphics are nothing. I've been going back and playing my older PC games and I'm enjoying them. This includes the original Witcher, Witcher 2 and even then, older games I have from GOG (Diablo, the first Gabriel Knight) because the immersive experience with them is what draws me in, not the graphics.
The whole point of the article clearly went right over your head.
I remember when these silly graphics debates started in the days of the SNES vs Sega Genesis. In the end, the graphics really didn't matter much. The most important thing was the games, and how they played. You could craft the most beautiful, true-to-life looking game ever made, but no one will care if the gameplay is complete garbage. On the flipside, a game may have average at best graphics, but people will play it all day long if the gameplay is a complete blast. Case in point, i play my Switch more often than my PS4 or Xbone because the games are a lot more fun, even if they're not quite on the same level graphically.
To our knowledge, Sony and Microsoft have not mandated the the games run at 4k or 60 Fps. It will be left to the developer how to leverage the power of those systems. They may very well choose to have the game run at a lower internal resolution while implementing other features that make the game more real. Lighting, physics, AI, ect. Personally I hope they choose to do so and am concerned next Gen will see only slight visual improvements because the industry is pushing for native 4k.
I Disagree.. I played the same game on 4k max settings and 1080p max settings. There is extreme difference... I can tell the difference without even knowing its 1080p or 4k. Even with all anti aliasing turned up and all shaders you can still tell the diff... I tested this on my Rig as I got rtx 2080ti SLI setup.. I can never play 1080p anymore haha
You really wasted $2500 just on your graphics cards with sli alone just to play at 4k??? LoL!!!!!
Ecosystem is huge here too, and was touched on in the article. Until Sony has an online experience that can compete with Xbox Live, and until they release a pro quality controller, I'll be sticking with MS.
I understand YOUR opinion and you have the right to yours but telling others what we should care about is very arrogant. Exclusive games are dying and therefore the graphics are very important and the main reason they are important is because PlayStation and Microsoft both brag about them and place price tags on their system to justify them to the players. So if these same companies brag about graphics and claim they are they have the best ...then you better well deliver it. If you charge me 70k for fast care that is ugly and another person charges me for a fast car that is also good looking at the same price don’t tell me that I should be fine with being charged 70k for a ugly car simply because it’s fast and I shouldn’t care about looks. If you want that for yourself...feel free and pay for ugly looks, it’s your money but don’t tell others we shouldn’t care about looks when the makers are telling use that this is why we are being charged the fees.
Actually, resolution DOES matter. And the reason that it matters most people generally don't think about: It's our TVs. Because of the way modern televisions operate, different resolutions can provide DRASTICALLY different visual experiences. The problem is that the bulk of us are now using 4K LCD or OLED displays. These are fixed pixel displays. They HAVE to display at their native resolution. Old CRT displays do NOT have to display at their native resolutions. They can sharply display a multitude of different resolutions. On a modern flat panel, if the TV receives a lower than native res signal, it has special processing that can upscale that and still make it look acceptable on the display. The problem, however is that the process is SLOW, and causes severe input lag. The TV manufacturers thought of that, and implemented game modes which eliminate all that extra processing, but then you lose the clarity benefits from the TV's upscaler. The result of all of this is that if you want to play a game on a modern TV and have it not feel sluggish, the game needs to be rendered at as high a resolution as possible to avoid looking fuzzy. Case in point: the Shadow of the Colossus remaster on PS4. The game on PS4 Pro allows you to switch between a 1440p 30fps mode and a 1080p 60fps mode. The 1080p mode, on a 4K set with game mode enabled, looks NOTICEABLY less detailed than the 1440p option. Flowers, foliage, textures, the whole screen basically looks like it has a filter applied. A microscopic sheet of gauze attached to the screen. Switch to 1440p and everything comes into much clearer focus. So while YOU may not care, or notice the difference different graphics settings, be it resolution or anything else, to say that NOBODY should care just isn't within your right. And honestly I feel it's a flawed opinion to perpetuate.
I think the main reason consumers and fans care about "graphics" is because of future proofing. If you're throwijg down 400 bucks on a console you want it to be the best of the best and last forever. Period. So it's fine that you don't care about graphics but that's like telling a muscle car fan not to care about horsepower or 0-60 times because it doesn't translate to real world performance when driving. No. We care because it's our money and we want to make the best decision. So it's fine that you don't care but because other people do care it doesn't make them wrong and really shouldn't bother you enough to write this long article though I understand your jobs purpose is to attract attention with these pieces. Which it did. So good job
I’d tend to agree, graphics on console don’t matter. If you need the best graphics, resolution, and frame rates, play on pc.
graphics and system power only matter to Sony fans when the Playstion is weaker. Tghe Playstation was weaker than the Saturn, it won because it had more games, and Sega made the Saturn very hard to program. A great example is Street Fighter vs Xmen. You couldn't tag on the playstation, and it loaded so slow, who wantedto play it. It sold more, because there were more PSX systems. When the Dreamcast crushed it, folks argued that the PSX had more games, so why buy a Dreamcast. The PS2 came out, and the argument instantly turned into how the PS2 was more powerful. The 360 beat the PS2 and then the PS3. It was more powerful, had better games and cheaper. The PS4 launched, after the disastrous Xbox One TV TV TV!!!!! reveal, and it was stronger. Sony fans, once again, beat the drum of power. The the X1X dropped, the PS4 got left in the dust, the PS4 Pro was also a distant second place, and once again, the coversation shifted away from power, to exclusive games. The early news is the Xbox is alreadu better than the PS5. The only hope was the load times, as it was thought the M2 drive in the Xbox was PCE3.0. Turns out, its PCE4.0. That was the last bastion of hope. The PS5 should be a ton weaker. What do we have here, a Sony shill with hack article after hack article, praising Sony, and telling us to ignore graphics. I am 46. I started with Pong. I have owned a Coleco Vision, Intellivision, Vectrex, Atari 2600, 5200, NES, SMS, SNES, Genesis, every Gameboy, CDI, Neo Geo, TG16, Turbo Duo, Saturn, PSX, Dreamcast, Xbox, 360, X1, X1X, plus a few more. DOJN'T PREACH ABOUT GRAFICS. Graphics matter. Visuals matter. Game play is the most important, but visuals are second. Why should we settle so you Sony shills can help Sony control the game narrative like they did in the early 2000's? It almost cost us the industry. Sony's 3D game only rules killed Megaman and lost us 2D Street Fighter games for 3 years. No. I will not settle for second rate visuals.
I don't have a dog in this fight, but if I was to buy a console I'd buy PS because the exclusives show that Sony believes its customers are emotional creatures with curiosity in their hearts, whereas Microsoft thinks my name is Chad.
Get the best of Android Central in in your inbox, every day!
Thank you for signing up to Android Central. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.