Demystifying the rumors that most console players only care about graphics.
This means game devs will start focusing more on better performance and optimization, right? …RIGHT?
Why spend time optimizing when you cant just slap some dogshit upscaling technology in and call it a day? Cant blame a lot of developers though, with the shitty time restraints imposed by management.
It’s really a shame because the upscaling tech is nice but it still has a lot of visual glitches and issues that keeps me from using it much. It might look nice on still images, but once things start moving there’s a lot of blur and ghosting.
Same goes for raytracing, it can look good but lights and reflections will often still bug out, which takes me straight out of the immersion.
Maybe because there’s usually not even a perceptible difference in visual quality between the two and a very noticeable performance difference.
It’s hard to tell if this would have been the case in past generations. I think this only became true once we crested the graphical plateau where all games look “good enough” in HD.
Not like console gamers were ever given a choice, but PC gamers kept wanting PC ports for more frames over the 30 fps standard. Graphics were already good during the PS4 era and PS5 is still crutching so hard on PS4 games during their PS5 pro showcasing. Now console users wanting the same after finally getting the option over a decade later I think shows they aren’t too different from PC gamers in loving frames.
That sends me back to when people in online discussions regularly claimed anything above 60 fps is pointless because the human eye can‘t see more than that anyway
That claim is such a pet peeve of mine. That’s not even how our eyes work, and it’s demonstrably untrue.
It can even be proven false by rapidly moving the mouse cursor across the screen very quickly and the lack of motion blur.
I think console players are catching up on the massive difference between 30 FPS and 60+ FPS in first person games where the camera can move quickly. As TVs have improved along with the consoles, and some titles are able to be played at 60+ FPS, people are noticing the difference compared to newer titles that aim for 30 FPS as a trade off for detailed graphics and motion blur.
Plus performance mode reduces the number of times a game might stutter or have short periods of time where the frames have a massive drop compared to their normal rate.
People think that frame rate isn’t very noticeable until you give them access to a toggle that lets them double it.
I’ve selected “performance” in the PS5 settings, but I’ve experienced several AAA games ignoring it and having their own graphics setting that defaults to “fancy graphics” mode.
3/4 of players want performance, but publishers don’t care.