The side-by-sides are definitely diminished returns compared to earlier gens where hardware bumps had very noticeable gains.
I am sure the performance is measurably better than the base PS5, but I don’t think it’s $200-plus-separate-disc-drive better.
I also found the game choices they used for some of these comparisons to be odd picks. Sure you have “Made for PS5” exclusives like the new Ratchet and Clank, Returnal, and Spider-Man 2, but they also heavily showcased:
The Last of Us Part 2
God of War: Ragnarok
Ghost of Tsushima
Horizon: Forbidden West
Control
All of those are last-gen games that received PS5 enhancements. Being on a base PS5, I already feel like I am getting the “better” experience compared to the default for those games, so why upgrade?
Well, as a PC gamer, there’s a bunch of settings you can turn on from “last Gen” games to make them look better. Just because they ran on those machines doesn’t mean you were getting the best version. If you’re playing on console you’re never getting the best version. This newer one can just turn on more settings and a higher resolution and framerate than the previous ones. I wish they’d let players decide what settings they want themselves, but sadly that’s not happening on console anytime soon I don’t think.
Chasing the “best version” is a fool’s errand, though. Unless you’re buying top-of-the-line hardware every cycle, you’ll never have the best. And even then, there are games that seem to target future hardware by having settings so high not even top-end PCs can max them out comfortably, and other games that are just so badly optimized they’ll randomly decide they hate some feature of your setup and tank the performance, too.
Everyone has their threshold for what looks good enough, and they upgrade when they reach that point. I used my last PC for 10 years before finally upgrading to a newer build, and I’m hoping to use my current one as long as well.
But just based on the displayed difference in performance between the base PS5 and the PS5 Pro, it doesn’t seem like a good investment for what benefits you get. It’s like paying Apple prices for marginally better hardware, and with overpriced wheels disc drive sold separately.
For sure, trying to max out everything is a bad idea. You can always have from FPS and higher resolution, for example. My point is just that “last Gen” doesn’t mean anything. The previous console versions couldn’t max the games out if they had graphics options. The game being older doesn’t mean it doesn’t take advantage of more advanced setting with better hardware.
I think chasing high graphics settings in general is a dumb idea. My favorite games are low fidelity indie games that do interesting things (right now Ostranauts, but also Factorio, Dwarf Fortress, and so many others). The games that max out my hardware are generally worse games. If you’re selling your game based on graphics then you aren’t selling it based on gameplay. I know console players generally seem to care about “realistic” graphics more, but it’s a fool’s errand.
Man, this is true now, but this conversation makes me very nostalgic for the good old days of the 1080Ti, where PC games were absolutely a “max out and forget” affair.
Sure, that was because monitors were capped out at 1080p60, by and large. These days people are trying to run 20 year old games at 500fps or whatever. But man, the lack of having to think about it was bliss.
The side-by-sides are definitely diminished returns compared to earlier gens where hardware bumps had very noticeable gains.
I am sure the performance is measurably better than the base PS5, but I don’t think it’s $200-plus-separate-disc-drive better.
I also found the game choices they used for some of these comparisons to be odd picks. Sure you have “Made for PS5” exclusives like the new Ratchet and Clank, Returnal, and Spider-Man 2, but they also heavily showcased:
The Last of Us Part 2
God of War: Ragnarok
Ghost of Tsushima
Horizon: Forbidden West
Control
All of those are last-gen games that received PS5 enhancements. Being on a base PS5, I already feel like I am getting the “better” experience compared to the default for those games, so why upgrade?
Well, as a PC gamer, there’s a bunch of settings you can turn on from “last Gen” games to make them look better. Just because they ran on those machines doesn’t mean you were getting the best version. If you’re playing on console you’re never getting the best version. This newer one can just turn on more settings and a higher resolution and framerate than the previous ones. I wish they’d let players decide what settings they want themselves, but sadly that’s not happening on console anytime soon I don’t think.
Chasing the “best version” is a fool’s errand, though. Unless you’re buying top-of-the-line hardware every cycle, you’ll never have the best. And even then, there are games that seem to target future hardware by having settings so high not even top-end PCs can max them out comfortably, and other games that are just so badly optimized they’ll randomly decide they hate some feature of your setup and tank the performance, too.
Everyone has their threshold for what looks good enough, and they upgrade when they reach that point. I used my last PC for 10 years before finally upgrading to a newer build, and I’m hoping to use my current one as long as well.
But just based on the displayed difference in performance between the base PS5 and the PS5 Pro, it doesn’t seem like a good investment for what benefits you get. It’s like paying Apple prices for marginally better hardware, and with overpriced
wheelsdisc drive sold separately.For sure, trying to max out everything is a bad idea. You can always have from FPS and higher resolution, for example. My point is just that “last Gen” doesn’t mean anything. The previous console versions couldn’t max the games out if they had graphics options. The game being older doesn’t mean it doesn’t take advantage of more advanced setting with better hardware.
I think chasing high graphics settings in general is a dumb idea. My favorite games are low fidelity indie games that do interesting things (right now Ostranauts, but also Factorio, Dwarf Fortress, and so many others). The games that max out my hardware are generally worse games. If you’re selling your game based on graphics then you aren’t selling it based on gameplay. I know console players generally seem to care about “realistic” graphics more, but it’s a fool’s errand.
Man, this is true now, but this conversation makes me very nostalgic for the good old days of the 1080Ti, where PC games were absolutely a “max out and forget” affair.
Sure, that was because monitors were capped out at 1080p60, by and large. These days people are trying to run 20 year old games at 500fps or whatever. But man, the lack of having to think about it was bliss.