cavalcade
From recent Digital Foundry videos its been clear that we're really hitting at the marginal end of visual differences between a moderately powerful system and a mind bogglingly powerful one.
As I mentioned elsewhere I shifted from a Ryzen 3400g + 1080 to a 5600x and OC 3800, mainly because I could (#ceolife), but also out of a curiosity to see how much it elevated the gaming experience. I also use a HDR capable ultra-widescreen monitor. So I thought it would be interesting to see how people rank what's important to them these days. Here's my HOTTAKE
To me, ultimately getting the framerate above 60 is the most important. And after all the tech upgrades that's the bit I've enjoyed the most. Apex is running at 290fps. Cyberpunk at 70fps with everything turned up. Even Sonic Transformed is now silky smooth. I was able to do this before to an extent, but on Apex I literally ran it with everything turned to 0 to try to maintain framerate. Seems crazy to use a £1k card just to improve the boat sections in an ancient cartoon racer but that's PC gaming for you.
Another tech I think Garwoofoo has also sung the praises of is HDR. When done well I think this is a far more visually arresting tech than RTX or simply chucking polygons at the screen. Destiny 2's HDR implementation is partially broken and often terrible, but when it clicks it's absolutely stunning. Switching it off is like coming off a coke high. I also like the fact I can now oversample on Destiny by running it at 200% and then having it scaled down. The fidelity and art design at times is stunning. HDR though has to be right up there as a key tech for making games look amazing and next-gen (if done well).
OLED, would be next on my list. Having seen OLEDs in action I think a properly tuned one is an absolute thing of beauty. Framerate, HDR and OLED together would be my goal over the next few months. As the saying goes - "how do you know if someone has an LG OLED TV? they'll tell you".
Less important to me is 4k. I think the difference between it and 1440p (and even 1080p) isn't really that incredible. Especially on a monitor at short view distances. I think the Star Wars Jedi Order game is one of the few I genuinely thought the 4k implementation was good. For many other games it's sort of visible, but not really. I'm going to hook the PC up to my 4k TV and revisit this, but even on 4k content on amazon/Netflix etc you have to go full Leadbetter to really appreciate a massive difference.
And bottom of the list has to be the difference between Low/Ultra graphics settings and even RTX on and off. Low in videogames today normally looks like you'd need a DF video to distinguish it from higher settings. And the difference between High and Ultra is often laughable at times.
Ray tracing is also clearly a bit of a busted flush. Metro Enhanced looks….. different? maybe? Not better. Cyberpunk looks…. different? too. Again, it's a sort of impressive tech, but when deployed it really looks a bit videogamey - like an overt Photoshop touch up on a model or something. Quake RTS is fun, but at the same time, I can't say I'm totally convinced. One game I have been impressed with is Ghostrunner that at top fidelity levels really meshes into a visual feast that uses all RTX and other approaches to create stunning levels (and it's also a great game). I do wonder if we're in a bit of a first phase of tech while people try to work out how to deploy it artistically. I think the Lego building game offers a hint of where the tech could go - sort of hyper-real, dense, tactile environments. But I'm not sure we're there yet. As Gar says, we're at the shiny gun phase.