• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Hot Take: Graphics have completely stagnated since 2019

Trunx81

Member
I fully understand what you mean, OP. When the PS4 was announced, everyone was about particle effects. Remember that Unreal Demo with the demon? Every game used it, Second Son was built around particle effects.
Funny also how so many people bring up Forbidden West. I just finished it on the PS4, and even with some minor issues, it ran great. It´s clearly not a next gen game, even if the expansion on PS5 delivers some action that can´t be made possible on the older model (or can it? Jedi Survivor was also once seen as impossible to run on older hardware).

What we will never have again are those leaps that took place in between the 16Bit, 32&64Bit and 128Bit eras. For myself I consider the leap from the N64 to the Game Cube even higher, as it was some sort of the same technology. Like moving from 8bit Donkey Kong directly to Donkey Kong Country.

Today, even years after the next gen release, we are still getting cross-plattform titles (thank you, XboxS). And are already talking about PRO-Consoles, which sparks the question: Have we already seen the FULL potential of PS5 and XSX? Remember the first games on older consoles, how they looked a whole generation behind compared to what we got at the end of a consoles lifecycle (GTA 4vs5, e.g.; Mario World vs DKC3).

High costs for development have been mentioned already, one of the (imho) main reasons we are still getting cross-plattform. Is it holding developers back? Why don´t they tell us?
 

HeWhoWalks

Gold Member
Also, Sony put all their budget into GaaS games.
R.4dc616458013ed5db776b25112f66282

"All their budget"...

How do you know this? Where are all the GaaS games I continue to hear about?
 
Last edited:

SeraphJan

Member
If you want true generational leap, games on PS5 Pro with 30fps is what you are looking for.
 
Last edited:

Clear

CliffyB's Cock Holster
I doubt there's a consensus as to what "best graphics" are. Is it photo-realism? Is it based on sheer artistry? Is it about performance metrics like res and frame-rate?

The problem is that each of the things I've just mentioned have different pathways to achieving. For example photo-realism is a lot about fooling the eye/brain into accepting an illusion; so it becomes about carefully selecting elements so that nothing breaks that illusion. For example presenting the scene as if it was shot with a bodycam by manipulating the image such that the distortions presented by actual footage shot that way are matched closely in the render.

Artistry on the other hand is about creating a certain expression and style. Look at great works of art throughout history and you can't argue that it'd be better if they were photographs and not paintings or whatever. Point being, photo-realism is A goal, not the only goal.

As to performance, obviously it matters when dealing with an interactive medium, but not everything benefits in the same way and a lot of the time -especially regarding resolution- it all depends on how the content is being viewed. Is it on a 50" TV or is it on a handheld or laptop display.

Above all else it depends how nit-picky or demanding the viewer is. For me the only "unforgivable" sin is severe screen-tearing, something that personally I find extremely distracting no matter anything else. I *prefer* smoothness and sharpness, but if the gameplay is fun enough. I can still appreciate what else is going on visually.

The bottom line though is that, at least for me, I'm past the point of caring about metrics and techniques. If a game looks good it looks good. I'm more likely to be wowed by what's being shown and what's going on in the frame, than its fidelity or display of advanced tech.
 

phant0m

Member
Every other gen gets a crux it has to overcome, this gen it's Ray Tracing.
And (incoming hot take): RT sucks. Sorry, not sorry but in most games I cannot tell the difference unless you put still frames side by side.

Give me 60+ fps with nice looking pre-baked lighting and SSR over “but the light is realistically bouncing and coloring the environment!”

In regards to OP’s statement I would’ve agreed a year ago. But we’ve had some really nice looking games in the last 6-8 months and there’s quite a slate of graphical powerhouse games coming this year.
 
Last edited:

Roni

Gold Member
Ever since 2019/2020 there hasn't really been any significant noticeable improvements in terms of graphical prowess, despite the release of the 9th gen consoles. There have been improvements in terms of frame rate and faster loading however.

I think the graphical leap next gen will be absolutely insane where borderline augmented reality visuals such as the matrix tech demo will be the norm, but this current gen feels like one massive waiting period before then where we bridge the gap to 60fps(which is definitely worth it).
Put down some money for GeForce Now and play Cyberpunk 2077 in max settings.
 

nkarafo

Member
I don't think we had any significant jumps after the XBOX 360 / PS3 generation tbh. After that the improvements were minimal and mostly had to do with frame rates and resolution increases. You still have games like GTA V looking pretty great today and that game was originally made for those consoles.

PS4 to PS5 is such a subtle improvement that even seasonal gamers have to double check to see which port runs where. You can hardly even see the differences in still pictures or Youtube videos, the testers have to zoom in the footage for the improvements to be noticeable. The only thing i can think of as a major jump is ray tracing but that also can be subtle and sometimes it can even look worse.
 
I don't think we had any significant jumps after the XBOX 360 / PS3 generation tbh. After that the improvements were minimal and mostly had to do with frame rates and resolution increases. You still have games like GTA V looking pretty great today and that game was originally made for those consoles.

PS4 to PS5 is such a subtle improvement that even seasonal gamers have to double check to see which port runs where. You can hardly even see the differences in still pictures or Youtube videos, the testers have to zoom in the footage for the improvements to be noticeable. The only thing i can think of as a major jump is ray tracing but that also can be subtle and sometimes it can even look worse.
PS3 to PS4 was stilla pretty signiifcant leap even if not as much as 7th gen to 8th gen.

affwmX2.png

4QVzTOx.png


zxwujTC.png
 
Last edited:

Bond007

Member
We want it all @ under $400-$500.
If we consumer's accepted higher pricing perhaps these companies could push the envelope more. As it stands- they need to hit a price point with mostly off the shelf parts.
But going up in price you alienate younger folk, by going up you challenge PC. So, its probably good where its at- but slower graphical growth in this age.
 

phant0m

Member
We want it all @ under $400-$500.
If we consumer's accepted higher pricing perhaps these companies could push the envelope more. As it stands- they need to hit a price point with mostly off the shelf parts.
But going up in price you alienate younger folk, by going up you challenge PC. So, it’s probably good where its at- but slower graphical growth in this age.
1000% this. I’d love to see what Sony or MS could do with an $800 MSRP console.

Right now the only place to _really_ see graphical fidelity/advancment is on PC. Because we have GPUs that are $800+ alone.
 
Last edited:

HeWhoWalks

Gold Member
Eh, nah. You'll get plenty of graphical advancements on consoles. No matter what folks tell themselves, the best looking games this gen could not have been done on last gen hardware without significant compromise.

And even with $800+ graphics cards, you aren't seeing the kind of advancement, relative to that amount, that you think you are. It can be done on consoles, you'll just pay a performance price for it. It's a resource allocation, not a hardware limitation.

No game is taking advantage of even my 3090 Ti, let alone a 4090 and better. It's not happening. I want people to understand this. Those cards are capable of far, far, far more than you've been getting.

What they do allow for, however, is better framerates and that's the biggest advantage. But, as long as consoles have the presence, share, and mass pricing accessibility that they do, your shiny PC graphics card will never, ever have its parts truly exploited and put to use.

Some examples of what could/would happen if PC devs only took advantage of the latest and greatest cards:

Rendered on a TITAN RTX in under four minutes in UE4 back in 2020:

Game-FINAL.png


Laundry-002.png

Laundry-001.png

Laundry-003.png


Fully path-traced, 8k normal maps on nearly all surfaces, high quality fog, many dust particles (thanks to instancing). This would be the norm if PC's best cards were actually taken advantage of! And that's.....2020.
 
Last edited:
Do we really need more? I know more would be nice but with games costing 100 million, or more, do we want them really go down this path? It’s not sustainable imo.
I think this is an area where AI can make a difference: fill in all the details to make the visuals pop.
 

Fbh

Member
Diminishing returns.
Graphics have gotten to the point where they can still get better but it requires increasingly more hardware power for increasingly smaller upgrades.
Unless there's some breakthrough in tech or consumers become willing to spend substantially more on consoles I could see a new reality where we only see big upgrades every 2 gens.

Personally I've been happy with the first half of the gen. IMO graphics are nice, there's lots of games running at 60fps, super fast loading, etc.
Only recently we've sadly seen the return of games aiming for 30fps for the sake of marginal graphical upgrades which aren't worth it IMO
 

HeWhoWalks

Gold Member
Diminishing returns.
Graphics have gotten to the point where they can still get better but it requires increasingly more hardware power for increasingly smaller upgrades.
Unless there's some breakthrough in tech or consumers become willing to spend substantially more on consoles I could see a new reality where we only see big upgrades every 2 gens.

Personally I've been happy with the first half of the gen. IMO graphics are nice, there's lots of games running at 60fps, super fast loading, etc.
Only recently we've sadly seen the return of games aiming for 30fps for the sake of marginal graphical upgrades which aren't worth it IMO
It's not really the individual graphics, though. It's those seeking higher framerates with higher resolutions. That stuff gets taxing. This is why devs default back to 30fps when on consoles — so they can produce the best visual showpiece possible.
 

rofif

Can’t Git Gud
Eh, nah. You'll get plenty of graphical advancements on consoles. No matter what folks tell themselves, the best looking games this gen could not have been done on last gen hardware without significant compromise.

And even with $800+ graphics cards, you aren't seeing the kind of advancement, relative to that amount, that you think you are. It can be done on consoles, you'll just pay a performance price for it. It's a resource allocation, not a hardware limitation.

No game is taking advantage of even my 3090 Ti, let alone a 4090 and better. It's not happening. I want people to understand this. Those cards are capable of far, far, far more than you've been getting.

What they do allow for, however, is better framerates and that's the biggest advantage. But, as long as consoles have the presence, share, and mass pricing accessibility that they do, your shiny PC graphics card will never, ever have its parts truly exploited and put to use.

Some examples of what could/would happen if PC devs only took advantage of the latest and greatest cards:

Rendered on a TITAN RTX in under four minutes in UE4 back in 2020:

Game-FINAL.png


Laundry-002.png

Laundry-001.png

Laundry-003.png


Fully path-traced, 8k normal maps on nearly all surfaces, high quality fog, many dust particles (thanks to instancing). This would be the norm if PC's best cards were actually taken advantage of! And that's.....2020.
Is this supposed to be any impressive? especially for an offline render?!
RE2 remake looks way better than this and runs 100fps without rt at 4k... in 2019

29UXOfF.jpeg

10T75oN.jpeg


Or you know. 2016 games looking like this. dynamic time of day, no rt in sight
SuyyYQy.jpeg

EH9hICq.jpeg

jP6JnBc.jpeg

And in 2016 (or today) we are nowhere near what raster is capable off without any use of RT
 
Battlefront 2015 was peak multiplayer graphics technology. Followed up by Battlefield 1 and most multiplayer games have been downhill from there ever since, even games made by DICE.
 

HeWhoWalks

Gold Member
Is this supposed to be any impressive? especially for an offline render?!
RE2 remake looks way better than this and runs 100fps without rt at 4k... in 2019




Or you know. 2016 games looking like this. dynamic time of day, no rt in sight



And in 2016 (or today) we are nowhere near what raster is capable off without any use of RT
RE2 does not look "way better than that". I laid out everything happening in that scene and RE2's environment is several steps back. Not even the "raytraced" version and especially not those dark screenshots you posted with those 2048x2048 ground textures. I get that you like the game, I do too, but no.

And those Final Fantasy pics? :pie_roffles:
 
Last edited:

Shin-Ra

Junior Member
Some devs are going backwards. Compare Rayman Legends (even Origins) with Prince of Persia The Lost Crown.

I guess a decade of mobile Unity games will do that.
 

mdkirby

Member
Things won’t take a large step up until ai is used extensively as both ai chips in the consoles to majorly bump graphics on the fly, and used in development to significantly speed up asset generation and world building.
 

clarky

Gold Member
Ever since 2019/2020 there hasn't really been any significant noticeable improvements in terms of graphical prowess, despite the release of the 9th gen consoles. There have been improvements in terms of frame rate and faster loading however.

I think the graphical leap next gen will be absolutely insane where borderline augmented reality visuals such as the matrix tech demo will be the norm, but this current gen feels like one massive waiting period before then where we bridge the gap to 60fps(which is definitely worth it).
Your talking 4 years ago. Not exactly a hot take more like a shit take.
 

6502

Member
The wright borthers could have met Yuri Gagarin. Those living that generational leap saw a huge change in flight tech ability. In the deacdes after, not so much.. it was small gains.

Those of us who went Atari to Xbox wont ever see anything like those leaps and neither will anyone until at least our grandkids.
 

Fbh

Member
It's not really the individual graphics, though. It's those seeking higher framerates with higher resolutions. That stuff gets taxing. This is why devs default back to 30fps when on consoles — so they can produce the best visual showpiece possible.

But that's the thing with diminishing returns. So far, in my opinion, none of these games designed around 30fps like Alan Wake 2, Plague Tale or FFXVI have showcased impressive enough visuals or gameplay for it to be worth giving up the fluidity and responsiveness of 60fps.

Alan Wake 2 in fidelity mode might look better than TLOU2, but when I can play TLOU2 at native 1440p and 60fps I just don't think the visual upgrade Alan Wake 2 is offering is big enough to be worth lowering the resolution and cutting the performance in half.
 

HeWhoWalks

Gold Member
But that's the thing with diminishing returns. So far, in my opinion, none of these games designed around 30fps like Alan Wake 2, Plague Tale or FFXVI have showcased impressive enough visuals or gameplay for it to be worth giving up the fluidity and responsiveness of 60fps.

Alan Wake 2 in fidelity mode might look better than TLOU2, but when I can play TLOU2 at native 1440p and 60fps I just don't think the visual upgrade Alan Wake 2 is offering is big enough to be worth lowering the resolution and cutting the performance in half.
Plague and Wake look much better on PC, but then, "impressive visuals" is an individual issue. Diminishing returns relates to budget—result.

What's happened has a lot less to do with that and is more on the devs themselves. There's a reason graphics have stagnated, to a small degree, in video games and not overall. Hardware is not the issue and there's definitely not a graphical ceiling at play.
 
If it is so disappointing, why did Sony remaster The Last of Us 2 from PS4 to PS5?

Seriously though, Cyberpunk is still the go to game for benchmarking on PC and that was developed last gen and released on PC in 2020. Sure it has had some work done on graphics, but there still hasn't been much that does better. This is the age of diminishing, diminishing returns. The things that get improved will be framerate and then AI and other parts of the game that are harder to capture in a video.
Because for a minimal amount of effort, they can offer a "new" version for $70.
 
Ever since 2019/2020 there hasn't really been any significant noticeable improvements in terms of graphical prowess, despite the release of the 9th gen consoles. There have been improvements in terms of frame rate and faster loading however.

I think the graphical leap next gen will be absolutely insane where borderline augmented reality visuals such as the matrix tech demo will be the norm, but this current gen feels like one massive waiting period before then where we bridge the gap to 60fps(which is definitely worth it).
If you made this post 6 months ago, you would have had a great point.
 

The Stig

Member
I think this could’ve been true in 2022, but now? Nah.

Between Alan Wake II, Cyberpunk 2077 w/ path tracing, Callisto Protocol, Hellblade II, Death Stranding 2, Fable, Grand Theft Auto VI, etc.. etc.. we’re in the beginnings of a very significant jump in visual fidelity.
yeah this. if you have a good enough rig the game looks truly amazing
 

rofif

Can’t Git Gud
This is trolling, right?
the only good thing in avatar is the vegetation.
Then it looks like this in other scenes.. not to mention flying scenes and night lighting from what I saw. It's not even unpopular opinion. These are not even my observations...
S1cE3TJ.jpeg

HRzageI.png


And Alan wake 2 looks good on pc but looks very subpar on ps5 which is where I played it. For me that game ran bad and looked bad for how such a poor image quality and no reflections.. it's all grainy.
And in motion, the break up is very severe. Disabling motion blur doesn't help. (I have it on in this shot)

X0vi6ZP.jpeg

BLfsqlP.jpeg

zdVgNAM.jpeg


Sure, maybe it doesn't have these problems on PC but I played on ps5 and there are ton of games that have much better image quality on ps5 and feature effects.
Demons souls, tlou part1 or 2, ff16 and some more.
What AW2 is doing is straight up embarrassing and the way they approached the game port to ps5 is embarrassing. They brute forced pc version which is to heavy for what it is and then slapped low and moved the game to ps5. shameful... and the industry is cherishing this? Inexcusable from a company that had perfect planar reflections in MP2. They 100% could've figured out something to replace lack of rt on consoles.

Avatar at least have a good consoles port and only does some scenes and scenarios poorly
 
Last edited:

Stuart360

Member
the only good thing in avatar is the vegetation.
Then it looks like this in other scenes.. not to mention flying scenes and night lighting from what I saw. It's not even unpopular opinion. These are not even my observations...
S1cE3TJ.jpeg

HRzageI.png


And Alan wake 2 looks good on pc but looks very subpar on ps5 which is where I played it. For me that game ran bad and looked bad for how such a poor image quality and no reflections.. it's all grainy.
And in motion, the break up is very severe. Disabling motion blur doesn't help. (I have it on in this shot)

X0vi6ZP.jpeg

BLfsqlP.jpeg

zdVgNAM.jpeg


Sure, maybe it doesn't have these problems on PC but I played on ps5 and there are ton of games that have much better image quality on ps5 and feature effects.
Demons souls, tlou part1 or 2, ff16 and some more.
What AW2 is doing is straight up embarrassing and the way they approached the game port to ps5 is embarrassing. They brute forced pc version which is to heavy for what it is and then slapped low and moved the game to ps5. shameful... and the industry is cherishing this? Inexcusable from a company that had perfect planar reflections in MP2. They 100% could've figured out something to replace lack of rt on consoles.

Avatar at least have a good consoles port and only does some scenes and scenarios poorly
Rofif i afree with you about Alan Wake 2. I myself dont quite get the praise that game gets for its graphics. AVATAR though is absolutely stunning. I even played it at 30fps just so i could get all the bells and wristles, something i almost never do over 60fps.
Also that cliff you like to keep showing, it looks good ingame and has some very high resolution textures, and not off a low bitrate screen grab. Thats how it also looks in the films. For me its best looking game i have played so far.

Having said all that, you think Forsaken looks great so.
 
Last edited:

rofif

Can’t Git Gud
Rofif i afree with you about Alan Wake 2. I myself dont quite get the praise that game gets for its graphics. AVATAR though is absolutely stunning. I even played it at 30fps just so i could get all the bells and wristles, something i almost never do over 60fps.
Also that cliff you like to jeep showing, it looks good ingame and has some very high resolution textures, and not off a low bitrate screen grab. Thats how it also looks in the films. For me its best looking game i have played so far.

Havins said all that, you think Forsaken looks great so.
To your fairness, I've not played Avatar myself. That's why I am not shitting on it as much as alan wake 2.
Forspoken though looks good. especially after patches.
 

Wildebeest

Member
The fact is the real innovation in real time 3d graphics was done long ago now and for a long time art side have just been adding detail in the same fake looking "cinematic realism" style and tech side has been adding weird shit that sometimes looks good, but often people just cope and pretend is better.
 
Geometry has definitely gone up but this can be difficult to notice.

I think lighting / ray tracing has the most significant impact on graphics and is the last major milestone that needs to be crossed - but the technology isn't there yet (at affordable prices for the majority of people). Unfortunately, we'll have to wait until the next console gen before we see moderate/good level ray tracing in the majority of titles.
 

AJUMP23

Gold Member
I think we are hardware that is giving games what it can give. We will never see jumps like we did from 8bit to 16bit.
 
Graphics are great now. The next gen should aim for current graphics but 60fps accross all games with ray tracing and more risks in the types of games we get.
 

StreetsofBeige

Gold Member
I mean this is from 2015 -



If this was a brand new video of a just announced Star Wars game, people would be arguing on here whether current consoles would be able to get 60fps etc.

I never bothered playing SW shooters aside from testing out I think an EA Access trial. I'm not into sci-fi shooters, but I thought the game looked incredible and at 60 fps too. BF1 also came out around that time and it looked excellent too. And that was on archaic Xbox One. If those games were remade for modern consoles, they'd be up there as best looking console shooters.
 
Last edited:
Top Bottom