• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: PS5 vs PC in Assassin's Creed Valhalla

VFXVeteran

Banned
optimizations and new techniques therefore don’t exist and games can’t possibly look better...

Optimizations do exist. But I've never heard of a game being "so" optimized that you can start adding more features to it because your rendering budget was suddenly BELOW what it was before them. In other words, have you ever had a development company tweet saying they made an incredible optimization that allowed them to use 8k textures instead of 4k textures running at the same FPS? That's the sillyness that you guys are proposing..

New techniques do exist. Always done on PC first then adopted by the consoles. PBR shaders, HBAO, HDAO, SSR, Volumetric light shafts, PhysX smoke, RT, etc.. all there for everyone to witness.
 
Last edited:

Rea

Member
Optimizations do exist. But I've never heard of a game being "so" optimized that you can start adding more features to it because your rendering budget was suddenly BELOW what it was before them. In other words, have you ever had a development company tweet saying they made an incredible optimization that allowed them to use 8k textures instead of 4k textures running at the same FPS? That's the sillyness that you guys are proposing..

New techniques do exist. Always done on PC first then adopted by the consoles. PBR shaders, HBAO, HDAO, SSR, Volumetric light shafts, PhysX smoke, RT, etc.. all there for everyone to witness.
Even though i love Playstation and i am PSfan, Have to agree on this, GPU rendering techniques existed on PC long before consoles are, and some PC exclusive features don't even exist on Consoles. But game engines do evolve over time and will have more optimization for consoles, same goes for PC.
 
Optimizations do exist. But I've never heard of a game being "so" optimized that you can start adding more features to it because your rendering budget was suddenly BELOW what it was before them. In other words, have you ever had a development company tweet saying they made an incredible optimization that allowed them to use 8k textures instead of 4k textures running at the same FPS? That's the sillyness that you guys are proposing..

New techniques do exist. Always done on PC first then adopted by the consoles. PBR shaders, HBAO, HDAO, SSR, Volumetric light shafts, PhysX smoke, RT, etc.. all there for everyone to witness.

I'm not even speaking of new techniques. I'm just speaking of ways to reduce cycle time, through better programming/memory management, that allows you to place more on the screen (regardless of whether the technique is "new")

See the tweet an ND dev made about the PS5 hardware, using PS3 as a perfect example. I'm sure "Uncharted 1" was using 99% of the GPU, in fact that game suffered from screen tearing. But compare it to TLOU and it is a night and day different in what they were able to put on the screen in terms of overall quality. They got better at the hardware which resulted in huge efficiency gains

This happens every gen but you seem to be on a crusade to deny it exists
 

VFXVeteran

Banned
I'm not even speaking of new techniques. I'm just speaking of ways to reduce cycle time, through better programming/memory management, that allows you to place more on the screen (regardless of whether the technique is "new")

You can't possibly give a realworld example. This just isn't true. The "more on screen" is completely dependent on GPU bandwidth. Optimization is more about getting that budget FPS because what you do have is slowing it down.

See the tweet an ND dev made about the PS5 hardware, using PS3 as a perfect example. I'm sure "Uncharted 1" was using 99% of the GPU, in fact that game suffered from screen tearing. But compare it to TLOU and it is a night and day different in what they were able to put on the screen in terms of overall quality. They got better at the hardware which resulted in huge efficiency gains

This happens every gen but you seem to be on a crusade to deny it exists

PS3 was a very very hard platform to develop for. I know they struggled with that hardware for years. That's NOT the same as it is now. PS5 is basically a faster PS4. Same architecture in AMD with a basic x86 CPU instruction set.

The problem with you guys is you assume things stay the same even though life around you is changing. The PS3 was a radical architecture and took time to master. The PS4 was more PC-friendly which developers could master rather quickly. The PS5 is just like the PS4, so not much to master other than RT. Why stick to something you know was a different situation and apply it to every single generation from here on out? Life doesn't work that way so why try to MAKE it so?
 
Last edited:

VFXVeteran

Banned
Sort of like throwing the keys of a Formula 1 car to your grandma and watching her max out the machine at Monaco.

Ignoring talent, experience & technique in driving hardware is a child’s depth of thinking.

Prove me wrong dude. Otherwise you have no leg to stand on. All of you are declaring ghosts exists and I"m constantly trying to find one where there is none. Then you throw the PS3's completely radical architecture out as an example that clearly took time to master, but then you apply that same "radical" architecture requires mastering with the PS4 and then PS5 when both are pretty much home-grown PCs? The fact that you can't show any sort of "mastery" of the hardware within the PS4 generation is telling .... You only get 1 or 2 games to prove it from the same company before the next generation comes. Choose your company wisely and hope they release a 2nd game within the generation..
 
You can't possibly give a realworld example. This just isn't true. The "more on screen" is completely dependent on GPU bandwidth. Optimization is more about getting that budget FPS because what you do have is slowing it down.



PS3 was a very very hard platform to develop for. I know they struggled with that hardware for years. That's NOT the same as it is now. PS5 is basically a faster PS4. Same architecture in AMD with a basic x86 CPU instruction set.

The problem with you guys is you assume things stay the same even though life around you is changing. The PS3 was a radical architecture and took time to master. The PS4 was more PC-friendly which developers could master rather quickly. The PS5 is just like the PS4, so not much to master other than RT. Why stick to something you know was a different situation and apply it to every single generation from here on out? Life doesn't work that way so why try to MAKE it so?

PS5 has some customizations that have not been fully utilized on launch day, and limitations removed that existed. Cross-Gen games are still developing with these limitations in mind.

The revamped Geometry Engine, the speed of the I/O in conjunction with things like cache scrubbers and potentially some unified cache on the CPU isn't something that just suddenly works to all of its potential. There still needs to be specific optimizations for that hardware, none of which is happening day 1 on the console.
 
Last edited:

Shmunter

Member
It just means that the hardware is maxed out relative to the software that's pushing it. There's nothing more the hardware can do, given the way it's being utilized, to make the game run faster or at higher resolutions. If that weren't the case then the engine wouldn't have to constantly lower the resolution to maintain its framerate. That's not to say the AnvilNext is representative of the full potential of the PS5 hardware, but rather the full potential of the current iteration of the AnvilNext engine on the PS5.
This discussion has no merit. Poor software can run away maxing ram, Cpu, leak memory etc. So what is then proverbial point to such a useless conversation. It’s misleading and has no base in any coherent conversation.

Bugsnax has framerate drops on PS5, oh no PS5 is maxed out!
 

Lethal01

Member
Prove me wrong dude. Otherwise you have no leg to stand on. All of you are declaring ghosts exists and I"m constantly trying to find one where there is none. Then you throw the PS3's completely radical architecture out as an example that clearly took time to master, but then you apply that same "radical" architecture requires mastering with the PS4 and then PS5 when both are pretty much home-grown PCs? The fact that you can't show any sort of "mastery" of the hardware within the PS4 generation is telling .... You only get 1 or 2 games to prove it from the same company before the next generation comes. Choose your company wisely and hope they release a 2nd game within the generation..

The examples were given, accompanied by the developers of the examples saying that the PS5 also has tons more potential when they get to used it more. The one plugging their ears and ignoring reality is you. Instead of pretending to be a professional try listening to them some more I guess.
 

VFXVeteran

Banned
PS5 has some customizations that have not been fully utilized on launch day,

Name something in the graphics pipeline that hasn't been utilized that will make AC:Valhalla run at native 4k/60FPS instead of dynamic res?

Let's go to Spiderman MM. Name something in the RT architecture that Insomniac hasn't mastered yet, that should get them even better reflections along with RT shadows, GI, and Area lights for the next and last Spiderman of this generation?


The speed of the I/O in conjunction with things like cache scrubbers and potentially some unified cache on the CPU isn't something that just suddenly works to all of its potential.

How does this translate to better visuals? I'm looking for 8k textures instead of 4k. I'm looking for several shadow casting light sources instead of 1 like in Demon Souls. How will that enhance the RT hardware to support more than just 1 RT feature? Will we see more tessellation than what has ever been seen before in any game even on PC? At what FPS will that game run at? What resolution? Where is the RT GI from Crysis Remake in a game @ 4k/30FPS with massive foliage? Will we see that kind of RT in Horizon: Forbidden West or will we have to wait for their last game of the generation before then?

I'm looking for these enhancements because I know that this is what you are guys are trying to imply despite it never existing last generation at all.
 
Last edited:

VFXVeteran

Banned
The examples were given, accompanied by the developers of the examples saying that the PS5 also has tons more potential when they get to used it more. The one plugging their ears and ignoring reality is you. Instead of pretending to be a professional try listening to them some more I guess.

Oh boy.. here we go with you again. No matter how much you try to make it out for me to be just a fanboy, it doesn't change the fact that I'm just like one of those game developers in reality. I get paid to do what they are doing. I've had a very long career at what they are involved in. Your stabs won't ever change that dude and my "vetted" status will always be true.

Anyway, can you stop talking about what I don't know and SHOW me prove of this from last generation please? I'm waiting.
 
Last edited:
1080ti at 1440p with similar settings dips alot more frequently and lower than ps5

And many of these scenes the ps5 is scaling above 1440p and holding closer to 60fps, if not locked most of the time



The 4k quality mode comparison argument is sort of a bad one. Of course neither could do 4k 60fps so we're stuck at a locked 30fps on ps5, unable to see how far above that it can go. Theres a ton of room for power comparison between 30 and 60.


I find this post to be somewhat disingenuous.

Of all the vids you could post yours maxes out at 720p. Everything is set to Ultra where that is definitely NOT the case for the PS5 version, and this is a 1080ti paired with a CPU from 2013 and his specs say he's still rocking slow ass DDR3 RAM.

I own a 1080ti with a modern CPU (3900X) and proper 3600cl16 DDR4 RAM to go along with it. I can lock out a 99% solid 60fps with my 1080ti at 1440p no problem and can manage a bit higher. 4K with resolution scaling set to .7 or maybe .8 for example and still hold that 60fps the vast majority of the time.

There is nothing to indicate that the PS5 offers performance anything above a 2080/1080ti ( a 4 year old GPU ). And this has been a BEST CASE SCENARIO so far with comparing PS5 to PC performance. Usually the PS5 performs worse than this. It's just that Valhalla is an outlier where RDNA 2 performs better on average against Nvidia. In the other 90% of cases, Nvidia GPUs perform better.
 
Last edited:
PS5 is performing better than a 2060S by at least 20% in this title, that was Alex's ballpark before release......He is guessing on clouds, when it's quite clear PS5 has the best settings there....He is benchmarking at 1440p straight on PC, when PS5 fluctuates from 1440p minimum to 2160P as per GPU load, so in as much as I like the video for what it shows, and he did good work there, PS5 is still not accurately represented here....It's even more performant than what we see in this video based on the DR, one higher setting than PC and mostly PC's top settings otherwise....>Curiously enough, PS5 was even more performant in that scene where it fell to 51fps prior to the patch....

So overall PS5's performance is maybe at least on par with 2080ti on rasterization, but this is only a crossgen game which is not even using the advantages/strengths of PS5 per se.....So as this gen progresses, more videos like this would be interesting...

My thoughts exactly. In this game, because of the res used, PS5 should be a good match for a 2080 Ti.

In other launch comparisons, the equivalent performance could be a little less. One thing is clear though: we've only touched the surface of next gen console performance.

Hypothetically If you tried to match Miles Morales Fidelity mode, 60fps dynamic 4K (take average res) with it's incredible RT implementation on PC, you'd need a faster card than a 2080 Ti even, because of the RT.
 
My thoughts exactly. In this game, because of the res used, PS5 should be a good match for a 2080 Ti.
Sorry but no. The PS5 does NOT match a 2080ti in this game. What it does is match a 1080ti. A bit of difference there.

And this has been a best case scenario for the PS5 so far. Usually it performs worse. It's just that in this case, Valhalla performs relatively better on RDNA 2, but it's an outlier. The other 90+% of the time, it swings the other way.
 
Last edited:
Name something in the graphics pipeline that hasn't been utilized that will make AC:Valhalla run at native 4k/60FPS instead of dynamic res?

Let's go to Spiderman MM. Name something in the RT architecture that Insomniac hasn't mastered yet, that should get them even better reflections along with RT shadows, GI, and Area lights for the next and last Spiderman of this generation?




How does this translate to better visuals? I'm looking for 8k textures instead of 4k. I'm looking for several shadow casting light sources instead of 1 like in Demon Souls. How will that enhance the RT hardware to support more than just 1 RT feature? Will we see more tessellation than what has ever been seen before in any game even on PC? At what FPS will that game run at? What resolution? Where is the RT GI from Crysis Remake in a game @ 4k/30FPS with massive foliage? Will we see that kind of RT in Horizon: Forbidden West or will we have to wait for their last game of the generation before then?

I'm looking for these enhancements because I know that this is what you are guys are trying to imply despite it never existing last generation at all.

But it did exist last gen even though you claim otherwise. Games later in the gen looked demonstrably better.

It seems you cannot distance yourself from the idea that if a technique is used throughout the generation than the games ALL LOOK OF SIMILAR quality, this of course is a false notion.

You don't need a difference of 4K vs 8K textures to make a difference. And again, I bring up UE5 which hasn't had any game launch on it yet (because it's not out), and it's already proven to run with visuals that surpass what we have seen for released games.
 

VFXVeteran

Banned
But it did exist last gen even though you claim otherwise. Games later in the gen looked demonstrably better.

There was nothing technically improved due to lack of mastering the hardware dude. Just because a game's art and overall scope changes doesn't mean that you can throw a label on that team to say they mastered the hardware. You don't even know anyone there and they have never made such comments in public.

It seems you cannot distance yourself from the idea that if a technique is used throughout the generation than the games ALL LOOK OF SIMILAR quality, this of course is a false notion.

I never said that. Go back and reread what I said. A game can look good but you have nothing to judge it against. You can't judge KZ:SF to Horizon and declare "they mastered the hardware because Horizon looks better than KZ:SF!!" That's completely a gamer perspective and nowhere near a developer that knows what's going on in the graphics engines' perspective.

You don't need a difference of 4K vs 8K textures to make a difference. And again, I bring up UE5 which hasn't had any game launch on it yet (because it's not out), and it's already proven to run with visuals that surpass what we have seen for released games.

UE5 is a standard graphics engine for MULTIPLE platforms. You can't use that because that doesn't help your claim one bit. It's not a company that can't master the PS5 and suddenly mastered it to make the UE5 demo. They have a demo that can run on ALL platforms. They just used the PS5 to showcase it.
 
Last edited:
There was nothing technically improved due to lack of mastering the hardware dude. Just because a game's art and overall scope changes doesn't mean that you can throw a label on that team to say they mastered the hardware. You don't even know anyone there and they have never made such comments in public.



I never said that. Go back and reread what I said. A game can look good but you have nothing to judge it against. You can't judge KZ:SF to Horizon and declare "they mastered the hardware because Horizon looks better than KZ:SF!!" That's completely a gamer perspective and nowhere near a developer that knows what's going on in the graphics engines' perspective.



UE5 is a standard graphics engine for MULTIPLE platforms. You can't use that because that doesn't help your claim one bit. It's not a company that can't master the PS5 and suddenly mastered it to make the UE5 demo. They have a demo that can run on ALL platforms. They just used the PS5 to showcase it.

You would be hard pressed to find a SINGLE DEV that would suggest that their games do not look better over time as they optimize across the generation.

And just because UE5 runs on multiple platforms DOES NOTHING against my argument. EVEN UE4 games throughout the gen started looking better. Compare UE4 early vs later, there are improvements.
 

VFXVeteran

Banned
You would be hard pressed to find a SINGLE DEV that would suggest that their games do not look better over time as they optimize across the generation.

Bro. In ONE generation, you MIGHT see 2 games from the same company - at MOST! Typically you get ONE shot for an entire generation to display your talent and release a game.

PS4 - Uncharted 4, TLOU2.. that's it. They are very similar in look and rendering tech. Not the LEAP that you guys are trying to claim.
PS4 - Infamous SS and Spiderman. Nothing too dramatic there. Both using the same engine and you can NOT tell that there was any super optimization code going on that allowed Spiderman to be possible.
PS4 - KZ:SF and Horizon and Death Stranding. Same graphics engine. All of them using the same techniques. HZD had new systems in place like procedural foliage, DTOD, cloth physics, etc.. but NOTHING glaring like the lighting, shading, textures, etc..
PS4 - GoW. Only one shot. One game.

All 3rd party games will follow the PC architecture and have graphics engines that are portable to multiple platforms, so we'll go with you guys assumption that they aren't optimized so we can leave their evolution alone.

There you go.. 4 companies all in one generation. NONE of them have games that prove what you guys are trying to say. Period.

And just because UE5 runs on multiple platforms DOES NOTHING against my argument. EVEN UE4 games throughout the gen started looking better. Compare UE4 early vs later, there are improvements.

UE5 is agnostic of hardware. Their algorithms are PC-centric. No optimized code I'm finding for the PS5. Should I post some source code of what I'm workign on to prove that too?
 
Last edited:

Gamer79

Predicts the worst decade for Sony starting 2022
I will be honest, I smoke a ton of "natures greens." Can someone put op's post into layman's terms?
 

renx

Member
UE5 is a standard graphics engine for MULTIPLE platforms. You can't use that because that doesn't help your claim one bit. It's not a company that can't master the PS5 and suddenly mastered it to make the UE5 demo. They have a demo that can run on ALL platforms. They just used the PS5 to showcase it.

They cannot port every UE4 game or demo to Nintendo Switch or a cellphone for a reason. And they cannot port this UE5 demo to any platform.
The engine is multi platform. But the software you create with the engine doesn't need to work on every platform.

I don't think that they just used the PS5 to showcase it. They used the PS5 because it was the best plataform for that showcase.
I yet have to see that flight scene on other systems.


The PS5 is designed to do certain things that are either very hard or impossible to do on other platforms, and people will understand as soon as nextgen exclusives start showing up. And those things have a learning curve for developers. Mastering new techniques is indeed needed.
 
Last edited:

jroc74

Phone reception is more important to me than human rights
bCX0xlR.jpg
Oh isn't that more cable than zen2 at 3.5 ghz in ps5 ?
So, why are some ppl only focusing on the 5700XT when this was done using a cpu way better than what's in the PS5?

Actually, what's the whole PC setup?
I find this post to be somewhat disingenuous.

Of all the vids you could post yours maxes out at 720p. Everything is set to Ultra where that is definitely NOT the case for the PS5 version, and this is a 1080ti paired with a CPU from 2013 and his specs say he's still rocking slow ass DDR3 RAM.

I own a 1080ti with a modern CPU (3900X) and proper 3600cl16 DDR4 RAM to go along with it. I can lock out a 99% solid 60fps with my 1080ti at 1440p no problem and can manage a bit higher. 4K with resolution scaling set to .7 or maybe .8 for example and still hold that 60fps the vast majority of the time.

There is nothing to indicate that the PS5 offers performance anything above a 2080/1080ti ( a 4 year old GPU ). And this has been a BEST CASE SCENARIO so far with comparing PS5 to PC performance. Usually the PS5 performs worse than this. It's just that Valhalla is an outlier where RDNA 2 performs better on average against Nvidia. In the other 90% of cases, Nvidia GPUs perform better.
Well, this also goes with my question. This article is using a way better cpu, but for some reason ppl are focusing on 5700XT results.

How about we see some benchmarks with a 1080ti, 2080, 5700XT and a cpu close to what's in the PS5 before we start dismissing these results.

Usually with PC benchmarks we can see what the rest of the PC specs were.
 
Last edited:
Bro. In ONE generation, you MIGHT see 2 games from the same company - at MOST! Typically you get ONE shot for an entire generation to display your talent and release a game.

PS4 - Uncharted 4, TLOU2.. that's it. They are very similar in look and rendering tech. Not the LEAP that you guys are trying to claim.
PS4 - Infamous SS and Spiderman. Nothing too dramatic there. Both using the same engine and you can NOT tell that there was any super optimization code going on that allowed Spiderman to be possible.
PS4 - KZ:SF and Horizon and Death Stranding. Same graphics engine. All of them using the same techniques. HZD had new systems in place like procedural foliage, DTOD, cloth physics, etc.. but NOTHING glaring like the lighting, shading, textures, etc..
PS4 - GoW. Only one shot. One game.

All 3rd party games will follow the PC architecture and have graphics engines that are portable to multiple platforms, so we'll go with you guys assumption that they aren't optimized so we can leave their evolution alone.

There you go.. 4 companies all in one generation. NONE of them have games that prove what you guys are trying to say. Period.



UE5 is agnostic of hardware. Their algorithms are PC-centric. No optimized code I'm finding for the PS5. Should I post some source code of what I'm workign on to prove that too?

I really can't take this seriously. TLOU2 does see noticeable improvements over UC4, but UC4 came out later in the gen.

Spider-Man looks significantly better than Infamous SS. Horizon/Death Stranding look significantly better than KZ:SF.

So I don't agree with any of your points. Are they GENERATIONAL leaps? No. But they are leaps nonetheless.

I also think that the PS4/XB1 era wasn't a revolutionary shift from the gen before it. The major improvements were GPU and RAM, but had a severely limited I/O and CPU.

This gen seems to have big gains in GPU, I/O, and CPU. Devs haven't made next-gen exclusive games yet (mostly). So what we are seeing at launch are games developed largely with last-gen restrictions in mind. That's why things like loading aren't as good as they should be.

So in theory, I think we should be seeing MUCH larger leaps once next-gen exclusive games start releasing. We haven't seen much of that yet.
 

Truespeed

Member
This discussion has no merit. Poor software can run away maxing ram, Cpu, leak memory etc. So what is then proverbial point to such a useless conversation. It’s misleading and has no base in any coherent conversation.

Your logic is flawed just as the weak example you've provided. Your inability to comprehend the fact that the AnvilNext engine is maxing out the PS5 hardware is bizarre.

Bugsnax has framerate drops on PS5, oh no PS5 is maxed out!

More proof that you either didn't read what was said or didn't understand it.
 
Last edited:

VFXVeteran

Banned
They cannot port every UE4 game or demo to Nintendo Switch or a cellphone for a reason. And they cannot port this UE5 demo to any platform.

The UE5 demo is using functions that are accessible to any developer. All you have to do is implement it. Since it's written in C++ and is hardware agnostic, you can port to your hearts desire (this is of course dependent on the developer - NOT Epic).

The engine is multi platform. But the software you create with the engine doesn't need to work on every platform.

Well of course. The developer would have to port it. But they are porting *their* algorithm. They are not porting UE Core classes for example:

FString
TArray
TLinkedList
TDoubleLinkedList

etc..

Those data structures are completely independent of platform. They are optimized for any CPU. And guess what? They are used for developing your game.

I don't think that they just used the PS5 to showcase it. They used the PS5 because it was the best plataform for that showcase.
I yet have to see that flight scene on other systems.

Sure. The situation warranted it's use since they was using the SSD for it. That doesn't mean due to the SSD usage that EVERY other platform is impossible to run it. You'll see it run on PCs here soon enough.


The PS5 is designed to do certain things that are either very hard or impossible to do on other platforms,

Not true. It's a PC with an AMD card and a custom SSD I/O system. It's not hard and certainly not impossible to program on. Sorry but it's not PS3.
 
Last edited:

VFXVeteran

Banned
I really can't take this seriously. TLOU2 does see noticeable improvements over UC4, but UC4 came out later in the gen.

Spider-Man looks significantly better than Infamous SS. Horizon/Death Stranding look significantly better than KZ:SF.

So I don't agree with any of your points. Are they GENERATIONAL leaps? No. But they are leaps nonetheless.

The simple fact is that you can't pinpoint, technically, what makes those games look better, so your entire premise is based off of a subjective opinion that "the game looks better therefore the developers must have optimized their techniques (without any objective proof) to make it look better".

Sorry, can't carry on the convo anymore bud. Until you can get one of these developers to tell you that they mastered some form of hardware technique that they couldn't in their previous game and they pushed out significantly better performance out of the card lending to better features, your argument is dead in the water.
 

Azurro

Banned
VFXVeteran VFXVeteran , man, I don't know why you post so much, here, let me put it in a few single phrases, you can just copy paste this:

1. Graphics on any platform cannot and will never get better, performance improvements don't exist
2. I'm an EXPERT!
3. PS5 is a mediocre PC shitbox, the Geometry Engine is shit and useless, no developers will use it.
4. PS5's SSD is shit and even if it's not, PC is better anyway at some point in time, for a lot of money.
4. PC is the bestest ever, buy one with the latest card and get on my level you pleb.
5. I'm an ExPeRt!
6. Consoles are shit.
7. There's nothing better to add except ray tracing.
8. Sliders are king, even when they don't affect visual quality all that much but it lets everyone know my PC is the best
9. I'm an ExPeRt!
10. Watchdogs is the greatest graphics benchmark even if it has visuals that are made for 8 year old mid range GPUs and mobile CPUs.
11. The best raytracing is the one that can't reflect anything past 10 meters.
12. I'm an ExPERt!
13. Notice my PC build sempai. Baka!

There you go, you can just put this text in a bash script and set up a cron job to launch this every 20 minutes. Voila! I've just optimised your time spent by hours and hours. You are welcome! ;)
 
Last edited:

Rickyiez

Member
Both graphics engines support the PC (ND engine will soon) so there isn't a fully optimized from scratch graphics engine JUST for a next-gen console. Sorry those days are over.

Days of exclusive are still not over sadly . I still need a PS5 to play ND games , end of discussion
 
Last edited:
VFXVeteran VFXVeteran , man, I don't know why you post so much, here, let me put it in a few single phrases, you can just copy paste this:

1. Graphics on any platform cannot and will never get better, performance improvements don't exist
2. I'm an EXPERT!
3. PS5 is a mediocre PC shitbox, the Geometry Engine is shit and useless, no developers will use it.
4. PS5's SSD is shit and even if it's not, PC is better anyway at some point in time, for a lot of money.
4. PC is the bestest ever, buy one with the latest card and get on my level you pleb.
5. I'm an ExPeRt!
6. Consoles are shit.
7. There's nothing better to add except ray tracing.
8. Sliders are king, even when they don't affect visual quality all that much but it lets everyone know my PC is the best
9. I'm an ExPeRt!
10. Watchdogs is the greatest graphics benchmark even if it has visuals that are made for 8 year old mid range GPUs and mobile CPUs.
11. The best raytracing is the one that can't reflect anything past 10 meters.
12. I'm an ExPERt!
13. Notice my PC build sempai. Baka!

There you go, you can just put this text in a bash script and set up a cron job to launch this every 20 minutes. Voila! I've just optimised your time spent by hours and hours. You are welcome! ;)
So since you have absolutely no clue how to counter points, you start to make shit up? Just like the other threads I've seen you in. Do you agree that there is better hardware out there than ps5? Like for instance better processors from AMD and Intel? Or better gpu's from AMD or Nvidia? If you agree, then your response shows an extreme immaturity, and lack of being able to hold a mature discussion. Just because he knows more about these things, doesn't give you the right to downplay their experience, especially as he's schooled you multiple times, in multiple threads already. Keep seething bro.
 
The simple fact is that you can't pinpoint, technically, what makes those games look better, so your entire premise is based off of a subjective opinion that "the game looks better therefore the developers must have optimized their techniques (without any objective proof) to make it look better".

Sorry, can't carry on the convo anymore bud. Until you can get one of these developers to tell you that they mastered some form of hardware technique that they couldn't in their previous game and they pushed out significantly better performance out of the card lending to better features, your argument is dead in the water.

You have direct quotes from developers that, in fact, HAVE STATED that their games have gotten better with more optimizations over time. You have a developer AT INSOMNIAC that says they are just scratching the surface of the PS5 and they have so much more to go.

Do you know how absurd you sound? Of course it's almost impossible to quantify how things look better for different games. You can clearly see some improvements in games like TLOU2 over their predecessors when it comes to foliage density, character models and skin detail, animation blending systems, etc. etc. But you will claim they've ALL been done before. OK, but maybe not at THAT SCALE and with THAT LEVEL of increased detail. But no, I cannot objectively go through and individually count 5,296 leaves on plants in TLOU2 vs. 3,476 leaves on plants in Uncharted 4.

At a certain point you have to: 1) Trust your eyes, 2) Assume that REAL developers know what they are talking about. Why you choose to ignore both is beyond me.
 

Brigandier

Member
Been trying to cancel/not update PS5 Valhalla after hearing it makes the game run worse but it managed to install somehow.... The performance post patch on PS5 is dogshit, stutter jitter galore and tearing is like the Ubi of old all over.... WTF?
 
Been trying to cancel/not update PS5 Valhalla after hearing it makes the game run worse but it managed to install somehow.... The performance post patch on PS5 is dogshit, stutter jitter galore and tearing is like the Ubi of old all over.... WTF?

It was running a lot better before the patch? How can something like this even happen? I was considering getting AC:V, but now...maybe not
 

Brigandier

Member
It was running a lot better before the patch? How can something like this even happen? I was considering getting AC:V, but now...maybe not

The problems addressed were for issues on XSX with bad tearing and stutter, Since that patch it's made PS5s performance run like ass when it was fine with only minor mishaps, It's quite horrific now imo, In a busy town especially when it rains it is absolute shit.
 
The problems addressed were for issues on XSX with bad tearing and stutter, Since that patch it's made PS5s performance run like ass when it was fine with only minor mishaps, It's quite horrific now imo, In a busy town especially when it rains it is absolute shit.

Yikes, well...guess I'll be staying away from the game then
 

Mobilemofo

Member
The last gen of cards are what people tend to make comparisons with, so it's good to see that the PS5 stands up rather than getting smoked day 1.

But the Sony wins and no need for PC gaming crowd is getting a little too lit in here. In a couple years the PS5 will be getting demolished in these comparisons.
Yes..in a few years. Catch-up.
 

Brofist

Member
VFXVeteran VFXVeteran , man, I don't know why you post so much, here, let me put it in a few single phrases, you can just copy paste this:

1. Graphics on any platform cannot and will never get better, performance improvements don't exist
2. I'm an EXPERT!
3. PS5 is a mediocre PC shitbox, the Geometry Engine is shit and useless, no developers will use it.
4. PS5's SSD is shit and even if it's not, PC is better anyway at some point in time, for a lot of money.
4. PC is the bestest ever, buy one with the latest card and get on my level you pleb.
5. I'm an ExPeRt!
6. Consoles are shit.
7. There's nothing better to add except ray tracing.
8. Sliders are king, even when they don't affect visual quality all that much but it lets everyone know my PC is the best
9. I'm an ExPeRt!
10. Watchdogs is the greatest graphics benchmark even if it has visuals that are made for 8 year old mid range GPUs and mobile CPUs.
11. The best raytracing is the one that can't reflect anything past 10 meters.
12. I'm an ExPERt!
13. Notice my PC build sempai. Baka!

There you go, you can just put this text in a bash script and set up a cron job to launch this every 20 minutes. Voila! I've just optimised your time spent by hours and hours. You are welcome! ;)

Basically flip flop PS5 and PC in your statements and take out the word expert and you have your own script! Easy done

Yes..in a few years. Catch-up.

Catch up to what? 2-3 years ago GPUs are already caught up. I'm talking about beating thoroughly
 
Last edited:

Mentat02

Banned
Digital Foundry is starting to become super hacky lately. Did they seriously pick AC Valhalla to compare PS5 with the RTX 20 series? The game runs like shit on the 30 series card, matter of fact it gets blown out of the water by AMD hardware by at least 20%.
 

Brigandier

Member
Digital Foundry is starting to become super hacky lately. Did they seriously pick AC Valhalla to compare PS5 with the RTX 20 series? The game runs like shit on the 30 series card, matter of fact it gets blown out of the water by AMD hardware by at least 20%.

I felt that they could have tested busy areas and rainy environments to actually show how bad the tearing/stutter and frame drops can be in performance mode post patch. The game is bound to run fine in a lowly populated town or fighting 4 or 5 guards.

I'm so agitated by the performance drop that I might drop the game and move on.
 

Shmunter

Member
I felt that they could have tested busy areas and rainy environments to actually show how bad the tearing/stutter and frame drops can be in performance mode post patch. The game is bound to run fine in a lowly populated town or fighting 4 or 5 guards.

I'm so agitated by the performance drop that I might drop the game and move on.
Do your have opportunity to put it on YouTube? May help in generating some blowback
 

Shmunter

Member
Your logic is flawed just as the weak example you've provided. Your inability to comprehend the fact that the AnvilNext engine is maxing out the PS5 hardware is bizarre.



More proof that you either didn't read what was said or didn't understand it.
I understand you’ve been twirling VFXVets balls like some kind of exotic stress reducer
 

Romulus

Member
I find this post to be somewhat disingenuous.

Of all the vids you could post yours maxes out at 720p. Everything is set to Ultra where that is definitely NOT the case for the PS5 version, and this is a 1080ti paired with a CPU from 2013 and his specs say he's still rocking slow ass DDR3 RAM.

I own a 1080ti with a modern CPU (3900X) and proper 3600cl16 DDR4 RAM to go along with it. I can lock out a 99% solid 60fps with my 1080ti at 1440p no problem and can manage a bit higher. 4K with resolution scaling set to .7 or maybe .8 for example and still hold that 60fps the vast majority of the time.

There is nothing to indicate that the PS5 offers performance anything above a 2080/1080ti ( a 4 year old GPU ). And this has been a BEST CASE SCENARIO so far with comparing PS5 to PC performance. Usually the PS5 performs worse than this. It's just that Valhalla is an outlier where RDNA 2 performs better on average against Nvidia. In the other 90% of cases, Nvidia GPUs perform better.

Your CPU cost as much as a ps5 or at least well over half with a deal. I would hope paired with a top tier GPU from a few years could outpace a $399 console. Hell, you could probably pair an even less capable GPU with the right CPU and RAM setup and get very close to PS5 in some benchmarks.

But we're not in an ideal environment to compare optimized console hardware against to PC specs. It's the opposite actually. Cov19 and launch window, not to mention AC and most of the other games have 5-6 versions. Dropping the old PS4 and XB1, and the pro version will allow devs to max what they have. Basically, we're seeing up resolution last gen games right now. We're seeing the absolute worst-case scenario for console hardware that thrives from optimization. PC is far less reliant on optimization because of the wide array of specs.
 
Last edited:
  • LOL
Reactions: TJC

TJC

Member
Your CPU cost as much as a ps5 or at least well over half with a deal. I would hope paired with a top tier GPU from a few years could outpace a $399 console. Hell, you could probably pair an even less capable GPU with the right CPU and RAM setup and get very close to PS5 in some benchmarks.

But we're not in an ideal environment to compare optimized console hardware against to PC specs. It's the opposite actually. Cov19 and launch window, not to mention AC and most of the other games have 5-6 versions. Dropping the old PS4 and XB1, and the pro version will allow devs to max what they have. Basically, we're seeing up resolution last gen games right now. We're seeing the absolute worst-case scenario for console hardware that thrives from optimization. PC is far less reliant on optimization because of the wide array of specs.
It makes me laugh when PC dudes get insecure about the hardware they have. Involving themselves in console talk because they want to justify the redicilous price they probably paid to match a 399/499 console.
 

Azurro

Banned
So since you have absolutely no clue how to counter points, you start to make shit up? Just like the other threads I've seen you in. Do you agree that there is better hardware out there than ps5? Like for instance better processors from AMD and Intel? Or better gpu's from AMD or Nvidia? If you agree, then your response shows an extreme immaturity, and lack of being able to hold a mature discussion. Just because he knows more about these things, doesn't give you the right to downplay their experience, especially as he's schooled you multiple times, in multiple threads already. Keep seething bro.

You know how in some boss battles there are some annoying mobs coming along for the ride that you have to get rid of before being able to attack the main boss? That's basically you in every thread cheerleading for the guy.

No need to get so triggered buddy, really, I was just having a bit of fun since be posts the same thing over and over. ;)
 

Md Ray

Member
You know how in some boss battles there are some annoying mobs coming along for the ride that you have to get rid of before being able to attack the main boss? That's basically you in every thread cheerleading for the guy.

No need to get so triggered buddy, really, I was just having a bit of fun since be posts the same thing over and over. ;)
LMAO perfect.
 
Top Bottom