• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: PS5 vs PC in Assassin's Creed Valhalla

GHG

Gold Member
Why use previous gen cards? Should have benchmarked it with the 3 series.

Makes little difference. In terms of general performance (outside of ray tracing):

  • 3060ti = 2080 super
  • 3070 = 2080ti
Everyone knows about the 3080/90's performance and its pointless comparing those cards to consoles because they are in an entirely different category, both in terms of price and performance.
 

Brofist

Member
Why use previous gen cards? Should have benchmarked it with the 3 series.

The last gen of cards are what people tend to make comparisons with, so it's good to see that the PS5 stands up rather than getting smoked day 1.

But the Sony wins and no need for PC gaming crowd is getting a little too lit in here. In a couple years the PS5 will be getting demolished in these comparisons.
 

Three

Gold Member
I always knew PS5 loved me by rendering more flowers for me.
Look at the light.
It falls just right.
My shadows they please,
Beneath the trees.
But none of these things happen for free.
Yeah, all that you see, rendered by me.

I synthesise and rasterise immaterial things that I fabricate for you.
For you.
Yeah, I tesselate and animate these dancing sprites and sunlit skies for you.
I do it for you.
I’m your GPU.

GPU.
Tell me what to do, and I’ll do it for you.

You’ve never wondered why,
I catch your eye?
It’s ‘cos you overlook,
All the choices I took.
I spend all of my time,
Deceiving and misleading you.
I like to surprise, with my virtual lies.

GPUUUU
https://youtu.be/QVBjiFPMKMM
 
Last edited:

sncvsrtoip

Member
9,6%


jYOFcHg.jpg


For GPU comparisons between consoles and PC (and AMD vs Nvidia),
still not 18.6% also they compared to ps5 after patch 1.04 that If I remember correctly drop little performance in this scene
 
So the PS5 is performing exactly to it's TF? Interesting, I thought those didn't matter.

You mean the theoretical maximums? Neither are performing to that level. Real world performance isn't the same as theoretical maximums. Theoretical maximums are mostly used for marketing purposes.

Are these the PS5 tools?

Both of them have advanced feature sets they haven't been fully utilized yet. Time will tell how effective they really are.
 
Last edited:

Zug

Member
Ubisoft PC ports always had shit optimization. They never even tried. Just click on "Build for PC/x64", easy money.
 

Rea

Member
I was laughing when i see a post from someone said "PC gaming are not held back by consoles yadi yada yada". Maybe they forgot about the story of "watch dog 1" and "witcher 3", etc. Demo versus final retail version. If those games were PC exclusive, they wouldn't be Downgraded. Simple as that.
 
  • Like
Reactions: GHG
it really is exciting to finally get consoles that are legitimately capable of delivering high quality graphics. Too bad so much of that power is being wasted on the meme that is 4k I wish consoles pushed for 1440p/144fps instead and push for 4k next gen, when even low tier pcs have no problem getting to 4k 60-100fps
 
Performance at 1440p very high settings on 2060S and Ryzen 3600.

Timestamped at 15:30



Goes From 60 to 50 fps average in span of 2 minutes. it drops from 60 to 30 in certain areas. The saddest part is there is no visual difference between the very high and ultra settings in this game.

Ubisoft optimization favouring AMD hardware is one thing but the performance being this bad on Nvidia cards is simply unacceptable.
 
Last edited:
Sony fans Cherry picking results to prove falsehoods? What is going on in this thread?

2080 level in this game. 2060 in watchdogs. Average is 2070
/Thread
They’re just getting a little - stupid - revenge for all the “most powerful console” talk before launch.

The fact they’re using a poorly optimized game - and a single one for that matter - to prove ps5’s secret sauce is just laughable.

It will be interesting to see games like Doom Eternal ported to ps5/series x to see how they compare against pc.

PS5 is 10tf and is performing as such. The strange thing is nvidia doing so bad in this game, and xbox doing bad either.
 

Venom Snake

Gold Member
Don't see why that's an issue. Be interesting to see how close or how far behind the SX and PS5 are to the PC's new tech too.


The 3080 + i9 10900k at 1440p/maxed out seems to jump between 70-80fps for most of the duration of this test.



Ryzen 5 5600x + RTX 3070/maxed out, 1440p. It seems to hover around 60fps in this particular test (more often above this value )


But these are tests from a month ago, maybe something has changed by then.
 
Last edited:

Sygma

Member
Why wouldn't they?

A 2080Ti it's already outperforming the PS5, and that GPU looks like a mid tier GPU compared to a 3080, much less a 3090.


Its hardly mid tier lol
 

Darius87

Member


Ryzen 5 5600x + RTX 3070/maxed out, 1440p. It seems to hover around 60fps in this particular test (more often above this value )


But these are tests from a month ago, maybe something has changed by then.

looking from this video PS5 is around 3070 in raster. without any hw advantages PS5 have, though is unclear if DF comparisson with 2080 was fair because comparing 1440p with dynamic res and CPU was not spec by spec?
 

Hendrick's

If only my penis was as big as my GamerScore!
This test confirms that the PS5 is performing in line with it's TF count, which is what we expected. So now we just need to understand why SX is performing well under its count. Show me those tools, MS!
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
Why use previous gen cards? Should have benchmarked it with the 3 series.

My guess is that Alex Bataglia simply didn't own all of these 30x0 cards, so he used the cards that he did own. Let's not forget that cards in the 3 series aren't exactly easy to get and you need to spend $3000 to buy all of them (3060 TI, 3070, 3080 and 3090).
 

Venom Snake

Gold Member
looking from this video PS5 is around 3070 in raster. without any hw advantages PS5 have, though is unclear if DF comparisson with 2080 was fair because comparing 1440p with dynamic res and CPU was not spec by spec?
Of course, we should take into account different testing environments, different locations, circumstances, and the very fact that in this particular game, Nvidia cards are doing - as Alex said - below the typical average.

Still, it's always good to see ps5 doing well, especially since so many people had serious doubts about it.
I personally do not intend to overestimate or underestimate the capabilities of the next-gen consoles, as it has always been quite a bumpy road with this type of hardware. Nevertheless, I am very pleased with the results. (y)
 

Topher

Gold Member
This test confirms that the PS5 is performing in line with it's TF count, which is what we expected. So now we just need to understand why SX is performing well under its count. Show me those tools, MS!

Does that also confirm then that if/when XSX devs get their new tools we will only see a 17% increase at most in performance over PS5?

Edit: You seem to use the "triggered" reaction when you don't want to answer a question. Noted.
 
Last edited:

Hendrick's

If only my penis was as big as my GamerScore!
Does that also confirm then that if/when XSX devs get their new tools we will only see a 17% increase at most in performance over PS5?

Edit: You seem to use the "triggered" reaction when you don't want to answer a question. Noted.
Was that a legitimate question? If it was, than yes, I would expect at most 10-15%.
 

Darius87

Member
Of course, we should take into account different testing environments, different locations, circumstances, and the very fact that in this particular game, Nvidia cards are doing - as Alex said - below the typical average.
that's typical excuse from him i wonder if he does the same when AMD cards when it does worse then Nvidia's?

Still, it's always good to see ps5 doing well, especially since so many people had serious doubts about it.
I personally do not intend to overestimate or underestimate the capabilities of the next-gen consoles, as it has always been quite a bumpy road with this type of hardware. Nevertheless, I am very pleased with the results. (y)
way back Alex said on forums that ps5 is == 2060 performance now he's with straight face says that it's 2080 and these people will compare PC vs PS5 vs XSX for next 5-7 years.
 
Last edited:

Topher

Gold Member
Was that a legitimate question? If it was, than yes, I would expect at most 10-15%.

Not sure why that question was controversial. I agree with your answer (assuming this is all about tools). I've read others make claims that the gains would be much higher and was curious to read your opinion.
 
Last edited:
Was that a legitimate question? If it was, than yes, I would expect at most 10-15%.

I'm wondering if thats just across the board.

It could be that depending on the situation either the PS5 will be ahead or the XSX. Both systems do seem to have different strengths and weaknesses and since they are extremely close I can see comparisons going either way.

Definitely not an Xbox vs PS2 situation where the Xbox wins all the multiplatform comparisons hands down.
 

Hendrick's

If only my penis was as big as my GamerScore!
I'm wondering if thats just across the board.

It could be that depending on the situation either the PS5 will be ahead or the XSX. Both systems do seem to have different strengths and weaknesses and since they are extremely close I can see comparisons going either way.

Definitely not an Xbox vs PS2 situation where the Xbox wins all the multiplatform comparisons hands down.
There will always be exceptions of course. They are very close in design though, so barring any actual issue with the Xbox hardware, which I doubt, for resolution and performance, the Series X should almost always have a slight advantage.
 
There will always be exceptions of course. They are very close in design though, so barring any actual issue with the Xbox hardware, which I doubt, for resolution and performance, the Series X should almost always have a slight advantage.

So far the comparisons make me believe that the advantages are situational. Haven't really seen a case yet where one system always has the advantage compared to the other. It could be like this for the rest of the generation since the two are extremely close. No DD (Dealer Difference) will become a thing with either of the two platforms. Certainly not like the past where you can have complete confidence in buying the superior version of a game without looking at comparisons.
 

Hendrick's

If only my penis was as big as my GamerScore!
So far the comparisons make me believe that the advantages are situational. Haven't really seen a case yet where one system always has the advantage compared to the other. It could be like this for the rest of the generation since the two are extremely close. No DD (Dealer Difference) will become a thing with either of the two platforms. Certainly not like the past where you can have complete confidence in buying the superior version of a game without looking at comparisons.
Like I said, if there is an issue or actual "bottleneck" in the Series X then that is possible. PS5 does have its advantages, but they really don't pertain to raw resolution and performance. At least with what we know, the Xbox SX should be performing better. Like I said, we will have to wait to see why SX is "punching below it's weight".
 
Last edited:
Like I said, if there is an issue or actual "bottleneck" in the Series X then that is possible. PS5 does have its advantages, but they really don't pertain to raw resolution and performance. At least with what we know, the Xbox SX should be performing better. Like I said, we will have to wait to see why SX is "punching below it's weight".

I guess you're right. Time will tell if it's an actual bottleneck that's stopping the XSX from performing better or it's just tools.

The only possible bottlenecks that I can think of is the ram configuration or insufficient cache to feed those CUs.
 
Last edited:

Venom Snake

Gold Member
that's typical excuse from him i wonder if he does the same when AMD cards when it does worse then Nvidia's?


way back Alex said on forums that ps5 is == 2060 performance now he's with straight face says that it's 2080 and these people will compare PC vs PS5 vs XSX for next 5-7 years.
My guess is that the performance of the new consoles will fluctuate between the two cards you mentioned, depending on whether ray tracing will be used and how it will be used. We have to take into account that this is a new feature on consoles (and graphics cards from AMD in general) and it looks like it is very expensive on this type of hardware.
These initial comparisons do not give us the full picture yet, we will find out whose speculations were more accurate in the near future. So far we have nothing to complain about, really.

Personally, I prefer to keep my expectations at a reasonable level, it is always better to be pleasantly surprised than unpleasantly disappointed. :messenger_winking:
 

Truespeed

Member
A new engine delivering ‘more’ on the same hardware means results wen’t maxed on previous engine tho?!?

What does ‘maxed’ out hardware mean in a technological environment where results are dictated by the marriage of hardware and software. Seems like a disingenuous proposal where techniques evolve and better use of a platform is achieved over time. Either that or a gross misunderstanding.

It just means that the hardware is maxed out relative to the software that's pushing it. There's nothing more the hardware can do, given the way it's being utilized, to make the game run faster or at higher resolutions. If that weren't the case then the engine wouldn't have to constantly lower the resolution to maintain its framerate. That's not to say the AnvilNext is representative of the full potential of the PS5 hardware, but rather the full potential of the current iteration of the AnvilNext engine on the PS5.
 
Last edited:

onQ123

Member
Like I said, if there is an issue or actual "bottleneck" in the Series X then that is possible. PS5 does have its advantages, but they really don't pertain to raw resolution and performance. At least with what we know, the Xbox SX should be performing better. Like I said, we will have to wait to see why SX is "punching below it's weight".

You have your answer you just don't want to accept it

 

Hendrick's

If only my penis was as big as my GamerScore!
You have your answer you just don't want to accept it

Yet here we are in a thread where there is actual evidence that the PS5 is performing exactly as it's AMD off the shelf equivalent. Sorry, but your desire to take an early victory lap is foolish.
 

thelastword

Banned
The RT performance of RDNA2 is trash. Even the RX 6800 loses to the RTX 2060S in games that have path tracing like Minecraft RT.
I think the consoles may perform like the 2060S in current RT games, but the PS5's rasterization performance is much superior to the 2060s. We are looking at 3070 levels and above. With PS5's low level api, it's geometry engine and cache scrubbers in play, it may even push to 3080 levels, but we will only see that on third party games that are pushing the customized hardware of the PS5 with a new gen game, like DICE.

Having said that. RT performance is not something we can judge RDNA2 cards yet, most of the RT titles are Nvidia focused, so we should see how the new crop of games fare on RDNA2 RT..... I think it will be much more competitive than the games built primarily for NVIDIA's proprietary RT.
It's poetic justice that the engine made hand-in-hand with Nvidia for their older cards (remember, this was done in cross-gen time when they put Black Flag out on PS3) now comes back to bite them in the butt because they went the old AMD route with high core count/heavy compute and AMD went classic Nvidia. And of course this time the new engine will be made with AMD from the beginning (it's gonna go the same way it went with GTA5 vs RDR2). Expect to see a lot of Ampere owners cry their hearts out when that hits in 2 years & ofc that magic 2 year mark is also when Nvidia stops giving a fuck about you because they have a new arch out meanwhile RDNA2 is gonna see games/engines built around it for the next 7+ years.

3lb9Ph5.gif
KgU1BiA.gif


XL27Ste.gif
I've told Nvidia fans from the outset to prepare to lose quite a bit to AMD in game's performance when these consoles launch. The console's are the bread and butter for these devs, both consoles have RDNA2 Gpu's. The majority of games will be developed with that architecture in mind because of consoles. So the RDNA2 PC Gpu's will gain the most from this and will continue to be more performant in rasterization vs Nvidia cards more than twice it's price. RT on RDNA2 will improve too, devs are just getting to grips with it, in Vulkan and DXR 1.1.....Most games and it's RT will be based on the consoles as well, which will directly benefit PC RDNA2 Gpu's.
If 2 years from now AMD has good RT performance and working DLSS alternative I will jump to RDNA3. I have no brand loyality, just buy the product that is the best in my opinion, and Nvidia (still) is right now.
Ai reconstruction failed, no matter how much they tout DLSS. DLSS 2.0 went back to Thailand to reconstruct because the image quality was awful..... DLSS 2.0 is simply a guesstimated image from a 16k source, but I don't even trust Nvidia's numbers. It's not perfect and there are many inconsistencies and missing details in background tasks.

I think a hardware based solution like an improved checkerboard solution will prove much superior, like CB 2.0 or Super Resolution, there will be no need for A. I, it will work on any card and it will be artifact free.... I can easily see AMD building Super Resolution from the ID buffer foundation of PS4 PRO as that cleaned the image with varying levels of TAA, which could be tweaked manually by devs..... SO will easily outstrip DLSS, it will be easy to implement and unlike the slow drip of DLSS titles, it will work on all games at the driver level.... So having access to a more performant option will be universal. Devs won't have to work with AMD or connect to their servers to implement it..... Obviously, the more powerful your RDNA2 gpu, the better and you will be able to scale up to higher resolutions, like 4096 x 2160 scaled to 8k or even 16k Cb/Sr downscaled to 4k for Uber and pristine supersampling.... Checkerboard Rendering had several quality levels, this is why playing GOW on a more performant PS5 defaults to the highest quality and rez, which gives an even more impressive picture over PS4 PRO.......
 

LordOfChaos

Member
Even with the 3060TI bringing that performance to 400 dollars, a 400 dollar console matching a 400 dollar GPU at launch is a vast improvement over the 8th gen, where the PS4 was only matching ~250 dollar GPUs, the XBO even worse.
 

onQ123

Member
Yet here we are in a thread where there is actual evidence that the PS5 is performing exactly as it's AMD off the shelf equivalent. Sorry, but your desire to take an early victory lap is foolish.

This post has nothing to do with PS5 you asked why Xbox Series X was punching below it's weight & I gave you the answer that Microsoft gave us
 
I think the consoles may perform like the 2060S in current RT games, but the PS5's rasterization performance is much superior to the 2060s. We are looking at 3070 levels and above. With PS5's low level api, it's geometry engine and cache scrubbers in play, it may even push to 3080 levels, but we will only see that on third party games that are pushing the customized hardware of the PS5 with a new gen game, like DICE.
Hahahaha
Good one
 

VFXVeteran

Banned
Yes this is very important . A game fully developed from the scratch and optimized exclusively (by a competent developer) for these next gen consoles will look out of the world . I can't wait for the next Naught Dog or UE5 game .

Both graphics engines support the PC (ND engine will soon) so there isn't a fully optimized from scratch graphics engine JUST for a next-gen console. Sorry those days are over.
 

VFXVeteran

Banned
I was laughing when i see a post from someone said "PC gaming are not held back by consoles yadi yada yada". Maybe they forgot about the story of "watch dog 1" and "witcher 3", etc. Demo versus final retail version. If those games were PC exclusive, they wouldn't be Downgraded. Simple as that.

OR - their graphics engines needed work from the beginning...there's that.
 
  • Thoughtful
Reactions: Rea

Romulus

Member
Valhalla is one of the few titles where RDNA 2 performs better than average vs Nvidia. So naturally the PS5 also enjoys that relative performance boost so this is a best case scenario when comparing the PS5 to PC GPUs.

In this game the PS5 performs at basically 1080ti level, a 4 year old GPU.

Also, I'm guessing he started making this video before the PS5 got a "Quality mode" patch. This made the comparisons much more difficult as the PC doesn't have exactly the same kind of resolution scaling.

The Quality made makes the comparison much easier and more straight forward. The PS5 should be pretty much highest settings, at a full 4k and a locked 30fps. A 1080ti can also do this very well.


1080ti at 1440p with similar settings dips alot more frequently and lower than ps5

And many of these scenes the ps5 is scaling above 1440p and holding closer to 60fps, if not locked most of the time



The 4k quality mode comparison argument is sort of a bad one. Of course neither could do 4k 60fps so we're stuck at a locked 30fps on ps5, unable to see how far above that it can go. Theres a ton of room for power comparison between 30 and 60.
 
Last edited:

bargeparty

Member
I was laughing when i see a post from someone said "PC gaming are not held back by consoles yadi yada yada". Maybe they forgot about the story of "watch dog 1" and "witcher 3", etc. Demo versus final retail version. If those games were PC exclusive, they wouldn't be Downgraded. Simple as that.

Bioshock Infinite's scale was reduced because of the consoles.
 
  • Like
Reactions: Rea
Top Bottom