Why use previous gen cards? Should have benchmarked it with the 3 series.
Why use previous gen cards? Should have benchmarked it with the 3 series.
Look at the light.I always knew PS5 loved me by rendering more flowers for me.
still not 18.6% also they compared to ps5 after patch 1.04 that If I remember correctly drop little performance in this scene9,6%
For GPU comparisons between consoles and PC (and AMD vs Nvidia),
So the PS5 is performing exactly to it's TF? Interesting, I thought those didn't matter.
Are these the PS5 tools?
Why use previous gen cards? Should have benchmarked it with the 3 series.
Yes they absolutely did as did 90% of gamers and media!Do you think DF just underestimated the PS5's grunt? They suggested 2070 level power, 2080 for the Series X.
Because PC most recent high end cards ar always a generation ahead, sometimes even more.
They’re just getting a little - stupid - revenge for all the “most powerful console” talk before launch.Sony fans Cherry picking results to prove falsehoods? What is going on in this thread?
2080 level in this game. 2060 in watchdogs. Average is 2070
/Thread
Don't see why that's an issue. Be interesting to see how close or how far behind the SX and PS5 are to the PC's new tech too.
Why wouldn't they?
A 2080Ti it's already outperforming the PS5, and that GPU looks like a mid tier GPU compared to a 3080, much less a 3090.
Ryzen 5 5600x + RTX 3070/maxed out, 1440p. It seems to hover around 60fps in this particular test (more often above this value )
But these are tests from a month ago, maybe something has changed by then.
Why use previous gen cards? Should have benchmarked it with the 3 series.
Of course, we should take into account different testing environments, different locations, circumstances, and the very fact that in this particular game, Nvidia cards are doing - as Alex said - below the typical average.looking from this video PS5 is around 3070 in raster. without any hw advantages PS5 have, though is unclear if DF comparisson with 2080 was fair because comparing 1440p with dynamic res and CPU was not spec by spec?
This test confirms that the PS5 is performing in line with it's TF count, which is what we expected. So now we just need to understand why SX is performing well under its count. Show me those tools, MS!
I doubt it will be that highDoes that also confirm then that if/when XSX devs get their new tools we will only see a 17% increase at most in performance over PS5?
Was that a legitimate question? If it was, than yes, I would expect at most 10-15%.Does that also confirm then that if/when XSX devs get their new tools we will only see a 17% increase at most in performance over PS5?
Edit: You seem to use the "triggered" reaction when you don't want to answer a question. Noted.
that's typical excuse from him i wonder if he does the same when AMD cards when it does worse then Nvidia's?Of course, we should take into account different testing environments, different locations, circumstances, and the very fact that in this particular game, Nvidia cards are doing - as Alex said - below the typical average.
way back Alex said on forums that ps5 is == 2060 performance now he's with straight face says that it's 2080 and these people will compare PC vs PS5 vs XSX for next 5-7 years.Still, it's always good to see ps5 doing well, especially since so many people had serious doubts about it.
I personally do not intend to overestimate or underestimate the capabilities of the next-gen consoles, as it has always been quite a bumpy road with this type of hardware. Nevertheless, I am very pleased with the results.
Was that a legitimate question? If it was, than yes, I would expect at most 10-15%.
Maybe I misunderstood your intent. Apologies.Not sure why that question was controversial. I agree with your answer. I've read others make claims that the gains would be much higher and was curious to read your opinion.
Maybe I misunderstood your intent. Apologies.
Was that a legitimate question? If it was, than yes, I would expect at most 10-15%.
There will always be exceptions of course. They are very close in design though, so barring any actual issue with the Xbox hardware, which I doubt, for resolution and performance, the Series X should almost always have a slight advantage.I'm wondering if thats just across the board.
It could be that depending on the situation either the PS5 will be ahead or the XSX. Both systems do seem to have different strengths and weaknesses and since they are extremely close I can see comparisons going either way.
Definitely not an Xbox vs PS2 situation where the Xbox wins all the multiplatform comparisons hands down.
There will always be exceptions of course. They are very close in design though, so barring any actual issue with the Xbox hardware, which I doubt, for resolution and performance, the Series X should almost always have a slight advantage.
Like I said, if there is an issue or actual "bottleneck" in the Series X then that is possible. PS5 does have its advantages, but they really don't pertain to raw resolution and performance. At least with what we know, the Xbox SX should be performing better. Like I said, we will have to wait to see why SX is "punching below it's weight".So far the comparisons make me believe that the advantages are situational. Haven't really seen a case yet where one system always has the advantage compared to the other. It could be like this for the rest of the generation since the two are extremely close. No DD (Dealer Difference) will become a thing with either of the two platforms. Certainly not like the past where you can have complete confidence in buying the superior version of a game without looking at comparisons.
Like I said, if there is an issue or actual "bottleneck" in the Series X then that is possible. PS5 does have its advantages, but they really don't pertain to raw resolution and performance. At least with what we know, the Xbox SX should be performing better. Like I said, we will have to wait to see why SX is "punching below it's weight".
My guess is that the performance of the new consoles will fluctuate between the two cards you mentioned, depending on whether ray tracing will be used and how it will be used. We have to take into account that this is a new feature on consoles (and graphics cards from AMD in general) and it looks like it is very expensive on this type of hardware.that's typical excuse from him i wonder if he does the same when AMD cards when it does worse then Nvidia's?
way back Alex said on forums that ps5 is == 2060 performance now he's with straight face says that it's 2080 and these people will compare PC vs PS5 vs XSX for next 5-7 years.
A new engine delivering ‘more’ on the same hardware means results wen’t maxed on previous engine tho?!?
What does ‘maxed’ out hardware mean in a technological environment where results are dictated by the marriage of hardware and software. Seems like a disingenuous proposal where techniques evolve and better use of a platform is achieved over time. Either that or a gross misunderstanding.
Like I said, if there is an issue or actual "bottleneck" in the Series X then that is possible. PS5 does have its advantages, but they really don't pertain to raw resolution and performance. At least with what we know, the Xbox SX should be performing better. Like I said, we will have to wait to see why SX is "punching below it's weight".
Yet here we are in a thread where there is actual evidence that the PS5 is performing exactly as it's AMD off the shelf equivalent. Sorry, but your desire to take an early victory lap is foolish.You have your answer you just don't want to accept it
If Microsoft don't use just the TFLOPS to compare Xbox Series X GPU to the Xbox One & Xbox One X why is everyone else using it to compare it to PS5 ?
They only claim a 4-6X raw GPU advantage over Xbox One while having 9X TFLOPS advantage , The triangles rate advantage is the part that bring it down to 4 -6X the raw GPU output & PS5 actually has the advantage in triangle & pixel fill rate so why do people keep using the Xbox Series X GPU...www.neogaf.com
I think the consoles may perform like the 2060S in current RT games, but the PS5's rasterization performance is much superior to the 2060s. We are looking at 3070 levels and above. With PS5's low level api, it's geometry engine and cache scrubbers in play, it may even push to 3080 levels, but we will only see that on third party games that are pushing the customized hardware of the PS5 with a new gen game, like DICE.The RT performance of RDNA2 is trash. Even the RX 6800 loses to the RTX 2060S in games that have path tracing like Minecraft RT.
I've told Nvidia fans from the outset to prepare to lose quite a bit to AMD in game's performance when these consoles launch. The console's are the bread and butter for these devs, both consoles have RDNA2 Gpu's. The majority of games will be developed with that architecture in mind because of consoles. So the RDNA2 PC Gpu's will gain the most from this and will continue to be more performant in rasterization vs Nvidia cards more than twice it's price. RT on RDNA2 will improve too, devs are just getting to grips with it, in Vulkan and DXR 1.1.....Most games and it's RT will be based on the consoles as well, which will directly benefit PC RDNA2 Gpu's.It's poetic justice that the engine made hand-in-hand with Nvidia for their older cards (remember, this was done in cross-gen time when they put Black Flag out on PS3) now comes back to bite them in the butt because they went the old AMD route with high core count/heavy compute and AMD went classic Nvidia. And of course this time the new engine will be made with AMD from the beginning (it's gonna go the same way it went with GTA5 vs RDR2). Expect to see a lot of Ampere owners cry their hearts out when that hits in 2 years & ofc that magic 2 year mark is also when Nvidia stops giving a fuck about you because they have a new arch out meanwhile RDNA2 is gonna see games/engines built around it for the next 7+ years.
Ai reconstruction failed, no matter how much they tout DLSS. DLSS 2.0 went back to Thailand to reconstruct because the image quality was awful..... DLSS 2.0 is simply a guesstimated image from a 16k source, but I don't even trust Nvidia's numbers. It's not perfect and there are many inconsistencies and missing details in background tasks.If 2 years from now AMD has good RT performance and working DLSS alternative I will jump to RDNA3. I have no brand loyality, just buy the product that is the best in my opinion, and Nvidia (still) is right now.
Yet here we are in a thread where there is actual evidence that the PS5 is performing exactly as it's AMD off the shelf equivalent. Sorry, but your desire to take an early victory lap is foolish.
So XsX is maxed out day one too? Or is that a special case?
HahahahaI think the consoles may perform like the 2060S in current RT games, but the PS5's rasterization performance is much superior to the 2060s. We are looking at 3070 levels and above. With PS5's low level api, it's geometry engine and cache scrubbers in play, it may even push to 3080 levels, but we will only see that on third party games that are pushing the customized hardware of the PS5 with a new gen game, like DICE.
Yes this is very important . A game fully developed from the scratch and optimized exclusively (by a competent developer) for these next gen consoles will look out of the world . I can't wait for the next Naught Dog or UE5 game .
I was laughing when i see a post from someone said "PC gaming are not held back by consoles yadi yada yada". Maybe they forgot about the story of "watch dog 1" and "witcher 3", etc. Demo versus final retail version. If those games were PC exclusive, they wouldn't be Downgraded. Simple as that.
Valhalla is one of the few titles where RDNA 2 performs better than average vs Nvidia. So naturally the PS5 also enjoys that relative performance boost so this is a best case scenario when comparing the PS5 to PC GPUs.
In this game the PS5 performs at basically 1080ti level, a 4 year old GPU.
Also, I'm guessing he started making this video before the PS5 got a "Quality mode" patch. This made the comparisons much more difficult as the PC doesn't have exactly the same kind of resolution scaling.
The Quality made makes the comparison much easier and more straight forward. The PS5 should be pretty much highest settings, at a full 4k and a locked 30fps. A 1080ti can also do this very well.
I was laughing when i see a post from someone said "PC gaming are not held back by consoles yadi yada yada". Maybe they forgot about the story of "watch dog 1" and "witcher 3", etc. Demo versus final retail version. If those games were PC exclusive, they wouldn't be Downgraded. Simple as that.