• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: PS5 vs PC in Assassin's Creed Valhalla

Yeah, but should not be THAT bad. But Ubiport....

Japanese dev ports are even worse.

Yeah, I was never in that game. I held on to my Intel i7 920 for 9 years.

I have a GTX 1070 that I was going to upgrade to a 2080s for Half-Life: Alyx, but the word at the time was that the 3080 was going to launch in early Summer. As I waited, it got pushed back more and more, and now, it looks like it's all but impossible to get one until April. I would have been better served just getting a 2080s back in January. Even worse, my little brother's PC is stuck on an Intel i5 3570k and is in desperate need for upgrade. Unfortunately, the new AMD CPUs are also impossible to come by. With GPU availability scare and prices so ridiculously inflated, just feels less and less worth it, especially when the bigger PC exclusives are fewer and fewer these days. Not to mention, PS5's insane loading speeds. For multiplat gaming, I'm feeling like PS5 is going to take over PC for me, while I just keep my PC for CSGO and the occasional PC games that don't require a lot of power.

PS5 is great value no doubt about it. Hopefully the 1440p patch comes soon for folks that like to game on their gaming chairs with monitors. I know once you get used to either the couch or the chair; switching habits takes an adjustment period. You'll be able to land a good card for cheap.... just gotta wait somewhat longer (year or so) and be alert. The PS5 will be more than enough for the time being and then some.
 
Last edited:

VFXVeteran

Banned
No, It wasn't. I don't think they had the time and resource to use PS5's new GPU features like primitive shaders. They certainly did not use the console's RT hardware as there are no RT effects in this game. Those 16 threads in the CPU likely remain largely unused as well.

Dude, listen to what I'm saying. If the game could only render at 4k/60FPS dynamic resolution, then it was bandwidth limited - and therefore, maxed out. Period. It could barely hold 60FPS consistently no matter what content is being thrown at it as proven in the videos. Primitive shader usage (for what actually in this game?) would bottleneck the GPU further. RT would make the game not even run at a reasonable FPS.

Now I get what you guy's thinking is. You think that the PS5 has to use ALL of it's features ALL at the same time like a high-end PC GPU does in order for it to be fully taken advantage of. AC:Valhalla max'd out the next-gen consoles with the budget of what they wanted to obtain (4k/60FPS dynamic). There was nothing else they could have added for free.

Nah.. that's NOT how a developer thinks sorry to say. We think like this. How much can I get away with and stay within the FPS budget? Primitive shader usage will come from an "algorithm" using it in a general use graphics engine. NOT simply because the hardware has the feature (which btw, Nvidia has had for awhile now).
 
Last edited:
Good. If it was a huge gap, it wouldn't bode well for pc gamers. Console versions of games frequently dictate the ceiling for the pc version, unfortunately. Developers often don't put much effort into pc multiplats beyond what's expected (resolution, better AA, sometimes extended draw distance, etc.), and understandably so.
 

Hezekiah

Banned
Because they come up with false claims like: the PS5 will get more powerful as devs master the hardware. Or pretending that games have much better tech due to optimizations when they always have less features. Or that the PC is a weaker platform because it's unoptimized (when the PC doesn't need to be optimized like a console to outperform it). You hear the same song and dance every day.
I don't think anybody is saying that, just that they expect PS5 to perform better as time goes on - hardly surprising when most of the big-name games still have to account for Jaguar CPUs.

Killzone ShadowFall looks great (I own it), but do you really think it stands up to some of the more recent AAA PS4 games (like Horizon) in terms of graphics and complexity of the game world?

And I love the flexibility and performance of PC gaming. But it is expensive. And it will be months before GPU and PSU prices normalise. People here are talking about the 2080ti - I'm seeing second-hand cards selliing for £600+. The 3000 series looks excellent but not hitting the performance levels Nvidia led us to believe it would, and the launch has been a joke with it being near-impossible to get hold of one while Nvidia sells to miners, and the ludicrous prices if you do.
 

Alexios

Banned
Going by recent GAF discussions of the console magic sauces it's so weird his PC build didn't actually need to wait for next gen SSDs, ram and motherboards to bring it all together and match PS5's SSD magic sauce gains.

He didn't even use the last GPU series but tested 2+ year old models that were on the high end spectrum, I guess in a year or two (most people still won't have a next gen system) a mid/low end 4050ti might be at this level.

I wonder now how old his CPU & other components are (well, based on when they launched of course, not when he happened to buy any), they're not mentioned anywhere for some reason, weird for hardware comparisons.
 
Last edited:

Hezekiah

Banned
There is nothing impressive about a game running at 30fps.
Maybe not to you, but there is a lot to be impressed by games like The Last Of Us 2, Ghosts Of Tsushima, God of War etc. Hence the monumental praise they have received. To say otherwise is nonsense in my opinion. Ocarina of Time, GTA 3 on consoles, Bloodborne...not impressive.

And I already touched on the sheer lack of horsepower in current-gen CPUs.
 

VFXVeteran

Banned
I don't think anybody is saying that,

I beg to differ. They actually ARE saying that. Look at all the replies in this thread. They are absolutely implying that and wish that.

just that they expect PS5 to perform better as time goes on - hardly surprising when most of the big-name games still have to account for Jaguar CPUs.

I still have a CPU that's older than 7yrs (i7/6800k). These games are GPU bound. Yes, the Jaguar CPUs were bottlenecks, but the new CPUs will not utilize anywhere near equivalent to the GPUs usage. These games are all GPU bound.

Killzone ShadowFall looks great (I own it), but do you really think it stands up to some of the more recent AAA PS4 games (like Horizon) in terms of graphics and complexity of the game world?

Yes it does. Here, I can list it for you:

Deferred lighting engine
Anistropic filtering
volumetric fog/smoke
Large texture sizes
Normal maps
SSR (with dynamic cube maps)
SSS on skin
Water caustics
Light shafts
Dynamic shadows
Physically-based shaders

ALL of those features are in both games. None of those features are improved in Horizon over KZ:SF. Horizon is a completely different type of game where it's open world and they added a procedural foliage system and dynamic TOD. Those two games have different goals. One should NOT assume that because one is open world and has these extra systems that the developers didn't know how to make those systems back when making KZ:SF and here 5yrs later, they know how to make them now and they all optimized the shit out of the PS hardware to get it to run better than it was capable of earlier on. THAT's a fallacy and shows a lack of understanding of both game development and how rendering works on hardware.
 
Last edited:

VFXVeteran

Banned
How so? Aside from giving the option to increase certain options, devs don't often go out of their way to create a drastically more improved game (visually), even though the sky's the limit on PC.

I'll give you a simple example.

WD:Legions has a high resolution texture pack. That meant that they allowed the artists to create textures that are extremely high res. That texture pak is NOT available on consoles but IS available on the PC. Therefore, a console is NOT the PC's ceiling limitation since they made those assets with PC in mind.

I wish people would stop saying this statement. The PCs are the primary development machine for 3rd party devs.
 

Hezekiah

Banned
I beg to differ. They actually ARE saying that. Look at all the replies in this thread. They are absolutely implying that and wish that.



I still have a CPU that's older than 7yrs (i7/6800k). These games are GPU bound. Yes, the Jaguar CPUs were bottlenecks, but the new CPUs will not utilize anywhere near equivalent to the GPUs usage. These games are all GPU bound.



Yes it does. Here, I can list it for you:

Deferred lighting engine
Anistropic filtering
volumetric fog/smoke
Large texture sizes
Normal maps
SSR (with dynamic cube maps)
SSS on skin
Water caustics
Light shafts
Dynamic shadows
Physically-based shaders

ALL of those features are in both games. Horizon is a completely different type of game where it's open world and they added a procedural foliage system and dynamic TOD. Those are things have different goals from each game. They should NOT be compared that because one is open world and has these extra systems that the developers didn't know how to make those system back when making KZ:SF and here 5yrs later, they know how to make them now and they all come for free.
That's exactly the point. Anyone who knows anything knows that hardware doesn't just become more powerful. Over time developers are able to extract more from it. That's all that needs to said, no need to talk down to people, some of whom are probably mis-articulating the points they're trying to get across anyway.

The fact that Horizon is open world is why I mentioned. It's a massive game world and still looks like that. And regardless of what you say, people are always going to compare the graphics of different games - which is why you more commonly hear 'What's the best-looking game on X platform?', more than 'best looking game type or genre'.


P.S. I'm pretty sure the 6800k isn't more than seven years old.
 
I'll give you a simple example.

WD:Legions has a high resolution texture pack. That meant that they allowed the artists to create textures that are extremely high res. That texture pak is NOT available on consoles but IS available on the PC. Therefore, a console is NOT the PC's ceiling limitation since they made those assets with PC in mind.

I wish people would stop saying this statement. The PCs are the primary development machine for 3rd party devs.
So, what you're saying is that the consoles don't hold back PC gaming?
 
Valhalla is one of the few titles where RDNA 2 performs better than average vs Nvidia. So naturally the PS5 also enjoys that relative performance boost so this is a best case scenario when comparing the PS5 to PC GPUs.

In this game the PS5 performs at basically 1080ti level, a 4 year old GPU.

Also, I'm guessing he started making this video before the PS5 got a "Quality mode" patch. This made the comparisons much more difficult as the PC doesn't have exactly the same kind of resolution scaling.

The Quality made makes the comparison much easier and more straight forward. The PS5 should be pretty much highest settings, at a full 4k and a locked 30fps. A 1080ti can also do this very well.
 
Last edited:

VFXVeteran

Banned
So, what you're saying is that the consoles don't hold back PC gaming?

No. They don't. PC gaming is always going to have agnostic APIs and generally standard graphics engine (i.e. UE4). They will not be "catered" to any specific hardware platform like the consoles and they don't need to be.

All the R&D that goes into graphics features are developed from the PC. Most devs have both PC boxes and consoles (yes, even 1st party Sony companies).

So in answer to your question, no the PC isn't held back by consoles from a graphics features standpoint. Now, SCOPE of a game is another matter.
 
No. They don't. PC gaming is always going to have agnostic APIs and generally standard graphics engine (i.e. UE4). They will not be "catered" to any specific hardware platform like the consoles and they don't need to be.

All the R&D that goes into graphics features are developed from the PC. Most devs have both PC boxes and consoles (yes, even 1st party Sony companies).

So in answer to your question, no the PC isn't held back by consoles from a graphics features standpoint. Now, SCOPE of a game is another matter.

That is more or less what I meant. Sorry if that wasn't clear.
 

VFXVeteran

Banned
That's exactly the point. Anyone who knows anything knows that hardware doesn't just become more powerful. Over time developers are able to extract more from it. That's all that needs to said, no need to talk down to people, some of whom are probably mis-articulating the points they're trying to get across anyway.

But they don't extract anything "meaningfully" more from it at all. You hear "look at RT in MM!! Can you imagine what the game will look like in Spiderman 2?" This is a fallacy quote! It's not going to get significantly better than what's already presented graphics-wise. That's the lack of knowledge right there and fanboyism taking it's toll.

The fact that Horizon is open world is why I mentioned. It's a massive game world and still looks like that. And regardless of what you say, people are always going to compare the graphics of different games - which is why you more commonly hear 'What's the best-looking game on X platform?', more than 'best looking game type or genre'.

An open-world game doesn't make the case that the developers suddenly mastered the hardware. Why can't you admit to my point?

P.S. I'm pretty sure the 6800k isn't more than seven years old.

Maybe not, but I'm not upgrading based on a videogame being bottle-necked by it anytime soon.
 
Last edited:

Ev1L AuRoN

Member
I confess I didn't expect much from AMD after years of broken promises, I though the next gen would be between the 2060~2070 level of performance. But fortunately I was wrong and AMD deliver a great jump in performance with the RDNA2, PS5 looks like it sits between a 2080~2080ti in traditional rasterization which is much better than I anticipated.

After seeing the road to PS5, the UE5 demo and seeing the console in action I'm feeling confident about the future of gaming, I cannot wait to see the games that will take full advantage of these new consoles.
 
I think claiming it's close to a 2080ti is being too generous. Much closer to a 2080. Pretty much exactly 2080/1080ti level in this comparison. And this is an outlier example where RDNA 2 performs better than average against Nvidia.

I'd bet that in many other PS5 vs PC GPU comparisons, the PS5 will fair somewhat worse.
 
Last edited:
I think claiming it's close to a 2080ti is being too generous. Much closer to a 2080. Pretty much exactly 2080/1080ti level in this comparison. And this is an outlier example where RDNA 2 performs better than average against Nvidia.

I'd bet that in many other PS5 vs PC GPU comparisons, the PS5 will fair somewhat worse.

Watch Dogs is a good example. 2060S level.
 

Rea

Member
ac1.png
Dont let VFXVeteran see, he will come in to downplay ps5.
 

Optimus Lime

(L3) + (R3) | Spartan rage activated
Maybe not, but I'm not upgrading based on a videogame being bottle-necked by it anytime soon.

I dropped a 3080 into my 6800k-based system, and I'm still getting incredible performance. Well over 100fps in most games at native 4K.

This thread is really bizarre. The claims being made which are essentially suggesting that PC gaming is now a financial dead end because an Assassin's Creed game on PS5 is marginally equivalent performance to last-gen hardware are wishful thinking.

The PS5 is definitely punching above it's weight at a great price point. There's no doubt about that. But, the stake through the heart of the enthusiast PC market? Give me a break.
 

GHG

Gold Member
How so? Aside from giving the option to increase certain options, devs don't often go out of their way to create a drastically more improved game (visually), even though the sky's the limit on PC.

Console versions usually dictate the baseline for the PC version of the game.

Look at the general level of the minimum specs for PC ports of console games across the last generation (ps4/Xbox One) and you will start to see a trend.
 
For $400 the PS5 is punching well above it's weight and I'm very impressed with the results so far.

I feel like I got away with a huge steal by paying only $400 for my PS5DE.

It makes me very confident about the future of console gaming seeing what the PS5 can do for such a modest amount of money. I'm curious to see the wonders that Sony's first party devs will be able to pull with the PS5 in a couple of years. This generation is going to be very exciting.
 

Rickyiez

Member
That’s good so see out of cross generational launch game that probably isn’t taking advantage of what PS5 and Series X are capable of. Really looking forward to the technical achievements moving forward as the generation goes on.

Yes this is very important . A game fully developed from the scratch and optimized exclusively (by a competent developer) for these next gen consoles will look out of the world . I can't wait for the next Naught Dog or UE5 game .
 
Last edited:

Rikkori

Member
It's poetic justice that the engine made hand-in-hand with Nvidia for their older cards (remember, this was done in cross-gen time when they put Black Flag out on PS3) now comes back to bite them in the butt because they went the old AMD route with high core count/heavy compute and AMD went classic Nvidia. And of course this time the new engine will be made with AMD from the beginning (it's gonna go the same way it went with GTA5 vs RDR2). Expect to see a lot of Ampere owners cry their hearts out when that hits in 2 years & ofc that magic 2 year mark is also when Nvidia stops giving a fuck about you because they have a new arch out meanwhile RDNA2 is gonna see games/engines built around it for the next 7+ years.

3lb9Ph5.gif
KgU1BiA.gif


XL27Ste.gif
 

Armorian

Banned
It's poetic justice that the engine made hand-in-hand with Nvidia for their older cards (remember, this was done in cross-gen time when they put Black Flag out on PS3) now comes back to bite them in the butt because they went the old AMD route with high core count/heavy compute and AMD went classic Nvidia. And of course this time the new engine will be made with AMD from the beginning (it's gonna go the same way it went with GTA5 vs RDR2). Expect to see a lot of Ampere owners cry their hearts out when that hits in 2 years & ofc that magic 2 year mark is also when Nvidia stops giving a fuck about you because they have a new arch out meanwhile RDNA2 is gonna see games/engines built around it for the next 7+ years.

3lb9Ph5.gif
KgU1BiA.gif


XL27Ste.gif

If 2 years from now AMD has good RT performance and working DLSS alternative I will jump to RDNA3. I have no brand loyality, just buy the product that is the best in my opinion, and Nvidia (still) is right now.
 

Rikkori

Member
If 2 years from now AMD has good RT performance and working DLSS alternative I will jump to RDNA3. I have no brand loyality, just buy the product that is the best in my opinion, and Nvidia (still) is right now.
I don't know that RDNA 3 will fix RT for AMD, they need to dedicate a lot more specific hardware for that and I'm not sure they will. Maybe RDNA 4. As for DLSS, it's just down to the devs to do something as good as The Division 2's temporal reconstruction, which is better than DLSS. The "AI magic" doesn't exist because when they tried it with DLSS 1.0 it crashed & burned. All DLSS 2.0 is now is good temporal reconstruction, a very nicely tweaked TAA if you will. Considering you can have both TAA & VRS together but not DLSS & VRS, it's imo dead tech, only here to serve Nvidia's genius marketing. I'm sure that's what AMD is working on with devs, but I don't think we'll see it any sooner than 2022.

It is what it is.
 

Shmunter

Member
The level of performance for hardware doesn't "sneak" up over time. Not sure why every single PS owner believes this. The hardware is FIXED! It doesn't perform at 2080 on Day 1 and then perform at 2080Ti 7yrs from now with the same hardware.

The console WAS maxed out for this game. Most games released will have the developer max out performance and squeeze out every bit of power from the GPU. You don't see the game only using 60% of the GPU because a developer doesn't know how to tap into it's power only to show 2-3yrs later the same game using 100% of the GPU with the same hardware.

Where are all these crazy assumptions coming from by you guys?

P.S. I'm very surprised to see the PS5 running with mostly MAX settings from the PC. That is indeed impressive for me! It's definitely a good piece of hardware that Sony has made!
So XsX is maxed out day one too? Or is that a special case?
 

Bo_Hazem

Banned
I don't know that RDNA 3 will fix RT for AMD, they need to dedicate a lot more specific hardware for that and I'm not sure they will. Maybe RDNA 4. As for DLSS, it's just down to the devs to do something as good as The Division 2's temporal reconstruction, which is better than DLSS. The "AI magic" doesn't exist because when they tried it with DLSS 1.0 it crashed & burned. All DLSS 2.0 is now is good temporal reconstruction, a very nicely tweaked TAA if you will. Considering you can have both TAA & VRS together but not DLSS & VRS, it's imo dead tech, only here to serve Nvidia's genius marketing. I'm sure that's what AMD is working on with devs, but I don't think we'll see it any sooner than 2022.

It is what it is.

With a proper hardware you can have reconstruction like Demon's Souls performance mode which is reconstructed from 1440p and looks insanely sharp and clean and better that the vast majority of native 4K games. Even the aging checkerboarding on GOW when played on PS5 with no patch it looks as clean as those photo mode that everybody were saying that it's not representative of the actual game graphics! Seems it was but PS4 Pro hardware wasn't good enough.

So even checkerboarding is still relevant now with the proper power, so I expect AMD's solution will benefit a lot from multiplats that will favor it over nVidia as it can be implemented on PC/XSX|S/PS5. Another benefit from dominating on the console field is guaranteed optimization for AMD cards in the future. nVidia will still be very solid and strong, but AMD will not be a side dish for devs as it was in the PC field.
 
Last edited:
Yes this is very important . A game fully developed from the scratch and optimized exclusively (by a competent developer) for these next gen consoles will look out of the world . I can't wait for the next Naught Dog or UE5 game .
Yeah, ND is gonna make something gorgeous. Most excited to see the next GTA. Rockstar has that technical wizardry
 

Shmunter

Member
It's maxed out relative to the current version of the AnvilNext 2 engine.
A new engine delivering ‘more’ on the same hardware means results wen’t maxed on previous engine tho?!?

What does ‘maxed’ out hardware mean in a technological environment where results are dictated by the marriage of hardware and software. Seems like a disingenuous proposal where techniques evolve and better use of a platform is achieved over time. Either that or a gross misunderstanding.
 
Last edited:
I’m one of the 7 people in the world with a PS5 DE so paid less that the RRP of even a 3060ti so I think the performance is amazing for the price especially once you factor in the opportunities the new cpu and ssd will bring.

It’s definitely going to vary by game though as I would think even a 2060s might end up massacring both consoles in Cyberpunk (next gen version) once you factor in RT and DLSS. How does a 2060s perform with DLSS in Watchdogs with the PS5 settings?
 
Last edited:

Armorian

Banned
Looks like the PC is done again. All we can hope for is a mid gen PC pro refresh

Just wait for PC2

376654_0_i1064.jpg


I don't know that RDNA 3 will fix RT for AMD, they need to dedicate a lot more specific hardware for that and I'm not sure they will. Maybe RDNA 4. As for DLSS, it's just down to the devs to do something as good as The Division 2's temporal reconstruction, which is better than DLSS. The "AI magic" doesn't exist because when they tried it with DLSS 1.0 it crashed & burned. All DLSS 2.0 is now is good temporal reconstruction, a very nicely tweaked TAA if you will. Considering you can have both TAA & VRS together but not DLSS & VRS, it's imo dead tech, only here to serve Nvidia's genius marketing. I'm sure that's what AMD is working on with devs, but I don't think we'll see it any sooner than 2022.

It is what it is.

I don't see benefits of VRS so far, other than 3Dmark games are not showing results. Reminds me of async, it was so much promoted but in the end it gives just few %...

Right now DLSS is proving itself, it fixes TAA problems like ghosting and alpha dithering in Control and DS. You can even use it with downsampling:

monitor res - 2560x1080, in game setting (DSR) - 5120x2160, internal rendering resolution before DLSS kicks in - 2560x1080

54cControlScreenshot202.png
 
Last edited:
Ps5 is indeed amazing and actually didn't launch as an outdated piece of shit like the PS4 but comparing it to Valhalla on PC is literally the best case scenario for ps5 and worst case for Nvidia cards. The 650usd amd card trounced Nvidia's 3090 in Valhalla.
 

Resenge

Member
I don't work in film anymore. I've been working with realtime graphics for nearly 4yrs bud. Your comment is very stupid and smells of trying to discredit. It's equivalent to assuming someone that works at Apple or FB in their VR or realtime labs as not being knowledgable enough to give an credence about what goes on in the realtime 3d pipeline. Whether the application is a videogame, simulation or tutorial on learning - it's still has the same challenges and requires the same exact knowledge.

You misunderstood me, its not what you work in that I am pointing at. I highlighted the word "professional" because you ain't. Just another random fanboy who happens to do some VFX and loves to lean on that and quote "friends" to add weight to your fanboy arguments.


Your comment is very stupid and smells of trying to discredit.

You do this all on your own when you say things like..

Believe them if you must. Those words are hyperbole if you ask me.

Those are real Insomniac devs working on real games with real world experience yet you expect us to believe you? an anti Sony pc master race biased fanboy over a real Insomniac dev?


00f851c412ac5d862dc04efa268e0835e3ff61e8bdc92c2630c0eeb620e589b5.jpg
 
Last edited:

Rikkori

Member
I don't see benefits of VRS so far, other than 3Dmark games are not showing results. Reminds me of async, it was so much promoted but in the end it gives just few %...

Right now DLSS is proving itself, it fixes TAA problems like ghosting and alpha dithering in Control and DS. You can even use it with downsampling:

monitor res - 2560x1080, in game setting (DSR) - 5120x2160, internal rendering resolution before DLSS kicks in - 2560x1080
VRS is getting a lot of work done, right now it's all behind the scenes knuckle down and work phase. You'll start seeing results a year from now. Remember, the tiered VRS spec wasn't even finalised until recently.
 

J_Gamer.exe

Member
So again whilst a interesting test we have to remember the ps5 was much better in the cutscene hes using to benchmark before the patch.

Also is he not using a slightly stronger cpu?

Ps5 has higher grass density but maybe thats balances out by the fire and cloth physics.

To say a ubisoft game so early, designed around last gen is fully using ps5 features is such a ridiculous claim its barely worth countering. But ok... there is no way in hell this is fully utilising the geometry engine and code is not written for the next next gen io pipeline, the load times alone show us this.

Nor will it be specifically coded for the caches and other features it has. Same with series x its not written specifically for that either.

So if anything I reckon if you run a test with these cards on a mid gen game it would leave these cards behind even more once they code specifically for next gen systems.

I thought this was pretty obvious....
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Going by recent GAF discussions of the console magic sauces it's so weird his PC build didn't actually need to wait for next gen SSDs, ram and motherboards to bring it all together and match PS5's SSD magic sauce gains.

He didn't even use the last GPU series but tested 2+ year old models that were on the high end spectrum, I guess in a year or two (most people still won't have a next gen system) a mid/low end 4050ti might be at this level.

I wonder now how old his CPU & other components are (well, based on when they launched of course, not when he happened to buy any), they're not mentioned anywhere for some reason, weird for hardware comparisons.

Mate if the mid/low 3060ti is already matching the PS5 a theoretical RTX 4040 that costs 100 bucks will match it.

We knew what was in these consoles and we had a good idea of how pumped Ampere and RDNA2 were gonna be.

Even lastgen started the same with consoles pulling high/ultra settings.
In a year that will be a pipedream anywhere near native 4K.

Hoppers cheapest cheapest basically garbage tier card will be PS5 levels and be cheaper than a new console still.
 
Top Bottom