• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Last of Us Part I PC Features Trailer- Ultra-Wide Support, Left Behind, and More

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Yeah, I do a few other bits and pieces where I’ve seen usage go up and I typically have other apps open in the background when playing games. I can imagine that more games are going to start asking for 32GB RAM in the not too distant future too. No harm in upgrading while it’s cheapish!

Also, we’ve not played the game yet, that requirement could actually be accurate. We’ll need to wait and see.
Ahh.
If you are multi tasking while gaming then yeah 48GB for 180 dollars makes sense.
But realistically games on the high end use about 12GB, I dont see this game suddenly being the game to break the 20GB threshold that would require a 32GB to run.
Hell id be shocked if this game even uses 10GB of RAM.
 
Last edited:

Hoddi

Member
It's way more than just softer image, look at the hair in various places. There's way worse examples still.
The whole concept of games running at X resolution kinda doesn't exist anymore. Modern games have countless different buffers and render targets running at resolutions that are different from the output resolution. Depth-of-field effects often run at quarter resolution, for example, so they'll only run at 1080p when your output is 4k. There's no real way to quantify something like that as running at X resolution.

It's the same reason that 1080p games often look blurry as hell nowadays. Older TAA games like Doom 2016 always looked perfectly fine at 1080p while newer games often look godawful at the same resolution.
 

BigBeauford

Member
I wonder what the general consensus is among PS5 players.

Get Last of Us or Returnal?
Honestly Returnal is fucking amazing as a roguelike junkie. If the TLOU package had the multiplayer factions, TLOU is the no-brainer pick as factions is one of the most underrated multiplayer experiences in existence. Without factions, I prefer Returnal.
 
I got a 4080 but no 4K monitor so I should be good for 1440p ~100fps hopefully.

does it have RTX/DLSS3?

I love that 32GB is becoming more common on PC now. I'm going to be upgrading to 64GB RAM soon. It's getting too close for comfort seeing games run at 20-25GB lol.
 
Last edited:

Senua

Member
I wonder what the general consensus is among PS5 players.

Get Last of Us or Returnal?
Kenan Thompson Snl GIF by Saturday Night Live
 

hinch7

Member
its not, it often run into extreme performance bottleneckes at 1440p and above

literally on it box says 1080p. it is a gpu targeted for 1080p. it can match ps5 like for like at 1080p, but once you go beyond there, it will start having massive performance slowdowns.

here it is like this:


at 4k, 5700xt is %12 faster
at 1440p, they're matched
at 1080p, 6600xt is %8 faster

that's a whopping %20 performance drop going from 1080p to 4K.

cyberpunk is even more brutal


at 4k, 5700xt is %23 faster
at 1440p, they're matched
at 1080p, 6600xt is %10 faster

a whopping massive %23 performance drop going from 1440p to 4K and another %10 from 1080p to 1440p

6600xt is a gimped product that only works to its potential at 1080p.

problems are even more pronounced with ray tracing;



at native 1080p with rt set to high (ps5 ray tracing settings), it gets around 50-70 FPS.

ps5 gets these framerates native 4k with its fidelity mode;



as I said, 6600xt has pathetic bandwidth at 256 GB/s. and infinity cache only works good enough at lower resolutions where cache hits are repetetive. at 4k/1440p/ray tracing situations, it falls quite a bit below PS5.

there really exists no GPU that is a proper match for PS5. 6700xt overshoots, 6600xt is a situational gimped card. best way is to compare 6700xt to PS5 and see how much of a lead it has. otherwise comparisons will be moot.

as I said, just because 6600xt matches a PS5 in terms of TFLOPS does not mean it will match it. there are other factors. bandwidth is the most crucial one.

The PS5s GPU is more akin to the 6700 (not XT). Same shader count, CU's (36), ROPs and TMU. Even similar clock speeds. With a lesser 192-bit bus and bandwith, 10GB VRAM but with 80MB L3 cache (infinity cache).

Though you can't really compare 1:1 even with the same GPU layout. Each platform will use different API's and perform differently. And consoles being a closed system, is likely to be more optimised and perform better like for like. Especially if you put them side by side. Say the Windows PC running DX will have a lot more overhead with different configerations and drivers. Plus other features like cache scrubbers plus native decompression hardware on console, vs higher cache on RDNA 2 DGPU's.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I got a 4080 but no 4K monitor so I should be good for 1440p ~100fps hopefully.

does it have RTX/DLSS3?

I love that 32GB is becoming more common on PC now. I'm going to be upgrading to 64GB RAM soon. It's getting too close for comfort seeing games run at 20-25GB lol.
What game runs at 20 - 25GB?



Please dont say endgame Anno or custom Skylines.
 

Kataploom

Gold Member
The PS5s GPU is more akin to the 6700 (not XT). Same shader count, CU's (36), ROPs and TMU. Even similar clock speeds. With a lesser 192-bit bus and bandwith, 10GB VRAM but with 80MB L3 cache (infinity cache).

Though you can't really compare 1:1 even with the same GPU layout. Each platform will use different API's and perform differently. And consoles being a closed system, is likely to be more optimised and perform better like for like. Especially if you put them side by side. Say the Windows PC running DX will have a lot more overhead with different configerations and drivers. Plus other features like cache scrubbers plus native decompression hardware on console, vs higher cache on RDNA 2 DGPU's.
Yet it seems to perform around 6600 to 6600XT (best case) in most cases... I think it's still a pretty good performance, I just wouldn't use it for 4K or RT... BTW I mostly talk about performance mode, since that's the one I'm interested in.

OT: This game is the trigger for me to get 32 GB of RAM, I know it won't necessarily use it but I'm on 16 GB DDR4 2400 MHz, never had a single problem since I'm almost always GPU bounded but I wanted to upgrade to 3200 MHz just to not leaving performance on the table and this seems like the perfect excuse. I hold a little more since DDR5 is already out but it's to expensive to migrate to a DDR5 if I'm getting almost same performance on GPU bound scenarios anyway
 

Alex11

Member
I think it's time to get a PS5 and be done with all these different requirements for every game that releases, It's becoming a chore.
It is a very pretty game no doubt, but a 4080 and 32 GB RAM seems a bit much for a game that is linear, no time of day and has pre-calculated lighting, and yeah, I know that you can't compare 1:1 the specs for a PS5 to a PC, but still.
 

Kataploom

Gold Member
These two games are some of the ones I've tested myself on my 6700 XT just this past month:

A Plague Tale: Requiem: Around 50 to 65 at native 1440p... On PS5 it doesn't even renders at native 1440p for the 30-40 fps modes.

Forspoken: Around 60 to 80 fps on native non-dynamic 1440p... On PS5 it needs to use FSR to reach dynamic 1440p and yet framerate falls around 45 fps quite frequently.

Also DF put Returnal on PS5 against a 2060 Ti and it performs similarly even at 1440p, which in PS5 it requires CB and TAA from 1080p. The 2060 Ti was using DLSS Quality which is still higher than 1080p internally.

Even if you account for CPU bottlenecks, PC doesn't have it so it can show the full power of the GPU in scenarios PS5 can't.

I'll be playing TLOU Part I a little after it releases (gotta wait for performance reviews/patches first and get the 32 GB of RAM) so I'll be comparing.
 

Mr Moose

Member
These two games are some of the ones I've tested myself on my 6700 XT just this past month:

A Plague Tale: Requiem: Around 50 to 65 at native 1440p... On PS5 it doesn't even renders at native 1440p for the 30-40 fps modes.

Forspoken: Around 60 to 80 fps on native non-dynamic 1440p... On PS5 it needs to use FSR to reach dynamic 1440p and yet framerate falls around 45 fps quite frequently.

Also DF put Returnal on PS5 against a 2060 Ti and it performs similarly even at 1440p, which in PS5 it requires CB and TAA from 1080p. The 2060 Ti was using DLSS Quality which is still higher than 1080p internally.

Even if you account for CPU bottlenecks, PC doesn't have it so it can show the full power of the GPU in scenarios PS5 can't.

I'll be playing TLOU Part I a little after it releases (gotta wait for performance reviews/patches first and get the 32 GB of RAM) so I'll be comparing.
Image quality stats are a close match to the first game. Both PS5 and Series X each stay in place at a native 1440p resolution, reconstructing up to 4K using a temporal solution.
 

hinch7

Member
Yet it seems to perform around 6600 to 6600XT (best case) in most cases... I think it's still a pretty good performance, I just wouldn't use it for 4K or RT... BTW I mostly talk about performance mode, since that's the one I'm interested in.

OT: This game is the trigger for me to get 32 GB of RAM, I know it won't necessarily use it but I'm on 16 GB DDR4 2400 MHz, never had a single problem since I'm almost always GPU bounded but I wanted to upgrade to 3200 MHz just to not leaving performance on the table and this seems like the perfect excuse. I hold a little more since DDR5 is already out but it's to expensive to migrate to a DDR5 if I'm getting almost same performance on GPU bound scenarios anyway
CPU bottleneck. The Zen 2 CPU in consoles are fairly slow compared to current CPU's in terms of IPC. And those are clocked at 3.5Ghz with a small cache. Hence why when pushed in higher framerates they dip a lot when compared to more performant hardware. And fair better at 30fps. As i said you can't compare 1:1.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
I think it's time to get a PS5 and be done with all these different requirements for every game that releases, It's becoming a chore.
It is a very pretty game no doubt, but a 4080 and 32 GB RAM seems a bit much for a game that is linear, no time of day and has pre-calculated lighting, and yeah, I know that you can't compare 1:1 the specs for a PS5 to a PC, but still.
Where did you see that a 4080 is required to play this game? You can play it just fine with a GPU 1/10th of the price.
 

Alex11

Member
Where did you see that a 4080 is required to play this game? You can play it just fine with a GPU 1/10th of the price.
I didn't write required, of course I was referring to 4k60 max settings. I have a 1050 Ti now, but low 720p, with all due respect, yeah thanks but no thanks. And where I am, a GPU for 1080p High settings, it's in the same price range as a PS5, which is running at higher settings and resolution.

Don't take it the wrong way, I'm not looking to start something, I ain't a fan of any console or anything, it is just a conclusion that I've reached if I want to keep gaming, and it's just for my specific situation.
 

Kataploom

Gold Member
CPU bottleneck. The Zen 2 CPU in consoles are fairly slow compared to current CPU's in terms of IPC. And those are clocked at 3.5Ghz with a small cache. Hence why when pushed in higher framerates they dip a lot when compared to more performant hardware. And fair better at 30fps. As i said you can't compare 1:1.
Not everything is CPU bottleneck on those consoles, GPU can punch above their weights at 4K (30 fps) against one with similar specs on PC due to bandwidth but at 1440p they're not even 6700 (non XT) level, I don't care though, 1440p is good enough to me but just to clarify... I got hyped by my new 6700 XT and have been doing tests comparing to DF and other sources just for the lols and it's like 20% most cases above consoles.
 

Agent_4Seven

Tears of Nintendo
The whole concept of games running at X resolution kinda doesn't exist anymore. Modern games have countless different buffers and render targets running at resolutions that are different from the output resolution. Depth-of-field effects often run at quarter resolution, for example, so they'll only run at 1080p when your output is 4k. There's no real way to quantify something like that as running at X resolution.
That's the problem. It's like with CRT which is still perfect for modern games, but cuz no one decided to make it better when it comes to screen tech and at the same time significantly reduce the size of the tubes and monitors etc., we have to deal with bs tech like IPS, VA etc. which have a ton of problems everyone looking solutions for and devs had to implement motion blur (which a lot of people hate), where as CRT don't even need it. Same goes for image reconstruction tech, sole purpose of which is to help with performance, but at a cost of ruining the image which in my eyes not a worthy trade off. All of the above is solutions in search of more problems someone have to deal with at some point to make their own problems for others to fix and so on, the cycle never ends. The only tech which is (arguably) better than native res is DLSS, but it also has problems even with the latest version, BUT consoles can't use it.

I mean, okay, not a lot of people actually using photo modes in games to make screens, but for those of us using them - any kind of image reconstruction tech ruins the whole purpose of photo modes and making them pointless. Insomniac the only devs which actually allowing their games to run at much higher res so image reconstruction (if it's active in the game in some form) is not a problem. Native res is always better than any image reconstruction tech, no matter if it's being used only for certain visual aspects of the game like hair rendering, fog, lighting, shadows, foliage etc.

The whole problem of the gaming industry in general when it comes to tech and hardware, is that it's running ahead of the train without actually having hardware nessesary to do it (PC hardware included), so it's cheating and making an impression of the opposite. No one is using the full potential of UE3 or UE4 (which are not perfect, but Arkham Knight is sole example of what you can do with UE3 and kill any other game on this engine visually and tech wise) or any other game engine cuz fuck it, let's use UE5 and think of some bs tech to make games run on underpowered hardware and to make overpriced af PC hardware catch up for years to run games at native res.
It's the same reason that 1080p games often look blurry as hell nowadays. Older TAA games like Doom 2016 always looked perfectly fine at 1080p while newer games often look godawful at the same resolution.
It depends on the amount, complexity and quality of assets on screen and overall artistic vision. Take latest RE4 demo for example, which will look blury af in 1080p with TAA, same goes for any other extremely complex game visually. Also, why the fuck no one is using SSAA in modern games? It looks amazing in Metro: Last Light in 1080p while downsampling from 4K without TAA or other modern and BS AA solution. Modern high end GPUs have zero problems running SSAA.
 
Last edited:

winjer

Gold Member

Now I’ll be honest here. I’m really worried about the quality of the PC version of The Last of Us Part I Remake. A lot of people believed that Naughty Dog would be developing it. Instead, and similar to the UNCHARTED: Legacy of Thieves Collection, Naughty Dog has outsourced it to Iron Galaxy.

Iron Galaxy has dropped the ball numerous times when it comes to its PC ports. I mean, who can forget Batman: Arkham Knight, right? Moreover, UNCHARTED: Legacy of Thieves Collection had major mouse control issues when it came out. Hilariously, a post-launch patch also introduced some awful camera stutters.

We’ll definitely test and benchmarkThe Last of Us Part I Remake when it comes out on PC on March 28th. So stay tuned for more!

Season 5 No GIF by The Office
 

Thebonehead

Banned



Season 5 No GIF by The Office
Just saw this on DSOgaming myself :(

Have already pre-ordered via CD keys for £35 so may have to wait a bit for the patch before doing yet another run through of it.

On the plus side noticed that the latest Returnal patch now fully supports 3840x1600 resolution so will be able to fire it up again and play through
 

winjer

Gold Member
Just saw this on DSOgaming myself :(

Have already pre-ordered via CD keys for £35 so may have to wait a bit for the patch before doing yet another run through of it.

On the plus side noticed that the latest Returnal patch now fully supports 3840x1600 resolution so will be able to fire it up again and play through

You might be able to cancel the pre-order and get your money back.
 

M1987

Member
Hmmm how did you come to this conclusion? It's a new gen after all; maybe pc players need to optimize their hardware accordingly 😂
I know nothing about game development or PC ports,but if a PS5 is hitting 4k/30 and 1440p/60,you shouldn't need a 4080 for 4k/60
 

Gaiff

SBI’s Resident Gaslighter
Hmmm how did you come to this conclusion? It's a new gen after all; maybe pc players need to optimize their hardware accordingly 😂
Dude, you don't even own a gaming PC so why are you here and in the Starfield thread? You're just here as a fucking troll parading your console war nonsense. Your kind doesn't last around here. GTFO if all you're here for is to tout the superiority of your favorite plastic box. No one cares.
 

yamaci17

Member
I know nothing about game development or PC ports,but if a PS5 is hitting 4k/30 and 1440p/60,you shouldn't need a 4080 for 4k/60
you actually do. ps5 performs nearly to a 6700xt at 4k due to having higher bandwidth

and if you go and look at specs, 4080 is practically 2.2x faster over a 6700xt at 4K when you leave dlss2, dlss3 and stuff out of the equation

4080 is really not a mythical power GPU when it comes to rasterization. sure, having 2x over 6700xt is still a huge thing but its not that huge in certain perspective

if anything, if 4080 ends up pushing 4k/70 fps at ps5 equivalent settings, it practically means game scales 1 to 1 between pc gpus and ps5's gpu

problems start when you think 4080 as this mythical super overpowered GPU when it has barely 2x rasterization performance over a ps5/6700xt/similar spec.

you have to directly criticise why it does not run 4k/60 on ps5 to begin with. ragnarok + hfw runs nearly 1800p/60 fps on the ps5. this game however pushes 4k 30. if you have to criticise, you have to criticise the performance they targeted on ps5 to begin with before the PC version. the PC version will only perform as much as it does on PS5
 
Last edited:

ChiefDada

Gold Member
I know nothing about game development or PC ports,but if a PS5 is hitting 4k/30 and 1440p/60,you shouldn't need a 4080 for 4k/60

My point is neither you, me, or anyone else knows what the bottleneck is. Is it memory management/asset decompression difficulties (game is 80GB with PS5 compression), API overhead, combination, etc? Also, it's not just 4k60, but 4k60 at ultra settings. We don't know how far above and beyond the presets for PC exceeds PS5. It could very well warrant the spec requirements.

Dude, you don't even own a gaming PC so why are you here and in the Starfield thread? You're just here as a fucking troll parading your console war nonsense. Your kind doesn't last around here. GTFO if all you're here for is to tout the superiority of your favorite plastic box. No one cares.

Lol but I've been here longer than you. Where is this so called warring you see me committing? If someone claims ABC GPU should perform better than XYZ platform, I have the right to inquire about their reasoning and surely they should be able to provide them. Maybe you should take a nap and come join us when you've settled down from your temper tantrum.
 

ChiefDada

Gold Member
you have to directly criticise why it does not run 4k/60 on ps5 to begin with. ragnarok + hfw runs nearly 1800p/60 fps on the ps5. this game however pushes 4k 30. if you have to criticise, you have to criticise the performance they targeted on ps5 to begin with before the PC version. the PC version will only perform as much as it does on PS5

Both HFW and Ragnarok use checkerboard and TAAU, respectively, and they are very well implemented. Sadly, Naughty Dog Engine has no form of reconstruction or DRS, which this remake desperately needs. So native 4k or native 1440p at all times. Also perhaps you disagree but this is one of the most graphically intensive games, much more demanding than HFW or GoWR in many respects. Especially cinematics, probably the best looking cinematics from current gen. Again HFW and Ragnarok don't compare in cinematics and that is where performance struggles the most in TLOU Pt. 1
 

Gaiff

SBI’s Resident Gaslighter
Lol but I've been here longer than you. Where is this so called warring you see me committing? If someone claims ABC GPU should perform better than XYZ platform, I have the right to inquire about their reasoning and surely they should be able to provide them. Maybe you should take a nap and come join us when you've settled down from your temper tantrum.
Bullshit. You only do that when it pertains to the PS5. Someone claims the PS5 outperforms a 4090, you'll like the post. Someone claims the opposite, suddenly, you're interested in the reasoning. You got embarrassed in the Spider-Man Miles Morales for PC thread when you got exposed spouting falsehoods and inaccurate information. You aren't actually interested in discussing GPU performance or anything else. Otherwise, we'd regularly see you in PC performance or other similar threads. You only pop up when the PS5 is involved and 100% of the time, it's to white-knight it. You're a textbook warrior. Don't pretend that you wanna have an honest discussion.

You also showed your true colors in the Starfield thread when you went "hur dur, show gameplay", ignoring the footage, the long one they showed last year, and the planned event for June. Then you went full troll mode with your "I don't game on Xbox so I'm not used to waiting". You don't game on Xbox or on PC so why do you care about Starfield exactly? Why are you in almost every PC thread defending the PS5 like your life depended on it? Oh, I know why! You're a warrior. The least you can do is be honest about it.
 
Last edited:

ChiefDada

Gold Member
Bullshit. You only do that when it pertains to the PS5. Someone claims the PS5 outperforms a 4090, you'll like the post. Someone claims the opposite, suddenly, you're interested in the reasoning. You got embarrassed in the Spider-Man Miles Morales for PC thread when you got exposed spouting falsehoods and inaccurate information. You aren't actually interested in discussing GPU performance or anything else. Otherwise, we'd regularly see you in PC performance or other similar threads.

You'll see me more in PS5 threads because.... I only play on PS5. It's my platform of choice. Does that mean I am disbarred from discussing tech and platform comparisons?

You only pop up when the PS5 is involved and 100% of the time, it's to white-knight it. You're a textbook warrior. Don't pretend that you wanna have an honest discussion.

Me discussing my favorite game of all time (a PS5 game) and how PC is the best place to play to enjoy superior visuals:

Jesus Christ I don't think I'll be able to handle Returnal at max PC settings. Just way too much beauty!

It's difficulty is overblown imo; I got the plat. and I'm not a hardcore gamer by any stretch. It has become my favorite game of all time and I'm sure the PC experience will take it next level.

This is probably the first PS5 only game where PC version will demonstrate corresponding power to perf. differential. It should look amazing on high end PC.

My comments on Starfield and other titles not available on my preferred platform:

Oh absolutely! At this point, I can't imagine a scenario where Starfield doesn't do very well both critically and commercially (huge tick up in GP subscribers/console sales upon release). As for Redfall, I've been a fan since the announcement trailer and it looks better with every subsequent info drop).

As a huge fan of JSRF, I must say it looks drop dead gorgeous. Enjoy, Xbox/PC peeps.

GoW Ragnarok
TLOU Remake
And even though I know next to nothing about it, I feel like I'll end up wanting to play Starfield

PS5 was quicker out the gate with their next gen games and related implementations, but it looks like next gen content for Series consoles will be coming this year so exclusives such as the anticipated AAA Starfield that's coming in December should provide that next gen feel you currently have with PS5.


Peoples Choice Awards GIF by NBC
 
Top Bottom