• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF]: The Last of Us Part 1 PC vs PS5 - A Disappointing Port With Big Problems To Address

You'd hope they can get a fix pumped out quickly, but I guess it boils down to how fundamentally flawed the port is. I'd also assume it is all hands on deck to assess this. If the fix is going to take considerable time, Sony could do themselves a favor on the PR front by working with store fronts to extend refund windows and offering purchasers deep discount coupons for repurchase once it is fixed.

Hopefully they can do a Horizon Zero Dawn and properly turn it around, they seem very active and working hard on the updates, just a shame it wasn't done before they sold it for fifty fuckin squid init!
Honestly that Digital Foundry video was grim. But i hope it's not as hard as it seems to me.
 

Gaiff

SBI’s Resident Gaslighter
Supposedly (and i have no idea where i read this already) Nixxes is working on Ratchet & Clank after both Spider-Man games.
Sony probably trusted Naughty Dog way too much with this.
You read this on their hiring page where they state they're looking for someone with experience with a middleware used in R&C which is the only Sony game that uses it.
I'm sure Naughty Dog themselves must feel like crap right now...but still, it's an unfortunate situation and once the hype dies, the game won't do great numbers on PC anymore. It's literally at #43 right now while Resident Evil 4 remake that came out a few days before is still at #2.
Sony dropped the ball on that one. I'm sure Naughty Dog could have gotten this port up to speed but it was rushed. You know they were damn well aware of the issue but had a deadline and Sony had already allowed them a 3 1/2 weeks delay. The show had just ended and they wanted to ride the hype. This of course backfired spectacularly but I trust that Naughty Dog, if given more time, would have been able to deliver a competent port. This needed another month of work.
 

winjer

Gold Member

GeForce Hotfix Driver Version 531.58


This hotfix addresses the following issues:
  • [The Last of Us Part 1] Game may randomly crash during gameplay on GeForce RTX 30 series GPUs [4031676]
  • Assassin’s Creed Origins may have stability issues when using 531.18. [4008770]
  • [Resident Evil 4 Remake] Corruption in the game when FXAA enabled [4051903]
 

SlimySnake

Flashless at the Golden Globes
How the hell didn't Sony put Nixxes on this way before? Like...they bought them for this and they didn't think it would be good to have their premium studio for ports taking care of TLOU? It's honestly baffling to me.

I know Naughty Dog wanted to do it...but at what cost?
Look at the latest ND announcement, they are becoming a multiplatform studio now developing PC games simultaneously.

Herman told us as much when he said PS GaaS games will come to PC on day one. Nixxes will be used to port single player games while GG, Insomniac, Bend and ND all have their internal teams learn how to develop on PC. It's no secret that Bend did their own PC port as well seeing as how their next game is GaaS. It will ship on PC on day one.

Nixxes is probably working on porting Spiderman 2 to PC.
 

Turk1993

GAFs #1 source for car graphic comparisons
Exactly what I said. That CP2077 has low res textures. That's why it uses so little vram.
But TLOU Part 1 and 2 have also some low res textures if you look for it, every game has those especially open world games like CP2077.
 

SlimySnake

Flashless at the Golden Globes
You read this on their hiring page where they state they're looking for someone with experience with a middleware used in R&C which is the only Sony game that uses it.

Sony dropped the ball on that one. I'm sure Naughty Dog could have gotten this port up to speed but it was rushed. You know they were damn well aware of the issue but had a deadline and Sony had already allowed them a 3 1/2 weeks delay. The show had just ended and they wanted to ride the hype. This of course backfired spectacularly but I trust that Naughty Dog, if given more time, would have been able to deliver a competent port. This needed another month of work.
Everyone keeps bringing up the show but the show already ended weeks ago.

The timing of this release is very curious because it came right at the end of the fiscal year. We've seen EA do this to Bioware game after game after game. Sony wanted this game out before March 31st so that its revenue and profit could be counted for this year.

Capcom did the same with RE4. Its just what multiplatform publishers do when they no longer care about brand recognition. Profits are all that matter.
 

Stooky

Member
How the hell didn't Sony put Nixxes on this way before? Like...they bought them for this and they didn't think it would be good to have their premium studio for ports taking care of TLOU? It's honestly baffling to me.

I know Naughty Dog wanted to do it...but at what cost?
For a good pc port Nixxies should have been envolved during the entire production. Basically co developing while ps5 version is being made. Porting a game from console to pc is not easy. I think at the time when they joined Sony there just wasn’t enough time. They may have other projects on their schedule. Just one of those things
 

Kataploom

Gold Member
I remember when people argued for over a decade that your GPU is too slow anyway by the time you run out of VRAM.

Although they might have been right for a long time, this doesn't seem that accurate any more this generation.
Well, that's because of Nvidia... Their cards are too VRAM starved, everyone barely educated on it saw that coming the moment the cards were announced, I remember GAF crying over that, but I've barely seen any YouTuber say it's a problem even then, now they're all actively shitting on 8gb of VRAM because it's a hot topic lol

I'm seeing complaints all accross the spectrum. You dont get a mostly negative score on steam with problems on one resolution setting alone
Let alone one that isn't even mainstream, most PC players play on a 1080p screen and that's far from changing.
 

winjer

Gold Member

The Last of Us Part I Update 1.0.1.7 Release Notes​

  • Fixed an issue which could cause the Xbox controller stick inputs to erroneously read as zeros for brief periods of time
  • Fixed an issue where the ‘Reset to Default’ function in the Graphics menu under Settings could make improper selections
  • Fixed an issue where the HUD performance monitors could impact performance when enabled
  • Fixed an issue where a crash could occur when using [ALT+ENTER] to toggle between Fullscreen and Windowed modes
  • Fixed an issue where a memory crash could occur during the transition from the end of the game into the credits sequence
  • Fixed an issue that could cause a crash while the game launched
  • Added additional crash report logs to provide further insight for developers
  • Added a new feature where users will be prompted to enable additional GPU diagnostic tooling following a GPU-related crash (optional and only enabled for the current gameplay session)
 

GustavoLT

Member

The Last of Us Part I Update 1.0.1.7 Release Notes​


What we want:
- fixed stuttering,
- fixed CPU and GPU usage
- overall smoother gameplay on low, mid and hi end pc´s

what we got?
- fixed game randomly crashing while you play eating doritos
 

winjer

Gold Member
What we want:
- fixed stuttering,
- fixed CPU and GPU usage
- overall smoother gameplay on low, mid and hi end pc´s

what we got?
- fixed game randomly crashing while you play eating doritos

You do realize I don't work for Naughty Dog.
I just saw the patch on Steam and posted the notes.
If can tell all those things to Naughty Dog in the Steam forums, or use the reporting tool.
 

SlimySnake

Flashless at the Golden Globes
okay so for starters i had to reduce my vram usage to 7.3 gb, because game bar recording takes around 200 250 mb. so i had to use native 1440p

i made a lot of erratic turns with mouse, couldn't trigger hitches or stalls with 7.3 gb vram usage (in game 7.5 gb usage though)

is it happening when u play a lot or should I try for a more prolonged test? mind you im on 16 gb ram too.. so my ram is also stressed out. 45 fps cap is because... my 2700 craps out with anything above. even 45 fps is pushing it but luckly this map ran all right



so native 1440p with high textures is possible while recording a video (but im sure recording the video will sooner or later break the game)

try 11 gb settings maybe?_

I'm starting to believe what I'm managing to have is an extreme outlier. maybe i should just keep it to myself and just enjoy the games from now on. dunno

Could be your frame cap. Definitely an outlier especially considering your CPU.

I am using an 11700k which is working overtime in this game. Ive never seen it go above 65 degrees, here it sits at 75 degrees.

11GB is basically what I get when i go to native 4k resolution and it starts to stutter. I will try changing everything to ultra at 4k dlss quality and see if that causes stutters.

And while the 4k dlss drops to 40 was definitely after a prologoned play session. the change to high native 4k was a brand new session and within a couple of minutes, it started stuttering really badly.

This guy has the same stutters i do when he switched to 4k. he saw an instant performance drop to 30 fps which makes sense but then he started getting massive stutters.

 

GHG

Member


it seems is going to get worse.


Been obvious for a while.

This subject is the most basic litmus-test for determining whether or not a youtube/media outlet or individual knows what they are talking about when it comes to the PC gaming space. Too many people in denial because they spent too much money on parts that aren't well set up for what comes next (or they've been giving out poor PC building advice for too long to now change their stance).

People can be stubborn all they like, things will simply move on without them.
 
Last edited:

GustavoLT

Member
You do realize I don't work for Naughty Dog.
I just saw the patch on Steam and posted the notes.
If can tell all those things to Naughty Dog in the Steam forums, or use the reporting tool.
Maybe i did something wrong, just used reply buttom to copy the post... what i mean is that they fixed a lot of things except the most important... performance!
 

yamaci17

Member
Could be your frame cap. Definitely an outlier especially considering your CPU.

I am using an 11700k which is working overtime in this game. Ive never seen it go above 65 degrees, here it sits at 75 degrees.

11GB is basically what I get when i go to native 4k resolution and it starts to stutter. I will try changing everything to ultra at 4k dlss quality and see if that causes stutters.

And while the 4k dlss drops to 40 was definitely after a prologoned play session. the change to high native 4k was a brand new session and within a couple of minutes, it started stuttering really badly.

This guy has the same stutters i do when he switched to 4k. he saw an instant performance drop to 30 fps which makes sense but then he started getting massive stutters.


You cannot run 9.2 gb game application on 8 gb smoothly though? You have to make sure your game application is <7.5 GB on a 8 GB GPU

are you getting a red warning? you should be in the yellow. ideally, you can push things up to %114-115. that's what I meant... not go overboard with game application more than what you have in total . can you share a screenshot from your VRAM bar if you can? maybe I communicated some stuff wrongly :/
 

SlimySnake

Flashless at the Golden Globes
You cannot run 9.2 gb game application on 8 gb smoothly though? You have to make sure your game application is <7.5 GB on a 8 GB GPU

are you getting a red warning? you should be in the yellow. ideally, you can push things up to %114-115. that's what I meant... not go overboard with game application more than what you have in total . can you share a screenshot from your VRAM bar if you can? maybe I communicated some stuff wrongly :/
i am on a 10 GB 3080.
 

yamaci17

Member
i am on a 10 GB 3080.
I referred to the video where he tries to run 9.2 GB game application, ofc it will stutter on 8 gigs! For you... ideally... game shouldn't have behaved badly with 8.9-9 GB of game application usage. Native 4K is a tough nut to crack in this game though. In the end you can still get away with high textures comfortably with 8.5 GB game application budget... and that's where your card should really feel any pressure. I
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Everyone keeps bringing up the show but the show already ended weeks ago.

The timing of this release is very curious because it came right at the end of the fiscal year. We've seen EA do this to Bioware game after game after game. Sony wanted this game out before March 31st so that its revenue and profit could be counted for this year.

Capcom did the same with RE4. Its just what multiplatform publishers do when they no longer care about brand recognition. Profits are all that matter.
It was likely both. The show ended just 16 days before the game's release so it was and still is fresh in people's mind. Then it was also the end of the fiscal year. They didn't give a damn and got it out the door.
 

SlimySnake

Flashless at the Golden Globes
For "current gen looking" games, yes. For game like this that looks worse than TLOU2 on PS4, 8GB should be more than enough.
Matrix city demo topped out at 7.5 GB on my 3080. Thats an open world game with fully simulated 100,000 cars and 30,000 pedestrians pushing visuals a generation ahead of TLOU2. Not to mention ray tracing.

This just a very poor port. I wouldnt be surprised if VSG studios came up with a really dumb way to port this engine to PS5 and ND got stuck with porting that to PC. Even the PS5 should be doing way better than 1440p 60 fps for a game like this. TLOU2 has way bigger areas and it also runs at 1440p 60 fps via BC.
 

GHG

Member
For "current gen looking" games, yes. For game like this that looks worse than TLOU2 on PS4, 8GB should be more than enough.

The thing is, while the final outcome might not be favorable, the techniques used in this and many other recent games are what we will typically see going forwards.

This is why "unoptimised" isn't a very wise thing to say. All of this is explained by the UE5 dev in the interview linked above. The way that textures and assets get loaded in now is different to how it was before and there are several reasons for that. We might not like the visual outcome as it stands, but if this is the way that things will be done going forwards in instances where games being ported from console to PC (instead of the other way round) then 8GB VRAM cards will continue to be on their knees.
 
Last edited:

yamaci17

Member
Matrix city demo topped out at 7.5 GB on my 3080. Thats an open world game with fully simulated 100,000 cars and 30,000 pedestrians pushing visuals a generation ahead of TLOU2. Not to mention ray tracing.

This just a very poor port. I wouldnt be surprised if VSG studios came up with a really dumb way to port this engine to PS5 and ND got stuck with porting that to PC. Even the PS5 should be doing way better than 1440p 60 fps for a game like this. TLOU2 has way bigger areas and it also runs at 1440p 60 fps via BC.
I feel like TLOU2 has a "more natural" lighting than this one. I cannot quite explain but... TLOU2 hit different. I'm not saying necessarily TLOU2 looks better but... there's something special to it.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I feel like TLOU2 has a "more natural" lighting than this one. I cannot quite explain but... TLOU2 hit different. I'm not saying necessarily TLOU2 looks better but... there's something special to it.
Nah thats a fairly accurate statement. Ive said this several times now, but the lighting in Part 1 is very gamey and very reminiscent of the PS3 game. Not in terms of tech, but the art direction of the lighting. It's very obvious an artistic choice because we know this engine can do far more photorealistic lighting, but I think their decision to stick with the original game's lighting really hurts sell this as a next gen game.

The UE3 quality lighting Bills Town and early parts of Pittsburgh didnt do anything for me. TLOU2 during day time in Seattle Day 1 and Jackson looked phenomenal. Less so in Santa Barbara but its a huge upgrade over Pittsburgh and Bills Town. Boston looked great and on par with TLOU2.
 

rodrigolfp

Haptic Gamepads 4 Life
Even the PS5 should be doing way better than 1440p 60 fps for a game like this. TLOU2 has way bigger areas and it also runs at 1440p 60 fps via BC.
100%. Both versions are under performing for what they look.

This is why "unoptimised" isn't a very wise thing to say.
It is 100% unoptimization if it's coded in a way that is worse for the system to run.
 
Last edited:

ChiefDada

Gold Member
The I/O can save some memory on the PS5. Mostly by shifting around cache data, in and out of memory faster.
But let's remember that the SSD is orders of magnitudes slower than GDDR6. It cannot keep up with a rendering engine and it's buffers.
And to make things worse, the PS5 does not support sampler feedback.

The SSD doesn't need to be as fast GDDR6. And are you aware of what SFS does and what it's trying to accomplish? If so, you would understand why PS5 wouldn't need it. Not to be crass, but SFS is a joke compared to PS5 i/o. Same goes with mesh shaders vs geometry engine.
 

winjer

Gold Member
The SSD doesn't need to be as fast GDDR6. And are you aware of what SFS does and what it's trying to accomplish? If so, you would understand why PS5 wouldn't need it. Not to be crass, but SFS is a joke compared to PS5 i/o. Same goes with mesh shaders vs geometry engine.

So why doesn't it need to be as fast as GDR6? If it's to only fill a streaming pool, then yes, it will do the job. But it can never, ever replace memory in a rendering pipeline. it's just too slow.

Seems to me you do not understand what Feedback Sampler is. Otherwise you would not compare it to I/O.
 
The SSD doesn't need to be as fast GDDR6. And are you aware of what SFS does and what it's trying to accomplish? If so, you would understand why PS5 wouldn't need it. Not to be crass, but SFS is a joke compared to PS5 i/o. Same goes with mesh shaders vs geometry engine.

That's a weird comparison.

SFS is a hardware solution by AMD to optimise texture streaming on RDNA 2 and Series X/S. The PS5 can easily run an SFS system on software but there will be a small performance cost, in fact UE5 has a similar solution called "Virtual Texturing" which is what allowed them to push 8K textures on the PS5 demo. The goal is to keep only relevant texture data in RAM since it has such a large footprint in terms of memory usage.

There's no doubt that Sony first parties will be making heavy use of these systems in their upcoming titles if they haven't already.
 

kingyala

Banned
That's a weird comparison.

SFS is a hardware solution by AMD to optimise texture streaming on RDNA 2 and Series X/S. The PS5 can easily run an SFS system on software but there will be a small performance cost, in fact UE5 has a similar solution called "Virtual Texturing" which is what allowed them to push 8K textures on the PS5 demo. The goal is to keep only relevant texture data in RAM since it has such a large footprint in terms of memory usage.

There's no doubt that Sony first parties will be making heavy use of these systems in their upcoming titles if they haven't already.
they are all virtual texturing systems... virtual texturing is as old as video games... xbox 360 called it mega textures, xbox one called it tiled resources, ps4 called it partial resident textures they all exist in software and hardware on both nvidia and amd gpu's aswell.. sfs is just a new buzzword on series consoles just a new way of doing virtual textures it is not going to magically be performant than ps5's io.. you cant defeat physical data flows its like downloading ram.. ps5 can physically send 5-8 gb of data per second up to 22gb of compressed oodle data to the system memory. this is physical theres no virtual texturing voodoo that can compensate this
 

ChiefDada

Gold Member
So why doesn't it need to be as fast as GDR6? If it's to only fill a streaming pool, then yes, it will do the job. But it can never, ever replace memory in a rendering pipeline. it's just too slow.

Jesus why do people keep repeating this nonsense still after all these years - the SSD isn't meant to replace RAM, but rather to make better use of it. The SSD (and I/O) doesn't need to be VRAM to have a significant effect on fidelity and performance (do you remember why we're here, what we're talking about in this thread?) Ok everyone let's all hold hands and say it together now:



Seems to me you do not understand what Feedback Sampler is. Otherwise you would not compare it to I/O.

Are you joking? It's absolutely i/o related. Hell, even Microsoft says this. It's one of the core tenants of the #VelocityArchitecure. I've never heard someone argue it's not and I'm surprised it's coming from you.

That's a weird comparison.

SFS is a hardware solution by AMD to optimise texture streaming on RDNA 2 and Series X/S. The PS5 can easily run an SFS system on software but there will be a small performance cost, in fact UE5 has a similar solution called "Virtual Texturing" which is what allowed them to push 8K textures on the PS5 demo. The goal is to keep only relevant texture data in RAM since it has such a large footprint in terms of memory usage.

I'm not talking about SFS-like programs I'm talking about the supposed bespoke hardware in Xbox whichever PC GPUs that supposedly works in tandem with SFS. Partially resident textures which has been around for some time now. The Xbox supposedly has hardware in GPU to better manage transition from low mip to high in the event that a request from the GPU isn't loaded in time into memory. It's a backstop for any latency from disk to VRAM causing the higher mip to not load in time.

Simply put, this should never happen on PS5.

EE4Ns8o.jpg


Also, Nanite was the key feature of UE with it's ability to do software rasterization much faster/more efficient than GPU, which is why I say mesh shader isn't so amazing now. The ability to push 8k textures had more to do with PS5 decompression. Brian Karis admitted so himself.
 

SlimySnake

Flashless at the Golden Globes
This guy also confirmed smaller vram and ram usage. but no performance boost.



sadly, im not seeing it in my testing. same vram usage and 4k native is still causing massive stutters.
 
Last edited:

Hoddi

Member
This guy also confirmed smaller vram and ram usage. but no performance boost.



sadly, im not seeing it in my testing. same vram usage and 4k native is still causing massive stutters.

I hate to be that guy but I genuinely don't think patches are gonna change much. I think the game simply needs this much memory because it's made for a 16GB console and not an 8GB one. We'll have to see about Friday's patch but I wouldn't get my hopes up.

Personally, I think this is simply a repeat of AC Unity back in 2014. PC gamers back then swore up and down that their shitty 2GB GPUs should totally be enough to run the game with 4xMSAA. But the simple truth was that they had no idea what they were talking about because the game needed ~3GB without MSAA. The game's reputation has suffered badly for it even to this day.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I hate to be that guy but I genuinely don't think patches are gonna change much. I think the game simply needs this much memory because it's made for a 16GB console and not an 8GB one. We'll have to see about Friday's patch but I wouldn't get my hopes up.

Personally, I think this is simply a repeat of AC Unity back in 2014. PC gamers back then swore up and down that their shitty 2GB GPUs should totally be enough to run the game with 4xMSAA. But the simple truth was that they had no idea what they were talking about because the game needed ~3GB without MSAA. The game's reputation has suffered badly for it even to this day.
You are probably right but this patch did reduce the vram and ram usage by a significant amount. Getting 1gb back in just a week of debugging is no small feat.

Also the ps5 only has 12.5 out of its 16 gb available for games. I remember the vram debacle of last Gen cards but we were looking at a 2.25-5x increase in vram when compared to consoles. My 570 was at powerful as a ps4 in terms of performance but its 1 gb vram held it back. Same goes for 2gb cards. But 3 gb cards were fine. Gtx 970 had what 3gb of vram after nvidias fuck up? Or was it 3.5? That was good enough to run every ps4 game at double the framerate at better settings. Most ps4 games used 3gb for vram and the rest for cpu tasks. I think i posted killzone shadow fall’s ram usage earlier in the thread.

Even if we assume that they are using 10 gb of vram for the gpu then the 3080s shouldn’t have any problems with the stuttering we are seeing today.
 

yamaci17

Member
I hate to be that guy but I genuinely don't think patches are gonna change much. I think the game simply needs this much memory because it's made for a 16GB console and not an 8GB one. We'll have to see about Friday's patch but I wouldn't get my hopes up.

Personally, I think this is simply a repeat of AC Unity back in 2014. PC gamers back then swore up and down that their shitty 2GB GPUs should totally be enough to run the game with 4xMSAA. But the simple truth was that they had no idea what they were talking about because the game needed ~3GB without MSAA. The game's reputation has suffered badly for it even to this day.
2 gb gpus to 8 gb consoles is not ismilar to 8 gb gpus to 16 gb consoles



any 16 gb ram / 8 gig vram owner can use high textures at 1440p dlssq with 7.3 gb usage and have smooth operation or native 1440p with 7.5 gb usage (needs extreme measures of background vram culling however)

4 GB VRAM is the new 2 GB VRAM. not the 8 GB.
 
Last edited:

winjer

Gold Member
Jesus why do people keep repeating this nonsense still after all these years - the SSD isn't meant to replace RAM, but rather to make better use of it. The SSD (and I/O) doesn't need to be VRAM to have a significant effect on fidelity and performance (do you remember why we're here, what we're talking about in this thread?) Ok everyone let's all hold hands and say it together now:

I'm one of the few that have been saying, from day one, that the SSD can't replace memory. Don't try to invert the conversation.
The SSD can load faster than and HDD. That's it.

Are you joking? It's absolutely i/o related. Hell, even Microsoft says this. It's one of the core tenants of the #VelocityArchitecure. I've never heard someone argue it's not and I'm surprised it's coming from you.

I'm not talking about SFS-like programs I'm talking about the supposed bespoke hardware in Xbox whichever PC GPUs that supposedly works in tandem with SFS. Partially resident textures which has been around for some time now. The Xbox supposedly has hardware in GPU to better manage transition from low mip to high in the event that a request from the GPU isn't loaded in time into memory. It's a backstop for any latency from disk to VRAM causing the higher mip to not load in time.

Simply put, this should never happen on PS5.

EE4Ns8o.jpg


Also, Nanite was the key feature of UE with it's ability to do software rasterization much faster/more efficient than GPU, which is why I say mesh shader isn't so amazing now. The ability to push 8k textures had more to do with PS5 decompression. Brian Karis admitted so himself.

Sampler Feedback is only a set of instructions that allow the GPU to better identify the tiles of textures needed for every frame.
This means that instead of loading as much as it can into memory and hope that it has what's necessary, SF will identify and tell the I/O exactly what is needed to load. This will save on memory usage, as there is less waste.
But this is not a I/O set of instructions. It'0s just a better way of telling the I/O, what is needed for every frame.
 
Matrix city demo topped out at 7.5 GB on my 3080. Thats an open world game with fully simulated 100,000 cars and 30,000 pedestrians pushing visuals a generation ahead of TLOU2. Not to mention ray tracing.

This just a very poor port. I wouldnt be surprised if VSG studios came up with a really dumb way to port this engine to PS5 and ND got stuck with porting that to PC. Even the PS5 should be doing way better than 1440p 60 fps for a game like this. TLOU2 has way bigger areas and it also runs at 1440p 60 fps via BC.
Matrix demo has not the asset diversity this game has and like most demos is repeating ad nauseatum the assets used. It's not about being linear or open world. It's about how many unique objects and (high resolution) textures the game has in each scenes. This is the hallmark of Naughty Dogs games since Uncharted games. This "linear" short game, using the very optimized oodle compression needs no less than 100GB to store everything.
 
Last edited:

ChiefDada

Gold Member
I'm one of the few that have been saying, from day one, that the SSD can't replace memory. Don't try to invert the conversation.
The SSD can load faster than and HDD. That's it.

Again I don't know why you feel you have to say this when nobody is saying the SSD is replacing RAM.

Sampler Feedback is only a set of instructions that allow the GPU to better identify the tiles of textures needed for every frame.
This means that instead of loading as much as it can into memory and hope that it has what's necessary, SF will identify and tell the I/O exactly what is needed to load. This will save on memory usage, as there is less waste.
But this is not a I/O set of instructions. It'0s just a better way of telling the I/O, what is needed for every frame.

Great. now what makes you think the PS5 is more prone to loading things that don't need to be in memory? This has literally been Sony's premier marketing beat from day one.
 

winjer

Gold Member
Again I don't know why you feel you have to say this when nobody is saying the SSD is replacing RAM.

I was explaining to another user how an SSD can't be used to keep up with a rendering engine and it's buffers.
And for some reason I don't know, you quoted me and said the SSD doesn't need to be as fast as memory. As if implying it can be used in the rendering pipeline.

Great. now what makes you think the PS5 is more prone to loading things that don't need to be in memory? This has literally been Sony's premier marketing beat from day one.

Because the PS5 does not have the instruction set for Sampler Feedback.
And Sony never talked about memory reduction usage through this technique. They talked about having a fast SSD.
 

SmokedMeat

Gamer™
Went back last night to see how things were and CPU is running all cores at 80%+ just sitting in the title screen doing nothing.

They seriously need to fix this joke of a port.
 
Top Bottom