• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

jroc74

Phone reception is more important to me than human rights
Well it's pretty easy to explain why Infinite looked bad in that lighting condition given the PBR materials and set up.

He wants more technical explanation like what you get from a GDC talk other than "haha SSD go brrr". It's much easier to commentate on what's presented visually than what is behind the scenes, literally.
I'm pretty sure reading my post addresses all this.

He thought about it enough to have questions about how ray tracing is being handled....but have no speculation about how the SSD may be used....

Ok.

......A GDC talk ends all speculation.....

Lets also not pretend he had previous opinions about the SSD's before that clip.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
It's pretty pathetic seeing the mob crusade against Alex because he dares "question the power of the SSD" and Rift Apart (even though nothing he said in the video is obscene or downplaying Sony's precious system). He just wants more detail on how they did what they did utilizing that SSD, and what's loaded in RAM at any given time. Chill the fuck out.
What does he want exactly? A fucking seminar on what assets are being loaded in and out of ram?

The funny thing is that Cerny painstakingly laid out the tech behind this so if he wants details he should probably watch that again. it's all right there.

If he can't figure out why loading stuff in and out of ram on camera turns is so impressive then perhaps he shouldn't be in this business. You don't need to be a game developer to understand this very simple concept.

His posts on era are proof that he's no better than a console warrior. It's time to stop putting him on a pedestal. Notice how i didn't bring up John and Richard. It's because they didn't spend the last two years downplaying the ps5 io and ssd on forums like Alex did.
 

Rea

Member
By hearing what Alex's saying here, what he meant in some way is that, Cerny was lying during Road to PS5.
LoL.
He brings out latency and stuff, then wants to have confirmation.
Cerny literally said in the road to PS5 that they had removed lots of potential bottlenecks by sacrificing some die space, So to eliminate latency (still have a few milliseconds but that's miniscule).
 

Darius87

Member
lol someone asked DF about Insomniac loading data in and out of VRAM on player turns. Known SSD hater, Alex, would like more information before commenting and still ended up downplaying it. lol

Timestamped:



He brings up "concerns" about the speed and latency of getting things in and out of memory even though insomniac literally confirmed Cerny's claim that you could do it on a player camera turn. Clearly, the speed and latency is no longer a concern because you can literally do it on a player camera controlled turn. He then speculates that we dont know what the benefits are. I mean the article mentions how the benefits are more detailed worlds.

For those who arent aware of Alex's crusade against the SSD and I/O of the PS5. This is him getting called out for downplaying it last year, throwing a fit, crying to mods, and getting both posters he quoted banned in the process.
4sQ9dNU.png


We now finally have proof that the SSD and I/O combined to bring a more detailed world in Ratchet, and he wants to wait and see. Umm, how about you pick up the phone and call Insomniac? They contact studios all the time over some framedrops and have no real access to Insomniac? Even John and Richard just shrugged this off. It looks really unprofessional for them to just punt it off like that when they can just say, "hey, we need more detail on this so we are going to contact Insomniac and see if we can get that."

this is common thing for PCMR they know that streaming/rendering tech on PS5 is superior to standart streaming/rendering so they ask for specific GB/s number and thinks if PS5 SSD isn't saturated then PC SSD could do same thing which isn't true.
amount of streaming data in R&C depends on enviroments it isn't allways flatline someone need to ask insomniac edge case GB/s if they wan't to get number for streaming.
but this is great way to recognize any PCMR when they start to ask for specifics or say PC SSD could do this before even there's no proof of it.
 

Darius87

Member
It's pretty pathetic seeing the mob crusade against Alex because he dares "question the power of the SSD" and Rift Apart (even though nothing he said in the video is obscene or downplaying Sony's precious system). He just wants more detail on how they did what they did utilizing that SSD, and what's loaded in RAM at any given time. Chill the fuck out.
so ignoring statements of insomniac dev for alex is alright? i mean if a dev says that they can focus... key word: all of it's memory what's in front of you, because PS5 SSD allows that but somehow alex don't care about this and just ignores it and want to know specifics GB/s that just shows his concern.
if all memory in ram what is in front of you means there's no assets in memory what's behind you it's that simple but yeah just keep ignoring this.
 

JediMind

Neo Member
so ignoring statements of insomniac dev for alex is alright? i mean if a dev says that they can focus... key word: all of it's memory what's in front of you, because PS5 SSD allows that but somehow alex don't care about this and just ignores it and want to know specifics GB/s that just shows his concern.
if all memory in ram what is in front of you means there's no assets in memory what's behind you it's that simple but yeah just keep ignoring this.
I think the problem with Alex is that he is way too biased. He loves PC and really think that consoles can't compete, in any way at all. Of course PC is better for most part since it's way more expansive but there COULD be stuff that the console does better due to being a closed box, and he simply ignores that.
I enjoy his videos but every time he talks about graphics and consoles his bias really shines through. I mean, he thinks Metro Exodus was the best looking game just because of some ray traced lightning that didn't even look good (just compare it with the enhanced version). And let's be honest, there are a lot of prettier games than Metro out there :)

Also, when Sony (and Microsoft as well) decides to put in some really fast and expensive tech (SSDs) in their console they do it for a reason. They would never increase the BOM if it wasn't beneficial for the games and in the long term their value proposition.
 
Last edited:

Dibils2k

Member
I think Returnal using a 1080p image as base is very impressive. I certainly couldnt tell. I thought it was a 1440p 60 fps title like Demon Souls this whole time.
i could

Ratchet will be a interesting one, the amount of detail and long distance detail too, i wonder how performance mode will look with its lower res. More fine detail means higher res is more important
 

sncvsrtoip

Member
lol someone asked DF about Insomniac loading data in and out of VRAM on player turns. Known SSD hater, Alex, would like more information before commenting and still ended up downplaying it. lol

Timestamped:



He brings up "concerns" about the speed and latency of getting things in and out of memory even though insomniac literally confirmed Cerny's claim that you could do it on a player camera turn. Clearly, the speed and latency is no longer a concern because you can literally do it on a player camera controlled turn. He then speculates that we dont know what the benefits are. I mean the article mentions how the benefits are more detailed worlds.

For those who arent aware of Alex's crusade against the SSD and I/O of the PS5. This is him getting called out for downplaying it last year, throwing a fit, crying to mods, and getting both posters he quoted banned in the process.
4sQ9dNU.png


We now finally have proof that the SSD and I/O combined to bring a more detailed world in Ratchet, and he wants to wait and see. Umm, how about you pick up the phone and call Insomniac? They contact studios all the time over some framedrops and have no real access to Insomniac? Even John and Richard just shrugged this off. It looks really unprofessional for them to just punt it off like that when they can just say, "hey, we need more detail on this so we are going to contact Insomniac and see if we can get that."

tough when he talked about sampler feeedback in previous df direct he was very excited ;d typical pcmr
 
He wants more technical explanation like what you get from a GDC talk other than "haha SSD go brrr".

Well they had their chance with Cerny. They should have asked him the right questions. If they did they could have applied that knowledge with what's going on with Ratchet and the devs claims.

The funny thing is that Cerny painstakingly laid out the tech behind this so if he wants details he should probably watch that again. it's all right there.

Unlike other publications DF had direct access to Cerny. Cerny did dedicate a lot of time to the I/O maybe DF should have asked more questions about it since it was a big part of the road to PS5.
 
Last edited:

ZywyPL

Banned
ad LOD, pop up etc .. watching 1080p shit YT IQ ..

Ratchet PS5

9:37 - grass pop up /middle centre part of the screen around the rock/
10:25 - abrupt LOD change /middle centre right, branches, bushes round tree/
12:45 - shadow pop up /middle centre, on da road/

Well all of this is kind of odd considering the community manager recently said out loud they are flushing out every data that's behind the camera and load it up immediately again when you turn around, which as we all see works flawlessly, and yet here we see pop-ups of what's in front of you, where the game was way more time to predict/load the data... And there's even an elevator section (starting at 2:14), followed by w walk slow-down and door before you can enter the next area, which for me is the most bizzare case here seeing how fast the game can switch between entirely different worlds, and for whatever the reason they added a section that looks like it was build for HDD, really odd. Maybe it as created early where they didn't "crack the code" of PS5 yet and decided to leave it as it is, that's the only explanation I can think of, because it really doesn't make any sense at all to have that kind of mechanics where you can load entirely new map within seconds.
 
Well all of this is kind of odd considering the community manager recently said out loud they are flushing out every data that's behind the camera and load it up immediately again when you turn around, which as we all see works flawlessly, and yet here we see pop-ups of what's in front of you, where the game was way more time to predict/load the data... And there's even an elevator section (starting at 2:14), followed by w walk slow-down and door before you can enter the next area, which for me is the most bizzare case here seeing how fast the game can switch between entirely different worlds, and for whatever the reason they added a section that looks like it was build for HDD, really odd. Maybe it as created early where they didn't "crack the code" of PS5 yet and decided to leave it as it is, that's the only explanation I can think of, because it really doesn't make any sense at all to have that kind of mechanics where you can load entirely new map within seconds.

Those look like rendering issues to me. Not loading
 

VFXVeteran

Banned
I think the problem with Alex is that he is way too biased. He loves PC and really think that consoles can't compete, in any way at all. Of course PC is better for most part since it's way more expansive but there COULD be stuff that the console does better due to being a closed box, and he simply ignores that.
I enjoy his videos but every time he talks about graphics and consoles his bias really shines through. I mean, he thinks Metro Exodus was the best looking game just because of some ray traced lightning that didn't even look good (just compare it with the enhanced version). And let's be honest, there are a lot of prettier games than Metro out there :)
I'm not going to jump into this SSD debate and Alex as it's a waste of time, however, I will address why Alex (and many other developers) think that Metro is the best looking game.

Metro is the first and only game to use a full RT lighting pipeline. Lighting/Shading is objectively the most important thing in rendering. All the other graphics features like textures and higher resolution geometry play an important role but it's not as important as getting lighting/shading the best it can be. Metro's art direction is based on an ugly, gritty, and dirty world where things don't look pretty and colorful like many of the games today that most would say looks "beautiful". I would urge people to look at the original vision from the director and judge it based on that target vision. I also thought Metro looked ugly when I first started playing it. Now though, with their complete rewrite of their engine, larger texture sizes and full PBR materials with the most advanced RT lighting pipeline to date, I can see how he would be more impressed with Metro as opposed to the color-centric games.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
tough when he talked about sampler feeedback in previous df direct he was very excited ;d typical pcmr
lol touche

Unlike other publications DF had direct access to Cerny. Cerny did dedicate a lot of time to the I/O maybe DF should have asked more questions about it since it was a big part of the road to PS5.
This is true. Richard got to see the Road to PS5 reveal ahead of everyone and all he wanted to ask was about variable tflops.
 

Dodkrake

Banned
Well all of this is kind of odd considering the community manager recently said out loud they are flushing out every data that's behind the camera and load it up immediately again when you turn around, which as we all see works flawlessly, and yet here we see pop-ups of what's in front of you, where the game was way more time to predict/load the data... And there's even an elevator section (starting at 2:14), followed by w walk slow-down and door before you can enter the next area, which for me is the most bizzare case here seeing how fast the game can switch between entirely different worlds, and for whatever the reason they added a section that looks like it was build for HDD, really odd. Maybe it as created early where they didn't "crack the code" of PS5 yet and decided to leave it as it is, that's the only explanation I can think of, because it really doesn't make any sense at all to have that kind of mechanics where you can load entirely new map within seconds.

Or... Stay with me here... It's a design decision for pacing. I know, controversial.
 

Thirty7ven

Banned
By hearing what Alex's saying here, what he meant in some way is that, Cerny was lying during Road to PS5.
LoL.
He brings out latency and stuff, then wants to have confirmation.
Cerny literally said in the road to PS5 that they had removed lots of potential bottlenecks by sacrificing some die space, So to eliminate latency (still have a few milliseconds but that's miniscule).

This is basically it. We really throwing basic ass questions about latency after the U5 demo?

Dude is straight up doubting both Cerny and the developers at this points. Embarrassing.
 

azertydu91

Hard to Kill
does alex have any credibility in the game developement scene?
I mean the guy couldn't spot raytracing on a picture where everybody here noticed it.... All that after saying the ps5 had no raytracing hardware.... After Sony already confirmed it. How much less credibility do you need ?
Not even accounting all the times he got proven wrong or contradicring himself and cried to mods on era even banning people that showed proofs and quoted him.
 

Dodkrake

Banned
I mean the guy couldn't spot raytracing on a picture where everybody here noticed it.... All that after saying the ps5 had no raytracing hardware.... After Sony already confirmed it. How much less credibility do you need ?
Not even accounting all the times he got proven wrong or contradicring himself and cried to mods on era even banning people that showed proofs and quoted him.
Not just that. In the trailer analysis (timestamped) him and Jon could not spot that the reflection was "garbled" due to the ground not being a smooth surface, so they call it "lower resolution reflection". Don't get me wrong, it looks it can be lower res, but there's also a noise map on top of the reflection

 

Dibils2k

Member
Not just that. In the trailer analysis (timestamped) him and Jon could not spot that the reflection was "garbled" due to the ground not being a smooth surface, so they call it "lower resolution reflection". Don't get me wrong, it looks it can be lower res, but there's also a noise map on top of the reflection


You really think they are gonna render reflections on full res/detail only to then distort it to look lower res/detail? :messenger_tears_of_joy: what a waste of resources that would be, it will be lower res/detail reflections cause thats the smart thing to do, why are you whining about DF pointing it out?

you people are genuinely mental
 
Last edited:

elliot5

Member
Not just that. In the trailer analysis (timestamped) him and Jon could not spot that the reflection was "garbled" due to the ground not being a smooth surface, so they call it "lower resolution reflection". Don't get me wrong, it looks it can be lower res, but there's also a noise map on top of the reflection


He literally says and the caption says "lower DETAIL reflection" not resolution. He commented how the reflected object was lower poly, which it is. It's likely the reflections are also lower res to save on performance, but the noise from the puddles helps alleviate that in terms of image quality.
 

SlimySnake

Flashless at the Golden Globes
Did anyone catch NX Gamer's RE8 comparisons? The PS4 and base X1 comparison stood out the most to me. X1 looks like the switch version. Muddy, blurry, ugly. It is running at 900p same as base PS4 but everything else is downgraded to oblivion. I know Cyberpunk on X1 looked blurry but thats a next gen game that was never designed to run on base consoles. X1 shouldve been able to run a last gen 60 fps title without this many downgrades.

Timestamped:


The framerate difference is roughly around 20-40% which makes sense considering the difference in tflops between the two base consoles. But there is clearly a need for far more downgrades which means their RAM setup is likely the bottleneck here. Regardless, Just the fact that this 40% difference in tflops is causing the game to look this bad should raise some eyebrows considering the fact that we have a next gen console where the difference between it and the series x is a stark 200%. What's going to happen with late gen games?

I strongly recommend everyone watch the comparison above because as bad as the screens look, it looks even worse in motion. Simply unacceptable and yet no one is talking about it. A good 80-90% of the 49 million Xbox one owners have the base X1 or the X1S and they are getting this insanely inferior experience. This should be big news.

jh755Qn.jpg
n6KwatZ.jpg
 
Did anyone catch NX Gamer's RE8 comparisons? The PS4 and base X1 comparison stood out the most to me. X1 looks like the switch version. Muddy, blurry, ugly. It is running at 900p same as base PS4 but everything else is downgraded to oblivion. I know Cyberpunk on X1 looked blurry but thats a next gen game that was never designed to run on base consoles. X1 shouldve been able to run a last gen 60 fps title without this many downgrades.

Timestamped:


The framerate difference is roughly around 20-40% which makes sense considering the difference in tflops between the two base consoles. But there is clearly a need for far more downgrades which means their RAM setup is likely the bottleneck here. Regardless, Just the fact that this 40% difference in tflops is causing the game to look this bad should raise some eyebrows considering the fact that we have a next gen console where the difference between it and the series x is a stark 200%. What's going to happen with late gen games?

I strongly recommend everyone watch the comparison above because as bad as the screens look, it looks even worse in motion. Simply unacceptable and yet no one is talking about it. A good 80-90% of the 49 million Xbox one owners have the base X1 or the X1S and they are getting this insanely inferior experience. This should be big news.

jh755Qn.jpg
n6KwatZ.jpg

Why would it be news that the X1 was an under powered system when it launched? It was an under powered system in 2013 it is even more so today. Since people here are quick to point out that the XSS is super easy to find perhaps we could compare how the XSS version of the game compares to the X1 version seeing how the XSS is its direct successor.
 
Why would it be news that the X1 was an under powered system when it launched? It was an under powered system in 2013 it is even more so today. Since people here are quick to point out that the XSS is super easy to find perhaps we could compare how the XSS version of the game compares to the X1 version seeing how the XSS is its direct successor.

He was comparing how badly the base One is compared to the PS4 (Because yeah he's right, the disperency is massive compared to the hardware differences)

And you still won't shut up about the Series S lmao
 

SlimySnake

Flashless at the Golden Globes
He was comparing how badly the base One is compared to the PS4 (Because yeah he's right, the disperency is massive compared to the hardware differences)

And you still won't shut up about the Series S lmao
To be fair, I did frame it as a Series S discussion because this is a next gen tech thread after all, and I get warned for straying too far when discussing next gen graphics. Maybe a next gen graphics thread is needed where we can talk about this stuff without fear of getting thread banned? I was put on a perma thread ban warning after posting some speculation about the BF6 graphics so i am kinda baffled tbh. Are graphics not tech? How do we compare the tech in the hardware if we cant talk about the end result of these next gen tech advancements. Frankly, I am ok with a separate thread for graphics discussion if thats what the mods want.

Anyway, I think the RAM implementation is the bottleneck here. The ESRAM is finally showing its age. And I think the 7.5 GB of RAM available for games in the series s is going to become apparent as soon as mid gen games come out. It's able to hold its own with cross gen games but they are last gen games designed to work on a 1.3 tflops GPU which is what DarkMage seems to be missing. The moment the PS5's 10 tflops or the XSX's 12 tflops becomes the baseline, that console is going to start to struggle badly. By the end of the gen, it will be running games that look awful like RE8 does today on the X1s.
 

Riky

$MSFT
To be fair, I did frame it as a Series S discussion because this is a next gen tech thread after all, and I get warned for straying too far when discussing next gen graphics. Maybe a next gen graphics thread is needed where we can talk about this stuff without fear of getting thread banned? I was put on a perma thread ban warning after posting some speculation about the BF6 graphics so i am kinda baffled tbh. Are graphics not tech? How do we compare the tech in the hardware if we cant talk about the end result of these next gen tech advancements. Frankly, I am ok with a separate thread for graphics discussion if thats what the mods want.

Anyway, I think the RAM implementation is the bottleneck here. The ESRAM is finally showing its age. And I think the 7.5 GB of RAM available for games in the series s is going to become apparent as soon as mid gen games come out. It's able to hold its own with cross gen games but they are last gen games designed to work on a 1.3 tflops GPU which is what DarkMage seems to be missing. The moment the PS5's 10 tflops or the XSX's 12 tflops becomes the baseline, that console is going to start to struggle badly. By the end of the gen, it will be running games that look awful like RE8 does today on the X1s.

More Series S concern trolling.

The PS4 version runs fine on a 1.8 tflop machine which sort of disproves your own point.
Series S doesn't have an elaborate Esram setup and has a whole raft of RDNA2 performance saving features that will be in the GDK well before mid gen games come out, such as SFS, Mesh Shaders and VRS.
So it's not a comparable situation at all.
 

jroc74

Phone reception is more important to me than human rights
Did anyone catch NX Gamer's RE8 comparisons? The PS4 and base X1 comparison stood out the most to me. X1 looks like the switch version. Muddy, blurry, ugly. It is running at 900p same as base PS4 but everything else is downgraded to oblivion. I know Cyberpunk on X1 looked blurry but thats a next gen game that was never designed to run on base consoles. X1 shouldve been able to run a last gen 60 fps title without this many downgrades.

Timestamped:


The framerate difference is roughly around 20-40% which makes sense considering the difference in tflops between the two base consoles. But there is clearly a need for far more downgrades which means their RAM setup is likely the bottleneck here. Regardless, Just the fact that this 40% difference in tflops is causing the game to look this bad should raise some eyebrows considering the fact that we have a next gen console where the difference between it and the series x is a stark 200%. What's going to happen with late gen games?

I strongly recommend everyone watch the comparison above because as bad as the screens look, it looks even worse in motion. Simply unacceptable and yet no one is talking about it. A good 80-90% of the 49 million Xbox one owners have the base X1 or the X1S and they are getting this insanely inferior experience. This should be big news.

jh755Qn.jpg
n6KwatZ.jpg

Whatever the case is, a longer than normal cross gen period will not be good.
 

IntentionalPun

Ask me about my wife's perfect butthole
Anyway, I think the RAM implementation is the bottleneck here. The ESRAM is finally showing its age. And I think the 7.5 GB of RAM available for games in the series s is going to become apparent as soon as mid gen games come out. It's able to hold its own with cross gen games but they are last gen games designed to work on a 1.3 tflops GPU which is what DarkMage seems to be missing. The moment the PS5's 10 tflops or the XSX's 12 tflops becomes the baseline, that console is going to start to struggle badly. By the end of the gen, it will be running games that look awful like RE8 does today on the X1s.

The Remedy dev in that IGN interview sort of stated the opposite.. that once they can ditch cross-gen it'll be easier to dev for Series S because they'll be building engines around it vs porting things that aren't meant for a next-gen architecture.

(he still wasn't enthused about it existing, but I keep pointing out that their statements buck this narrative, and nobody seems interested in discussing that)
 
Did anyone catch NX Gamer's RE8 comparisons? The PS4 and base X1 comparison stood out the most to me. X1 looks like the switch version. Muddy, blurry, ugly. It is running at 900p same as base PS4 but everything else is downgraded to oblivion. I know Cyberpunk on X1 looked blurry but thats a next gen game that was never designed to run on base consoles. X1 shouldve been able to run a last gen 60 fps title without this many downgrades.

Timestamped:


The framerate difference is roughly around 20-40% which makes sense considering the difference in tflops between the two base consoles. But there is clearly a need for far more downgrades which means their RAM setup is likely the bottleneck here. Regardless, Just the fact that this 40% difference in tflops is causing the game to look this bad should raise some eyebrows considering the fact that we have a next gen console where the difference between it and the series x is a stark 200%. What's going to happen with late gen games?

I strongly recommend everyone watch the comparison above because as bad as the screens look, it looks even worse in motion. Simply unacceptable and yet no one is talking about it. A good 80-90% of the 49 million Xbox one owners have the base X1 or the X1S and they are getting this insanely inferior experience. This should be big news.

jh755Qn.jpg
n6KwatZ.jpg

Is it expected we are getting support for Xbox one for the entire generation? Because dammmmmmn if we are, things are going to look a little muddy by year 4 and 5.
 
The Remedy dev in that IGN interview sort of stated the opposite.. that once they can ditch cross-gen it'll be easier to dev for Series S because they'll be building engines around it vs porting things that aren't meant for a next-gen architecture.

(he still wasn't enthused about it existing, but I keep pointing out that their statements buck this narrative, and nobody seems interested in discussing that)
I keep saying that things like SFS and VA aren't being utilized and people keep using cross gen titles to 'prove' the XSS has some sort of 'bottleneck'. If you want 4K gaming and more graphical bells and whistles the XSS isn't the best choice but if you are on a budget it offers fantastic value and capabilities for the price. You can't get more for less.
 

dcmk7

Banned
More Series S concern trolling.

The PS4 version runs fine on a 1.8 tflop machine which sort of disproves your own point.
Series S doesn't have an elaborate Esram setup and has a whole raft of RDNA2 performance saving features that will be in the GDK well before mid gen games come out, such as SFS, Mesh Shaders and VRS.
So it's not a comparable situation at all.
Why is it concern trolling? You're being a bit too sensitive.

The jist of his post is along the same lines of what some developers have already mentioned publicly.

And I highly doubt they were concern trolling.
 

Jacir

Member
I'm not going to jump into this SSD debate and Alex as it's a waste of time, however, I will address why Alex (and many other developers) think that Metro is the best looking game.

Metro is the first and only game to use a full RT lighting pipeline. Lighting/Shading is objectively the most important thing in rendering. All the other graphics features like textures and higher resolution geometry play an important role but it's not as important as getting lighting/shading the best it can be. Metro's art direction is based on an ugly, gritty, and dirty world where things don't look pretty and colorful like many of the games today that most would say looks "beautiful". I would urge people to look at the original vision from the director and judge it based on that target vision. I also thought Metro looked ugly when I first started playing it. Now though, with their complete rewrite of their engine, larger texture sizes and full PBR materials with the most advanced RT lighting pipeline to date, I can see how he would be more impressed with Metro as opposed to the color-centric games.
As a huge fan of Killzone 2. A game a lot of people didn't like because of the lack of color and understandably the controls compared to Call of Duty. Also a game that probably has less color than each of the Metro titles, I'm not really that impressed.
 

Jacir

Member
You really think they are gonna render reflections on full res/detail only to then distort it to look lower res/detail? :messenger_tears_of_joy: what a waste of resources that would be, it will be lower res/detail reflections cause thats the smart thing to do, why are you whining about DF pointing it out?

you people are genuinely mental
Theoretically, isn't this what VRS does but for areas that don't need to be rendered at full resolution?
 

SlimySnake

Flashless at the Golden Globes
MasterCornholio MasterCornholio Posting this here since the Ratchet thread was closed before I could submit my reply.

Im seeing some talk on how the I/O is bullshit.

Excluding the forums was there any bullshit said about it from Cerny or developers?

I'm assuming that we can still use the Road to PS5 and official comments from developers on anything I/O related. I can understand that some people here might not interpret the facts correctly or spread misinformation about it.
The I/O isnt bullshit but PCs can brute force anything, even Ratchet. Both Cerny and MS talked about how the decompression engines in their I/O blocks were equivalent to something crazy like 12 zen 2 cores decompressing data. Well, you can get a 24 core Zen 3 CPU right now. Hell, you can get a 48 core threadripper.

Nvidia is also bringing decompression into their 3000 series GPUs so it really doesnt matter if PCs dont have the I/O block because their GPUs are already 2x more powerful than the PS5. yes, they will take a hit but there is more than enough power here to do the heavy lifting on the GPU.


Then you have extra System RAM on every single PC. The 6800xt has 16 GB of VRAM, but you could and should have at least 16 GB of system RAM on your PC. Id imagine 32GB for next gen. Devs can load half of the game in the System ram and move the levels in and out from system ram into vram as they desire without ever even looking at the ssd.

A good example of just how powerful PCs are is the 3080. It only has 10 GB of VRAM compared to the 13.5 GB available to games on the XSX and maybe the PS5. But that doesnt mean it will struggle to run games designed on the PS5 because a lot of the VRAM on the PS5 is being used for things other than rendering. Game logic, A.I, what have you. So that 10 GB wont really be a bottleneck because almost no PS5 game would be using 13.5 GB for VRAM alone.

Here is KZ Shadowfalls' VRAM usage. Only 3 of the 5GB available was used for graphics.

killzone-shadow-fall-demo-postmortem-6-638.jpg
 
MasterCornholio MasterCornholio Posting this here since the Ratchet thread was closed before I could submit my reply.


The I/O isnt bullshit but PCs can brute force anything, even Ratchet. Both Cerny and MS talked about how the decompression engines in their I/O blocks were equivalent to something crazy like 12 zen 2 cores decompressing data. Well, you can get a 24 core Zen 3 CPU right now. Hell, you can get a 48 core threadripper.

Nvidia is also bringing decompression into their 3000 series GPUs so it really doesnt matter if PCs dont have the I/O block because their GPUs are already 2x more powerful than the PS5. yes, they will take a hit but there is more than enough power here to do the heavy lifting on the GPU.


Then you have extra System RAM on every single PC. The 6800xt has 16 GB of VRAM, but you could and should have at least 16 GB of system RAM on your PC. Id imagine 32GB for next gen. Devs can load half of the game in the System ram and move the levels in and out from system ram into vram as they desire without ever even looking at the ssd.

A good example of just how powerful PCs are is the 3080. It only has 10 GB of VRAM compared to the 13.5 GB available to games on the XSX and maybe the PS5. But that doesnt mean it will struggle to run games designed on the PS5 because a lot of the VRAM on the PS5 is being used for things other than rendering. Game logic, A.I, what have you. So that 10 GB wont really be a bottleneck because almost no PS5 game would be using 13.5 GB for VRAM alone.

Here is KZ Shadowfalls' VRAM usage. Only 3 of the 5GB available was used for graphics.

killzone-shadow-fall-demo-postmortem-6-638.jpg

This popped into my head while reading your post.


I can understand why they designed the PS5s I/O the way they did. If they went with a brute force approach they would need to have some pretty expensive parts in the PS5. In order to deliver extremely high speed I/O performance they had to find another way to reduce the costs. This is why they went with 12 smaller chips and a ton of custom I/O hardware. If they did it the PC it would be extremely expensive.

With that said I'm wondering what it would cost to build a PC that matches the PS5s I/O performance?
 

SlimySnake

Flashless at the Golden Globes
This popped into my head while reading your post.


I can understand why they designed the PS5s I/O the way they did. If they went with a brute force approach they would need to have some pretty expensive parts in the PS5. In order to deliver extremely high speed I/O performance they had to find another way to reduce the costs. This is why they went with 12 smaller chips and a ton of custom I/O hardware. If they did it the PC it would be extremely expensive.

With that said I'm wondering what it would cost to build a PC that matches the PS5s I/O performance?
With the pc parts shortage the way it is and the price hikes for AIB cards ( i saw the $500 3070 going for $750 at microcenter, $700 3080 was $1,200 and the $1,500 3090 was $2,500), I really couldn't say.

I would wait till bf6 launches to compare but even a 3060 ti and an 8 core 16 thread zen 3 cpu with a fast ssd isn't going to be cheaper than $1,500-2,000 at msrp prices.
 
With the pc parts shortage the way it is and the price hikes for AIB cards ( i saw the $500 3070 going for $750 at microcenter, $700 3080 was $1,200 and the $1,500 3090 was $2,500), I really couldn't say.

I would wait till bf6 launches to compare but even a 3060 ti and an 8 core 16 thread zen 3 cpu with a fast ssd isn't going to be cheaper than $1,500-2,000 at msrp prices.

I'm just glad I built like back when we didn't have Covid or a chip shortage. Now I look at prices and it makes my heart sink.
 

intbal

Member
There wasn't really a good place for this and I don't think it deserves its own thread.

Neat video showing the differences between the RE8 trailers and the final game on PS5.



My take: They did a good job keeping what needed to be kept, cutting what was unnecessary, and improving what needed work. It looks a little better in some areas, a little worse in others, but the final result is a great looking game (except on base XO). Definitely avoids the Trailer-to-Retail Hall of Shame that has plagued so many games.
 
Status
Not open for further replies.
Top Bottom