• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Oberon PlayStation 5 SoC Die Delidded and Pictured

ethomaz

Banned
Have you noticed the tweet where he says there's no unified cache as you said in the first pages of this thread? Or how he calls clowns the ones that were shouting the RDNA1 bullshit? Or how the PS5 is using the old render backend (unlike the one introduced in PC RDNA2 which is found in the XSX) and has not VRS-specific hardware? Or how he says there's no L3$ on the GPU whatsoever?

I hope you have so you can stop going into hardware threads claiming it does
That was already knows things... my first comment in this thread was a question.
And there indeed something between the the memory controller togheter with the Fabric link but I even said I thought it was non-used space before.

PS5 indeed uses old ROPs from RDNA1 while Series X uses the new ones found in RDNA 2... there are advantages and disvatages in the new way AMD created... one of the disvantages is the lack of hardwre VRS... the advtage is that the old ROPs has more porcessing power for what it is designed to do.

Stop what? The article said it... no me... go read what I said.
 

ethomaz

Banned
If it has the same functionality at the cost of increased heat density then it's not using less transistors. It's using area-optimized transistor libraries.


The initial assessment about decreasing the FPU to 128bits was a pretty bad one to be honest, considering Cerny had named 256bit floating ops as something that would cause a decrease in CPU clocks.
Yes it was a petty bad call... I believed something was cut from the FPU like registers and others units not used in games but it not seems to be the case.
 
That was already knows things... my first comment in this thread was a question.
And there indeed something between the the memory controller togheter with the Fabric link but I even said I thought it was non-used space before.

PS5 indeed uses old ROPs from RDNA1 while Series X uses the new ones found in RDNA 2... there are advantages and disvatages in the new way AMD created... one of the disvantages is the lack of hardwre VRS... the advtage is that the old ROPs has more porcessing power for what it is designed to do.

Stop what? The article said it... no me... go read what I said.
PS5 64 color ROPS 256 depth ROPS 16x8x2
Series x 64 color ROPS 128 depth ROPS 16x4x2

Series x has the same frontend as RDNA 1
Shader engine 2 primitive units 2 rasterizers 16 pixels written out

RDNA 2 frontend
Shader engine 1 primitive unit 1 rasterizer 32 pixels written out

RDNA 1 and Series x compute unit frontend
Wave unit 20 waves 32/64 controller 640/1280 threads

RDNA 2 compute unit frontend
Wave unit 16 waves 32/64 controller 512/1024 threads
 

rnlval

Member
That would certainly widen the bandwidth gap between the two, but then again even when doing the PR rounds I think they knew they had over engineered the ROPS fillrate (maybe there were some edge cases where they could flex it, but rare ones). PS4 Pro had some post Polaris enhancements, but I do not recall a wide render cache.
PS4 Pro has a Vega-style RPM (Rapid Pack Math) with higher memory bandwidth bound behavior.

Baseline Polaris 10 IP has 2 MB L2 cache connected to geometry and compute/texture paths, hence why AMD has Async Compute marketing campaign.
 
Last edited:

rnlval

Member
PS5 64 color ROPS 256 depth ROPS 16x8x2
Series x 64 color ROPS 128 depth ROPS 16x4x2

Series x has the same frontend as RDNA 1
Shader engine 2 primitive units 2 rasterizers 16 pixels written out

RDNA 2 frontend
Shader engine 1 primitive unit 1 rasterizer 32 pixels written out

RDNA 1 and Series x compute unit frontend
Wave unit 20 waves 32/64 controller 640/1280 threads

RDNA 2 compute unit frontend
Wave unit 16 waves 32/64 controller 512/1024 threads
AMD has four Z-ROPS/Depth ROPS: one color ROP ratio since RV770.


rop.jpg



Prove Xbox Series X has gone backward with two Z-ROPS/Depth ROPS: one color ROP ratio.


Xbox Series X's render back end is compliant with PC's RDNA 2 Variable Rate Shading feature which is missing with RDNA v1. Variable Rate Shading uses geometry edges with variable pixel shading (via ROPS being used as the read/write units) resolution.
 
Last edited:
AMD has four Z-ROPS/Depth ROPS: one color ROP ratio since RV770.


rop.jpg



Prove Xbox Series X has gone backward with two Z-ROPS/Depth ROPS: one color ROP ratio.


Xbox Series X's render back end is compliant with PC's RDNA 2 Variable Rate Shading feature which is missing with RDNA v1. Variable Rate Shading uses geometry edges with variable pixel shading (via ROPS being used as the read/write units) resolution.
I read some guys saying the cutdown RDNA2 ROPs make sense in RDNA2 because of Infinite cache high bandwidth it wouldn't need it. The problem is, XSX hasn't got any infinite cache.

There was an article from Activition developer showing software VRS was better in all aspect than hardware VRS. Faster with better results (basically because it was much more customizeable). And you know what? After having seeing a few games with RDNA2 VRS I agree. RDNA2 VRS is a failure IMO. The end results is very blurry, it's noticeable when it shouldn't and I predict most multiplat developers won't use it in the future because it's too blurry for not much gains.
 
Last edited:

Riky

$MSFT
I read some guys saying the cutdown RDNA2 ROPs make sense in RDNA2 because of Infinite cache high bandwidth it wouldn't need it. The problem is, XSX hasn't got any infinite cache.

There was an article from Activition developer showing software VRS was better in all aspect than hardware VRS. Faster with better results (basically because it was much more customizeable). And you know what? After having seeing a few games with RDNA2 VRS I agree. RDNA2 VRS is a failure IMO. The end results is very blurry, it's noticeable when it shouldn't and I predict most multiplat developers won't use it in the future because it's too blurry for not much gains.

Tier 2 VRS didn't exist then, it's far superior to the software version as you can quite clearly see in Gears 5 compared to Metro Exodus. Tier 2 has more fine granular control, analysis of Gears 5 and Doom Eternal shows its imperceptible during gameplay.
The Devs from Doom Eternal said they wish all formats had hardware support for it as it's superior to having a more wildly fluctuating DRS scale, they should know.
 

martino

Member
I read some guys saying the cutdown RDNA2 ROPs make sense in RDNA2 because of Infinite cache high bandwidth it wouldn't need it. The problem is, XSX hasn't got any infinite cache.

There was an article from Activition developer showing software VRS was better in all aspect than hardware VRS. Faster with better results (basically because it was much more customizeable). And you know what? After having seeing a few games with RDNA2 VRS I agree. RDNA2 VRS is a failure IMO. The end results is very blurry, it's noticeable when it shouldn't and I predict most multiplat developers won't use it in the future because it's too blurry for not much gains.
it was versus tier 1 capabilities
I'm prepare for out of context nitpicking from usuals focusing a lot on loss of detail in foreground and corner of games when this will happen....forgetting that this also bring more details where attention playing the game actually is.
Also how can this be a bad solution if the alternative is global IQ degradation of whole frame ?
 

assurdum

Banned
If someone would to question Cerny's words and imply that he is possibly lying with all that Sony, we dont knows for what reason, have never done a real hw Deep dive and have never released real technical specifications ....o clarified what's inside this blessed geometry engine and how it ranks with respect to the newest mesh shader ... or still about the vrs etc etc ... well if anyone would allow themselves to say that Cerny lied probably would get an instant ban.
It's incredibly how you push yourself to talk about something which you barely knows every fucking single time. It's all about praise the PR MS marketing spin words and leak mountain of shit on Cerny because doesn't understand shit reading your post.
He practically built an hardware virtually similar to the series X performance saving 100 bucks, looks at that stupid idiot. MS practically waited AMD refinishing Navi at the last minute, proclaiming itself the magical engineers because looks at these higher specs. Quite easy put higher specs waiting the last second when the other company had already finished the hardware specs almost an year early.
 
Last edited:

Riky

$MSFT
If someone would to question Cerny's words and imply that he is possibly lying with all that Sony, we dont knows for what reason, have never done a real hw Deep dive and have never released real technical specifications ....o clarified what's inside this blessed geometry engine and how it ranks with respect to the newest mesh shader ... or still about the vrs etc etc ... well if anyone would allow themselves to say that Cerny lied probably would get an instant ban.

I don't think Cerny lied, it's just fair to now assume that what isn't stated as fact in the road to PS5 presentation is not there, when asked they offer no comment about the "Full RDNA2" press release.
DF have recently confirmed the no hardware support for Tier 2 VRS, they said recently that Geometry Engine is "cut down" mesh Shaders (Primitive Shaders?) And they say in their latest video that PS5 lacks the machine learning hardware for AI upscaling that MS are working on bringing, that has been confirmed before.
 
it was versus tier 1 capabilities
I'm prepare for out of context nitpicking from usuals focusing a lot on loss of detail in foreground and corner of games when this will happen....forgetting that this also bring more details where attention playing the game actually is.
Also how can this be a bad solution if the alternative is global IQ degradation of whole frame ?
No I am talking about tier 2.

Why do you think Digital Foundry don't do RDNA2 VRS ON / OFF identical comparisons and benchmarks using PC games? Because once you apply performance mode (the one with the interesting performance gains with ~10% higher fps), then the textures look like shit and are very noticeably worse. A bit like PS1 textures according to one gaffer who did the test in one game with tiers 2 VRS.
 

Riky

$MSFT
There was plenty of comparisons on Doom Eternal where they showed you the difference with Tier 2 VRS between platforms, you needed 400% zoom and they all said it was imperceptible during gameplay.
I'd like to see some screens of those "PS1" textures.
 

martino

Member
No I am talking about tier 2.

Why do you think Digital Foundry don't do RDNA2 VRS ON / OFF identical comparisons and benchmarks using PC games? Because once you apply performance mode (the one with the interesting performance gains with ~10% higher fps), then the textures look like shit and are very noticeably worse. A bit like PS1 textures according to one gaffer who did the test in one game with tiers 2 VRS.
You now you are to the point to simply post lie and add a layer of conspiracy fud and hyperbole (ps1 textures) on them....
At least shame is not what is stopping you.
 

Loope

Member
It's incredibly how you push yourself to talk about something which you barely knows every fucking single time. It's all about praise the PR MS marketing spin words and leak mountain of shit on Cerny because doesn't understand shit reading your post.
He practically built an hardware virtually similar to the series X performance saving 100 bucks, looks at that stupid idiot. MS practically waited AMD refinishing Navi at the last minute, proclaiming itself the magical engineers because looks at these higher specs. Quite easy put higher specs waiting the last second when the other company had already finished the hardware specs almost an year early.
WTF,you basically criticized him for shitting on Cerny and here you are shitting on MS engineers for no reason at all, implying all they had to do was waiting for AMD and throw shit in there. Are you for real now?

Btw, did you sent you CV to MS yet, after all you know better than them, i mean when you shit on someone's work like that, you surely know a lot more than them.

What the hell is wrong with you people (both sides)? These guys (Cerny, MS engineers, Nintendo Engineers) are were they are because they're fucking good at what they do, seeing forum warriors going ham on them and acting like they don't know shit other than throw of the shelf parts in there is ridiculous.
 

Arioco

Member
I don't think Cerny lied, it's just fair to now assume that what isn't stated as fact in the road to PS5 presentation is not there, when asked they offer no comment about the "Full RDNA2" press release.
DF have recently confirmed the no hardware support for Tier 2 VRS, they said recently that Geometry Engine is "cut down" mesh Shaders (Primitive Shaders?) And they say in their latest video that PS5 lacks the machine learning hardware for AI upscaling that MS are working on bringing, that has been confirmed before.


Could you please provide a timestamped link to these statements? We have confirmation that PS5 does not support hardware VRS, and it seems to be something related to the ROPs, but as far as I know we know nothing about what GE is or whether PS5 supports int4 or int8 for ML. I watched latest DF's latest video but I don't recall the guys mentioned something like that.

Thank you.
 

Riky

$MSFT
Could you please provide a timestamped link to these statements? We have confirmation that PS5 does not support hardware VRS, and it seems to be something related to the ROPs, but as far as I know we know nothing about what GE is or whether PS5 supports int4 or int8 for ML. I watched latest DF's latest video but I don't recall the guys mentioned something like that.

Thank you.

I've quoted them both, the last video it's about 1h.19m onwards, Alex says it about AI upscaling. You can search my posts for the other one it was in answer to a reader's question.
 

Arioco

Member
It's not just him though, Sony ommited those features in road to PS5, AMD themselves confirmed it when they revealed RDNA2, Jason Ronald stated it several times and now DF are saying it.


Oh, but I'm not saying Alex is wrong. In fact I don't count on those features on PS5 unless SONY themselves confirm it, which they've haven't done it. But it doesn't make Alex a reliable source, cause I insist, he was sure PS5 lacked hardware accelerated RT AFTER Cerny confirmed it to The Wire. 🤷‍♂️ Not sure where he got his information from or if that was pure speculation, but he stated his thought several times, both in DF videos and on Resetera.
 

azertydu91

Hard to Kill
Oh, but I'm not saying Alex is wrong. In fact I don't count on those features on PS5 unless SONY themselves confirm it, which they've haven't done it. But it doesn't make Alex a reliable source, cause I insist, he was sure PS5 lacked hardware accelerated RT AFTER Cerny confirmed it to The Wire. 🤷‍♂️ Not sure where he got his information from or if that was pure speculation, but he stated his thought several times, both in DF videos and on Resetera.
Not only after the wire article but even after the road to ps5 where Cerny specifially mentionned hardware RT,it was in the video of their reaction to the spec reveal...Talking about an impartial analysis.
 

Lysandros

Member
Could you please provide a timestamped link to these statements? We have confirmation that PS5 does not support hardware VRS, and it seems to be something related to the ROPs, but as far as I know we know nothing about what GE is or whether PS5 supports int4 or int8 for ML. I watched latest DF's latest video but I don't recall the guys mentioned something like that.

Thank you.
From 29:00...
 

Arioco

Member
From 29:00...



Thank. Yes, I've seen The Road to PS5 but what I meant is that we don't know how GE differs from Mesh Shaders. What Cerny is describing sound pretty much the same but maybe there are some differences, I honestly don't know. According to Matt Hargett (who should know, unless Alex he's actually worked on PS5) GE is great for saving cycles, but we still don't have any details.

I wish SONY would be more open about the tech inside PS5, to be honest.
 

Lysandros

Member
Thank. Yes, I've seen The Road to PS5 but what I meant is that we don't know how GE differs from Mesh Shaders. What Cerny is describing sound pretty much the same but maybe there are some differences, I honestly don't know. According to Matt Hargett (who should know, unless Alex he's actually worked on PS5) GE is great for saving cycles, but we still don't have any details.

I wish SONY would be more open about the tech inside PS5, to be honest.
I see. I think the key word is 'full programmatic control'. By the way 'even if' nothing was changed from RDNA/2's unified GE at hardware level besides the 'full programmability' part, it should still be more capable than XSX counterpart beause of the simple fact of running at ~20% higher frequency.
 
Last edited:

martino

Member
I see. I think the key word is 'full programmatic control'. By the way 'even if' nothing was changed from RDNA/2's unified GE at hardware level besides the 'full programmability' part, it should still be more capable than XSX counterpart beause of the simple fact of running at ~20% higher frequency.
nvidia experiment and conclusion with their hardware with programmatic control doesn't conclude that.
All data loads are handled via shader instructions instead of the classic fixed function primitive fetch and therefore scales better with more Streaming Multiprocessors. It also allows easier use of custom vertex encodings to further reduce bandwidth.
https://developer.nvidia.com/blog/introduction-turing-mesh-shaders/
XSS and XSX design choice seems to take this kind of feedback into account while choosing nvidia implementation for their api.

Also we also have data on amd ngg default culling gains using vulkan with 6xxx :
like most of those feature when not isolated it cannot do that much in real world scenarios..that's why adding gains everywhere cannot be a bad design/decision.
 
Last edited:
Wtf, with straight face, saying a cut down version of Mes shader?? What's this guy smoking bruh. Where's his source?
Will Smith Reaction GIF
Both primitive shaders and mesh shaders are compiled by a driver on the API,the newest drivers on AMD's NGG mode are converting mesh shaders into primitive shaders on their 6000 series GPU's,these drivers convert mesh shader code into primitive shaders.

The PS5 can easily use mesh shaders using Sony or AMD API's,they are as already stated driver compiled and are only limited by the dev's desire to use them.
 
I see. I think the key word is 'full programmatic control'. By the way 'even if' nothing was changed from RDNA/2's unified GE at hardware level besides the 'full programmability' part, it should still be more capable than XSX counterpart beause of the simple fact of running at ~20% higher frequency.
Mesh shaders and Primitive shaders are not the same thing,but they are similar in what they do,the only thing you need to use PS and MS is a API that supports them,they are compiled by a driver on the API (fully programmable),the PS5 can use PS or MS using Sony's or AMD's API.

AMD's 6000 series GPU's using the NGG mode convert mesh shader,vertex shader,geometry shader code into Primitive shaders.
 
Both primitive shaders and mesh shaders are compiled by a driver on the API,the newest drivers on AMD's NGG mode are converting mesh shaders into primitive shaders on their 6000 series GPU's,these drivers convert mesh shader code into primitive shaders.

The PS5 can easily use mesh shaders using Sony or AMD API's,they are as already stated driver compiled and are only limited by the dev's desire to use them.

giphy.gif
 

rnlval

Member
I read some guys saying the cutdown RDNA2 ROPs make sense in RDNA2 because of Infinite cache high bandwidth it wouldn't need it. The problem is, XSX hasn't got any infinite cache.

There was an article from Activition developer showing software VRS was better in all aspect than hardware VRS. Faster with better results (basically because it was much more customizeable). And you know what? After having seeing a few games with RDNA2 VRS I agree. RDNA2 VRS is a failure IMO. The end results is very blurry, it's noticeable when it shouldn't and I predict most multiplat developers won't use it in the future because it's too blurry for not much gains.
128 MB Infinite Cache is a workaround for the mid-range 256 bit GDDR6-16000 (512 GB/s) design.

Remember, XBO's 32 MB ESRAM supports ~1600x900p frame buffer without Delta Color Compression. 128 MB Infinite Cache with Delta Color Compression can support 4K frame buffer.
 
Last edited:

rnlval

Member
Mesh shaders and Primitive shaders are not the same thing,but they are similar in what they do,the only thing you need to use PS and MS is a API that supports them,they are compiled by a driver on the API (fully programmable),the PS5 can use PS or MS using Sony's or AMD's API.

AMD's 6000 series GPU's using the NGG mode convert mesh shader,vertex shader,geometry shader code into Primitive shaders.
Nope, RX Vega and RX 5700 XT have "Primitive Shaders" NGGP and it doesn't support NVIDIA's Amplification Shader and Mesh Shader NGGP.

Mesh Shader terminology exists for Microsoft DirectX12 and Khronos Group's Vulkan APIs. PC's and XSX/XSS's RDNA 2 was modified to support NVIDIA's NGGP.



Iuvk3fh.jpg

Before RDNA 2, AMD offered their NGGP future path to Microsoft and it was rejected.

Facts: NVIDIA has about 80% discrete GPU market and NVIDIA's revenue is larger than AMD's. AMD is crushed by NVIDIA!
 
Last edited:

azertydu91

Hard to Kill
You claimed "I never mentioned anything about RT" while your screenshot example shown RT claim. :messenger_unamused:
That is probably one of the worst defense I have ever seen.
48/52.....
There's no pictures attached since you seem easily distracted that may be a way for you not to talk about ambiant occlusion or anti aliasing or any kind of technology that is present on a screen.
Or maybe you wanted him to blur the parts where rt is obvious ? I mean I could talk yo him about redhead girls on ps5 since his picture included one and it would completely misses the point just like you.
 

rnlval

Member
That is probably one of the worst defense I have ever seen.
48/52.....
There's no pictures attached since you seem easily distracted that may be a way for you not to talk about ambiant occlusion or anti aliasing or any kind of technology that is present on a screen.
Or maybe you wanted him to blur the parts where rt is obvious ? I mean I could talk yo him about redhead girls on ps5 since his picture included one and it would completely misses the point just like you.

Md Ray posted the following screenshot;

Here's one example where PS5's higher clock speed brings it on par Series X in a scene where visuals and resolution are identical between both. If higher CU count and BW was everything then this shouldn't be happening.
JOqj3Wi.png


There are plenty of examples like this, just go through VGTech's channel and checkout the last couple of vids. And re-read these excellent posts again to better understand what we mean:

------

Md Ray:: I never mentioned anything about RT. Whether the scene has RT or not, PS5's higher clock speed can close the gap in some scenarios, that was my point.

-----------


I wasn't distracted.
 

rnlval

Member
nvidia experiment and conclusion with their hardware with programmatic control doesn't conclude that.

XSS and XSX design choice seems to take this kind of feedback into account while choosing nvidia implementation for their api.

Also we also have data on amd ngg default culling gains using vulkan with 6xxx :
like most of those feature when not isolated it cannot do that much in real world scenarios..that's why adding gains everywhere cannot be a bad design/decision.
The Radeon Vulkan NGG Culling extension for NAVI 1.x is not enough for NVIDIA's Amplification Shader and Mesh Shader NGGP which is used for DirectX12U's NGGP compliance.

Prove NAVI 1.x can support DirectX12U's NGGP compliance. Good luck with DXVK when bridging DirectX12U's NGGP for NAVI 1.x.
 
Last edited:

azertydu91

Hard to Kill
Md Ray posted the following screenshot;

Here's one example where PS5's higher clock speed brings it on par Series X in a scene where visuals and resolution are identical between both. If higher CU count and BW was everything then this shouldn't be happening.
JOqj3Wi.png


There are plenty of examples like this, just go through VGTech's channel and checkout the last couple of vids. And re-read these excellent posts again to better understand what we mean:

------

Md Ray:: I never mentioned anything about RT. Whether the scene has RT or not, PS5's higher clock speed can close the gap in some scenarios, that was my point.

-----------


I wasn't distracted.
There is AA in those pics so I am gonna talk about that.It would have nothing to do with what you are saying right?
That's exactly how your post felt, beside his point was that 1 fps difference is nothing to write home about when there is a 2 TF difference betweent the 2 consoles.Find 2 graphics card of the same architecture with 2 TF difference and I can garantee you there will be a bigger delta between them.Not counting of course all the games where ps5 was the better performing console.
Just because you posted a picture doesn't mean that your point include every bit of technology present on screen, so RT was not brought up, even moreso considering that RT seems to be what dev are trying to incorporate in their games nowadays, so I feel like it will be harder and harder to find games without RT ( in the AAA space ).That's an extrapolation on your part about what he meant but clearly considering his answer it was not at all what he meant.


Edit: wait I think you guys agree on that part but are just arguing semantics there.It happens so much in the IT field it is incredible, 2 people arguing, that in fact agree with eachother but don't use the same terms.I remember when I was an IT student 2 friend where arguing about the way to plug a switch for a good half an hour until one plugged it how he felt and the other told him "Great so now you agree" until they both realise they wanted to do it the same way....Of course the rest of us laughed about it.
 
Last edited:

Hobbygaming

has been asked to post in 'Grounded' mode.
That would be impossible.

But why did MS give the option to use Series X CPU at a higher clockspeed (3.8 Ghz) with fewer threads if having higher clocks with fewer cores doesn't have any real advantages?

You probably don't remember, but when XBOX One launched MS engineers explained to Digital Foundry that higher clocks provided better performance than more CUs (which they know because XBOX One devkits had 14 CUs instead of the 12 present in the retail console, and according to them 12 CUs clocked higher performed better that the full fat version of the APU with 14 CUs). The difference in clockspeed was not enough to overcome PS4's advantage in raw Tflops, but One S did beat PS4 on some occasions.

https://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects




So according to MS engineers a 6.6% clock upgrade provided more performance than 16,6% more CUs. 🤷‍♂️
Another thing to keep in mind is that it's more difficult to program for a higher number of CUs because you have to keep every one of them fed with useful work
 

Rea

Member
I bet Sony took what they needed. PS5 has VRS, just not VRS 2.0 but they most likely have solutions for everything they need for a good leap forward
VRS is a one of the features of DX12U.
0ML55se.jpg




If you're interested there is a video link. VrS is also not very useful when geometry has very small pixel size polygons.
 

Hobbygaming

has been asked to post in 'Grounded' mode.
VRS is a one of the features of DX12U.
0ML55se.jpg




If you're interested there is a video link. VrS is also not very useful when geometry has very small pixel size polygons.

VRS might be a part of the DX12 Ultimate feature set but it's also on the PS5. The GE basically requires VRS, otherwise you're throwing performance away for nothing
 

assurdum

Banned
WTF,you basically criticized him for shitting on Cerny and here you are shitting on MS engineers for no reason at all, implying all they had to do was waiting for AMD and throw shit in there. Are you for real now?

Btw, did you sent you CV to MS yet, after all you know better than them, i mean when you shit on someone's work like that, you surely know a lot more than them.

What the hell is wrong with you people (both sides)? These guys (Cerny, MS engineers, Nintendo Engineers) are were they are because they're fucking good at what they do, seeing forum warriors going ham on them and acting like they don't know shit other than throw of the shelf parts in there is ridiculous.
I'm shitting to MS engineers because I said they have wait the last minute to have higher specs? Wut. I just said it's quite easy achieve higher specs of their counterparts in this way. If Sony would have done the same higher specs on ps5 were guaranteed.
 
Last edited:

Md Ray

Member
You claimed "I never mentioned anything about RT" while your screenshot example shown RT claim. :messenger_unamused:
Yep, my point was, PS5's higher GPU clocks can bring it on par SX in some scenarios. Another e.g. would be that infamous corridor of doom section in Control where SX is pumping out 33fps, whereas the PS5 is pushing 32fps. A 3% difference, not 18% or 25%.
 
Last edited:

Soosa

Banned
You claimed "I never mentioned anything about RT" while your screenshot example shown RT claim. :messenger_unamused:
Why are you talking about RT, when his example didnt mention RT at all? It just looks like that you missed the point below completely.

Hypothesis: series x have 20% more pure power than PS5, so series x should perform ~20% better on identical scenario.

test scenario(s): Same game, same resolution, same graphical fidelity level

Results: Series x and PS5 perform almost identically on some of these scenarios


That is the point he was pointing out, nothing about RT. It is just weird that you even bring that up, when it is not mentioned.


The interesting thing is, that when one system have more theoretical performance, but they perform similarry on same kind of situations, why is that?

is it because faster clocks of PS5 allow higher usage of the theoretical performance maxium, or something else?

And if it is that,then does series x have more future potential to gain higher gains of what devs can get out of the system, assuming ps5 as a system is easier to nearly be maxed out.
 

martino

Member
rdna2 things have not been practically even used yet .. we will see a big change as soon as they switch from primitive shaders the mesh shaders take hold
like everything not in absolute

people missed the point of my ngg example....it was to show that new things will not always bring much to table in all scenarios...
 
Last edited:
Top Bottom