• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[NX Gamer] AC: Valhalla - XsX vs PS5 - The first REAL head to head battle.

geordiemp

Member
At least that's how I think it would work in these types of situations. It makes sense to me but no idea if SmartShift is being used in that moment.

Smart shift is changing power around 8 times a frame (assuming same as laptop timing wise)

RDNA2 is also pervasive fine clock gating across the GPU, so think of it as 8 or more times a frame power is being moved around the whole GPU die and that is likely waht is happening - this is in the AMD PC RDNA2 presentation.

Smart shift is just AMD the name given when CPU helps GPU. RDNA2 PC GPU parts also do this power shifting at a CU level.
 
Last edited:

devilNprada

Member
I did go pick up a PS5 at Best Buy yesterday and there was a huge XBox cardboard cutout that did in fact say "Worlds most Powerful Console"

Now i have a question about cables... PS5 is telling me it is outputting 1080p because my HDMI device..
Not using their short ass cable I kept my original HDMI cables..

If I change the cable does it fix that regardless of my TV?
What will be the effect?
 
Last edited:

ethomaz

Banned
I did go pick up a PS5 at Best Buy yesterday and there was a huge XBox cardboard cutout that did in fact say "Worlds most Powerful Console"

Now i have a question about cables... PS5 is telling me it is outputting 1080p because my HDMI device..
Not using their short ass cable I kept my original HDMI cables..

If I change the cable does it fix that regardless of my TV?
What will be the effect?
Make a test with the PS5 cable first... it is HDMI 2.1 compatible... so if it works then you will need to buy (and make sure to check you are buying) a Ultra High Speed HDMI cable.

But your issue is probably due the cable not being compatible with high bandwidht.
 
Last edited:

ABnormal

Member
I just finished watching the video, but can't comprehend how a 1K thread can evolve out of this minor differences. What a waste.

I suspect you are playing dumb, here. And I say "suspect" just because everyone deserves a phylosophic doubt.

But I'm sure it's not hard to understand that after months of "XSX much more powerful", "at least 20% advantage on multiplatform titles" and similar statements, a parity between the two consoles is a big deal (so many are silently suffering now). If then happens that there are even some aspects a little better on ps5 (albeit so small), you can understand that it's the icing on the cake, just to add insult to injury.
 

John Wick

Member
Ps5 uses boost clocks, variable. You can't just compare this way
You still spreading the MS fud?
There’s no better explanation that what Mark Cerny has already given in his talk, and later clarified in his DigitalFoundry interview.
It’s tied to power usage, not temperature. It’s designed so that for the most part the clocks stay at their highest frequencies, regardless of whether the console is in a TV cabinet or somewhere cold.
It’s designed to be deterministic. The purpose is to reduce clocks when they don’t need to be so high to help with power usage and keeping the fans quiet. If a GPU is expected to deliver a frame every 16.6ms (60 FPS) and it’s done its work already in 8ms, then there’s no point it sitting there idle at 2.23Ghz sucking power and generating heat. If it could intelligently drop the clocks so that it finishes its frame just before the 16.6ms you get the same 60 FPS game, the same graphical detail, but with much less fan noise.
Anyone with a gaming PC will know that GPU utilisation is rarely at 100%
It typically takes burn tests and crazy benchmark software to get that.
Cerny seemed to suggest that you’d need quite a synthetic test to really load up both the CPU and GPU enough to cause them to declock for power reasons, and that it won’t show up in any normal game.
He said that same synthetic test would simply cause a PS4 to overheat and shutdown.
And even then, dropping power consumption by 10% only drops core clocks by a “few” percent. Which makes sense if you’re used to overclocking modern GPUs. You need to crank up the power to get even a minimal amount of extra clock, and cranking up an already jacked up GPU clock by a “few” percent barely makes a difference to performance anyway.
The PS5 can change powerdraw multiple times per frame.
 
Nothing to see here I guess with absolute parity on a cross gen game aside from that ONLY 12% FASTER LOADING! HAHA WHERE YOUR 2X SSD SPEED AT SONY PONIES!!? NO 2 SECOND LOAD HUH!! But yeah it's cross gen so that's why there's parity otherwise the Series X would be eating Asgard for breakfast with those extra TF....

I think I nailed it :messenger_smirking:
 

assurdum

Banned
Nostradamus is here? The bigger number thing was worded the way it was because it's a joke. But you can't blame anyone for expecting x amount of difference in performance in a game when they can see the console is capable of it on paper.
Paper means nothing without a factual contest.
 
Last edited:

devilNprada

Member
Make a test with the PS5 cable first... it is HDMI 2.1 compatible... so if it works then you will need to buy (and make sure to check you are buying) a Ultra High Speed HDMI cable.

I have a two year old Sony A8F ($3500) TV it has some features of 2.1 (earc) but not all features, it is a 4k 120 hz refresh rate tv.

Will it always be producing 1080p and upscale then? Should i even replace the cable?
 

Zadom

Member
Aren't you busy having a screeching meltdown over the XSX having a PCIe 4 ssd?

H787avD.jpg
 

ethomaz

Banned
I have a two year old Sony A8F ($3500) TV it has some features of 2.1 (earc) but not all features, it is a 4k 120 hz refresh rate tv.

Will it always be producing 1080p and upscale then? Should i even replace the cable?
Your TV is to work fine with 4k... 1080p is really weird.
That is why I'm asking to try the cable that come with the PS5 first... to make sure it is not the cable.
 

THE:MILKMAN

Member
Smart shift is changing power around 8 times a frame (assuming same as laptop timing wise)

RDNA2 is also pervasive fine clock gating across the GPU, so think of it as 8 or more times a frame power is being moved around the whole GPU die and that is likely waht is happening - this is in the AMD PC RDNA2 presentation.

Smart shift is just AMD the name given when CPU helps GPU. RDNA2 PC GPU parts also do this power shifting at a CU level.

I really don't get why so many get their knickers in a twist about PS5's variable clocks. dGPU's have been doing it for a good while now (base, game, boost clocks = variable) and nobody complains.

A graph of the RX 5700 XT over a few minutes shows the variable clocks in a game:

graph_1.png.webp


I think PS5 might be go even further/fine grained with the variable clocks and adding SmartShift too.
 

assurdum

Banned
Guys I love ya all. This thread has been too much fun, when's the next big multiplat face off aside from trans fest Cyberpunk in December?
Quite curious about Watch Dogs on ps5 . About COD I feel scared for the series X. Activision ended to handle COD better on PS4 pro even compared the one X. That's not a good sign for Xbox . But who know, I could be wrong
 
Last edited:
The tune has certainly changed on here ever since the games came out. What with AC Valhalla claimed to run native 4K 6 60fps and PS5 dynamic 4K@60fps.

To be fair of you read the message from Ubisoft correctly neither version was confined to run at a native 4K. Usually game makers tend to point out if it's native or not since that's a selling point for their game.
 
Last edited:

geordiemp

Member
I really don't get why so many get their knickers in a twist about PS5's variable clocks. dGPU's have been doing it for a good while now (base, game, boost clocks = variable) and nobody complains.

A graph of the RX 5700 XT over a few minutes shows the variable clocks in a game:

graph_1.png.webp


I think PS5 might be go even further/fine grained with the variable clocks and adding SmartShift too.

So are RDNA2 PC parts like 6800. But AMD did not talk about variable clocks per CU too much other than the slide below because it goes agaianst MS narrative as an important customer.

When AMD do the white paper for RDNA2 we will find how far pervasive goes.....per WGP likely...


nhgJqf9.png
 
Last edited:

Godfavor

Member
I really don't get why so many get their knickers in a twist about PS5's variable clocks. dGPU's have been doing it for a good while now (base, game, boost clocks = variable) and nobody complains.

A graph of the RX 5700 XT over a few minutes shows the variable clocks in a game:

graph_1.png.webp


I think PS5 might be go even further/fine grained with the variable clocks and adding SmartShift too.

Nonsense, variable clocks on ps5 were added because there is a tdp cap.
 
Didn't stop people running with the native 4K narrative.

They basically came to that conclusion because they thought the PS5 would be much weaker. Basically if it's dynamic on the PS5 it will be native on the XSX. Doesn't help that Greenburg shoved Native 4K in everyone's faces so they had high expectations for the system.

Realistically though Native 4K consumes alot of resources and most developers won't do that on consoles.

Nonsense, variable clocks on ps5 were added because there is a tdp cap.

Cerny also mentioned that it was a vital part of the cooling system. Basically the clocks will be low when there's less work being that. Lower clocks means that less heat will get produced.
 
Last edited:

geordiemp

Member
Nonsense, variable clocks on ps5 were added because there is a tdp cap.

Nope, variable pervasive fine gated clocks are part of big Navi and RDNA2, point 1 below and per CU based (could be DCU)

A slde that poeple missed or did not get from AMD about big NAVI

Qcbe7Rt.png
 
Last edited:

THE:MILKMAN

Member
So are RDNA2 PC parts like 6800. But AMD did not talk about it too much other than the slide below because it goes agaianst MS narrative as an important customer. When AMD do the white paper for RDNA2 we will find how far pervasive goes.....per WGP likely...


nhgJqf9.png

I'm amazed at how quiet Sony and AMD have been about talking PS5/RDNA 2 tech and/or custom tech. I think Mark Cerny might do his tech talks like he did after PS4 launched next year but at least a basic overview would be nice. We still don't even know the actual die size of PS5's APU even though Austin Evans and iFixit were directly asked to measure it!
 
Last edited:
Nope, variable pervasive fine gated clocks are part of big Navi and RDNA2, point 1 below and per CU based (could be DCU)

A slde that poeple missed or did not get from AMD about big NAVI

Qcbe7Rt.png

Some people are confused with the whole RDNA2 thing but if there's one thing that Cerny did confirm is that the CUs are RDNA2 CUs. Which means that whatever AMD said about them can be applied to the PS5.

20200329152909.jpg


Looking at AMDs presentation the variable clocks make sense because the CUs are RDNA2 ones.
 

Godfavor

Member
They basically came to that conclusion because they thought the PS5 would be much weaker. Basically if it's dynamic on the PS5 it will be native on the XSX. Doesn't help that Greenburg shoved Native 4K in everyone's faces so they had high expectations for the system.

Realistically though Native 4K consumes alot of resources and most developers won't do that on consoles.



Cerny also mentioned that it was a vital part of the cooling system. Basically the clocks will be low when there's less work being that. Lower clocks means that less heat will get produced.


Nope, variable pervasive fine gated clocks are part of big Navi and RDNA2, point 1 below and per CU based (could be DCU)

A slde that poeple missed or did not get from AMD about big NAVI

Qcbe7Rt.png

Thars all part of the tdp though, doesnt change what I wrote.
 

Godfavor

Member
And is it a bad thing in your opinion? Just based of what we saw do far.

Not a bad thing at all.
Some people don't understand what it does.

GPUs (like nvidia) that do not have this feature doesnt make them worse but less effecient. It is great for heat/watt effeciency per cu level and I wish xsx had it, but thats that and nothing more.

It is great for consoles because in an ideal solution, ps5 could do 210 watt constant load (whatever the cap is) by switching frequencies between cpu-gpu rapidly. That's a way to maximize an apu.

Xsx does not have it because the max apu speeds won't exceed the tdp, as the gpu speed is way lower (1823 mhz) and thus more effecient in power draw (watt) to gpu speed ratio.

If xsx had something like that it could lead to a higher TF number as well.
 

Old Empire.

Member
They basically came to that conclusion because they thought the PS5 would be much weaker. Basically if it's dynamic on the PS5 it will be native on the XSX. Doesn't help that Greenburg shoved Native 4K in everyone's faces so they had high expectations for the system.

Realistically though Native 4K consumes alot of resources and most developers won't do that on consoles.



Cerny also mentioned that it was a vital part of the cooling system. Basically the clocks will be low when there's less work being that. Lower clocks means that less heat will get produced.

Apparently Xbox switched to a new API called 'gamecore' and the dev tools are not that mature yet. Series X likely have some teething issues here at the beginning. The devs will get a better handle with the series as time goes by and there will be advantages over the PS5.

Seriously Why would people bother investing in higher quality GPU's on PC if the slightly weaker one gave the same resolution and frame assets and had the same textural assets? Next year is when the real benchmarking begins as the devs will have the consoles for some time!
 
Not a bad thing at all.
Some people don't understand what it does.

GPUs (like nvidia) that do not have this feature doesnt make them worse but less effecient. It is great for heat/watt effeciency per cu level and I wish xsx had it, but thats that and nothing more.

It is great for consoles because in an ideal solution, ps5 could do 210 watt constant load (whatever the cap is) by switching frequencies between cpu-gpu rapidly. That's a way to maximize an apu.

Xsx does not have it because the max apu speeds won't exceed the tdp, as the gpu speed is way lower (1823 mhz) and thus more effecient in power draw (watt) to gpu speed ratio.

If xsx had something like that it could lead to a higher TF number as well.

Ok that makes sense to me. I've seen some people argue it as a thing and I really can't see negative side of it. It essentially allows a GPU to be more efficient and gain some performance because of it.
 

M1chl

Currently Gif and Meme Champion
While we are at it, this is a quick capture from internal PS5 recording, you can keep it at automatic 4K@60fps HDR recording for up to 15 minutes, or capture longer with the built-in Share Factory Studio app in PS5:




Pretty clean footage, and a very powerful editing app that will make many budget content creators stand out with quality stuff. @Kingthrash no more 1080p when you go to PS5, my bro ;)

Dat framerate in open world:
94f.png

What is XSX being bound by here?
vlcsnap-2020-11-13-18h56m20s419.png



He speaks about 120fps here but it applies to 60fps too.


More sample motion blur it seems, so cough a G P U.
 
Last edited:
Apparently Xbox switched to a new API called 'gamecore' and the dev tools are not that mature yet. Series X likely have some teething issues here at the beginning. The devs will get a better handle with the series as time goes by and there will be advantages over the PS5.

Seriously Why would people bother investing in higher quality GPU's on PC if the slightly weaker one gave the same resolution and frame assets and had the same textural assets? Next year is when the real benchmarking begins as the devs will have the consoles for some time!

I've also read read that developers still have to get used to the inner workings of the PS5 so things will get better for Sony as well. But so far it seems like the PS5 is a very well designed system.
 

Dodkrake

Banned
Apparently Xbox switched to a new API called 'gamecore' and the dev tools are not that mature yet. Series X likely have some teething issues here at the beginning. The devs will get a better handle with the series as time goes by and there will be advantages over the PS5.

Seriously Why would people bother investing in higher quality GPU's on PC if the slightly weaker one gave the same resolution and frame assets and had the same textural assets? Next year is when the real benchmarking begins as the devs will have the consoles for some time!

Ford invested in the pinto. Think about that. Just because you invest a lot doesn't mean you invest wisely.

Oh, and by the way, we've heard the "next year" talk since 2013. It materialized in 0 exclusives and a sales failure.
 
Apparently Xbox switched to a new API called 'gamecore' and the dev tools are not that mature yet. Series X likely have some teething issues here at the beginning. The devs will get a better handle with the series as time goes by and there will be advantages over the PS5.

Seriously Why would people bother investing in higher quality GPU's on PC if the slightly weaker one gave the same resolution and frame assets and had the same textural assets? Next year is when the real benchmarking begins as the devs will have the consoles for some time!

I can't comprehend this attitude.
Its always the same with Xbox "just wait and see!".
In 18 years I haven't seen anything worthwhile for Xbox exclusivity yet. (this is obviously just my subjective opinion and I have always had a gaming PC and a Playstation and sometimes a additional Nintendo console )

Oh no I remembered, Fable 1 was Interesting! Played it later on PC though.
 
Last edited:
Top Bottom