• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Yoboman

Member
I remember when pixel shaders were the new kid on the block, and suddenly every surface was a shiny mess! Hopefully things will calm down over the next year or so and we will see more subtle use of RT. I agree RT GI can greatly improve a scenes look, and I don't need to see my face in every puddle!

Also it is harder to tell if shadows are ray traced or if they are using RT GI to the naked eye, when you can get close approximation from using traditional methods, with less performance cost. RT reflections are much easier to spot.
Yeah, but if its not prepared they can do a lot more with dynamic lighting. Id like to see them go a bit nuts in that regard
 

Rea

Member
Not really. As far as I know, SFS works with the RAM.

You're saying that they mentioned like that is "the Xbox infinity cache"?

edit: yes, that's what you just wrote. LOL.
No, they are wrong.
SFS is a DX12U features, both nvidia and Amd cards support it, nothing to do with nvidia's ampere architecture or amd's RDNA2. Infinity cache is one of the features of rdna2 and exclusively for rdna2 cards.
 

Rea

Member
No shit we know how it works and why. They run at constant power and let the clock vary, instead of running at a constant clock speed but letting power fluctuate. This made the PS5 thermal & cooling design much easier to engineer.
Meaning, ps5 will run every games with same amount of noise and heat? But xbox series X will run hot and noisier for some games?
 

Rea

Member
Mr socks, the answer is right there. It happens several times per frame, and it will not negatively affect performance because it’ll only happen at times in the pipeline where the gpu or cpu does not need the power. It’s an optimization in the usage of the power budget, whereas with “constant” clocks, power is delivered equally to the cpu and gpu no matter if one of them doesn’t need it, and it’s wasted
Also, when there are games hungry for power, will draw more power and produce more heat and fan noise??
Am i right?
 

Lethal01

Member
I remember when pixel shaders were the new kid on the block, and suddenly every surface was a shiny mess! Hopefully things will calm down over the next year or so and we will see more subtle use of RT. I agree RT GI can greatly improve a scenes look, and I don't need to see my face in every puddle!

Also it is harder to tell if shadows are ray traced or if they are using RT GI to the naked eye, when you can get close approximation from using traditional methods, with less performance cost. RT reflections are much easier to spot.

While I agree GI is often more important people are really downplaying reflections too much. Yes, in some scenes having reflections enabled kill performance with little difference. but as soon as you need to depict a rainy day or a city full of giant glass reflective buildings you get tons of benefit from reflections.

For extremely reflective scenes, many of the benefits of good GI are gotten from reflections.
 
Last edited:
I'm sure this is all great audio info, very interesting and much appreciated, but it's beside the point.

If you took a good quality mic, pointed it at the exhaust port of a Series X (and had it in reasonably close proximity), would you be able to hear and record it? Yes, of course.

If you put a typical $300 mic on a desk in the same room, say 10 feet away, and pressed record, would the recording meter show anything or would it sit in it's 'zero' position?

It may well sit at zero, like this waveform of a simple test I just did.

8PXrD3T.jpg



This was a continuous recording made in a quiet room with a One X operating. I was sitting on a couch across the room, so fairly typical. The peaks are me moving or clicking my fingers, the bits in-between show nothing on the meter at all.

I cannot hear the console operating.

If I listen to that recording I can hear ambience in the quiet parts but nothing that I could identify reliably as the One X operating.

What Jez said was not controversial. He had an open mic in the same room as an operating Series X console and his recording meter was showing nothing as per the above example. Big deal, right? Totally believable and easily reproducible.
Yeah, well, I can record a waveform like that with ambience noise, that doesn't say much of anything, and is not like Audacity is a particularly brilliant software for audio. And anyway, you said it yourself:

"If I listen to that recording I can hear ambience in the quiet parts but nothing that I could identify reliably as the One X operating."

I explained, in my previous post, the reason as to why a sound, being there, could be harder to identify. And you're saying yourself how in the "quiet parts" the mic is picking up the ambience sound. The console working is part of that sound. And this most likely without a good mic for these kind of measurements (even within the 300$ range). Again, you may think that I'm disputing the fact that the console is quiet or anything like that, and I'm absolutely not. I'm just saying that, because of how sound works at a physical level, what Jez said is inaccurate. Is all I'm saying, I'm not denying his claim that the Series X is a quiet console.

But I guess we'll just agree to disagree and move on, this is surely tiring for both of us.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Okay, let's be fair here. They took too long to reveal that Miles Morales was coming to PS4
and no one from Sony Santa Monica Twitter has confirmed that gow will be ps5 only.

They are very active but curiously quiet on it being a next gen only game.
 
  • Thoughtful
Reactions: Rea

Lethal01

Member
and no one from Sony Santa Monica Twitter has confirmed that gow will be ps5 only.

They are very active but curiously quiet on it being a next gen only game.

Well, you could have said the same thing about Ratchet and Clank until a moment ago but it ended up being nothing.
While I agree the possibility it's on PS4 still exists I don't think them not screaming it is really evidence of anything or them trying to be sneaky.
 

Tripolygon

Banned
Ok? That doesn't address what I said. People want to know how often the GPU or CPU won't be having their full power, meaning that 10.2TF isn't all available.
Huh?

1, 2, 3, 12TF are theoretical numbers. Its a perfect utilization of the ALU. It doesn't exist. You are asking a really weird question. Do you ask how many TF in your smartphone isn't available? Do you ask how many TF in your PC CPU and GPU isn't all available?

If I showed you a single frame from a game, can you tell how many TF was used in the process of rendering that frame?
No matter how you want to spin it, variable clocks are worse than constant clocks. People, myself included, want to see the affects that the variable clock rate has.
Ok i was ready to give you the benefit of a doubt above that you just didn't understand what you're talking about but now i think you're just interested in console wars. You are not going to see the "effects" of variable clocks same way you don't see it on your PC. What you are seeing are the effects of optimization from software to better utilize a given hardware.
HcDHBda.jpg

ehvCLLJ.jpg
 
Last edited:

buenoblue

Member
Then how come PC have variable clocks? Nvidia and AMD cards are also variable.
They are variable to get more performance out of the chip . Not less

The trouble is pc graphics cards list the base clock they will guarantee to hit then boost variably from there. What sony have done is stated the maximum boost clock the gpu will run at, then they variably downclock from there. This seems problematic. No matter how much spin they put on it a constant 10.2 tetaflops is better than clocking down at any time.
 

Alex Scott

Member
The trouble is pc graphics cards list the base clock they will guarantee to hit then boost variably from there. What sony have done is stated the maximum boost clock the gpu will run at, then they variably downclock from there. This seems problematic. No matter how much spin they put on it a constant 10.2 tetaflops is better than clocking down at any time.
What you are implying is utterly incorrect.
It doesn't even make sense from the perspective of how the GPU functions.
dropping power 10 percent to a GPU is only a 1 or 2 percent drop in frequency most of the time. These adjustments too are going to happen literally multiple times a frame, at 120 FPS it could shift power 3 or 4 times within ONE frame
 
Last edited:

jroc74

Phone reception is more important to me than human rights
No matter how you want to spin it, variable clocks are worse than constant clocks. People, myself included, want to see the affects that the variable clock rate has.
Well, you can see the affects of it right now.

Look at any game running on PS5. There's your affects.


I'm sure when DF and NXGamer do their analysis, some will say must be the clocks for any frame or resolution drop.

So, it doesn't really matter.
 

Yoboman

Member
and no one from Sony Santa Monica Twitter has confirmed that gow will be ps5 only.

They are very active but curiously quiet on it being a next gen only game.
They havent even confirmed the name of the game, or even the platform its on. There was no splash indicating even PS5. If it wasn't on the PS5 showcase we wouldnt even be able to assume it's on PS5. All we have is a logo and a year

I'd say you're jumping to conclusions on how quiet they are being
 

Tripolygon

Banned
The trouble is pc graphics cards list the base clock they will guarantee to hit then boost variably from there. What sony have done is stated the maximum boost clock the gpu will run at, then they variably downclock from there. This seems problematic. No matter how much spin they put on it a constant 10.2 tetaflops is better than clocking down at any time.
First of all, what in the world is a constant 10.2TF? And why is it better than clocking down at anytime?
 
Last edited:

Jemm

Member
I seriously don't understand Ubisoft teams, water is something they have almost perfected in Assasins Creed, WTH is this shit?

- Aren't they using the same engine?
- Don't they share their tech across their teams to deliver in better times?

Ubisoft has many different engines for their popular games, even though they look alike:
  • Dunia Engine (heavily modified CryEngine): Far Cry
  • AnvilNext: Assassins Creed, For Honor
  • Disrupt: Watch Dogs
  • Snowdrop: The Division
I suspect they share many techniques, but with so many teams across the globe and different release cycles, it'll take time to implement them.

Maybe they had to cut corners in Watch Dogs, to keep the performance better. Water is not so important element in that game, anyway, unlike in AC: Origins where water played much bigger role. Water effects looked much better:

uwHuRvM.gif
 

buenoblue

Member
What you are implying is utterly incorrect.
It doesn't even make sense from the perspective of how the GPU functions.
dropping power 10 percent to a GPU is only a 1 or 2 percent drop in frequency most of the time. These adjustments too are going to happen literally multiple times a frame, at 120 FPS it could shift power 3 or 4 times within ONE frame

Exactly, this is how every gpu and cpu works. Series x will up and downclock constantly. They up and downclock when needed or not needed. The fact that sony have needed to state variable clocks is a worry because it must be more than how it normally works. Like I said pc gpus state the minimum clock then variably boost up. Sony have stated the boost top end and not stated the lower end. At any given time my pc gpu can rely on a certain clock at any time. Ps5 it seems can not guarantee this.
 

bitbydeath

Gold Member
Exactly, this is how every gpu and cpu works. Series x will up and downclock constantly. They up and downclock when needed or not needed. The fact that sony have needed to state variable clocks is a worry because it must be more than how it normally works. Like I said pc gpus state the minimum clock then variably boost up. Sony have stated the boost top end and not stated the lower end. At any given time my pc gpu can rely on a certain clock at any time. Ps5 it seems can not guarantee this.

Have you noticed how people are constantly surprised that consoles can pull off incredible looking graphics such as God of War on a 1.84TF device which is literally impossible to do on a PC with the same spec/TF?

Well, that just goes to show that TF isn’t everything. Software (APIs) plays a huge role in taking advantage of the hardware and Xbox just moved their API’s further away from the console realm and closer to that of PC so that alone will have its craigs impacts.
 
Exactly, this is how every gpu and cpu works. Series x will up and downclock constantly. They up and downclock when needed or not needed. The fact that sony have needed to state variable clocks is a worry because it must be more than how it normally works. Like I said pc gpus state the minimum clock then variably boost up. Sony have stated the boost top end and not stated the lower end. At any given time my pc gpu can rely on a certain clock at any time. Ps5 it seems can not guarantee this.

Educate yourself before talking nonsense that was explained in this very thread about 1000 times already.
PS5 is using smartshift, the XSX isn't. That's why it was called out particularly. It allows an easier control of POWER DRAW which makes it easier to cool. This is important for hardware that is power limited like laptops and consoles. Basically a technology that improves APUs real world performance.

So about your last point, being able to rely on a certain clock at any time, yes, the PS5 will be able to rely on the max frequency any time it actually needs, based on workload. Once the workload is done by the GPU, that power can shift again (if needed) to the cpu rather than go to waste (which is what happens in the xsx and other more inefficient platforms).
 

Jemm

Member
Well, that just goes to show that TF isn’t everything. Software (APIs) plays a huge role in taking advantage of the hardware and Xbox just moved their API’s further away from the console realm and closer to that of PC so that alone will have its craigs impacts.
The DX12U API is more unified with PC, but that doesn't mean it is less low-level. Actually the inverse is true:
DirectX 12 Ultimate, largely unifies Windows PCs with the upcoming Xbox Series X platform, offering the platform's new precision rendering features to Windows gamers with supporting video cards.

Many of the new features have more to do with the software side of development than the hardware. The new DirectX 12 Ultimate API calls aren't just enabling access to new hardware features, they're offering deeper, lower-level, and potentially more efficient access to hardware features and resources that are already present.

Inline ray tracing is an alternate API that allows developers lower-level access to the ray tracing pipeline than DXR1.0's dynamic-shader based ray tracing. Rather than replacing dynamic-shader ray tracing, inline ray tracing is present as an alternative model, which can enable developers to make inexpensive ray tracing calls that don't carry the full weight of a dynamic shader call. Examples include constrained shadow calculation, queries from shaders that don't support dynamic-shader rays, or simple recursive rays.

Source: https://arstechnica.com/gaming/2020...e-brings-xbox-series-x-features-to-pc-gaming/
 
What people want to know is how often the PS5 will drop clock speeds of the CPU or GPU. It will absolutely 100% do it, otherwise they would have just gone with constant instead of variable.

1.-If you are dev with access to a dev kit you already know it because you need that kind of info
2.-In PS5 the clocks can drop depending of the kind workload your system is doing in that moment
because clock != power consume is more (clock*workload)=power consume
3.-The games are not stress test as many people think
4.-The locked clocks are are the norm in current/old consoles like PS4 and Xbox one X but are not invencible
5.-If you have a scene which stress too much your system with locked clocks, can shutdown it, see the issue
which happens with Watch Dogs Legion in Xbox one X
6.-Any decent game dev will not use Flops as a magical measure, only the marketing and fans/users use that metric
in so universal way

The conclusion could be: is okay to have curiosity but if you are not someone who really need that low info but doesn't know
is because you don't need it, as user you should only care results not more.
 
Last edited:

geordiemp

Member
So you won't be able to see anything....................but you will be able to?

What people want to know is how often the PS5 will drop clock speeds of the CPU or GPU. It will absolutely 100% do it, otherwise they would have just gone with constant instead of variable.

Well if you paid attention to the AMD presentation you will of known RDNA2 has fine gate frequency control likely down to WGP to acheive frequencies up to 2.5 GHz at a CU level. So each CU can boost.

What people want to know is why XSX is the only technology that does not have it. We will find out finally put to bed in the RDNA2 white paper as it cannot be hidden any longer.


ArT7Pn0.png


The larger 14 CU (7 DCU) shader array of XSX will also become clear when all RDNA2 hardware is 10 CU or less.

Here is the Stryx larger die (80CU) game clocks if anyone is interesed, ps5 die is much smaller and has liquid metal TIM so will cool better.

 
Last edited:

Alex Scott

Member
Exactly, this is how every gpu and cpu works. Series x will up and downclock constantly. They up and downclock when needed or not needed. The fact that sony have needed to state variable clocks is a worry because it must be more than how it normally works. Like I said pc gpus state the minimum clock then variably boost up. Sony have stated the boost top end and not stated the lower end. At any given time my pc gpu can rely on a certain clock at any time. Ps5 it seems can not guarantee this.
Again you are incorrect. XSX can't downclock just like the pervious consoles. It will stay at the given frequencies that is set by the Xbox team. It is the power draw that will go up or down depending on the given workload. Sony stated PS5 is variable because it is Xbox isn't. Xbox constant frequency variable power. PS5 constant power variable frequency.
Sony did state if they have to downclock in a worst scenario it will be minimum.
 
Last edited:

Rea

Member


Watch dogs: legion on Pc on rtx 3090,
Apparently, the game uses RT reflection with mixture of SSr. The game looks like shit even with ultra setting and ultra ray tracing running on rtx 3090. Compare to demon souls with unconfirmed RT in game, looks like generations ahead . Demon souls has dynamic lighting and self-shadowing, high dynamic range of colors. Watch dog legion looks awesome with rt reflection but texture quality and lighting looks shit.
 

Dibils2k

Member
They havent even confirmed the name of the game, or even the platform its on. There was no splash indicating even PS5. If it wasn't on the PS5 showcase we wouldnt even be able to assume it's on PS5. All we have is a logo and a year

I'd say you're jumping to conclusions on how quiet they are being
i mean its called God of War: Ragnarok

the whole point of that video was to reveal the name
 

buenoblue

Member
Educate yourself before talking nonsense that was explained in this very thread about 1000 times already.
PS5 is using smartshift, the XSX isn't. That's why it was called out particularly. It allows an easier control of POWER DRAW which makes it easier to cool. This is important for hardware that is power limited like laptops and consoles. Basically a technology that improves APUs real world performance.

So about your last point, being able to rely on a certain clock at any time, yes, the PS5 will be able to rely on the max frequency any time it actually needs, based on workload. Once the workload is done by the GPU, that power can shift again (if needed) to the cpu rather than go to waste (which is what happens in the xsx and other more inefficient platforms).

But xbox were able to cool a more powerful gpu in a smaller form factor with near silent accoustics? If ps5 was more powerful and smaller I could understand your argument that this was the only way to keep it quiet but clearly this is not the case.

Again you are incorrect. XSX can't downclock just like the pervious consoles. It will stay at the given frequencies that is set by the Xbox team. It is the power draw that will go up or down depending on the given workload. Sony stated PS5 is variable because it is Xbox isn't. Xbox constant frequency variable power. PS5 constant power variable frequency.
Sony did state if they have to downclock in a worst scenario it will be minimum.

Do you have proof of this? I highly doubt the series x will not adjust clocks at any time. What about just sitting on the home screen or watching Netflix? Is this gonna be 12 TF? Digital foundry have already measured different power draw for different games, indicating different clocks for cpu and gpu.

Look guys I have a ps5 on coming on pre order. Im not getting a series x because I have a pc. I just think we should be wary of what sony is saying about variable clocks as to me it doesn't seem great.
 

PaintTinJr

Member
Exactly, this is how every gpu and cpu works. Series x will up and downclock constantly. They up and downclock when needed or not needed. The fact that sony have needed to state variable clocks is a worry because it must be more than how it normally works. Like I said pc gpus state the minimum clock then variably boost up. Sony have stated the boost top end and not stated the lower end. At any given time my pc gpu can rely on a certain clock at any time. Ps5 it seems can not guarantee this.
It seems you are either unfamiliar with the DF interview Cerny gave - after the Road to Ps5 - or are wantonly peddling not 10.23TF just by a different MO.

The downclocking you are mentioning is when the silicon is idle. This is not the same as constant boost clocking at skyhigh clocks with active silicon staying within a constant power budget - by varying boost clocks deterministically based on CU utilisation.

Cerny's comments regarding how to optimise for this paradigm shift will only benefit setups that can boost like the ps5 GPU does. On older designs it will save power draw via the fixed clock, whereas on the new design power remains constant and those workloads get boost clocked in the same way lighter workloads did.
 
Status
Not open for further replies.
Top Bottom