• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Dusk Golem reiterates that the Xbox will be more powerful than the PS5 (Admitted to starting console wars, demodded)

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Planar reflections you have to render scenes twice. You are not rendering what isn't even visible on screen. Same limitation as SSR, except more costly.

Cube maps are your only argument, but like VFX pointed they reflections are too good a quality to be that

Either way, alex wasn't refering to cubemaps. So your argument is irrelevent.

So you agree that it IS possible to reflect something static thats offscreen using Planar, Cubemaps or to put it more bluntly noneRT techniques........for instance.......a building?
Im glad we agree.


This was his main point.
There is no argument:
 
So you agree that it IS possible to reflect something static thats offscreen using Planar, Cubemaps or to put it more bluntly noneRT techniques........for instance.......a building?
Im glad we agree.


This was his main point.
There is no argument:


Sure, for cubemaps. Not SSR pr planar, you're still wrong there. But unless you're going to try and argue that cubemaps can look as good as the reflections in that image then you'll have to forgive me for laughing at you

Basically just another poor analysis from Alex.
 
Last edited:

Ar¢tos

Member
Considering RE8 has problems even at 1080p I think 4K dynamic will be a great result.

PS5 GPU has variable clock, deal with it.
So edgy!
There is absolutely no data that the reason for that, if it is even true, is due to smartshift, but still, very edgy!

If a variable clock gives me this:
spider-man-miles-morales-ps5.original.jpg


When a fixed clock gives me this:
halo-infinite-brute-community-director-news.jpg

(I think... It's not even a Devkit capture, xsx Devkits seem to be harder to find than pink unicorns)

I'll take the variable clock any day.
 
A drop of 10% on either CPU or GPU saves 27% power according to the lead architect Mark Cerny. So how much would a 5% drop save? Maybe your not listening? No game in existence taxes the GPU 100% all the time.
I am pretty sure that cerny's very vague example was not what you are saying.


So edgy!
There is absolutely no data that the reason for that, if it is even true, is due to smartshift, but still, very edgy!

If a variable clock gives me this:
spider-man-miles-morales-ps5.original.jpg
is that gameplay? or is that photomode?
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Sure, for cubemaps. Not SSR pr planar, you're still wrong there. But unless you're going to try and argue that cubemaps can look as good as the reflections in that image then you'll have to forgive me for laughing at you

Basically just another poor analysis from Alex.

Why NOT planar reflections?
This is their job they are taxing sure, but the jump to PS5 could cover the cost.
But anyway that isnt/wasnt my point.....my point was that the screenshot at a glance by itself if we didnt know anything about the game, and/or hadnt seen the trailer doesnt seem to be showing off anything that screams RT, which I think everyone agrees that it would be hard to discern a specific feature of the screenshot that couldnt be done with Rasterization techniques.



And you are gonna laugh at me if I tell you its possible to have super highres Cubemap/Reflection probe reflections that look as good as what we are seeing in this screen shot?

Are you sure you really want to do this...are you sure sure.
 

pawel86ck

Banned
So edgy!
There is absolutely no data that the reason for that, if it is even true, is due to smartshift, but still, very edgy!

If a variable clock gives me this:
spider-man-miles-morales-ps5.original.jpg


When a fixed clock gives me this:
halo-infinite-brute-community-director-news.jpg

(I think... It's not even a Devkit capture, xsx Devkits seem to be harder to find than pink unicorns)

I'll take the variable clock any day.
Halo Infinite looks disappointing, but that's not because of hardware. Even Gears 5 looks more impressive.
 

Zathalus

Member
I am pretty sure that cerny's very vague example was not what you are saying.
It's not wrong though. Pushing frequency to the limit has a massive effect on power draw. The Radeon 5700xt for example, pushing the clocks up by 10% increase power draw up by almost 30%.

It's not a linear relationship, even a 50mhz drop in frequency can net big drops in power draw.
 

Hairsplash

Member
Any game that has a ps4pro patch SHOULD run faster on a ps5. So IMO the ps5 has a at least a 2 year “lead” due the ps4pro being “ Essentially“ the same. What I mean by essentially is that the PS five has the same number of shader units the PS4 Pro has but the PS five has twice the speed CPU and twice the speed GPU and fancy SSD... this it should run ps4pro games on a ps5 faster, mooting any advantage that the series X Xbox will have in the first two years... not to mention that the Lockhart version Xbox is going to be similar to the Xbox 1 X... so in my opinion almost all games will be made “ Especially “ for the Lockhart version of the Xbox... and IMO the Xbox series X will be a Halo Xbox (pun intended)
 
It's not wrong though. Pushing frequency to the limit has a massive effect on power draw. The Radeon 5700xt for example, pushing the clocks up by 10% increase power draw up by almost 30%.

It's not a linear relationship, even a 50mhz drop in frequency can net big drops in power draw.
I've already written many times that it is not a linear relationship, and especially after a threshold the curve both on thermals and consumption becomes very-very steep.
That's one of the reasons I find it so funny that many people believe that "worst case game" will have the GPU throttling back from a 2.23 to a 2.16, which is exactly what the (very misinforming IMO) cerny's example was.
Keep in mind these consoles are not the poor netbook tech current gen was.




This is gameplay :

Fake edit: I posted a screenshot of a ps4 game by mistake... My bad!
cool story bro.
what about that spiderman image though?
 
Last edited:

Ar¢tos

Member
I've already written many times that it is not a linear relationship, and especially after a threshold the curve both on thermals and consumption becomes very-very steep.
That's one of the reasons I find it so funny that many people believe that "worst case game" will have the GPU throttling back from a 2.23 to a 2.16, which is exactly what the (very misinforming IMO) cerny's example was.
Keep in mind these consoles are not the poor netbook tech current gen was.





cool story bro.
what about that spiderman image though?
Ask Insomniac. The game is not out so I can't obviously tell you. But nothing in that image seems too much quality for it not to be gameplay. Outside of the reflections, it looks just ok IMHO.
 

Zathalus

Member
I've already written many times that it is not a linear relationship, and especially after a threshold the curve both on thermals and consumption becomes very-very steep.
That's one of the reasons I find it so funny that many people believe that "worst case game" will have the GPU throttling back from a 2.23 to a 2.16, which is exactly what the (very misinforming IMO) cerny's example was.
Keep in mind these consoles are not the poor netbook tech current gen was.
Why should we believe that GPU down clocking should exceed 50-70Mhz for demanding use (obviously it can clock way lower if the GPU is not needed as much)? We already know such a small down clock can net back a large portion of the power budget, and we have confirmation of such from both the lead architect and multiple developers. What more confirmation do we need?
 
Last edited:
Why should we believe that GPU down clocking should exceed 50-70Mhz for demanding use (obviously it can clock way lower if the GPU is not needed as much)? We already know such a small down clock can net back a large portion of the power budget, and we have confirmation of such from both the lead architect and multiple developers. What more confirmation do we need?
"We" as in me personally, would like a whole lot more information and hands-on testing.
Also I have written many times before that if the required downclocking would be something as trivial as a few percentage points, sony would have locked the frequency there and call it a day.
Don't you think so?

IMO it can go only two ways: either my (admittedly incomplete because of missing parameters) calculations are way off, or a huge amount of people is going to get a surprise sometime in the near future
 

Zathalus

Member
"We" as in me personally, would like a whole lot more information and hands-on testing.
Also I have written many times before that if the required downclocking would be something as trivial as a few percentage points, sony would have locked the frequency there and call it a day.
Don't you think so?

IMO it can go only two ways: either my (admittedly incomplete because of missing parameters) calculations are way off, or a huge amount of people is going to get a surprise sometime in the near future
Why lock the frequency when you can use dedicated silicon to change the GPU/CPU power allocation to drive clocks as high as possible on a near millisecond basis? Remember, this is not just for the GPU, the system manages the CPU and other parts of the SOC as well. A GPU does not need to be stressed to 100% usage all the time, even in the most demanding titles, so having dedicated processing power to move around the power budget on the fly opens up a lot of flexibility compared to a system without it does.

If Microsoft had included SmartShift and additional power monitoring silicon like Sony has done then the Xbox Series X could have likely been able to boost the clocks even further. Of course they went a different approach for pure stable clocks. I cannot say which approach works out to be better for the SOC size and cost.

Sony likely took this variable approach to squeeze as much possible performance out of a 36 CU GPU, it's also why the console is as large as it is; for the patented cooling system they have implemented (which may include liquid metal on the die).
 
And that tiny, zoomed in piece is sufficient to definitively claim that this must be RT and not a cube map reflection? Definitely possible, but not that obvious, IMO.
So he is certainly stupid enough to work for DF, but not say somerhing like " that this picture is highly compressed and I cant reach a conclusion on it?" Instead he will deny it's Raytracing because its his xbox discord confirmation bias kicking in. Like a dog in heat, his first reaction is to dismiss it with the same vigor.
 
Last edited:

Rentahamster

Rodent Whores
So he is certainly stupid enough to work for DF, but not say somerhing like " that this picture is highly compressed and I cant reach a conclusion on it?" Instead he will deny it's Raytracing because its his xbox discord confirmation bias kicking in. Like a dog in heat, his first reaction is to dismiss it with the same vigor.
He literally said he wants a better picture to confirm. He didn't make a proclamation one way or the other in his tweet.
 
Why lock the frequency when you can use dedicated silicon to change the GPU/CPU power allocation to drive clocks as high as possible on a near millisecond basis? Remember, this is not just for the GPU, the system manages the CPU and other parts of the SOC as well. A GPU does not need to be stressed to 100% usage all the time, even in the most demanding titles, so having dedicated processing power to move around the power budget on the fly opens up a lot of flexibility compared to a system without it does.

If Microsoft had included SmartShift and additional power monitoring silicon like Sony has done then the Xbox Series X could have likely been able to boost the clocks even further. Of course they went a different approach for pure stable clocks. I cannot say which approach works out to be better for the SOC size and cost.

Sony likely took this variable approach to squeeze as much possible performance out of a 36 CU GPU, it's also why the console is as large as it is; for the patented cooling system they have implemented (which may include liquid metal on the die).
I think there has been enough of discussions about why devs prefer fixed things to work on, no need to make repetitions.
about cost, I think everyone will tell you that the one with the less cus is cheaper.
about heat, and all that will come from it, the fact that a smaller board needs what must be the largest enclosure yet in consoles, is a reasonable first answer.
as I said many times, for me, a ps5 a stripdown and some hands-on testing are way more needed than for xbox.
not that I wont watch hot chips tonight of course... :]
 
Last edited:

Polygonal_Sprite

Gold Member
Technically you can pull off marketing bull shots with photomode. The final product should look very similar to that.

Plus its just a better looking PS4 game with next gen features. I don't see why it's so hard to believe.

Considering most people will be playing it at 1080p to play at 60fps there’s no way that image quality is representative of the game. I imagine RT will also be stripped In the 60fps mode too.
 
Considering most people will be playing it at 1080p to play at 60fps there’s no way that image quality is representative of the game. I imagine RT will also be stripped In the 60fps mode too.



I don't care about what others game at. I have a 4K monitor so I'll be able to play the game at 30FPs with that image quality if I want.

There's nothing dishonest with letting people know what the visuals can look like at their best.
 
Last edited:
A

Anal Wake

Unconfirmed Member
That Silent Hill exclusivity deal for PS5 is a load of shit, if you ask me. Just wishful thinking. I seriously doubt that Sony would be willing to pay as much as Konami would like for a - let’s be real here - classic but dead franchise.
 
Last edited by a moderator:

Ar¢tos

Member
That Silent Hill exclusivity deal for PS5 is a load of shit, if you ask me. Just wishful thinking. I seriously doubt that Sony would be willing to pay as much as Konami would like for a - let’s be real here - classic but dead franchise.
Konami has no interest in gaming anymore, if they can make money with licensing IPs without having to do any effort, why wouldn't they do it?
And licensing doesn't necessarily involve a fixed price, it can be a profit percentage, so the risk is lower for Sony and Konami will always get some money.
 

FunkMiller

Gold Member
That Silent Hill exclusivity deal for PS5 is a load of shit, if you ask me. Just wishful thinking. I seriously doubt that Sony would be willing to pay as much as Konami would like for a - let’s be real here - classic but dead franchise.

If the deal did exist (but you’re probably right) it would most likely have been more of a percentage of profit based one than loads of cash up front from Sony.

Edit: Arctos beat me to it 😋
 
Last edited:

JonnyMP3

Member
I'm not sure it's dead franchise when P.T. has caused more hype in 5 years than the main SH series has done in the previous 10.
 

martino

Member
I'm not sure it's dead franchise when P.T. has caused more hype in 5 years than the main SH series has done in the previous 10.

We will never know but what would be the hype for a PT demo without Kojima/Del toro assiciated to it ?
 
Last edited:
Top Bottom