• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[NX Gamer] AC: Valhalla - XsX vs PS5 - The first REAL head to head battle.

geordiemp

Member
Are you stupid or are you deliberately being dense?
Do you think you know better than Mark Cerny and developers?
A 10% drop in power only yealds a few percent drop in frequency. So what do you think the PS5 GPU will drop down to 1.8ghz???
It won't come to that because its already been explained to you but you just love spreading fud.
Yet easily make excuses as to why series X isn't performing in Valhalla as it's theoretical spec sheet suggests

I have him on ignore as its not worth trying to explain to people who have not a clue what they are talking about. To do high clocks and variable frequency means fine gated clocks at the CU level, its a different silicon design.

And you cant have 14 CU shader arrays as silicon path is long, all fast designs have < 10 CU for a reason.

For PC parts that are not binned, and frequencies are prevasive spread (vary across DCU) for performance. XSX has none of this and is a staple of RDNA2 white paper coming out soon. Below is stryx 6800 XT



The daft only understand simple numbers like TF and its embarrasing.

The RDNA2 PC 40 CU part is even high frequency lol
 
Last edited:

Sejanus

Member
Great video even it's in german



This show the Xbox series x as the best looking version


z5fFuyY.gif
 

Md Ray

Member
Different products but same dna base from tsmc 7nm. It is a pity Amd cannot get 6800 to run continous boost clocks at 2.1ghz, will give it a nicer 18.5Tf instead of 16Tf.

I wonder if Phil can, to make SX a bit taller, wider, say 35x35x35, slap in a bigger vapor chamber, a bigger 380W psu, and push it to variable boost clocks, say 2.1ghz, will give it a nicer round 14Tf, and of course a higher tide raises all SX boats too.

But It is early days to call out on humiliation imo. :messenger_moon: :messenger_ok:
I'm gonna get an RDNA 2 GPU with big copper heatsink, apply liquid metal onto the chip, increase the power limit, voltage, increase the frequency as much as possible. Basically, I'm gonna overclock the shit out of it. From now on, I also like running the GPU at a higher frequency since Cerny also likes running his GPU at higher frequency. :pie_ssmiling:
 
Last edited:

Bo_Hazem

Banned
I'm gonna get an RDNA 2 GPU, apply liquid metal onto the chip, increase the power limit, voltage, increase the frequency as much as possible. Basically, I'm gonna overclock the shit out of it. From now on, I also like running the GPU at a higher frequency since Cerny also likes running his GPU at higher frequency. :pie_ssmiling:

The elephant in the room...

2020-04-23-image-9.jpg
 

longdi

Banned
I'm gonna get an RDNA 2 GPU with big copper heatsink, apply liquid metal onto the chip, increase the power limit, voltage, increase the frequency as much as possible. Basically, I'm gonna overclock the shit out of it. From now on, I also like running the GPU at a higher frequency since Cerny also likes running his GPU at higher frequency. :pie_ssmiling:

This may sound troublesome, but most PCMR already doing so with their GPU. My 1080Ti is overclocked, liquid cooled, to 2Ghz 'sustained', most of the time, newer engines do tend to drop it down to 1.9Ghz.

I havent done liquid metal because i dont need the extra 4-5c temps drop for the extra efforts.

PCMR :messenger_moon: :messenger_ok:
 
Last edited:

Sejanus

Member
I believe (for both of them) is pourly optimized.
Last gen engine, without rt and both of them can't keep native for less than two seconds (in crowd area).
 

longdi

Banned
Are you stupid or are you deliberately being dense?
Do you think you know better than Mark Cerny and developers?
A 10% drop in power only yealds a few percent drop in frequency. So what do you think the PS5 GPU will drop down to 1.8ghz???
It won't come to that because its already been explained to you but you just love spreading fud.
Yet easily make excuses as to why series X isn't performing in Valhalla as it's theoretical spec sheet suggests

I have shown you the Big Navi designs from Amd website.
Go read up on boost clocks vs game clocks. 🤷‍♀️

How much is few percents? I dont know. Mark left all of us hanging when it comes to the good stuffs.
Let see Big Navi results. Let see Amd unannounced 6700 series.

Until then, we have to refer to Amd websites on boost vs game clocks, which i screen shot earlier.
There were crazy fanboi rumors that 6800 and 6900 runs >2ghz game clocks but those rumors were destroyed on 28 Oct 2020. :lollipop_bomb:
 
Last edited:

Schmick

Member
The circle jerking around these parts is bloody embarrassing to read..how old are some of you lot, I'm guessing late teens ....early twentys?
Please don't tell me your older, that would just about finish off my last ounce of faith in the gaming community.

Yours sincerely, grumpy old fucker.
I agree, I'm kinda embarrassed to even have a NeoGaf log in at the moment.
 
:messenger_tears_of_joy:
no games i want to play to justify tearing down my PC.
Maybe waiting for 3080Ti and FF16 . :messenger_waving::messenger_moon:
I’m waiting for midgen to upgrade mine. currently the nextgen consoles are the best value. But in around 2-3 years, there should be a new breakthrough in gpu architecture. I’m looking to upgrade by then.
 

rnlval

Member
I have him on ignore as its not worth trying to explain to people who have not a clue what they are talking about. To do high clocks and variable frequency means fine gated clocks at the CU level, its a different silicon design.

And you cant have 14 CU shader arrays as silicon path is long, all fast designs have < 10 CU for a reason.

For PC parts that are not binned, and frequencies are prevasive spread (vary across DCU) for performance. XSX has none of this and is a staple of RDNA2 white paper coming out soon. Below is stryx 6800 XT



The daft only understand simple numbers like TF and its embarrasing.

The RDNA2 PC 40 CU part is even high frequency lol

XSX GPU's clock speed is base clock speed with boost mode disabled. MS already stated their reasons for disabling boost mode and it's about reducing programming complexity.

The elephant in the room...
BSOD is associated with kernel-level drivers. Kernel-level drivers are not memory protected like userland apps and games. Programming at Windows NT kernel drivers level is almost like programming in AmigaOS i.e. you need to know what you are doing, or the whole OS can crash.

Linux has its "kernel panic".
 
Last edited:
Why on earth are they showing PS5 and SX at different resolutions, even stating the PS5 is Lower than XboxOne X?
Disingenuous comparison trying to pander the few hard-core fanboys who preached for 2 years xsx will destroy ps5 in comparison and yet once the results were in they started their 5 stages of grief . They r in denial phase . Still few more stages before acceptance. 👀😅😅
 
Last edited:

rnlval

Member
Different products but same dna base from tsmc 7nm. It is a pity Amd cannot get 6800 to run continous boost clocks at 2.1ghz, will give it a nicer 18.5Tf instead of 16Tf.

I wonder if Phil can, to make SX a bit taller, wider, say 35x35x35, slap in a bigger vapor chamber, a bigger 380W psu, and push it to variable boost clocks, say 2.1ghz, will give it a nicer round 14Tf, and of course a higher tide raises all SX boats too.

But It is early days to call out on humiliation imo. :messenger_moon: :messenger_ok:
AIBs has their factory OC RX 6800 models or it's another RX 5700 being re-flashed with RX 5700 XT BIOS like scenario.

Along with RTX 3080 TI 20GB AIB OC(doubles as Blender 3D HW RT rig), I plan to purchase RX 6800 XT AIB OC for my second gaming PC.

Current RTX 2080 Ti 11 GB OC --> RTX 3080 TI 20GB OC
Current RTX 2080 8 GB OC --> RX 6800 XT 16 GB OC


RX 6800 has the potential since it's the same "NAVI 21" ASIC as it's higher SKUs.
 
Last edited:
Disingenuous comparison trying to pander the few hard-core fanboys who preached for 2 years xsx will destroy ps5 in comparison and yet once the results were in they started their 5 stages of grief . They r in denial phase . Still few more stages before acceptance. 👀😅😅

Who are GamersGlobal, are they an Xbox fan channel? I am just asking because I do not know. But it is very fishy that they would make such a disingenuous comparison at a time like this. Again, I am just asking because it is very weird!
 
Who are GamersGlobal, are they an Xbox fan channel? I am just asking because I do not know. But it is very fishy that they would make such a disingenuous comparison at a time like this. Again, I am just asking because it is very weird!
Tim dog seemed abit preoccupied the oast few days. Hmmmm🤔🤔😅
No one knows them hence the do shit like this to gain followers of that camp
 

MCplayer

Member
This :

Performance basically identical

Dynamic resolution counted as low as 1440p on XsX and 1620p PS5 but he thinks it drops as low as 1440p too. Basically identical

- Both are quite good at keeping 60FPS (in the area he tested)
- PS5 is vsynced, XSX is not
- Denamic res on both with ~1440p as the lowest limit (seen on XSX)
- PS5 loads 12% faster LOL Power of the SSD
uhhh ok, thanks but I've seen it already xd
 

geordiemp

Member
XSX GPU's clock speed is base clock speed with boost mode disabled. MS already stated their reasons for disabling boost mode and it's about reducing programming complexity.


BSOD is associated with kernel-level drivers. Kernel-level drivers are not memory protected like userland apps and games. Programming at Windows NT kernel drivers level is almost like programming in AmigaOS i.e. you need to know what you are doing, or the whole OS can crash.

Linux has its "kernel panic".

No wrong again, what are you taliking about - boost mode for CPU lol

SMT is disabled for older games, hence the 3.6 GHz (SMT) and 3.8 GHz (no SMT)

You remind me of misterxmedia.
 
Last edited:

rnlval

Member
No wrong again, what are you taliking about - boost mode for CPU lol

SMT is disabled for older games, hence the 3.6 GHz (SMT) and 3.8 GHz (no SMT)

You remind me of misterxmedia.
No, you're wrong again. Both PS5 and XSX have their respective fixed frequency modes. Sony elected to use the boost mode for thier PS5

I'm just paraphrasing MS's argument for fixed frequency mode.

I'm not misterxmedia , your speculation is wrong.
 
Last edited:

geordiemp

Member
No, you're wrong again. Both PS5 and XSX have their respective fixed frequency modes. Sony elected to use the boost mode for thier PS5

I'm just paraphrasing MS's argument for fixed frequency mode.

What on earth are you on about, ps5 is 3.5 Ghz with variable clocks and smart shift.

RDNA2 also uses fine gated frequency control acros the chip on PC parts, sony just extended the concept here to include the CPUs and be deterministic based on workload.

Are you trying to say there is a hidden boost in XSX or something else ? As I have really no idea what your on about lol

The results are out, your theories are all wrong, its too late for made up stuff.
 
Last edited:

longdi

Banned
:messenger_beaming:
AIBs has their factory OC RX 6800 models or it's another RX 5700 being re-flashed with RX 5700 XT BIOS like scenario.

Along with RTX 3080 TI 20GB AIB OC(doubles as Blender 3D HW RT rig), I plan to purchase RX 6800 XT AIB OC for my second gaming PC.

Current RTX 2080 Ti 11 GB OC --> RTX 3080 TI 20GB OC
Current RTX 2080 8 GB OC --> RX 6800 XT 16 GB OC


RX 6800 has the potential since it's the same "NAVI 21" ASIC as it's higher SKUs.
I wonder if you can bios unlock 6800 like the old days...i got lucky 2 times before :messenger_beaming:

But seeing amd these days need to maintain their bloated share prices, i dont think so
 

rnlval

Member
What on earth are you on about, ps5 is 3.5 Ghz with variable clocks and smart shift.

RDNA2 also uses fine gated frequency control acros the chip on PC parts, sony just extended the concept here to include the CPUs and be deterministic based on workload.

Are you trying to say there is a hidden boost in XSX or something else ? As I have really no idea what your on about lol

The results are out, your theories are all wrong, its too late for made up stuff.
You have forgotten PS5 has a fixed frequency mode along with boost mode configuration.

I recall
1. MS was able to perform late clock speed changes with XBO software update.
2. AMD was able to perform late clock speed changes with 7950 via 7950 BE BIOS update which enabled boost mode.

I'm just paraphrasing MS's argument for fixed frequency mode.

Most AMD APU's clock speeds can be changed with software.
 

geordiemp

Member
You have forgotten PS5 has a fixed frequency mode along with boost mode configuration.

I recall
1. MS was able to perform late clock speed changes with XBO software update.
2. AMD was able to perform late clock speed changes with 7950 via 7950 BE BIOS update which enabled boost mode.

I'm just paraphrasing MS's argument for fixed frequency mode.

Most AMD APU's clock speeds can be changed with software.

What a load of rubbish, clock speeds or architecture wont be changed on consoles now.
 
Last edited:

rnlval

Member
What are you on about ?

All PC parts in RDNA2 and Ps5 use 10 CU per shader array (this has nothing to do with binning and disabling).

There is a reason for that - go figure it out.
Again, disabled CU is useless for software.

At full ASIC extent, PS5's SE to CU ratio is the same as the fully enabled NAVI 10 and NAVI 21.
 

John Wick

Member
I have shown you the Big Navi designs from Amd website.
Go read up on boost clocks vs game clocks. 🤷‍♀️

How much is few percents? I dont know. Mark left all of us hanging when it comes to the good stuffs.
Let see Big Navi results. Let see Amd unannounced 6700 series.

Until then, we have to refer to Amd websites on boost vs game clocks, which i screen shot earlier.
There were crazy fanboi rumors that 6800 and 6900 runs >2ghz game clocks but those rumors were destroyed on 28 Oct 2020. :lollipop_bomb:
Exactly you don't know. Stop trying to be an expert and leave that to actual system designers and game developers. This isn't the PC part. It's been custom designed.
 

rnlval

Member
:messenger_beaming:
I wonder if you can bios unlock 6800 like the old days...i got lucky 2 times before :messenger_beaming:

But seeing amd these days need to maintain their bloated share prices, i dont think so
Past examples

HD 6950 with HD 6970 BIOS re-flash, access to high clock frequencies, or re-activated CU bonus. I have done this BIOS update.

7950 with 7970 BIOS re-flash, access to higher clock frequencies. I have re-flash my Sapphire HD 7950 810Mhz with the Sapphire HD 7950 900Mhz model. I didn't attempt 7970 BIOS re-flash.

R9-290X AIB OC into R9-390X re-flash, access to slightly higher clock frequencies. I have done this BIOS update.

RX 470 4GB with RX 480 8GB BIOS re-flash, access to additional power curve profiles and higher frequencies, or re-activated additional 4 GB VRAM bonus. I have skipped RX 400/RX 500 series.

5700 with 5700 XT BIOS BIOS re-flash, access to additional power curve profiles, and higher frequencies. I have skipped RX 5700 series.

The major problems with the GPU BIOS OC method are the VRAM profile, but RX 6800 has the same 256 bit GDDR6-16000 configuration as with higher RX 6800 XT and 6900 XT SKUs. OC also benefits 128 MB L3 cache.
 
Last edited:

rnlval

Member
Exactly you don't know. Stop trying to be an expert and leave that to actual system designers and game developers. This isn't the PC part. It's been custom designed.
PS4 with Linux X86 jailbreak shows X86 PC underpinnings. PS4 doesn't have MS ACPI tables, so Linux X86 source code that uses MS's ACPI tables needs to be modified. Sony has its own ACPI config.

Under Linux x86 (aka Lintel distro type), PS4 GPU is treated as any other GCN version 1.1 GPU (slower Hawaii GCN with 18 CU scale) and Steam games work on it, but the CPU is slow i.e. nearly useless for modern desktop apps.

Modern Windows NT type OS wouldn't work without compliant MS ACPI config embedded in the BIOS (refer to Designed for Windows logos).

PS5's 8 cores Zen 2 CPUs + 16 GB memory + RDNA 2 based GPU 36 CU scale + UHD Blu-ray ROM drive ... would be useful for cheap full Linux desktop distro with Libre Office + Steam Linux games and 'etc'.

Jailbreaking the DRM is the major barrier for installing Lintel disto on PS5.
 
Last edited:

Thirty7ven

Banned
Why on earth are they showing PS5 and SX at different resolutions, even stating the PS5 is Lower than XboxOne X?

Because they are either using wrong information that was speculated before or mixing Watch Dogs with Valhala on PS5.

They aren’t even trying to do an analysis in the video, it’s very low effort and unfortunately it’s spreading false information.
 
Last edited:

geordiemp

Member
You guys are setting yourself up for major embarrassment when the table inevitably flips. I will enjoy watching your meltdowns soon.

I wont have any meltdown, I said 3-4 months ago they would be similar with a few percent either way based on workload and engine after seeing XSX hotchips and an idea of what ps5 is from the patents and Cerny.

So nothing has changed IMO.

Will XSX run some stuff or modes better, yes of course as slow and wide will benefit some workloads - maybe 4k instead of 1440p or whatever workload feels more comfortable on XSX ...

But it will be minimal either way, there is no 17 % difference is my point (the simplistic TF believers), which is a win for Cerny doing what he did with 36 CU and a smaller chip.

No fan of ps5 was ranting about power, we just blowing off steam after having to listen to the TF power rants for 6 months.
 
Last edited:
Top Bottom