• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel Core i9-12900K Alder Lake CPU "Destroys" AMD Ryzen 9 5950X In Single-Core & Mult-Threaded Benchmark Leak

Biggest change in W11 is "Intel Thread Director", as if the old Windows Schedulers weren't bad enough.
Works wonder for Alder Lake, see:





But it fucks with other CPUs, SPECIALLY AMD CPU's, SPECIALLY crippling one of Ryzen's biggest strengths that is the L3. One example:

Read from 898GB/s to 136GB/s (-85%)
Write from 565GB/s to 51GB/s (-91%)
Copy from 701GB/s to 66GB/s (-91%)
Latency from 11.2ns to 32.1ns (-187%)

This happens even on Ryzens with just one CCD, this is totally Microsoft's fault with Intel blessing (it comes in time for both Alder Lake and Ryzen 3D!)
What AMD can do? Ask Intel for the source code "to fix" it?
The solution here will be Intel releasing an update to force W11 to not use this Thread Disruptor with other CPUs.
 
Last edited:

Chiggs

Gold Member

Intel Core i9-12900K (Alder Lake) Versus Ryzen 9 5900X (Zen 3)​

Several of the leaked benchmarks we've seen so far indicate that Alder Lake will give even AMD's top Zen 3 CPU (Ryzen 9 5950X) a run for its money, depending on the workload. SANDRA tells a different story. Looking at the same graph above, here is how the Core i9-12900K compares to AMD's second-best Zen 3 chip, the Ryzen 9 5900X...

  • Quad Float: 53.07 (Alder Lake) / 49.47 (Zen 3)= Zen 3 by 7.3 percent
  • Double Float: 1,097.06 (Alder Lake) / 1,190 (Zen 3) = Zen 3 by 8.5 percent
  • Single Float: 1,958.74 (Alder Lake) / 2,000 (Zen 3) = Zen 3 by 2.1 percent
  • Quad Integer: 131.13 (Alder Lake) / 157 (Zen 3) = Zen 3 by 19.7 percent
  • Long: 691.90 (Alder Lake) / 805 (Zen 3) = Zen 3 by 16.3 percent
  • Integer: 1,691.13 (Alder Lake) / 2,000 (Zen 3) = Zen 3 by 18.3 percent
These are clear victories by Zen 3 over Alder Lake, half of which come by double-digit percentage gains. The biggest win is by nearly 20 percent (Quad Integer).

By the way, the Ryzen 9 5900X is a 12-core/24-thread CPU with a 3.7GHz base clock, 4.8GHz max boost clock, and 64MB of L3 cache. It is roughly comparable to the Core i9-12900K in core and thread counts, though obviously they are two very different architectures.

This is an interesting comparison for sure. Like every other leak, we'll have to wait for Alder Lake to arrive before knowing if these results stand up to post-launch testing by reviewers and the general public at large. Fun times are ahead.

Time to pump the brakes on the hype train?
 
Last edited:

SantaC

Member
Biggest change in W11 is "Intel Thread Director", as if the old Windows Schedulers weren't bad enough.
Works wonder for Alder Lake, see:





But it fucks with other CPUs, SPECIALLY AMD CPU's, SPECIALLY crippling one of Ryzen's biggest strengths that is the L3. One example:

Read from 898GB/s to 136GB/s (-85%)
Write from 565GB/s to 51GB/s (-91%)
Copy from 701GB/s to 66GB/s (-91%)
Latency from 11.2ns to 32.1ns (-187%)

This happens even on Ryzens with just one CCD, this is totally Microsoft's fault with Intel blessing (it comes in time for both Alder Lake and Ryzen 3D!)
What AMD can do? Ask Intel for the source code "to fix" it?
The solution here will be Intel releasing an update to force W11 to not use this Thread Disruptor with other CPUs.

The issue has already been fixed and is rolling out
 

STARSBarry

Gold Member
OK... so here's the thing, with that power draw what are we looking at to cool it? I don't mean from a "oh it's fine just stick a block on it and where done" I'm talking how fast will the fans have to go to move that heat off the CPU? Because all I can imagine unless your using a custom loop is effectively a jet engine at that draw.

More power = more heat, alot more at the more recent gens. Even triple fan 360mm AIO's had to ramp up last gen at the higher frequencies due to that generations draw, what are going to be looking at now?
 
Last edited:


Time to pump the brakes on the hype train?
I can't wait to know the truth. I think Intel will kick ass with these new CPUs. If they are barely able to beat Ryzen 3 or even be outclassed by them it will be a huuuuge disappointment.

Edit: someone in the comments said the E cores are not used in this test at all, don't know if true. If true AMD is in trouble.
 
Last edited:

Papacheeks

Banned
I can't wait to know the truth. I think Intel will kick ass with these new CPUs. If they are barely able to beat Ryzen 3 or even be outclassed by them it will be a huuuuge disappointment.

Edit: someone in the comments said the E cores are not used in this test at all, don't know if true. If true AMD is in trouble.
They will be fine.
 

GreatnessRD

Member
OK... so here's the thing, with that power draw what are we looking at to cool it? I don't mean from a "oh it's fine just stick a block on it and where done" I'm talking how fast will the fans have to go to move that heat off the CPU? Because all I can imagine unless your using a custom loop is effectively a jet engine at that draw.

More power = more heat, alot more at the more recent gens. Even triple fan 360mm AIO's had to ramp up last gen at the higher frequencies due to that generations draw, what are going to be looking at now?
At this point we'll have to look at LN2 with these power draws and the talk of the next set of GPUs to power up to 600W. :messenger_tears_of_joy:
 

PhoenixTank

Member
Just google search AMD Win 11 Fix and you'll find tons of articles with both MS and AMD knowing of the issue and promising a fix in October
Appreciated and you're right but none of those links I've seen have said it is already fixed and that the solution is rolling out already.
 

CuNi

Member
Just google search AMD Win 11 Fix and you'll find tons of articles with both MS and AMD knowing of the issue and promising a fix in October

As PhoenixTank said, acknowledging the issue and "rolling out a fix" are two vastly different things.

They plan on releasing a fix in October, but that doesn't mean it will come in October. Could be November, could be December, could be any time in 2022 either or never.

So far its only officially acknowledged that it is indeed a bug and the involved parties will look into a way to fix it. Nothing more and nothing less.
 

ToTTenTranz

Banned
Welp...Intel definitely too greedy

I don't think Alder Lake consuming a lot of power is greedyness.

They don't have a product that can compete in power efficiency, so they're clocking their big cores well past their ideal power/performance curves so they can compete in performance. AMD did exactly that with Vega and Polaris (RX4xx/5xx) against Nvidia's Pascal.

That's not a huge problem when dealing with tower desktops IMO, but it doesn't look like Alder Lake will be able to compete very well against AMD's Cezanne (Zen3 + Vega), and most likely Rembrandt (Zen3 + RDNA2), in the laptop market.
 

FireFly

Member
I don't think Alder Lake consuming a lot of power is greedyness.

They don't have a product that can compete in power efficiency, so they're clocking their big cores well past their ideal power/performance curves so they can compete in performance. AMD did exactly that with Vega and Polaris (RX4xx/5xx) against Nvidia's Pascal.

That's not a huge problem when dealing with tower desktops IMO, but it doesn't look like Alder Lake will be able to compete very well against AMD's Cezanne (Zen3 + Vega), and most likely Rembrandt (Zen3 + RDNA2), in the laptop market.
According to Intel, the E-Core consumes 80% less power than Skylake at the same performance level, comparing 4C4T to 2C4T.


So they may be able to compete in power efficiency, just not power efficiency and absolute performance.
 

ToTTenTranz

Banned
According to Intel, the E-Core consumes 80% less power than Skylake at the same performance level, comparing 4C4T to 2C4T.


So they may be able to compete in power efficiency, just not power efficiency and absolute performance.
The LITTLE/e-core is an Atom core. It's fine for light tasks (e.g. office & e-mail, light web browsing, watching youtube, etc.) but once the system loads up a game it'll need to use the big / p-core. That's when the power constraints of a laptop come into play, and where Alder Lake's p-cores will need to use a lot lower clocks than the desktop counterparts.
Zen 3's APUs (with less L3) seem to use around 3W per core when clocked at 3-3.5GHz, and it seems the Golden Cove p-cores can't reach anywhere near that level of power efficiency.
 

FireFly

Member
The LITTLE/e-core is an Atom core. It's fine for light tasks (e.g. office & e-mail, light web browsing, watching youtube, etc.) but once the system loads up a game it'll need to use the big / p-core. That's when the power constraints of a laptop come into play, and where Alder Lake's p-cores will need to use a lot lower clocks than the desktop counterparts.
Zen 3's APUs (with less L3) seem to use around 3W per core when clocked at 3-3.5GHz, and it seems the Golden Cove p-cores can't reach anywhere near that level of power efficiency.
Well, it's an "Atom" core with 8% better single threaded performance than Skylake (!), that can deliver up to 80% better MT comparing 4C4T w/2C4T. And Skylake-level performance is already adequate for gaming, so I would be curious to see how the versions of Alder Lake with fewer P-Cores do.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not


I'm just hope there's gonna be a K version of the leaked i5 12400 (6+0 configuration), as it'll easily be the best bang for the buck, by a huge margin.

What do K chips bring to the table these days?

Back in Sandybridge when you went from 3.3GHz to 5.0GHz that was worth the price of admission alone.

But today K chips barely boost over the stock chips and even with added power/overclock potential you gain near negligible gains.

I highly doubt there will be a xx400K.
We havent had a K 400 chips before and havent needed 400Ks cuz they do nearly everything the 5/600K chips can do at a bargain price.
 

Xyphie

Member
The most important aspect of "overclocking" for gaming today is basically being able to adjust RAM timings/frequency to decrease RAM latency and being able to increase CPU power limits so the CPU can boost longer/indefinitely. As long as you can do that the difference between running your CPU at 4.5 or 5GHz is pretty trivial.
 

ZywyPL

Banned
What do K chips bring to the table these days?

Back in Sandybridge when you went from 3.3GHz to 5.0GHz that was worth the price of admission alone.

But today K chips barely boost over the stock chips and even with added power/overclock potential you gain near negligible gains.

I highly doubt there will be a xx400K.
We havent had a K 400 chips before and havent needed 400Ks cuz they do nearly everything the 5/600K chips can do at a bargain price.

Yeah, in high-end models you're able to add maybe 100-200MHz extra compared to stock and that's it, but in lower tier models like the leaked one with just 4GHz clock that extra 1GHz would make a nice difference. There's gonna be a 12600K as per the leaks, but that's 6+4 configuration, and I'm not really interested in those Atom cores at all, I'd prefer an 8+0 i7. But who knows, maybe there will be enough faulty leftovers from the production that Intel will offer such CPU with disabled Atom cores later on.
 

Xyphie

Member
I suspect 8+0, 6+0, 4+0 will exist, but only as Xeon E SKUs, maybe with AVX512 enabled as they wouldn't need instruction parity anymore.
 

ZywyPL

Banned


Reportedly 330W with OC, so right in time for winter season.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not


Reportedly 330W with OC, so right in time for winter season.

Higher than the 10900K when only overclocking the big cores?

Noctua stock about to sky rocket once everyone realizes they are the bare minimum to keep Alderlake within reason.
Hope NZXT and Arctic have new baseplates ready for their AIOs.

If Intel can price these things correct and NOT say the motherboard is only useful for one generation.
Ill be back in team Intel come November.
 

KAL2006

Banned
How much is the power consumption of this in comparison to 5800X. I'm just about to build a 3070Ti build PC and but the 3070Ti is already power hungry so not sure to add even more power hungry parts in especially with the cost of electricity going up in the UK.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
How much is the power consumption of this in comparison to 5800X. I'm just about to build a 3070Ti build PC and but the 3070Ti is already power hungry so not sure to add even more power hungry parts in especially with the cost of electricity going up in the UK.
You want to get a 12900K....it fights with the 5950 and 5900 and the price will show that.
The competish for the 5800X is i7s.

Intel generally eats more power than AMD at full tilt.
But for gaming they will be pretty close to each other.
The CPU is never truly at full tilt while gaming. (or atleast for now there arent really games that truly stress a CPU like AIDA, Prime or other "stress" tests which benchmarkers use to get absolute peaks.
power-gaming.png


My advice wait for the 12600 or better yet the 12400, that still leaves you with an upgrade path if that CPU is somehow a bottleneck for you.

On/Of Topic
The 12400 looks like its gonna be king of the hill again (400) series for gamers.
Overclocking is dead, memory gains are useless.
Long live budget Intel CPUs:
wzTH4jrNJKIo5eIe.jpg

^The gen on gen gains Intel seem to have made are actually pretty insane.
Heres to hoping the reviews are good.....the leaked prices already have me scratching my head.
Intel you have this shit in the bag.....just price your chips at Ryzen level or a few dollars above...dont entire tier increase the price, else ill be sticking to the 10th gen till they are legit bottlenecks.....in 2030.
 

KAL2006

Banned
Yeah I'm definitely going to go intel 12 gen as the chip will come out in the next month. As well as a motherboard that supports PCIE 5.0 but I can't seem to find any news of these 5.0 motherboards coming out soon but thats my ideal purchase so my PC is future proof in terms of CPU and Motherboard. In 5 years time hopefully I'd just need a GPU and RAM upgrade.
 

lils

Member
Yeah I'm definitely going to go intel 12 gen as the chip will come out in the next month. As well as a motherboard that supports PCIE 5.0 but I can't seem to find any news of these 5.0 motherboards coming out soon but thats my ideal purchase so my PC is future proof in terms of CPU and Motherboard. In 5 years time hopefully I'd just need a GPU and RAM upgrade.

wait for independent benchmarks before making a decision. current info is based on "leaks", which i suspect intel has done deliberately to hype up their product.

intel has a horrible reputation for manipulating benches to make them look good:

 
Last edited:

KAL2006

Banned
wait for independent benchmarks before making a decision. current info is based on "leaks", which i suspect intel has done deliberately to hype up their product.

intel has a horrible reputation for manipulating benches to make them look good:



Yeah I'm in no rush making my build around after black Friday so plenty of time for reviews and previews
 
The other day the CEO had said that AMD was already done, that Intel was back.
But yesterday on the finalcial report he said that Intel expects to achieve "performance per watt parity in 2024 and leadership in 2025".
Maybe he was talking about Apple?
 

winjer

Gold Member
The other day the CEO had said that AMD was already done, that Intel was back.
But yesterday on the finalcial report he said that Intel expects to achieve "performance per watt parity in 2024 and leadership in 2025".
Maybe he was talking about Apple?

Can you post the source.
I would like to read on it.
 

Chiggs

Gold Member
Last edited:

Chiggs

Gold Member
On/Of Topic
The 12400 looks like its gonna be king of the hill again (400) series for gamers.
Overclocking is dead, memory gains are useless.
Long live budget Intel CPUs:
wzTH4jrNJKIo5eIe.jpg

^The gen on gen gains Intel seem to have made are actually pretty insane.
Heres to hoping the reviews are good.....the leaked prices already have me scratching my head.
Intel you have this shit in the bag.....just price your chips at Ryzen level or a few dollars above...dont entire tier increase the price, else ill be sticking to the 10th gen till they are legit bottlenecks.....in 2030.

That's gonna get creamed by the v-cache 5000 series in early 2022.
 

Kenpachii

Member
You want to get a 12900K....it fights with the 5950 and 5900 and the price will show that.
The competish for the 5800X is i7s.

Intel generally eats more power than AMD at full tilt.
But for gaming they will be pretty close to each other.
The CPU is never truly at full tilt while gaming. (or atleast for now there arent really games that truly stress a CPU like AIDA, Prime or other "stress" tests which benchmarkers use to get absolute peaks.
power-gaming.png


My advice wait for the 12600 or better yet the 12400, that still leaves you with an upgrade path if that CPU is somehow a bottleneck for you.

On/Of Topic
The 12400 looks like its gonna be king of the hill again (400) series for gamers.
Overclocking is dead, memory gains are useless.
Long live budget Intel CPUs:
wzTH4jrNJKIo5eIe.jpg

^The gen on gen gains Intel seem to have made are actually pretty insane.
Heres to hoping the reviews are good.....the leaked prices already have me scratching my head.
Intel you have this shit in the bag.....just price your chips at Ryzen level or a few dollars above...dont entire tier increase the price, else ill be sticking to the 10th gen till they are legit bottlenecks.....in 2030.

Thats a mighty high jump forwards 768. that's about 50% faster then 9900k single core performance. This is getting tempting specially for RT solutions.
 
Last edited:

DonkeyPunchJr

World’s Biggest Weeb
You want to get a 12900K....it fights with the 5950 and 5900 and the price will show that.
The competish for the 5800X is i7s.

Intel generally eats more power than AMD at full tilt.
But for gaming they will be pretty close to each other.
The CPU is never truly at full tilt while gaming. (or atleast for now there arent really games that truly stress a CPU like AIDA, Prime or other "stress" tests which benchmarkers use to get absolute peaks.
power-gaming.png
Thanks, this is the kind of comparison I’m really interested in. Most sites just measure the “all cores full load” peak power consumption + idle consumption but that doesn’t give you a realistic idea of the power/heat you will be dealing with while gaming.

I’m especially interested in what this will look like with efficiency cores in the mix. Wondering if we might even see gaming scenarios where Alder Lake average wattage is lower than Zen 3 even as peak wattage is higher.
 

Kenpachii

Member
Thanks, this is the kind of comparison I’m really interested in. Most sites just measure the “all cores full load” peak power consumption + idle consumption but that doesn’t give you a realistic idea of the power/heat you will be dealing with while gaming.

I’m especially interested in what this will look like with efficiency cores in the mix. Wondering if we might even see gaming scenarios where Alder Lake average wattage is lower than Zen 3 even as peak wattage is higher.

Most of the cores aren't used, and only 50% used most of the time on your average gaming solution. so that's the most relevant data. However overclockers will most likely want to run those 8 big cores at top speed for gaming solutions.

I would not be shocked that people will disable the smaller cores just to up the bigger cores performance.

So to see the actual gaming performance, we need to see it oced with those small cores disabled.
 
Last edited:
Looks like Alder Lake is now going to be power efficient :lollipop_smiling_face_eyes:



But today is was confirmed, Intel made the PL1 state obsolete and equaled it to the PL2 at 210W.
Intel is not trusting their own "efficiency improvement" claims, they're giving it as much power as the boards can to get on top.
 
Top Bottom