• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD supplied review guide shows 7950X3D being barely faster than 13900K. 13900K ~10% faster than 7950X. (RTX 4090, 1080p).

Rentahamster

Rodent Whores
It was true for the 10900K generation, unless you believe everyone was buying 2933 RAM or lower, in 2020...
Just because someone is arguing against your unrealistic depiction of the extreme high end, doesn't mean they are arguing for your unrealistic depiction of the extreme low end. Be reasonable.
 
You're wasting your time. He's butthurt because they didn't use 7200MTs memory for the 13900K and makes stupid claims that no one pairs a 13900K with 6000MTs memory because it "gimps it" when the difference at 1080p according to HU is 3%. At 4K, it would probably be less than 1 or 2%, making purchasing 7200MTs memory a waste of money.
I mean, if 3% is nothing.. then 6% is also nothing.

That said 300 watt cpus should be laughed at, intel fail again.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Does Techspot not count? They tested more games and found scenarios where RAM does make a difference...

Why is "best" in quotes? Best value? Or actual best?

7200C34 is beating 6400C32 in the review you chose (you left out the Metro and Far Cry results).
I must have missed the article you posted it??
Link it again and lets see the massive gains 7000+ is making over low latency 6000.

Best in quotes because its the best value and in gaming the lower latency kit will perform equal or better than the higher latency higher speed kits....which in effect makes them the best outright.

I literally mentioned Far Cry in my post....did you want me to link every single image?
 
I watch MLID on YouTube (I'm sure Leonidas is a big fan of the channel and subs too) and when he did a video about 7000-series X3D performance the comments were overwhelmingly about it being a total slam dunk over 13900K and KS.

I was actually one to say that wasn't gonna be the case, and they would be close to each other in gaming performance (7950X Vs 13900KS). Where it would be a massacre would be in efficiency, where Intel's Frankenstein 13900K will get absolutely embarrassed.

After all, one CPU drawing ~100W less while offering 5% gaming performance is the deal sealer for most of everyone, and far more important than a hypothetical where one CPU is 10% (instead of 5%) faster but draws the same power.

The inconvenience (heat, cooling requirements, noise of fans, energy cost) of the 13900K, let alone that of the 13900KS, makes it a total dud in comparison. Like you couldn't even pay me to put up with the above issues caused by Intel basically pushing their archaic design way, way past it's optimal power curve.
 

LiquidMetal14

hide your water-based mammals
AMD is throwing in free RAM though with at least their 7000x series right now and probably the new ones as well. That changes the cost structure.

If you have a Microcenter near you could get quite a good price on the overall bundle once these are released.
Link to this promo son. I saw the one for Jedi Survivor and I changed my mind, I'm going for the 7950x3d. I'm just gonna spend the extra 100 and feel less guilty later. I have DDR5 6000mhz mem coming tomorrow but if I can get the same or better free, I'm in like a fart in church.
 

Leonidas

Member
I must have missed the article you posted it??
Link it again and lets see the massive gains 7000+ is making over low latency 6000.

The second place graph is DDR5 6400C32
zfaw63r.png


Then there is this. Basically the same margins, 6000 low latency helps AMD the most, but it is not the best choice for Intel...

ah35g2W.png



Best in quotes because its the best value and in gaming the lower latency kit will perform equal or better than the higher latency higher speed kits....which in effect makes them the best outright.

I literally mentioned Far Cry in my post....did you want me to link every single image?
But your review shows 7200C34 faster than 6400C32 overall, though some results seemed bound in other ways... a small mention isn't enough. I'd rather you linked the article so I wouldn't have to take 5 seconds to search it.
 

StereoVsn

Member
Link to this promo son. I saw the one for Jedi Survivor and I changed my mind, I'm going for the 7950x3d. I'm just gonna spend the extra 100 and feel less guilty later. I have DDR5 6000mhz mem coming tomorrow but if I can get the same or better free, I'm in like a fart in church.
Currently x3d are not for sale, but if you go to Microcenter, you can see free RAM for 8 to 16 core 7000x CPUs.
 

Celcius

°Temp. member
I know the 13900k runs hot and would likely need a 360mm AIO, but with the 7950x3d using so much less wattage would it be fine with a noctua nh-d14? Or would that not be so great for that cpu either if pushing it with all cores running?
 

LiquidMetal14

hide your water-based mammals
I know the 13900k runs hot and would likely need a 360mm AIO, but with the 7950x3d using so much less wattage would it be fine with a noctua nh-d14? Or would that not be so great for that cpu either if pushing it with all cores running?
I would imagine but I'm not taking any chances. I've got the Arctic Freeze II 360mm and am also doing push/pull.
 

//DEVIL//

Member
I know the 13900k runs hot and would likely need a 360mm AIO, but with the 7950x3d using so much less wattage would it be fine with a noctua nh-d14? Or would that not be so great for that cpu either if pushing it with all cores running?
No it wont. and it will still run hot btw. just wait for the reviews and see.

I9 runs hot when you are doing benchmark that using all cores. you do not all cores when it comes to gaming so its leat by default.
 

64bitmodels

Reverse groomer.
Without the 3DVCache AMD themselves were saying they were ~10% slower than the 13900K. Unless AMD starts putting out 3DVCache from the start they will have to do a lot of work to outperform them next time (unless Intel stagnates again...).
GIF by Election 2016


They've got a literal revolutionary technology on their hands yet they're still so careful with it and shit
if 3d vcache caused an improvement that great with the 5800x they should immediately have started developing AM5 and Ryzen 7000 with 3d vcache in mind.
 
Last edited:

DaGwaphics

Member
^ Maybe they figure they would have a hard time competing on price with it, so they lead with the standard models. Or, maybe they would have a harder time with volume production so they need a percentage of the chips to not have it.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not

The second place graph is DDR5 6400C32
zfaw63r.png

Mate are you fucking slow or something.
Thats a 3% difference.
Something everyone in this thread has been saying.
Thats not a gotcha moment or anything, its literally everyone's point.
You are acting like higher clock memory transforms the 13900K into a new beast...and fools NOT using 7200 kits are wasting their CPU. None of that is true.....the difference is margin of error, literally not noticeable to anyone.

7000+ high latency vs 6000-6400 lower latency wont yield you much better results....if better at all.
And the low latency ~6000 kit will be cheaper than even CL34 7200 kit let alone a CL32 7200.


Then there is this. Basically the same margins, 6000 low latency helps AMD the most, but it is not the best choice for Intel...

ah35g2W.png

And where exactly am i gonna get a CL32 7200 Kit for anywhere near the price of a CL30 6000........to gain that glorious 3% that is "gimping" the 13900K?

But your review shows 7200C34 faster than 6400C32 overall, though some results seemed bound in other ways.
df2da37278e0270d873015fb5613e57a.jpg

Of course its bound in other ways.


So as is right now there is no test showing us this mega advantage you are talking about.
And all our knowledge about RAM is clearly wrong amarite?
Speed + Latency yields Gaming Performance. (this is known, yes you can overcome both, up the speed to make up for high latency, tighten timings, lower the latency to make up for low speed)
Of course a low latency and higher speed kit will beat the next kit with higher latency and a lower speed......but right now we are looking at whats generally available and actually compatible.
Currently 7000+ kits are generally higher latency for alot more money than kits that out perform them with lower speeds and much lower latencys.
CL38 is the most common 7200 RAM kit you can get....its basically twice the price of the CL30 6000 kit.

When you do the calculations:
A 7200CL36 kit and a 6000CL30 kit will have the same gaming performance.
And to make sure you dont think im just shitting on high speed kits for the fuck of it.
A 6000CL30 kit and a 5600CL28 kit will have the same gaming performance.

However a fundamental difference between these two comparison is that the 6000CL30 and 5600CL28 are very close in price.
The 7200CL36 kit is near double the price.
Why spend more to NOT really gain anything.
If your motherboard and chips are up to the task of doing 7200CL30 then jump on that no doubt.
But in the current real world there is no massive gaming advantage to that higher latency higher speed kit.
 

Leonidas

Member
Mate are you fucking slow or something.
Thats a 3% difference.
Something everyone in this thread has been saying.
Thats not a gotcha moment or anything, its literally everyone's point.
You are acting like higher clock memory transforms the 13900K into a new beast...and fools NOT using 7200 kits are wasting their CPU. None of that is true.....the difference is margin of error, literally not noticeable to anyone.
I've been saying it too, in this thread. I'm not changing goal posts. It's 3% faster on average, cutting AMD's guided claims in half.
How am I acting like it transforms the 13900K into a beast when even I said it's 3% multiple times now in this thread?
When you do the calculations:
A 7200CL36 kit and a 6000CL30 kit will have the same gaming performance.
And to make sure you dont think im just shitting on high speed kits for the fuck of it.
A 6000CL30 kit and a 5600CL28 kit will have the same gaming performance.
Using your logic 4800C24 will have the same performance as 7200C36
Using your logic 3200C16 will have the same performance as 7200C36

That's simply not the case. Some games benefit from the extra frequency.

However a fundamental difference between these two comparison is that the 6000CL30 and 5600CL28 are very close in price.
The 7200CL36 kit is near double the price.
Why spend more to NOT really gain anything.
If your motherboard and chips are up to the task of doing 7200CL30 then jump on that no doubt.
But in the current real world there is no massive gaming advantage to that higher latency higher speed kit.
7200 is expensive, but 6800C34 was $200 when I checked yesterday, $50 more than 6000C30. And it will perform better in games that can use the extra frequency.
 

Gaiff

SBI’s Resident Gaslighter
I mean, if 3% is nothing.. then 6% is also nothing.

That said 300 watt cpus should be laughed at, intel fail again.
6% isn't much. I usually draw the line at 5%. I don't mind his claim that they should test the 13900K under its best possible configuration. I take issue with his claim that "no one pairs a 13900K with 6000MTs memory" because 7200MTs with higher latency is 1% faster at 4K and 3% at 1080p.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Intel needing 50% more power to be slightly slower is kinda catastrophic if you think about it:

https://videocardz.com/newz/amd-ryz...-core-i9-13900k-a-summary-of-16-reviews-shows
Honestly anyone spending over $350 on a CPU for a pure gaming rig deserve their electricity bill to be high.

  • 7700X
  • 5800X3D
  • 13600K

Are the most expensive CPUs anyone should even consider for a pure gaming rig.....going beyond these and thats all on you.
The 7800X3D gets an honorable mention cuz its probably gonna get a price cut soonish and fall right in that sweetspot of price performance, where spending anymore makes no sense cuz you gain so little for your dollars.


If you are buying a workstation machine that just so happens to game, we can all safely assume that machine is actually an investment and makes more than it cost to run so who gives a shit about power draw.
 
Last edited:
Top Bottom