• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD's Ryzen 3000 Review Thread: More Cores, More Threads.

Absolutely true. Honestly i cant make my mind up on what to get really all those ryzen cpu's are god tier products atm. 3700 sits at 9900k core counts, 3600 on 8700k core counts and they are cheap as hell on top of it.

Intel will have to drop a new series of CPU's sooner then later or drop the prices drastically to keep being valid. Atm the only edge intel has is on high core counts and emulation but that's it.

Intel's 'counter' to this release is yet another 14nm+++++ refresh in 2020. Then another refresh of old Skylake after that. Let that sink in.
 
Will the next zen be on a new socket or will we have one more for AM4 after Ryzen 3000?
AMD promised AM4 socket will be supported through 2020, so Zen 2+ (Ryzen 4000) should be on AM4. After that, probably Socket AM5 for Zen 3 core.
Buying into X570 on AM4 now might be good if you're planning a video card upgrade in 2020, I expect Ampere will be PCIe 4.0.
 

Xyphie

Member
Will the next zen be on a new socket or will we have one more for AM4 after Ryzen 3000?

Should still be AM4 as Milan (Zen 3 for servers) will still be DDR4 and on SP3. Specific boards could very well drop support for 4000 series though.
 

llien

Member
For those who planned to go with Ryzen 3600, at 1440p very slim gap between 5700XT ($399) and 2070S ($499) athat AIB's will likely cover.

 
Last edited:

Kenpachii

Member
So hows the clocks doing for ryzen atm, it seems like nobody hits the advertised clocks with those cpu's. any progress on this?

Currently deciding if i should go with a 3600 or 3600x. but i see even on the 3600x nobody hitting that 4,4ghz they advertise. Anybody can chip in on this?
 
Last edited:

llien

Member

Conclusion
We were curious to see how well a cheap B350 motherboard could run the latest and greatest 12-core, 24-thread Ryzen 9 3900X and were pleasantly surprised that there was no smoke, or sparks flying — it just works. Actually, I'm impressed with how well this two-generation-old platform can run the newest processor—thank you, AMD. Intel would have definitely charged us for a new chipset—twice.

Using the latest BIOS update, which adds Ryzen 3000 processor compatibility, setup was a breeze, no major issues to report. All AMD BIOSes use a software component called "AGESA", which is provided by AMD to the motherboard vendors. All the vendors have to do now is add their motherboard specific features and branding—the core software remains unchanged and is controlled by AMD. This not only takes tremendous work off the motherboard vendors, it also enables AMD to centrally design, develop, test, and release updates. Such an approach not only improves software quality, it also pushes new capability to all motherboards without the motherboard vendor having to get involved, which is almost magical.

We were surprised to see the full range of overclocking, tuning, and tweaking options appear in this old motherboard's BIOS. This also goes for memory support. With the memory controller located inside the processor and AMD's AGESA providing the framework for memory settings, initialization, training, and compatibility, many memory modules that previously had issues will run infinitely better now. Can I use the word "magical" again?

Not everything is peachy, of course, as high-end motherboards are expensive for a reason. They come with seriously overbuilt VRMs, which work more efficiently and spread the heat generated during voltage conversion out over a wider area by using more components. These motherboards also include more premium heatsinks to transport the heat away. We took a closer look at VRM temperatures and were shocked that they can reach up to 140°C when B350 is paired with a powerful processor. This sounds bad at first, but the solution is: don't use watercooling.
 
Last edited:
There are people who understand how numbers work.
Then there are those who don't.
Then there are straws.
Then there are people who won't even listen to what AMD themselves said about this comparison...

Nobody's arguing the numbers. People are arguing that you can't compare them to each other because you have NO way of knowing how the tests were performed, if there were any hardware differences which might explain worse performance on one compared to another, if they tested the same area in the game...did they all use the same API... how they gathered their data...
 

llien

Member
Nobody's arguing the numbers. People are arguing that you can't compare them to each other...
That is exactly what you are not following: WHICH numbers you can't compare to each other.
You can't compare 5700 on one CPU, to 2070 on another and nobody is doing that either.
 

sol740

Member
I've read there's little OC headroom on the 3600, but I ordered one anyway. Looking forward to seeing how far I can push it, but even the out of the box benchmarks are impressive at 200 beans. I have a Noctua nh-u12s so I won't be using the Wraith, curious where it'll be stable.
 
That is exactly what you are not following: WHICH numbers you can't compare to each other.
You can't compare 5700 on one CPU, to 2070 on another and nobody is doing that either.
God you're fucking stupid.

Each source is testing a single CPU with 3 different GPUs. They aren't testing 3 different CPUs with the same GPU to see if one performs better than the other.

EuIEgds.png


If you look at that chart and compare the same color bars with each other.. THE ONLY THING that confirms is that AMD cards perform better in this title no matter the CPU. If you try to compare the 3 colored bars over each GPU YOU CAN'T because they were all done by different sources...

You're not comparing the same GPU performance under different CPUs... Each CPU is from a different source... meaning the specs are different, the testing method is different, bioses could be different.. hell, I even checked and the versions of Windows they are using are different....

I mean.. if you can't understand that then there's no helping you... You can't compare anything in that chart other than to say that the AMD GPUs perform better in Tomb Raider than their Nvidia counterparts no matter the CPU.
 
Last edited:

llien

Member
Each source is testing a single CPU with 3 different GPUs.

Ok. So this:
Vega VII
3900x - 119,0
9900k - 111,3
8700k - 108,3

5700XT
3900x - 115,0
9900k - 109,8
8700k - 110,3

2070
3900x - 93,0
9900k - 102,8
8700k - 104,1



Becomes this:

3900X

Vega VII - 119,0 +28%
5700XT - 115,0 +24%
2070 - 93,0

9900k
Vega VII - 111,3 +8%
5700XT - 109,8 +6%
2070 - 102,8


Skipped 8700k, although it is even worth pic, Vega h
Now, if you make "only strictkly true" statements, you can't even conclude that Vega VII is indeed faster than 2070, perhaps there is another test, you know, in another scene and settings (hairworks anyone?) in which 2070 is faster.
And if you apply common sense and don't notice that AMD drivers, perhaps, run better on AMD CPU while nvidia's on Intel, oh, well, you should have a better explanation on why GPU gap on AMD CPU is bigger than on Intel's. Whatever.
Flame away.
 
Becomes this:

3900X

Vega VII - 119,0 +28%
5700XT - 115,0 +24%
2070 - 93,0

9900k
Vega VII - 111,3 +8%
5700XT - 109,8 +6%
2070 - 102,8

You can't do that... because they are different sources man... LMAO. How fucking dense can you be? How do you know they are comparing the same section of the game exactly? How do you know they have the same updates and latest drivers? The same testing methodology?

The ONLY thing you can say from what you just did there... was that AMD GPUs perform better in Shadow of the Tomb Rader on both configurations in each of the tests.

You're trying to compare different sources with each other... you can't do that. Which is why even Robert Hallock was quick to shut that shit down. Nothing about what you've posted says ANYTHING about AMD performing better with AMD. It's saying that one source tested an area which AMD cards performed much better than their Nvidia counterpart... and another source tested an area which AMD cards only performed a bit better than their Nvidia counterpart using completely different methods.

Now I'm not saying that it's not possible that AMD cards wont perform better with AMD CPUs... but that chart certainly doesn't prove that at all... and should completely be dismissed from any attempt to. Cmon man... can you imagine if Anandtech made some graphs comparing them like that from all the different sources out there and made the claim that AMD performed better with AMD? They'd be crucified.. Just use your head a little bit.
 
Last edited:

llien

Member
You can't do that... because they are different sources man...
GPU gap is larger on the AMD CPU.
This is a HARD FACT
, with no "muh, bah doh ough".
There is no sensible explanation to it, besides the one I've mentioned in my previous post.

"Bah mah specael port of da scenee" defense can be applied to anything, including 2070 losing altogether. Maybe there is some other area of the game, where 2070 beats VII, who knows, eh?
The only any benchmark make sense at all is exactly that it happens next to never.
 
Last edited:
GPU gap is larger on the AMD CPU.
This is a HARD FACT
, with no "muh, bah doh ough".
There is no sensible explanation to it, besides the one I've mentioned in my previous post.
There IS a sensible explanation to it... it's called DIFFERENT FUCKING SOURCES.

You're so stupid it hurts. "pie_tears_joy:
 

xPikYx

Member
I was waiting for these new AMD processors, tgey are good for everything but gaming, major issue even if the gap is not that huge, i9 9900k can be be oveclocked on all cores to 5ghz and the gap increases pretty much, honestly for now, the combo i9 9900k + MB is cheaper than the combo 3900x + MB, but I do understand everyone has different needs and different stories, so AMD processors are definitely great processors and the most recommendable, the question is always, what do you do and plan to do in future with your PC? The answers then is easy
 

Kenpachii

Member
Why Ryzen is the best Gaming CPU....?





Also, Streaming + Gaming......

Gaming whilst having other workloads running....No contest..


I don't follow this guys logic.

Benchmarks are there to showcase you what your CPU can or can't do. If your CPU is bottlenecked with a 1080 that means the CPU already sits at its peek at the moment you buy it and when next year a budget gpu gets released that does 2x the performance of that 1080 u will struggle with your CPU as a result while the other cpu that could have had no issue's with that gpu could have been a better option for 5 bucks more or same price. ( basically you need to see the 2080 ti performance wise as performance u can expect from future budget gpu's )

But that isn't showing up if you test with "what people got gpu's". So you have no clue how good that cpu really is as result

Then there are other metrics like cores etc and price points.

His logic only holds up if you buy a entire PC at the same time and replace everything in a few years again which mostly isn't the case when it comes to PC's.

So i find his statements a little bit simplistic and inaccurate and no it doesn't make ryzen the best gaming CPU. It just makes it the best for buck cpu's.

The best gaming cpu is still the 9900k and some would argue a 9600k ( if no more threads are used to 6, and a 9700k if no more threads are used then 8 )

Sadly the 9600 is limited that even current days give issue's it seems with the 6 cores, and 9700 being so overpriced that the only CPU's intel really has is the 8700k and 9900k in my book that have any legs longer then this current generation.

Now to get the same legs going on ryzen with worse performance the prices are far more cheaper to get.

A 3600 already has the same core counts at 8700k, it performs worse but will stay valid just as long for half the price. That's massive and in my book makes intel look extremely overpriced or simple not offering anything this generation of cpu's that validates it.
 
Last edited:
I was waiting for these new AMD processors, tgey are good for everything but gaming, major issue even if the gap is not that huge, i9 9900k can be be oveclocked on all cores to 5ghz and the gap increases pretty much, honestly for now, the combo i9 9900k + MB is cheaper than the combo 3900x + MB, but I do understand everyone has different needs and different stories, so AMD processors are definitely great processors and the most recommendable, the question is always, what do you do and plan to do in future with your PC? The answers then is easy

Lol nonsense post, 5% gap at 1080p and 2% at 1440p using a 2080 Ti to boot. That's it, the 3900X is more or less the same as the 9900K in gaming.

Secondly, now that they're tied in gaming for all intents and purposes, the goalposts have moved. Suddenly everyone can clock their 9900K to 5ghz+ to enjoy some small gaming perf gap over the competition 😆 Since when has OC'ing a 9900K to 5Ghz on all cores been a sensible thing to do 24/7? Should insecure 9900K owners do it just to say they got more epeen than 3900X owners?! Nice one.

Overclocking all 8 cores on a 9900K furnace will result in massive power draw and heat, it's already running at it's limit. Even then, the average gap will be under 10% still using a larger suite of games like TPU's bench suite, something like 8%, as the 9900K already has a 4.7Ghz all-core and 5Ghz single core boost numps.
 
Last edited:

llien

Member
TPU benches have shown there is next to no diff in perf between x470 an x570 boards.
Even B350 works just fine (they tried it with 3600).

Chipset power consumption went from around 5 watts for x470 to about 10 watts with x570. Figure is still rather low, but warrants a cooler, that most of the time could be idling.

x570 brings vs x470
more USB3 ports
PCIe4 - no impact on GPUs (TPU tested 5700/XT), but NVMe SSDs could use that extra bandwidth.
 

Kenpachii

Member
TPU benches have shown there is next to no diff in perf between x470 an x570 boards.
Even B350 works just fine (they tried it with 3600).

Chipset power consumption went from around 5 watts for x470 to about 10 watts with x570. Figure is still rather low, but warrants a cooler, that most of the time could be idling.

x570 brings vs x470
more USB3 ports
PCIe4 - no impact on GPUs (TPU tested 5700/XT), but NVMe SSDs could use that extra bandwidth.

Seems like the reason to buy a better motherboard is for future upgrades. I bet my ass off that 4000 series is going to grind some vrms to dust so buying a cheap motherboard could bite you in the arse later on.

is this now a good time to upgrade from a 4790k i7 ?

Unless you need the cores or features or got a top end gpu and want to push all the performance into 144hz 1080p gaming, or do emulation. there is no need. That cpu is holding up pretty well atm. Better of sitting it out unitl your cpu drops below 60 fps in games. From what is saw even ac origin sits at 60+ fps.
 
Last edited:

Armorian

Banned
Seems like the reason to buy a better motherboard is for future upgrades. I bet my ass off that 4000 series is going to grind some vrms to dust so buying a cheap motherboard could bite you in the arse later on.



Unless you need the cores or features or got a top end gpu and want to push all the performance into 144hz 1080p gaming, or do emulation. there is no need. That cpu is holding up pretty well atm. Better of sitting it out unitl your cpu drops below 60 fps in games. From what is saw even ac origin sits at 60+ fps.

Even ~5GHz 7700K drops frames in Alexandria
 

Kenpachii

Member
Even ~5GHz 7700K drops frames in Alexandria


Ah that area seems to kill any CPU besides the absolute top end. Not sure if its worth upgrading tho instead of just dropping a few settings.

Game looks good tho. Love egypth time period, maybe i should revisit it when my new hardware comes in.
 
Last edited:

LordOfChaos

Member
Does someone have a link to a chiplet configuration breakdown by product SKU? I'm curious how many of them are split into core modules, where the cutoff is before the cores become monolithic (apart from the IO die), and are any of them unequal numbers of cores between the two chiplets?

The 8+8 is obvious, but was anything below it split?
O8bTJO1bP4b3wUZc.jpg


I haven't paid enough attention to this aspect, just got curious
 
Last edited:
Does someone have a link to a chiplet configuration breakdown by product SKU? I'm curious how many of them are split into core modules, where the cutoff is before the cores become monolithic (apart from the IO die), and are any of them unequal numbers of cores between the two chiplets?

The 8+8 is obvious, but was anything below it split?
O8bTJO1bP4b3wUZc.jpg


I haven't paid enough attention to this aspect, just got curious

3950X is 2x8
3900X is 2x6
3800X and 3700X are 1x8
3600X and 3600 are 1x6

This creates some interesting situations for the people using the 6-core CCX variants (3900X, 3600X, 3600) with the PS3 emulator RPCS3 as an example:


This issue is significant (and entirely unsolvable) enough that people who are really wanting to do PS3 emulation should actually consider not getting the 3900X or 3600X/3600, instead choosing to get the 3950X or 3800X/3700X. Or....just getting the 9900K or 9700K. It's a really fascinating example of how a CPU's design is just fundamentally incompatible with the design of a piece of software because of how RPCS3 handles the emulation of the Cell processor in the PS3.
 
Last edited:

Kenpachii

Member
3950X is 2x8
3900X is 2x6
3800X and 3700X are 1x8
3600X and 3600 are 1x6

This creates some interesting situations for the people using the 6-core CCX variants (3900X, 3600X, 3600) with the PS3 emulator RPCS3 as an example:


Yea that's why i mentioned a few pages back. IF the cores are not used its better to disable them for better performance when it comes to emulation or anything that doesn't use them really. I noticed always a big leap in performance. Also Ocing seems to be more easier through less heat generation.

This also counts for intel cpu's.

I never understood why people with 8/16 1700 ryzens would not disable those 16 threads for gaming.
 
Last edited:
That 3000 series integrated gpu can actually handle 1080p 60 gta 5 on medium/high. That's higher than baseline PS4 and xbox one performance-without no dedicated gpu, for $130. :messenger_tears_of_joy:

That's pretty good is all I'm saying. Even casuals can play real games now.
 
Does someone have a link to a chiplet configuration breakdown by product SKU? I'm curious how many of them are split into core modules, where the cutoff is before the cores become monolithic (apart from the IO die), and are any of them unequal numbers of cores between the two chiplets?

The 8+8 is obvious, but was anything below it split?
O8bTJO1bP4b3wUZc.jpg


I haven't paid enough attention to this aspect, just got curious

i don't think the line between Chiplets is accurate.
AMD's robert hallock said even the communication of CCXs on the same chiplet go through the I/O-die. therefore CCX latency is virtualy the same on chiplet and between chiplet.
 

llien

Member
heard anantech claims xt hovers 1750mhz on average, would an aib increase that?

Germans tested 5700XT with alternative cooler and +50% power limit in wattman

ZiRIAjD.png



As temp is raising, so is power consumption, in their test it got to around 260-280w (reported by card itself, which by default limits itself to 180w)
 
Last edited:

Ivellios

Member
Very Good Review...…..Quite interesting results......The 3600 is one hell of a CPU...



Many people on the comment are saying that something os wrong with his benchmarks, because in all other reviews the 3600 did not trade blows with i9 9900k like that.

I kinda agree with them, because on another 3600 review i saw that 3600 traded blows with the i7-8700k, not with the i9 9900k

Regardless the 3600 is a an amazing price/performance CPU for sure.
 

thelastword

Banned
Yes Techspot/HardwareUnboxed gave it a 100/100 score, says it makes the 9600X (and 9600) redundant as same gaming performance more or less but delivers some crazy multi-threaded performance for its price.
No doubt, and regardless of how awesome the entry level Ryzen 3000 processor is, that it beats high end Intel processors that cost loads more.....Pretty much every Ryzen processor on the higher scale is better in price to perf against the Intel too, the entire Intel Stack.......Now if someone wants more cores and more threads, Ryzen has them covered with 3700, 3800, 3900...….More cores to throw at encrypting, encoding, emulating.....More cores to do streaming without breaking a sweat or loading multiple CPU intensive programs into memory...….Whatever you need, Ryzen's got you...

Now by next year, when we get more games using mulitcores, like Crysis 3, Shadow of the Tomb Raider.....Intel will be in a world of hurt......

Which reminds me... (I love Crytek man, wish they would give us a Crysis 4, a new action game, a new Ryse, anything)...These guys really know how to use technology huh!.....Whilst these farcry devs try to push for a 6Ghz Pentium processor, Crytek decided to make CryEngine heavily multicore centric. From the time they ported Crysis 1 to consoles, they knew what was coming......No wonder Farcry 4 lost frames so often on consoles when you moved fast, especially XBOX in large grassy expanses and when you drove around at pace……...Palm on head, AC Unity and all those dips, even today, my goodness...….Ubisoft better get the memo, their games are so single core focused, they surely made the jaguars cry...…..Well, I know Rockstar will be on-top of things come GTA6...It's where all the devs are going including DICE, hell, they need good multicore support for Bad Company 3 anyway.....
 
Top Bottom