• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.
  • The Politics forum has been nuked. Please do not bring political discussion to the rest of the site, or you will be removed. Thanks.

AMD's Ryzen 3000 Review Thread: More Cores, More Threads.

Feb 15, 2013
8,953
7,695
855
London
The video encoder block is always the same (when present) on a generation of GPU's. So all Pascals have the same encoder block, all Turings, etc.

I know nothing about the AMD GPU's, but on the Nvidia side you've been able to use Shadowplay for low-overhead streaming and recording using the NVENC hardware encoder for many years now going back to the Kepler generation. The quality of the hardware encoding used to be much worse and has steadily improved every generation to the point where you can now use Shadowplay for both streaming and recording with very good quality, comparable to x264 run on the CPU.

Oh I see. There doesnt seem to be many benches out there.
 

PhoenixTank

Member
Jul 13, 2017
1,385
1,535
705
The video encoder block is always the same (when present) on a generation of GPU's. So all Pascals have the same encoder block, all Turings, etc.
Largely accurate but the 1650 is a notable exception to that rule.
More differences when it comes to decode, though. The 960 had better support for HEVC than the 980Ti, for example.
 
Dec 14, 2008
33,844
2,317
1,360
Oh he's doing this on the Gigabyte X570 Aorus Master, that's a pretty good recommendation of a board if I've seen one. I'm probably going to get the step-down Ultra though, I have no use for 2.5G Ethernet and the BIOS post code means little to me, I'm not going to OC with LN2 on a live stream.

Watching him crash Ryzen Master over and over is amusing though. I see that AMD's software is still as shit as ever. Good thing I'm not planning on any real OCing, especially with a Noctua NH-D14.
 
Last edited:
  • Like
Reactions: CrustyBritches
Jul 29, 2013
3,255
6,161
985
US
Oh he's doing this on the Gigabyte X570 Aorus Master, that's a pretty good recommendation of a board if I've seen one. I'm probably going to get the step-down Ultra though, I have no use for 2.5G Ethernet and the BIOS post code means little to me, I'm not going to OC with LN2 on a live stream.
Yeah, GN had Buildzoid do their X570 rundown with some older X470 boards, too. He was praising Gigabyte a lot.

Steve just mentioned the 5GHz rumors that were floating around from AdoredTV I think it was. "This is all it took." he chuckled. Lol. Just LN2, a blow torch, and about an hour. lol.

GN got to 4.9GHz and completed Cinebench R15 benchmark. Got into Windows @ 5GHz, but crashed out during Cinebench. I think by the time he called for help and learned he needed to adjust Infinity Fabric speeds, he was already dealing with condensation or even cracked thermal paste.

They uploaded their new 3600X review during the stream. Can't watch now I just spent too much time on the live OC demo...
 
Last edited:
Feb 15, 2013
8,953
7,695
855
London
Steve just mentioned the 5GHz rumors that were floating around from AdoredTV I think it was. "This is all it took." he chuckled. Lol. Just LN2, a blow torch, and about an hour. lol.

That lying b*stard, still can't get over how wrong his leaks were across both Ryzen 3000 and Navi. Yet he still has thousands of people hanging off his every word!

This is my favourite story but; I tweeted to him back in Feb/March, and said that some of his clockspeed figures were totally unrealistic (a 3950X with 5.2Ghz boost and 4.3Ghz base clock) therefore his chart was very likely fake, and I got an abusive reply from him and his drones. And the same again even after the official reveal exposed everything as a lie.

He did some real damage to this release with those lies spread from Dec 2018.
 

Ascend

Member
Jul 23, 2018
3,532
5,139
585
That lying b*stard, still can't get over how wrong his leaks were across both Ryzen 3000 and Navi. Yet he still has thousands of people hanging off his every word!

This is my favourite story but; I tweeted to him back in Feb/March, and said that some of his clockspeed figures were totally unrealistic (a 3950X with 5.2Ghz boost and 4.3Ghz base clock) therefore his chart was very likely fake, and I got an abusive reply from him and his drones. And the same again even after the official reveal exposed everything as a lie.

He did some real damage to this release with those lies spread from Dec 2018.
I agree... I didn't really mind his speculative videos, and well, leaks can change. But regarding Navi, he seemed different. He basically made a video trashing it after the Super cards were released. My comment was that I would reserve judgment until Navi is released, and that I find it odd that he's praising nVidia so much, after complaining that they raised prices. I also mentioned that the Vega 56 is the best value. and that he was a lot more critical of these things in the past. Additionally he left Steve's Hardware Unboxed review out, which was the only real review critical of the super cards, despite him praising Steve in the past.

He replied with a bunch of nVidia fanboy-ish mumbo jumbo of Vega 56 at $300 no longer being good value if you take its power consumption into account because electricity cost (WTF he was never like that), and despite him trashing ray tracing in the past, he suddenly said he supports innovation in the space regardless which company does it, completely dismissing the price hike. It really seemed as if he received an nVidia paycheck or something.

In any case, after a bunch of back and forth regarding the above, Navi was then released. I did not give my perspective on anything regarding Navi itself, but I simply asked him if he's going to release a video on Navi, and, until today, no word, no video. Dead silence.

I think it's the first time it's been really obvious that he's dead wrong. Especially because he didn't present anything on Navi as leaks anymore, but as facts, and, well, things turned out differently, and now, he's nowhere to be seen. That's how he killed his own shred of credibility and integrity he had left.
 
Last edited:
  • Like
Reactions: Coulomb_Barrier

DJ Shalad

Member
Dec 10, 2018
2,950
7,330
700
CPU load constantly higher on the $500 chip, that makes no sense ... at 4K the load is even higher than at 2K. What?
 
Feb 15, 2013
8,953
7,695
855
London
I agree... I didn't really mind his speculative videos, and well, leaks can change. But regarding Navi, he seemed different. He basically made a video trashing it after the Super cards were released. My comment was that I would reserve judgment until Navi is released, and that I find it odd that he's praising nVidia so much, after complaining that they raised prices. I also mentioned that the Vega 56 is the best value. and that he was a lot more critical of these things in the past. Additionally he left Steve's Hardware Unboxed review out, which was the only real review critical of the super cards, despite him praising Steve in the past.

He replied with a bunch of nVidia fanboy-ish mumbo jumbo of Vega 56 at $300 no longer being good value if you take its power consumption into account because electricity cost (WTF he was never like that), and despite him trashing ray tracing in the past, he suddenly said he supports innovation in the space regardless which company does it, completely dismissing the price hike. It really seemed as if he received an nVidia paycheck or something.

In any case, after a bunch of back and forth regarding the above, Navi was then released. I did not give my perspective on anything regarding Navi itself, but I simply asked him if he's going to release a video on Navi, and, until today, no word, no video. Dead silence.

I think it's the first time it's been really obvious that he's dead wrong. Especially because he didn't present anything on Navi as leaks anymore, but as facts, and, well, things turned out differently, and now, he's nowhere to be seen. That's how he killed his own shred of credibility and integrity he had left.

I really do question whether Nvidia or Intel started paying him or more likely, supplied him with fake info, as he did say his Ryzen 3000 info came from the Nvidia source that leaked RTX naming before that release.

Lol just remembered he claimed AMD were bringing back the naming scheme 'Black Edition' for what we now know is the 3900X and 3950X. That was a crock of shite too.
 
Last edited:

Ascend

Member
Jul 23, 2018
3,532
5,139
585
I really do question whether Nvidia or Intel started paying him or more likely, supplied him with fake info, as he did say his Ryzen 3000 info came from the Nvidia source that leaked RTX naming before that release.

Lol just remembered he claimed AMD were bringing back the naming scheme 'Black Edition' for what we now know is the 3900X and 3950X. That was a crock of shite too.
Agreed... Most likely he's being used, and he does not seem aware of it. Why would anyone want to leak anything to him? The main reason I can think of is that they want to gain his trust, to then feed him slight disinformation to either destroy the view on the competition, or destroy his own credibility since he's not been really friendly to these companies. It can be both as well.
 

thelastword

Banned
Apr 7, 2006
11,732
11,938
2,000
CSGO was always the game where Intel won, now that AMD IPC is better, AMD is better......However, just look at how close AMD is in so many titles, in some games Intel has a slightly better average but AMD has much better 1% lows, AMD wins in a few.....There are so many games you could call a tie....and remember, this is against a processor that you can't cool with intel's box coolers, does the 9900k even come with a cooler?

Intel is getting slighlty better averages because of two things, 1) "5GHz"...and 2) Unoptimized games for AMD......Guess what, CSGO was recently optimized for AMD CPU's and guess what, BAM!......World Warz where day 1 benchmarks showed a huge uptick for intel vanishes with a quick and small patch.......You think Ubisoft will go back and patch Farcry for AMD? Likely not and most of the games there are not optimized for AMD......It's just funny that AMD does not need 5GHZ to beat Intel if they games are optimized for their CPU's.......I can only imagine the lead AMD will have if these CPU's hit the same clocks as Intel, but I guess as more games are patched for Ryzen 3, the more you will see AMD pull ahead over Intel in gaming......The next few games to come will all be optimized for AMD as well......

The real backbreaker for Intel, is when all these multicore games start appearing later this year and early next year en route to console's release....Speaking of which...This will be a good thing for Console launch titles, these devs will have almost 2 years working on high IPC, high core CPU's in Ryzen 3000.......PC titles in 2020 and around console launch should be pushing some serious physics and AI........When all is said an done, I think the 3900X will make the 9900k look like an i5 in comparison...
 
Feb 15, 2013
8,953
7,695
855
London
CSGO was always the game where Intel won, now that AMD IPC is better, AMD is better......However, just look at how close AMD is in so many titles, in some games Intel has a slightly better average but AMD has much better 1% lows, AMD wins in a few.....There are so many games you could call a tie....and remember, this is against a processor that you can't cool with intel's box coolers, does the 9900k even come with a cooler?

Intel is getting slighlty better averages because of two things, 1) "5GHz"...and 2) Unoptimized games for AMD......Guess what, CSGO was recently optimized for AMD CPU's and guess what, BAM!......World Warz where day 1 benchmarks showed a huge uptick for intel vanishes with a quick and small patch.......You think Ubisoft will go back and patch Farcry for AMD? Likely not and most of the games there are not optimized for AMD......It's just funny that AMD does not need 5GHZ to beat Intel if they games are optimized for their CPU's.......I can only imagine the lead AMD will have if these CPU's hit the same clocks as Intel, but I guess as more games are patched for Ryzen 3, the more you will see AMD pull ahead over Intel in gaming......The next few games to come will all be optimized for AMD as well......

The real backbreaker for Intel, is when all these multicore games start appearing later this year and early next year en route to console's release....Speaking of which...This will be a good thing for Console launch titles, these devs will have almost 2 years working on high IPC, high core CPU's in Ryzen 3000.......PC titles in 2020 and around console launch should be pushing some serious physics and AI........When all is said an done, I think the 3900X will make the 9900k look like an i5 in comparison...

Yes the games getting optimized already is fantastic. In the space of a couple weeks CSGO and World War Z have been updated resulting in big performance uplifts. This will only continue.

EDIT: And look at that. 36 games and as I've been saying the gaming gap when benching under unrealistic 2080 Ti + 1080p res is just SIX PERCENT. The 3900X and 9900K are so close in gaming and this small difference will continue to shrink.
 
Last edited:

OSC

Member
Jun 16, 2018
3,625
3,446
515
..When all is said an done, I think the 3900X will make the 9900k look like an i5 in comparison...
With next gen consoles all ryzen, it is likely most future titles will be optimized for ryzen, and as seen here and as you've mentioned when optimized for ryzen ryzen beats 9900k in gaming.

With future games optimized for ryzen, future ryzen h/w will only widen the gap.
EDIT: And look at that. 36 games and as I've been saying the gaming gap when benching under unrealistic 2080 Ti + 1080p res is just SIX PERCENT. The 3900X and 9900K are so close in gaming and this small difference will continue to shrink.
Some titles might flip with a radeon vii, or future amd cards. Future ryzen optimized titles might also flip performance advantage even on nvidia h/w.

From a thread in the amd reddit, seems nvidia might, accidentally or on purpose, have reduced ryzen performance
In this case, I mean that the Ryzen chip was faster at 1080p in SotTR with Radeon VII than it was with 2080ti, which is a galactic red flag. -chapstickbomber
 
Last edited:
  • Thoughtful
Reactions: wordslaughter

Kenpachii

Member
Mar 23, 2018
7,436
8,939
765
I'm starting to think all these reviews running games at medium/low settings @1080p on a 2080Ti of all cards are useless. Sure you can see how much faster intel is in that scenario but who will play games at that res with a 2080Ti? I just saw a review where the 3700X is really competitive with the 9900K and it looks like the only thing the reviewer did was run the games on High/Ultra setting @1080p. So on theory there is already a GPU bottelneck just by enabling High settings. And lets say he ran the test with a GTX 1080/5700, there wouldn't even be a gap worth worrying about.


WIth Ryzen 3700X, gaming at high res/settings will be a smoother experience because you have idle threads, compared to a 9700K for instance where games like Assassin's Creed can 100% all 8 cores. And with games becoming more multi threaded by the day, those 8 cores are not going to be enough if you want to future proof.

If you are not going to go all out on a 9900K, Ryzen is a no brainer.

U make no sense.

First you say future proofing stuff isn't interesting it's all about now. When it doesn't favor AMD.
Then you say because it favors AMD future proof is heavily needed.

This kinda of logic is used a lot in this thread.

CPU load constantly higher on the $500 chip, that makes no sense ... at 4K the load is even higher than at 2K. What?

Probably disabled some cores/smt.
 
Last edited:

Screamer-RSA

Member
Nov 19, 2018
1,161
1,990
540
U make no sense.

First you say future proofing stuff isn't interesting it's all about now. When it doesn't favor AMD.
Then you say because it favors AMD future proof is heavily needed.

This kinda of logic is used a lot in this thread.

What are you on about brah? I said your 9900K is still the best option for gaming didn't' I? What are you trying to justify?

I said exactly what I said. The fact that some people(and I think you included?) think"future proofing" is running games at 720p and then recommending an 8 core chip over a 8c/16t for the same price is them kidding themselves. Especially with Ryzen 3000. And obviously the 9900K with lots of fast cores/threads is the exception.
 
Last edited:

billyxci

13 year old console warrior. Put me on ignore.
Aug 3, 2014
13,814
8,718
970
i regret upgrading to Ryzen now. had nothing but issues with it. was running great at first but it's so unstable and running really hot. considering just returning the cpu/motherboard.
 

Xyphie

Member
Oct 4, 2007
2,865
476
1,250
Somewhere
i regret upgrading to Ryzen now. had nothing but issues with it. was running great at first but it's so unstable and running really hot. considering just returning the cpu/motherboard.


It's a known issue. There's massive issues with idle voltages, my 3700X is constantly running at ~1.4V because some apps like Steam, Battle.net, RTSS etc just force the core voltage to be full load.
 
Last edited:

REDRZA MWS

Member
Jan 7, 2018
828
1,100
440




Finally took the plunge having my PC custom built. Can any PC guys take a glance at the build lmk what you think or anything I should add or am missing? Thanks!
 

Screamer-RSA

Member
Nov 19, 2018
1,161
1,990
540




Finally took the plunge having my PC custom built. Can any PC guys take a glance at the build lmk what you think or anything I should add or am missing? Thanks!

I am not familiar with IBuyPower but I would not recommend the ASUS ROG MAXIMUS FORMULA motherboard. They have the worst VRMs of all Z390 boards. 4 phase compared to 12 phase for the Gigabyte equivalent for instance. Your 9900K will hit power limitations way sooner than on other boards.

Otherwise congrats!

Also there is a Neogaf PC gaming thread now where you can also post for more suggestions.
 
Last edited:
  • Like
Reactions: llien

REDRZA MWS

Member
Jan 7, 2018
828
1,100
440
I am not familiar with IBuyPower but I would not recommend the ASUS ROG MAXIMUS FORMULA motherboard. They have the worst VRMs of all Z390 boards. 4 phase compared to 12 phase for the Gigabyte equivalent for instance. Your 9900K will hit power limitations way sooner than on other boards.

Otherwise congrats!

Also there is a Neogaf PC gaming thread now where you can also post for more suggestions.

Thanks! I’m excited for it.
 

thelastword

Banned
Apr 7, 2006
11,732
11,938
2,000
From a thread in the amd reddit, seems nvidia might, accidentally or on purpose, have reduced ryzen performance
Interesting, lots is going on.....It's early days yet, so we can expect further game optimizations for Ryzen and perhaps a huge adrenalin update when AIB's launch in mid August for Navi....The performance is already on par and when optimized beating Intel........So future patches should provide a huge boost everywhere...

If the video below shows anything is that Ryzen is one hell of a CPU with huge potential for gaming...….1% lows, optimized games, multicore games is where it will shine......Intel fans should know that even the next Farcry and AC will be optimized for Ryzen....Funny that no one shows Crysis 3 CPU benches or Ryse….

 

Kenpachii

Member
Mar 23, 2018
7,436
8,939
765
Interesting, lots is going on.....It's early days yet, so we can expect further game optimizations for Ryzen and perhaps a huge adrenalin update when AIB's launch in mid August for Navi....The performance is already on par and when optimized beating Intel........So future patches should provide a huge boost everywhere...

If the video below shows anything is that Ryzen is one hell of a CPU with huge potential for gaming...….1% lows, optimized games, multicore games is where it will shine......Intel fans should know that even the next Farcry and AC will be optimized for Ryzen....Funny that no one shows Crysis 3 CPU benches or Ryse….


Stop linking useless benchmarks dude.

The reason its 1% because its a GPU bottleneck.
 
  • Like
Reactions: Evilms

Mithos

Member
Apr 26, 2006
6,158
496
1,520
Sweden
Anyone know of a review/benchmark that lives in the past...?
Testing Ryzen 3600/X with RTX 2060/super on 60hz monitor with v-sync ON @1080p.
 
Last edited:
Jul 29, 2013
3,255
6,161
985
US
I'm honestly more confused after watching this than I was before. I guess it doesn't matter much for me with a 3900X which is already binned very close to it's limit in terms of clock speeds.
Congrats on the new CPU. 3900X is awesome, at some point I want to upgrade to it from my R5 1600. Their written article explains it pretty well and it's easy to digest since it's not coming at you too fast.

With the launch of the Ryzen 3000 series processors, we’ve noticed a distinct confusion among readers and viewers when it comes to the phrases “Precision Boost 2,” “XFR,” “Precision Boost Overdrive,” which is different from Precision Boost, and “AutoOC.” There is also a lot of confusion about what’s considered stock, what PBO even does or if it works at all, and how thermals impact frequency of Ryzen CPUs.
Precision Boost Overdrive is a technology new to Ryzen desktop processors, having first been introduced in Threadripper chips; technically, Ryzen 3000 uses Precision Boost 2. PBO is explicitly different from Precision Boost and Precision Boost 2, which is where a lot of people get confused. “Precision Boost” is not an abbreviation for “Precision Boost Overdrive,” it’s actually a different thing: Precision Boost is like XFR, AMD’s Extended Frequency Range boosting table for boosting a limited number of cores when possible. XFR was introduced with the first Ryzen series CPUs. Precision Boost takes into account three numbers in deciding how many cores can boost and when, and those numbers are PPT, TDC, and EDC, as well as temperature and the chip’s max boost clock. Precision Boost is enabled on a stock CPU, Precision Boost Overdrive is not. What PBO does not ever do is boost the frequency beyond the advertised CPU clocks, which is a major point that people have confused. We’ll quote directly from AMD’s review documentation so that there is no room for confusion:
Follow link for PPT, TDC, and EDC definitions.
What is important to note is that PBO only affects these three power limits. The effect on CPU clock speed is indirect, and PBO will never boost the CPU past the advertised clocks. At best it will allow the CPU to maintain boost clocks longer and more often, and therefore PBO will have the strongest effect on scenarios where the CPU is already able to boost. Because one of the constraints is thermal, PBO will also have less effect on CPUs that are already well-cooled and not bumping up against that limit. Remember that in addition to these three power limits PB is constrained by temperature and max boost clocks, and these limits are not affected by PBO.
 

wordslaughter

Banned
Apr 17, 2019
1,377
3,953
445
My 3900X just arrived!! Looking forward to assembling my build today. Also very interested in figuring out how to get the most out of it.

DerBauer just put up an interesting video about CCX overclocking and is getting excellent results.

If you're interested you can read about it in a Reddit thread here...


Other people have been able to easily recreate these results :)

This is significant because it's bringing 3900X single core scores in-line with 9900K single core scores :messenger_sunglasses:
 
Last edited:

nkarafo

Member
Nov 30, 2012
16,255
7,702
1,070
Which CPU would you choose if you wanted to build the ultimate emulation PC?
 

thelastword

Banned
Apr 7, 2006
11,732
11,938
2,000
If you want detailed deep dives with lots of evidence based data...…..Tech Jesus is unparalleled......If you want somebody to read you lullabies and tuck you in....Maybe Anthony from Linus tech tips is your guy......

Anthony