• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Baldur's Gate 3 runs faster on 5800X3D than 13900K


vM2TRbb.jpg


Pretty impressive showing for AM4's swan song, the Zen 3-based 5800X3D, against Intel's latest and greatest. If you haven't been willing to overhaul your entire PC to move to AM5 or whatever socket Intel is on these days, which requires a new motherboard and also DDR5 memory, then you should have bought a 5800X3D ages ago now.
 
Wasn't expecting this game to like the 3D V-Cache so much.
Games which use a lot of data at once and therefore need to hit memory accesses hard tend to love the 3D chips. Look at how MMO's like WoW and FFXIV perform on the 3D chips for example. Simulations like MS Flight Simulator and Civilization VI also love them, same reason.
 

Hoddi

Member
Jesus. That's not exactly by a small amount.

I never thought I'd be upgrading my 9900k anytime soon but it looks positively ancient by comparison.
 

rnlval

Member



Pretty impressive showing for AM4's swan song, the Zen 3-based 5800X3D, against Intel's latest and greatest. If you haven't been willing to overhaul your entire PC to move to AM5 or whatever socket Intel is on these days, which requires a new motherboard and also DDR5 memory, then you should have bought a 5800X3D ages ago now.
Besides Ryzen 7 5800X3D, AMD has released additional Ryzen 5000 SKUs for the AM4 platform.

The TC's benchmark doesn't show the recommended DDR5 memory speeds e.g. DDR5-5200 for AM5 is not recommended by AMD. Intel LGA1700 platform also needs faster DDR5 memory.

Ryzen 7 7700X should be paired with DDR5-6000 as per AMD's recommendations. X3D chips are resistant to below-par memory module configurations.
 
Last edited:

Bojji

Member
Besides Ryzen 7 5800X3D, AMD has released additional Ryzen 5000 SKUs for the AM4 platform.

The TC's benchmark doesn't show the recommended DDR5 memory speeds e.g. DDR5-5200 for AM5 is not recommended by AMD. Intel LGA1700 platform also needs faster DDR5 memory.

Ryzen 7 7700X should be paired with DDR5-6000 as per AMD's recommendations. X3D chips are resistant to below-par memory module configurations.

This won't change much, games seems to love massive L3 cache.
 

vM2TRbb.jpg


Pretty impressive showing for AM4's swan song, the Zen 3-based 5800X3D, against Intel's latest and greatest. If you haven't been willing to overhaul your entire PC to move to AM5 or whatever socket Intel is on these days, which requires a new motherboard and also DDR5 memory, then you should have bought a 5800X3D ages ago now.
Love how you mention needing a new mobo and Ram. Hope often do you think mother duckers are updating their rig? I’m still here with my 8700K
 

GreatnessRD

Member
Jesus. That's not exactly by a small amount.

I never thought I'd be upgrading my 9900k anytime soon but it looks positively ancient by comparison.
Still getting over 100 FPS. 9900K still a beast. And with slow ram and being 6 years old, haha. What GPU are you rockin' with it?
 

DenchDeckard

Moderated wildly
The 7800X3D is such an amazing gaming chip. I hope the console manufacturers can work out how you get a 3D chip into their next systems.

I have a 13900KS and have always had Intel chips. Not out of bias but it's just been the best option for me everything I've upgraded.

I'll be picking up a 7800X3D this year for my daughters first ever school work and gaming PC as she moves into high school.

Beastly cpus
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
7800X3D king of the hill again.
I almost feel bad for people who got 7950X3Ds for gaming builds, but then again they had the money to blow on a 7950X3D so I end up being jealous.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Maybe we do more than games on our PC's? I'd like to see you render 3D animations, compile UE5 scripts or encode videos with your 7800X3D compared to the 7950X3D.
So its not a gaming build it just so happens to game.


P.S If you are still CPU rendering in 2023 lord have mercy on your soul.
 

Pedro Motta

Member
So its not a gaming build it just so happens to game.


P.S If you are still CPU rendering in 2023 lord have mercy on your soul.
OFFTOPIC: Please don't try and teach me what rendering is. If you have no idea that CPU rendering is still used for lots of things I don't know what to tell you. Even if you're GPU rendering you have Hybrid rendering right now and even if you don't use Hybrid rendering, having a faster CPU will speed up your renderings by a lot.
 

Hoddi

Member
Still getting over 100 FPS. 9900K still a beast. And with slow ram and being 6 years old, haha. What GPU are you rockin' with it?
Similarly aged 2080Ti. And you're right because I still haven't found a good enough reason to upgrade.

I was starting to feel the GPU a bit at 4k but moved to 3440x1440 some weeks ago. That pretty much solved the issue for me.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
OFFTOPIC: Please don't try and teach me what rendering is. If you have no idea that CPU rendering is still used for lots of things I don't know what to tell you. Even if you're GPU rendering you have Hybrid rendering right now and even if you don't use Hybrid rendering, having a faster CPU will speed up your renderings by a lot.
What render engine are you using?
Vray and Arnold have reach parity for GPU rendering, and with Corona the 7950X and 139K are spitting distance apart.
Where as pretty much any render engine that supports OptiX will beat both CPUs with even "budget" GPUs.

blender.png


corona.png
 

rnlval

Member
Tbf the 5800X3D still also significantly cheaper. Especially if the person(s) upgrading are still on AM4.
Strix Point APU is rumored to have 12 Zen 5 cores per CCX. Two CCX = 24 Zen 5 cores... mooooore cores.

At least my AM5 motherboard has support for UDIMM ECC memory that can transition into a workstation.


What render engine are you using?
Vray and Arnold have reach parity for GPU rendering, and with Corona the 7950X and 139K are spitting distance apart.
Where as pretty much any render engine that supports OptiX will beat both CPUs with even "budget" GPUs.


corona.png



iiMfkIP.jpg




y644Rdb.jpg
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not


iiMfkIP.jpg

And look at the scores when you use OptiX.
Why anyone would use CPU rendering at a hobbyist level is beyond me.

4060ti_4070_Rendering_Blender.png
 
Last edited:
Any idea of CPU temps between AMD and Intel?
I don't know about the 13900K or the 5800X3D but my 7950X3D runs Baldur's Gate 3 at around 60-65C at most with a Be Quiet Dark Rock Pro 4 cooler.

In non gaming stress tests Intel CPUs can hit 100C. Stress testing my 7950X3D it doesn't go beyond 89C no matter what I do to it. Previous AMD cpus would be capped at 95C but due to the X3D cpus being more sensitive to heat they run cooler.

My 7950X3D in idle/light usage (non gaming) sits at about 38-45C. My old 9900K would idle at 27-33C and could hit 55-65C when gaming, and 90-95C under heavy loads (with same cooler).

A big reason I went with AMD was because of how hot Intel CPUs can get. Yeah the idle/normal usage temps are higher but when gaming or doing heavy work it actually stays reasonably cool.
 
Last edited:

JCK75

Member
I'm really glad I went AMD this time..
My build started with a Ryzen 1700X, needed more oomph for VR so upgraded to a 3950X
was thinking.. I was really lucky I could do that but I'm guessing it's time for a whole new build now..
look up my Mobo support.. what? I can get a 5000 series now?

think I'm going to get a 5800x3D at some point and hopefully one day I'll be able to afford a GPU that actually feels like an upgrade to my GTX 1080TI
 

twilo99

Member
It’s rare that a company does something as pro consumer as what AMD did for AM4 with the 5800x3D.

At the point in time I would be spending $1000+ on a full upgrade, but instead I spent $290 and I’ll probably get another 2 years out of my AM4 board.

Also, can confirm the game runs great on this chip with a 6800xt.
 
Last edited:

phant0m

Member
It’s rare that a company does something as pro consumer as what AMD did for AM4 with the 5800x3D.

At the point in time I would be spending $1000+ on a full upgrade, but instead I spent $290 and I’ll probably get another 2 years out of my AM4 board.

Also, can confirm the game runs great on this chip with a 6800xt.
Yup. And can confirm the game runs great on my 5800X3D + 3080

But goddamn it runs hot! Most games keep it around 70-75C but dense areas in BG3 easily his 85C
 

twilo99

Member
I hope ps5 performance is good, the wait is freakin torture.

Locked 60 for sure, they can tone down fidelity enough to get good performance if needed.

Next gen consoles better have 3D V-cash.. the current CPUs feel like a decade behind.
 

twilo99

Member
Yup. And can confirm the game runs great on my 5800X3D + 3080

But goddamn it runs hot! Most games keep it around 70-75C but dense areas in BG3 easily his 85C

I don’t think it minds those temps, I wouldn’t worry about it at all
 
Last edited:

winjer

Gold Member
That 3DCache is doing miracles for games.
The diference between the 7800X3D and the 13900K is a generational leap in performance.
 

winjer

Gold Member
Interesting how many difference will be, between 8800X3D vs 14k series, despite we got spoilers about >=30% IPC between Zen4 and Zen5

14th series is about to be launched and it's just the 13th series with a slight overclock and more e-cores. No IPC gains. But with a big power usage increase.
 

Bojji

Member


I will have to watch this.

14th series is about to be launched and it's just the 13th series with a slight overclock and more e-cores. No IPC gains. But with a big power usage increase.

They should have learned and Increase cache on some of their CPUs.(for "gamers"). Next Intel series isn't much of improvement based on their own slides.

Power draw difference between 13900k and 7800x3d is quite big.
 

winjer

Gold Member
They should have learned and Increase cache on some of their CPUs.(for "gamers"). Next Intel series isn't much of improvement based on their own slides.

Power draw difference between 13900k and 7800x3d is quite big.

yeah, power efficiency on Intel's 1th gen is a disaster. And with 14th gen, it's going to get even worse.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Interesting how many difference will be, between 8800X3D vs 14k series, despite we got spoilers about >=30% IPC between Zen4 and Zen5

14th gen is just a refresh of the 13th gen.

Even Meteor/ArrowLake isnt looking to be a sizeable upgrade cept on GPU side of things.
The true upgrades are gonna drop with LunarLake.

images-3.jpg
 
It runs well on everything but I've seen it plummet on YouTubers videos who are using 13900ks in places where my framerate doesn't move by a single frame on a 7800x3d so it definitely loves 3d cache.
 
Top Bottom