• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

CPU gaming performance has come a long ways in the past 3 years.

Zug

Member
A 4090 is not made for 1080p/1440p so it's logical that the CPU becomes the bottleneck in this context, which reflect raw CPU differences.
But who runs games at 200+ FPS ? In real world scenarios, most people will be GPU bound because they'll want their game to look a nice as possible on their hardware, and the CPU will not account for that much (exceptions exist of course).
 

Rubicaant

Member
I also dont like this talking. A Gpu was never made for any resolution beside of marketing. We now are simply there that 4k is playable nothing more. Theres a reason Nvidia cane up with „RTX“. Because we now have more than enough power for rasterization. So how do you get people to buy new stuff even if its undercooked af and not really real ray tracing. So now so you can cripple performance for the next decades.

A fully real Raytrayced Game with lets say Red Dead Redemption 2 Graphics. Will take decades to run in 4k.

Look at quake 2 we can barely run that. My 3080 shit itsself in a 20 year old game. Now look at Portal. Good Luck with „hybrid fake Raytracing “.
And yes i get lost while typing on the phone😆.
All i really meant is that 1080p and 1440p with current cards are the only way to benchmark CPUs really, because they don't fully load a GFX card. if you are full load on a current gfx card, CPU means very little as long as it's somewhat up to date.
 

Freeza93

Banned
All i really meant is that 1080p and 1440p with current cards are the only way to benchmark CPUs really, because they don't fully load a GFX card. if you are full load on a current gfx card, CPU means very little as long as it's somewhat up to date.
No cpu benchmarks are done in 720p (16:9) lowest res scale possible, lowesr settings possible. To ensure you are always cpu limited 100%. Valhalla is still gpu limited in 720p as example
 

Rubicaant

Member
My point is that, that's why the cpu upgrades appear to have such a huge improvement, because they are playing with a gpu and games that are no longer gpu bound. so of course cpu shows such improvment. i dont feel like we're talking about the same thing.
 

Brock2621

Member
Still rocking my OC’d 8700K and seems fine still. I’d be curious if I just switched my CPU if I’d get any more frames out of my 3080Ti
 

Celcius

°Temp. member
Sometimes I think about upgrading my 10700K but it still runs all my games great so I may as well just keep it for now.
 

Leonidas

Member
Still rocking my OC’d 8700K and seems fine still. I’d be curious if I just switched my CPU if I’d get any more frames out of my 3080Ti
If you game at 1440p (or 1080p) on a high refresh display you would probably notice a difference in some games.

HUB tested with a 4090 and found the 13900K ~2x faster than the 8700K in some cases.

 
Last edited:

//DEVIL//

Member
No it didn't. Not even close !!.

Who the hell game at FHD with an i9 13900k

If someone buying a 13700k or higher. He is aiming for 4k gaming in most cases. With a high end GPU with it.

These CPUs are potatos when it comes to 4k gaming with a high end GPU. The performance difference is a joke.

A 13600k or even 12600k at 4k is as good as 13900k minus what ? 10% ?? Not worth double the price or the premium price.


I am only talking about gaming side. Which is what OP is mentioning.

As a work load for other things beside gaming then sure. But for gaming ? Lol what a joke.

Here is hoping the 7800x3d does something ffs .
 

Leonidas

Member
No it didn't. Not even close !!.

Who the hell game at FHD with an i9 13900k
Seems you might learn something from watching this.



And again it's not just 1080p. 1440p is the sweet spot, margins at 1440p with a high end GPU like the 4090 aren't far off from the 1080p numbers (as seen in the OP, where 1440p numbers were included.

If I kept my old 8700K (or even 5800X), I would have been bottlenecked with my current and next GPU upgrades at 1440p in some cases.

Here is hoping the 7800x3d does something ffs .
Not exactly sure what you're hoping out of the 7800X3D. It will still be GPU bottlenecked at 4K to a similar degree as modern high end CPUs, but it might be 10% faster at 1080p upon launch compared to currently available hardware...
 
Last edited:

//DEVIL//

Member
Seems you might learn something from watching this.



And again it's not just 1080p. 1440p is the sweet spot, margins at 1440p with a high end GPU like the 4090 aren't far off from the 1080p numbers (as seen in the OP, where 1440p numbers were included.

If I kept my old 8700K (or even 5800X), I would have been bottlenecked with my current and next GPU upgrades at 1440p in some cases.


Not exactly sure what you're hoping out of the 7800X3D. It will still be GPU bottlenecked at 4K to a similar degree as modern high end CPUs, but it might be 10% faster at 1080p upon launch compared to currently available hardware...

and what is the performance different 12600k vs 12900k at 4k with 4090 ? -_-

bringing an I3 to i9.. I never understood the *point* of that video lol.
 
Last edited:
There's a point though that if you "only" have a £800 graphics card like a 4070 Ti and you're gaming at 1440p or higher then you might as well pair it with a £130 Ryzen 5600. You will not notice a difference between it and a much higher-end CPU, except in some fringe situations.

I agree with the logic of running CPU benchmarks in the most CPU-limited way but that the data itself is barely applicable to real world use is a problem. When you're new to PC gaming & building a lot of stuff is going to go over your head, and you're going to be more inclined to just rely on charts, I think. You might miss that they're running the game on a graphics card priced for Saudi oil barons, or that the resolution is lower than you'd realistically run on such a card.
 
Well, I’m coming back to Intel from a Ryzen 5800 I got back in 2020 to an i9 13900k on a Gigabyte Z790 mobo that supports DDR4 so I can still make use of my investment on 96GB, I will move on to the next DDR gen in three years, but now I can save the money on ram memory
 

CuNi

Member
No it didn't. Not even close !!.

Who the hell game at FHD with an i9 13900k

If someone buying a 13700k or higher. He is aiming for 4k gaming in most cases. With a high end GPU with it.

These CPUs are potatos when it comes to 4k gaming with a high end GPU. The performance difference is a joke.

A 13600k or even 12600k at 4k is as good as 13900k minus what ? 10% ?? Not worth double the price or the premium price.


I am only talking about gaming side. Which is what OP is mentioning.

As a work load for other things beside gaming then sure. But for gaming ? Lol what a joke.

Here is hoping the 7800x3d does something ffs .

I game in 1080p with a 5900X and a 3080 on a 240hz monitor.
To me, fluidity of games is more important than higher rez.

Native high fps 4k is still leagues way out of reach, even more so since rtx got introduced. I might upgrade to 1440p in 2 Gens or 3,when 240Hz becomes available at that rez.

4k is a resolution i might switch to in 10 years. If the displays reach higher fps.
 

//DEVIL//

Member
I game in 1080p with a 5900X and a 3080 on a 240hz monitor.
To me, fluidity of games is more important than higher rez.

Native high fps 4k is still leagues way out of reach, even more so since rtx got introduced. I might upgrade to 1440p in 2 Gens or 3,when 240Hz becomes available at that rez.

4k is a resolution i might switch to in 10 years. If the displays reach higher fps.
You are just bottlenecking your GPU. Nothing more. Your setup can push for 2k 240 easily.

I have a 4k ( almost ) 240 Hz monitor. And with dlss enabled I get 220 frames with cod on a 4090 and i5 12600k.

You should be getting the same thing on a 2k monitor seeing how 4090 is about 80% more powerful than a 3080. Plus your cpu will make a difference at 2k a little.


In general though. Outside you and very few exceptions. High end CPU is being paired with a high end GPU.

Your 3080 is considered in the master race is on top of the mid range GPUs . ( Seeing how many GPUs ahead of it ) .
But let's say you got your hands on a cheap 4080 or 4090 or 7900 XTX. You ain't gonna game on a full hd. Heh .
 
Your 3080 is considered in the master race is on top of the mid range GPUs . ( Seeing how many GPUs ahead of it ) .
But let's say you got your hands on a cheap 4080 or 4090 or 7900 XTX. You ain't gonna game on a full hd. Heh .
Where are these cheap 4080, 4090, and 7900 xtx you speak of? $1000+ is cheap to you?
 

//DEVIL//

Member
Where are these cheap 4080, 4090, and 7900 xtx you speak of? $1000+ is cheap to you?
The value of money is different from one person to another so who am I to say. But I was talking about in general if he can get his hands on cheap or used one etc . We have some 4090s used on sale sometimes here on marketplace .
 

Ironbunny

Member
Still rocking my OC’d 8700K and seems fine still. I’d be curious if I just switched my CPU if I’d get any more frames out of my 3080Ti

Heh. Considering I went from 8700k (5ghz) to 5900x and it already felt like a tangible leap and now from 5900x to 13900k and that also felt like leap in FPS...so i'd say yes.
 
I'm currently picking parts out for a friend. i9-13900KS is what he wants along with his MSI Liquid Cooled 4090 and MSI MEG motherboard with PCIe5 M.2 ports. Good lord am I jealous, but I'm so happy I get to build this monster.
 

lukilladog

Member
Playing racing games at 165fps/165hz on an i3 12100f:


moar-power.gif
 
Last edited:
Well, I’m coming back to Intel from a Ryzen 5800 I got back in 2020 to an i9 13900k on a Gigabyte Z790 mobo that supports DDR4 so I can still make use of my investment on 96GB, I will move on to the next DDR gen in three years, but now I can save the money on ram memory
It sounds like you're making an objectively bad decision tbh. You get a marginal performance improvement in games (at best; I hope you're using this computer for professional money-making) but also you have to upgrade to a completely different system when your obviously obsessive ass decides that DDR4 memory isn't enough?
 
Top Bottom