• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

How bad is 8GB of VRAM in 2023?

just turn down settings and game a happy man. Ya don't need raytraced ballsack hairs to enjoy a game, and Ultra 99% of the time is a useless preset in the short term only available for futureproofing. This isn't 2006 anymore, Low/medium settings now look pretty much 80% of what ultra is for a 200% performance gain.
Speak for yourself. I demand only the best in ballsack hair representation at all times.
 

mxbison

Member
I've had no issues with my 3070.

My setup isn't meant for 4K anyway and worked just fine at 1440p so far. If I bought a new card now I'd obviously get more than 8GB though.

If it can run Cyberpunk on ultra + RT then it should run other games too. Dog shit PC ports seem like the main issue here.
 
Last edited:

Roxkis_ii

Member
Man, I've been rocking my laptop with a 3060 and 6 gigs of vram playing cyberpunk at 1440 with medium textures and getting around 55 fps.

I even get pretty good frames rate in foza horizon 5, even though I get a vram warning 2 minutes after getting game lol, still runs smooth for the most part.

I thank God for consoles else I wouldn't get to play new games at a decent framerate.
 

SF Kosmo

Al Jazeera Special Reporter
Calling people names because I hit a nerve isn't the proper way to respond back.
What name did I call you? I simply pointed out that consoles can't use the full 16GB of memory as VRAM the way you were claiming, and that 12GB of VRAM and 16GB of system RAM was going to be far more overhead than a console has.
 

Bojji

Member
8GB is fine, if u play games, PC gamers are actually playing. Other then a few ports that struggle on max settings from games that nobody on pc even knows it exists.

If PC gamers didn't care about those games Sony wouldn't port them. Just like someone said, for people playing games you mentioned even some old ass GPU will be enough so they don't care about new GPUs just like they don't care about normal games.

I still think it was 100% unacceptable for nvidia to be releasing 8GB GPUs for for these prices in 2020 and nevermind in 2023.

But does the difference between 8GB and 16GB truly matter in the grander scheme? Not really. I can't think of a single instance in the past 30 years where a mere doubling of memory would be counted as 'a big difference'.

It's massive difference and it will be like that in many (most?) games in the future.

I've had no issues with my 3070.

My setup isn't meant for 4K anyway and worked just fine at 1440p so far. If I bought a new card now I'd obviously get more than 8GB though.

If it can run Cyberpunk on ultra + RT then it should run other games too. Dog shit PC ports seem like the main issue here.

Cyberpunk is old (and it will be patched this year with higher demands), game has low res textures compared to other games in the test.

I love this talk about "few bad ports"... Majority of PC ports in history were "bad ports". If you think that number of games requiring more than 8GB to function properly will stop at R&C you are mistaken, fully current gen games will be VRAM limited on 8gb one way or another.

Lowering settings is always an option to help but shouldn't be needed on a new cards from Nvidia released in 2023 for 400$!
 
Last edited:

raduque

Member
You think it's unacceptable that GPUs ship with 8gb vram in 2023?

I think it's unacceptable that games are being poorly developed enough to require 16gb vram.

90% of gamers have 16gb system memory.

Devs need to fix their goddamned games to be under 8gb vram.
 

winjer

Gold Member
You think it's unacceptable that GPUs ship with 8gb vram in 2023?

I think it's unacceptable that games are being poorly developed enough to require 16gb vram.

90% of gamers have 16gb system memory.

Devs need to fix their goddamned games to be under 8gb vram.

That is a stupid argument. By that logic we would still be using 640K.
Tech evolves. People upgrade their machines. And the cycle repeats.
 

raduque

Member
That is a stupid argument. By that logic we would still be using 640K.
Tech evolves. People upgrade their machines. And the cycle repeats.
And because of your defensive logic, we'll probably have games requiring 32gb vram by mid next year. GPUs will be $3000 MSRP at the high end and they people with more money then sense will tell people they can't play games on, and shouldn't buy, GPUs with 16gb vram.

It's just ridiculous, and does nobody good except nVidia's shareholders.
 

winjer

Gold Member
And because of your defensive logic, we'll probably have games requiring 32gb vram by mid next year. GPUs will be $3000 MSRP at the high end and they people with more money then sense will tell people they can't play games on, and shouldn't buy, GPUs with 16gb vram.

It's just ridiculous, and does nobody good except nVidia's shareholders.

Oh no, tech evolving like it did for the past 5 decades. Oh, the tragedy.
 

raduque

Member
NVidia doesn't sell RAM.
And tech has always been evolving, even before nvidia existed.
And NVidia doesn't make CPUs, memory, motherboards, SSDs, monitors, etc.
No, but nVidia has massively overpriced products, and they only care about revenue, not providing a good, affordable product.

Tech is always evolving, but I feel like gpu tech is being pushed farther, and more expensive than it needs to be, and it's allowed developers to get sloppy and lazy.

CPUs, motherboards, SSDs, etc are fine. My Ryzen 3600 and 64gb DDR4-3200 on a b350 motherboard are still more then enough for all but the absolute worst mess of a port. But my RTX 2080 8gb is literal trash to people like you.

Don't even get me started on unnecessary shit like 4k and 8k monitors.
 
Last edited:

Hugare

Member
Don’t mobile rtx3060 laptops have 6GB of vram? I used to have one and it had 6GB on it.
https://www.techpowerup.com/gpu-specs/geforce-rtx-3060-mobile.c3757

It’s a shame that mobile gpus are also shipping with lower vram amounts compared to desktops. Mobile 4070 only has 8GB? At least you get 12GB on the desktop version.

Anyway, 8GB was a nice jump for Nvidia when they included it on their 1070/1080 cards. A jump from the 4GB 980/6GB 980ti. Since then (besides the 11GB 1080ti) they’ve really stagnated on vram size.

Definitely should avoided if possible. I nearly went for a 4070ti, but the 12GB of vram really put me off on the card. (Coming from a 10GB 3080)
Oh yeah, my bad. It has 6 gb

But my point stands: even with 6 gb, sticking to 1080p (even 1440p most of the time) and with RT on low/off, you can survive the gen

I can play Cyberpunk with RT on Psycho at a stable 30 with DLSS on Quality
 

Bojji

Member
No, but nVidia has massively overpriced products, and they only care about revenue, not providing a good, affordable product.

Tech is always evolving, but I feel like gpu tech is being pushed farther, and more expensive than it needs to be, and it's allowed developers to get sloppy and lazy.

CPUs, motherboards, SSDs, etc are fine. My Ryzen 3600 and 64gb DDR4-3200 on a b350 motherboard are still more then enough for all but the absolute worst mess of a port. But my RTX 2080 8gb is literal trash to people like you.

Don't even get me started on unnecessary shit like 4k and 8k monitors.


You can call every game trash port but for more demanding games R5 3600 is absolutely not enough... Same goes for 2080.

Tech evolves, with every new consoles games are more demanding, you can't expect developers to support 8gb cards in 2023, first mainstream (not titan etc.) GPUs with this amount are from 2015!

I'm sorry but your GPU is 5 years old.

Same thing happened with dual core CPUs, 2gb VRAM cards, 4gb VRAM cards, 4 core CPUs etc.
 
Top Bottom