• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Uninspiring GeForce RTX 4060 Ti performance and sub-US$500 price targets leak

Kataploom

Gold Member
VRAM amount isn't about performance, it's about being able to fit the graphics and textures. Imagine having a fast core that can run a game at high settings/60fps but being forced to reduce the resolution/texture quality because it can't fit in the VRAM. That's the worst bottleneck you can have.
I don't have to imagine it, I had the same issue with the Vega 8 iGPU which was about 60% to 80% most of the time but the VRAM always at 99%, also then I got a 1060 3gb and same story. It's admittedly frustrating thinking I could be getting more performance but the cards are way too unnecessarily capped, literal programmed obsolescence
 

Leonidas

Member
These statements crack me up every time.

RDNA3 is "such a letdown" compared to what? They're not a letdown compared to Ada chips, considering it's using smaller chips (higher yields) on a cheaper process to achieve similar performance at a lower cost.
RDNA3 was a letdown because AMD overpromised and under-delivered with their cherry picked first party benchmarks.
 

Dream-Knife

Banned
These cards were released 2 years ago and 8GB was already on the low side back then.

VRAM amount isn't about performance, it's about being able to fit the graphics and textures. Imagine having a fast core that can run a game at high settings/60fps but being forced to reduce the resolution/texture quality because it can't fit in the VRAM. That's the worst bottleneck you can have.

8GB in 2023 is the new "2GB in 2015" deal. I did the mistake buying the 960 2GB back then because i was told "2GB is enough" but less than a year later i had problems with RE7 and i had to play the game at 720p because it wouldn't fit in those 2GB, even though the game would otherwise run at 60fps (the 960 core was far superior to the PS4 GPU). Nobody needs to do the same mistake i did then.

At least, the 960 was a cheap card. Not a "current day cheap 500$ card" bullshit. A properly cheap card at around 250$. And now you are telling me it makes sense to pay 500$ to get a card with the, currently, absolute minimum amount of VRAM, which is only 2GB above my current 1060.... A 300$ card i bought 5 years ago. You seriously think 8GB is enough now for 1440p gaming or, let alone, in a year from now?
8gb is absolutely fine for 1440p lmao what are you talking about?
 
8gb is absolutely fine for 1440p lmao what are you talking about?

I think their point was 2GB was fine in 2015 for 1080p and as soon as the new gen games were released, it became a huge bottleneck.

I’m in agreement. A low to mid range card underspecced and overpriced is a raw deal to be selling in 2023.
 

DaGwaphics

Member
I think it's safe to say we'll get 10-15% improvement at the lower end of GPUs. Will you still complain about the prices if you get 10-15% more performance for the same price as last gen GPUs or the same price as last gen cards for 10-15% less money?

Of course I would still complain about that, just like we all complained about that with the CPUs when we were getting it. :messenger_tears_of_joy:

If performance of a 4060 ($300 - $350) is only at 2070 Super level, I don't see it drawing a lot of interest. At that point sales would be for new builds or those with 1000 series cards or older, as I don't see many with a 2000 or 3000 series card jumping for that performance improvement. Would need to be 3060ti level performance at minimum to be even slightly interesting and that is more in the range of 125% of the 3060 (Though in price terms that would be about 15% or so off the 3060ti MSRP).

A 30% price reduction in comparison to last gen performance would likely be accepted, that would hold 4070 pricing at $499 for 3080 level performance. Moving up one tier in performance level gen over gen is nothing spectacular, but I doubt we get anything close to that (they will either not deliver on the performance uplift or shift the prices too far upward - like the 4070 for $100 - $150 more).
 
Last edited:

Crayon

Member
The A770 is an unsung hero.
Launching with those atrocious drivers killed alot of the hype, and you know how first impressions are everything.
But that card is an absolute workhorse.
If I was building a midranger PC right now, I would certainly be taking a very very serious look at the A770.
Especially with its amazing Raytracing performance.
With Nvidia midrange cards still asking a premium its a legit option and Intel can moneyhat more games to support XeSS so things should be good going forward.

And before people jump at me that for legacy APIs the A770 is trash tier.
Not anymore.
P3kJWVr.png



Im so looking forward to the B770.
I really hope its atleast 4070 or 4070Ti level.

I would totally give them a shot. When I heard they were going to dxvk for the dx11 translation I had a good feeling. dxvk works.
 

iHaunter

Member
Meh, don’t think prices going down until unsold inventory piles up. Amd might be for ya, at low-mid end most people don’t care about RT and upscalers so might stand a chance.
Actually inventory issues don't exist anymore. Scalpers can't even sell their cards.

 

DaGwaphics

Member
I think their point was 2GB was fine in 2015 for 1080p and as soon as the new gen games were released, it became a huge bottleneck.

I’m in agreement. A low to mid range card underspecced and overpriced is a raw deal to be selling in 2023.

Yeah, I made the same mistake with the 2GB 960, all the big YT channels said that was enough. But, they failed to take into account the generational shift in game design. 8GB should be fine going forward if XSS type settings are your target, but most PC players probably want more than that (I expect you'll need at least 10-12GB to match PS5/XSX settings in relatively short order).
 
Last edited:

Amiga

Member
at this point I'm certainly more interested in what Intel does than in what AMD does.

RDNA3 was such a letdown that I kinda think it will take a long time until AMD becomes competitive again.

And Intel was able to present a competitive RT performance unlike AMD.

AMD are missing their window. Intel have been held back by their commitment to their fabrication factories. But Intel are about to catch on to lower nods and will take advantage of the CHIPS act and the policy of limiting fabrication tool exports.
 
Yeah, I made the same mistake with the 2GB 960, all the big YT channels said that was enough. But, they failed to take into account the generational shift in game design. 8GB should be fine going forward if XSS type settings are your target, but most PC players probably want more than that (I expect you'll need at least 10-12GB to match PS5/XSX settings in relatively short order).

Yup, I always cringe at these internet “rules of thumb” that get parroted over and over. Reddit and YouTube are the worst places for this. Nobody actually thinks about the why, and it leads to some pretty awful advice.


Having said that, my personal rule of thumb is simply buy value. That takes many different forms, but I’ve found that with video cards and smartphones, a large RAM bump is a very good time to upgrade if you want the hardware to last. It’s a hard limitation that is usually designed around, and the performance hit when swapping is always very noticeable. Since many games are designed with the PS5 in mind, that is the target I’d aim for and that is 16GB of unified memory which means that is 16GB high speed GDDR6 in play for devs to work with. Grabbing half of that would be a huge mistake, and even 12GB might leave you wanting for power at some point in the future, considering how many ports run worse on PC.


I have never been disappointed when going this route, here are my historical phone and GPU upgrade paths:

iPhone 6s (2GB) to iPhone 11 (4GB). They just bumped to 6GB for the 14 but I’m gonna hold out a little longer with my 11 as they tend to bump the pro/max models higher than the base due to the camera, so I’m looking for 8GB there.


Likewise, I went from a GTX 460 (1GB) to 7870 (2GB), then to a GTX 970 briefly (4GB, kinda) until I was able to return it due to the 3.5 fiasco. Spent another year with my 7870 in between and made the biggest leap ever with the GTX 1070 (8GB) which I’ve been running since.


Anyway I sure wouldn’t be buying another 8GB card now, and in fact I’d probably wait for a higher end card to hit 16GB unless you can get a great deal on a 12GB RTX card.
 

ToTTenTranz

Banned
RDNA3 was a letdown because AMD overpromised and under-delivered with their cherry picked first party benchmarks.
Their slides promised "up to 1.7x the performance over the RX 6950XT at 4K" which is what the card achieved.

lnPMLWI.jpg
F15hHek.jpg



It's nothing short of hilarious seeing Nvidia fans coping with AMD's hardware hitting within the ballpark of their promised performance numbers.
There's up to 15% difference in reviewer results due to different system setups, drivers and game versions.



You want to complain about blatant lies in GPU performance numbers? Why don't you complain about this instead?

From Nvidia's claims:
https://i0.wp.com/www.hardware.com.br/static/wp/2023/01/03/1-1080.ba240adc.jpg?ssl=1
https://i0.wp.com/www.hardware.com.br/static/wp/2023/01/03/131989-rtx-4070ti-perf-1.png?ssl=1




To reality:
FK7V9CF.jpg
yLJlg5j.jpg




At least AMD clearly states in their slides the conditions in which they achieved their results so they can be replicated. Nvidia makes ridiculous statements like "it's faster than the 3090 Ti" which then it's proven a lie no matter what the conditions they made up.
 

MikeM

Member
Their slides promised "up to 1.7x the performance over the RX 6950XT at 4K" which is what the card achieved.

lnPMLWI.jpg
F15hHek.jpg



It's nothing short of hilarious seeing Nvidia fans coping with AMD's hardware hitting within the ballpark of their promised performance numbers.
There's up to 15% difference in reviewer results due to different system setups, drivers and game versions.



You want to complain about blatant lies in GPU performance numbers? Why don't you complain about this instead?

From Nvidia's claims:
https://i0.wp.com/www.hardware.com.br/static/wp/2023/01/03/1-1080.ba240adc.jpg?ssl=1
https://i0.wp.com/www.hardware.com.br/static/wp/2023/01/03/131989-rtx-4070ti-perf-1.png?ssl=1




To reality:
FK7V9CF.jpg
yLJlg5j.jpg




At least AMD clearly states in their slides the conditions in which they achieved their results so they can be replicated. Nvidia makes ridiculous statements like "it's faster than the 3090 Ti" which then it's proven a lie no matter what the conditions they made up.
“3x faster than the 3090ti”

Lol Nvidia getting away with deceptive marketing.
 
Top Bottom