• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 4060 and RTX 4060 Ti Announced - Coming May 24 for $299

HeisenbergFX4

Gold Member
Tired Ariana Grande GIF by NETFLIX


Hardware threads on gaf. Why do I even bother. Might as well tape 16GB to a worm and gaf will still love it.
Get that 8GB card and try running higher settings and see what happens with todays games.
 

FingerBang

Member
Sadly they downgraded the Chip of xx60, now it's using 107 instead of 106.
They did that of all the cards except the 4090, while also raising the price. The joke writes itself.

AMD did the same btw, although the price increase was lower for their top-end
 

ToTTenTranz

Banned
Currently the best option seems to be the discounted RX 6800. The 4060 ti 16gb Vs Rx 6800 will be an interesting one I feel at the 500usd mark. Not to mention the possible Rx 7700 xt which should be revealed in the next few months.
If the RX 7700 XT uses Navi 33 then the RX 6800 will definitely be a lot faster still.
Though if it uses cut-down Navi 32 with 6 channels (12GB VRAM) then it might be a bit better.
 

kiphalfton

Member
An 8GB 4060 is the 3GB 1060 all over again. Just stay away if you don't want to regret it a little later. An 8GB ti is just puzzling, who is this card really for? It shouldn't exist as it makes no sense. I imagine it's just there to fill the price gap.

If you ignore the 8GB 4060 Ti variant, there's no fucking way $200 justifies the difference between the RTX 4060 and RTX 4060 Ti 16GB variant.

But this is Nvidia this generation, running wild with pricing. I mean ffs there's a $200 difference between the RTX 4070 and RTX 4070 Ti, and then theres a $400 gap between the RTX 4070 Ti and the RTX 4080. So I guess I'm not sure why I'm so surprised pricing on the lower end is sucky as well.

Edit:
For reference:
- There was a $300 difference between the GTX 560 and GTX 580.
- There was a $270 difference between the GTX 660 and GTX 680.
- There was a $400 difference between the GTX 760 and GTX 780.
- There was a $350 difference between the GTX 960 and GTX 980.
- There was a $400 difference between the GTX 1060 and GTX 1080.
- There was a $350 difference between the RTX 2060 and RTX 2080.
- There was a $370 difference between the RTX 3060 and RTX 3080.

And somehow now there is a $900 difference between the RTX 4060 and the RTX 4080. Makes perfect sense.
 
Last edited:

XesqueVara

Member
If the RX 7700 XT uses Navi 33 then the RX 6800 will definitely be a lot faster still.
Though if it uses cut-down Navi 32 with 6 channels (12GB VRAM) then it might be a bit better.
Both 7700 XT and 7800XT uses Navi 32, but AMD is gonna wait for RDNA 2 cards to sell before launching them on the market.
 
Last edited:

XesqueVara

Member
The 4060 being 300$ might force AMD to launch the 7600 at 250$, it helps AMD that 7600 is cheaper to make than 6600/XT.
 
Yeah i might get back into the PC shit when this come out. Hey, spend $300 on this or get an Xbox Series X 🤔 any comparisons out there?
 
Last edited:
This is barely better than a 3060 and you all think this is a great deal. It would be a good deal if it had the usual performance uplift from last gen, but it's what a 15% uplift. Are you freakin kidding me? You can talk about frame generation all you want, but that doesn't count when you still have the same lag (even more so actually).

Nvidia suckered you... and here I go back to gaming on my 4090!
not everyone has 3 grand to spend on a GPU! yeah that's how much it cost here in Canada.
 

Bojji

Member
Jesus christ. I'm not defending 8GB. VRAM isnt the only thing GPUS needs for performance. How the fuck is everyone so oblivious? If you slap 24GB VRAM on 970, it'll still be shit.

Of course it isn't, 8gb of memory didn't help radeon 390x that much in 2015.

Problem is, otherwise still quite fucking good GPU (3070) is completely fucked up in modern games. This GPU with 16 (or 12) gigs of ram would still be very competent. Fact that Nvidia is still selling 8gb GPUs in this (400$) price range is fucking ridiculous.
 

lukilladog

Member
Imagine buying a new card in 2023 and getting the same stutters, game warnings, and crappy textures as my 3050 on newer games. 8gb is legacy now, even for the low end at 900p or 720p (which is the actual rendering resolution with dlss for 1080p), and forget about activating RT when you are already choking on vram. What a clown company Nvidia has become, I would not even consider an 8gb 4050 for $150.
 
Last edited:

manfestival

Member
Nvidia is really out here bamboozling people out of their money. AMD was positioned to be so much better and they dropped the ball so bad lol. Intel well...
 

lukilladog

Member
8 GB 4060 is shit down the road in a year or two.

It is now, even on a 3050 you cannot use console textures in newer games... for example, The last of Us looks a bit like a cartoon, and on Hogwart's Legacy, you cannot use RT shadows because it runs out of Vram. 8gb is not enough for console textures in Resident evil remasters or the likes of Dead Space. 8gb is legacy now, even at the low end.
 
Last edited:

CrustyBritches

Gold Member
The xx60 on Pascal and Turing was as powerful as the xx80 card from the previous gen. When we got to Ampere, the 3060 was only as powerful as the 2070. The leaked Geekbench 5 CUDA score for the 4060 Ti is 146170. The 3060 Ti scores 130000-140000. 3070 is ~150000. Obviously we'll have to wait for reviews, but in this test the 4060 Ti isn't even hitting 3070 level, let alone the 4060.

If that ends up being reflected in benchmarks, then we went from xx60 being as powerful as the xx80 from the previous gen, to the xx60 being less powerful than xx70, and possibly even the xx60 Ti from the previous gen. Don't be fooled by the naming convention and price point, this is a 4050 renamed to a 4060.

This isn't even getting into the whole VRAM deal. The 8GB 4060 Ti is a definite no-go. If you're cool with med-high settings and want a new card with warranty, then the 4060 might be a consideration. My take is that this is more trash from Nvidia.
 

FireFly

Member
The xx60 on Pascal and Turing was as powerful as the xx80 card from the previous gen. When we got to Ampere, the 3060 was only as powerful as the 2070. The leaked Geekbench 5 CUDA score for the 4060 Ti is 146170. The 3060 Ti scores 130000-140000. 3070 is ~150000. Obviously we'll have to wait for reviews, but in this test the 4060 Ti isn't even hitting 3070 level, let alone the 4060.

If that ends up being reflected in benchmarks, then we went from xx60 being as powerful as the xx80 from the previous gen, to the xx60 being less powerful than xx70, and possibly even the xx60 Ti from the previous gen. Don't be fooled by the naming convention and price point, this is a 4050 renamed to a 4060.

This isn't even getting into the whole VRAM deal. The 8GB 4060 Ti is a definite no-go. If you're cool with med-high settings and want a new card with warranty, then the 4060 might be a consideration. My take is that this is more trash from Nvidia.
Nvidia's official slides say it's 15% faster than the 3060 Ti, which would put it almost exactly on par with the 3070. I assume it will have some performance regressions and also have a few titles where it does really well, like the 4070 vs 3080.
 
Last edited:

LiquidMetal14

hide your water-based mammals
This is the best moves they've made with the ADA cards.

I criticize when necessary but this is a step in the right direction.
 

Reallink

Member
If you ignore the 8GB 4060 Ti variant, there's no fucking way $200 justifies the difference between the RTX 4060 and RTX 4060 Ti 16GB variant.

But this is Nvidia this generation, running wild with pricing. I mean ffs there's a $200 difference between the RTX 4070 and RTX 4070 Ti, and then theres a $400 gap between the RTX 4070 Ti and the RTX 4080. So I guess I'm not sure why I'm so surprised pricing on the lower end is sucky as well.

Edit:
For reference:
- There was a $300 difference between the GTX 560 and GTX 580.
- There was a $270 difference between the GTX 660 and GTX 680.
- There was a $400 difference between the GTX 760 and GTX 780.
- There was a $350 difference between the GTX 960 and GTX 980.
- There was a $400 difference between the GTX 1060 and GTX 1080.
- There was a $350 difference between the RTX 2060 and RTX 2080.
- There was a $370 difference between the RTX 3060 and RTX 3080.

And somehow now there is a $900 difference between the RTX 4060 and the RTX 4080. Makes perfect sense.

Even more astonishing than the price is that most of the X60's you listed were only 10-20% slower than the same generation's X80. Some X60's could be overclocked to be within spitting distances of the X80, like the GTX 460 Vs. the GTX 480 for example. It's just "mUH iNflAyShUn" according the braindead retards that masquarade as internet experts.
 
Last edited:

DaGwaphics

Member
Whomever was hoping for RTX 3080 performance for <$500 is up for some disappointment. These cards don't even reach RTX3070 performance numbers.

I don't see how there was any of those people left. The RTX 3080 equivalent part in the new series is the 4070 @$600. The 4060 line had to be a step down from that.
 

ToTTenTranz

Banned
I don't see how there was any of those people left. The RTX 3080 equivalent part in the new series is the 4070 @$600. The 4060 line had to be a step down from that.

There's no people left... who saw 3 generations of new 60 series cards getting the same performance as the previous 80 series but at a much lower starting price?
Your argument is that this expectation somehow didn't exist?


Relative Performance 2560x1440
Relative Performance 2560x1440
perfrel_2560_1440.png
 

DaGwaphics

Member
There's no people left... who saw 3 generations of new 60 series cards getting the same performance as the previous 80 series but at a much lower starting price?
Your argument is that this expectation somehow didn't exist?


Relative Performance 2560x1440
Relative Performance 2560x1440
perfrel_2560_1440.png

I'm not criticizing people for wanting that to be the case, however the day the 4070 embargo lifted and real-world performance was known, that dream died. If the 4070 had been 9 or 10% better than the 3080, maybe you could make the argument that a 3080ish 4060ti might slide in there (sometimes the 60ti and 70 have been relatively close).
 

Silver Wattle

Gold Member
Jesus christ. I'm not defending 8GB. VRAM isnt the only thing GPUS needs for performance. How the fuck is everyone so oblivious? If you slap 24GB VRAM on 970, it'll still be shit.
You're trying to spread the bullshit that slower GPUs can't make use of more RAM, which is completely at odds with modern rendering.

If you give modern games more ram overhead they will use it.
 

Kataploom

Gold Member
I posted it on another thread, and it was news to someone there.

Personally, I think collusion would explain some of the seeming inability/unwillingness of AMD to undercut NVIDIA and gain marketshare with a banger card for cheapish. Nothing's proven, of course.
You know, I've thought about that for a time now... It's like AMD seem to be steps behind Nvidia even when Nvidia mess it up badly, like if Nvidia does some steps back AMD does the same amounts of steps back or never take full advantage of their position...

When they announced their frame generation tech I felt like my conspiracy theories were more than mental gymnastics.

AMD tend to come behind Nvidia most of the time but not so far behind as if they just react to Nvidia releasing some new tech, but as if they were both working on something for some time and arrange their announcements as they see fit.

Once these cards release I'm curious to see how AMD will act since it's something I can't unsee anymore 😂
 

tkscz

Member
An 8GB 4060 is the 3GB 1060 all over again. Just stay away if you don't want to regret it a little later. An 8GB ti is just puzzling, who is this card really for? It shouldn't exist as it makes no sense. I imagine it's just there to fill the price gap.

I keep seeing this but it's always in terms of max graphic settings. That 1060 3GB GPU still runs new games at lower settings. My son has my old desktop which runs on a Radeon R9 380x 3GB and still plays newer games, just on low settings.

Cheaper GPUs are usually for people who don't care to run a game at Max, or Ultra, or the highest graphic settings. Something like the 4060ti 8GB is for high or medium settings which would last for a few years. These people aren't trying to hit Ultra at 1440p.
 

drotahorror

Member
Might finally upgrade my 1070.

6700k worries me though.
I'm still on 6700k but have a 3060 ti. CPU is definitely showing it's age. I have all new parts except mobo+cpu and at this point I'm waiting til Meteor Lake or even Arrow Lake. I will probably just grab a 5060 ti or 70 at that point as well. Nvidia shit the bed with the 40xx series minus 80s and 90s.


Also, charging $100 for 8gb vram, lol. FUCK YOU nvidia hah.
 
Last edited:

hlm666

Member
It is now, even on a 3050 you cannot use console textures in newer games... for example, The last of Us looks a bit like a cartoon, and on Hogwart's Legacy, you cannot use RT shadows because it runs out of Vram. 8gb is not enough for console textures in Resident evil remasters or the likes of Dead Space. 8gb is legacy now, even at the low end.
You should maybe go look at what the last patch for tlou did for vram use, 8GB didn't seem to be the problem otherwise how did a software patch fix it? Using shitty unpatched ports is like saying consoles are gonna be 720p in next gen games because jedi survivor drops that low.
 

lukilladog

Member
You should maybe go look at what the last patch for tlou did for vram use, 8GB didn't seem to be the problem otherwise how did a software patch fix it? Using shitty unpatched ports is like saying consoles are gonna be 720p in next gen games because jedi survivor drops that low.

Patch fixed allocation and streaming for 8gb cards, texturing still looks crappy. And even on good ports like RE4, 8gb makes texturing look a bit crappy. 8gb is just EOL.
 

lukilladog

Member
So is this good or bad lol

I see some people at each others throats over 8GB Ram.

8gb is way off to even match console's texturing in newer games, 1080p or 900p, doesn't matter. And it gets worse since a card that is already choking on vram will become a disaster if you want to enable RT effects on top as that eats on vram too... just like dlss3 frame interpolation does, which makes the card absurd. You wont be able to use its features in next gen games... it happened with the rtx 2060 6gb and will happen again.
 
Last edited:
So is this good or bad lol

I see some people at each others throats over 8GB Ram.
The key point is that current gen consoles has about 11gb to work with. So any current gen only PC ports will likely require similar (if not more) VRAM to get comparable texture quality and performance. If a port is well optimized (an endangered species at this point) you might see 8Gb VRAM still being adequate, but in most cases one would either see: (1) loads of hitching/stutter or; (2) lower texture quality on 8gb VRAM cards Vs the console editions.

People buying 8gb cards in 2023 are basically hoping that PC porting get much better or, they are content to play older titles (including any cross gen PC ports) or, are not aware of issues.
 

lukilladog

Member
The key point is that current gen consoles has about 11gb to work with. So any current gen only PC ports will likely require similar (if not more) VRAM to get comparable texture quality and performance. If a port is well optimized (an endangered species at this point) you might see 8Gb VRAM still being adequate, but in most cases one would either see: (1) loads of hitching/stutter or; (2) lower texture quality on 8gb VRAM cards Vs the console editions.

People buying 8gb cards in 2023 are basically hoping that PC porting get much better or, they are content to play older titles (including any cross gen PC ports) or, are not aware of issues.

Yeah, pretty much 6500xt buyers with more money.

hi-baby.gif
 
Last edited:
Top Bottom