• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA Allegedly Begins Testing Its Fastest Next-Gen GPU, The AD102, For GeForce RTX 4090 Graphics Card, Features 24 Gbps GDDR6X Memory

OZ9000

Banned
1440p 144+ looks and feels better IMO than 4k60 on a desktop monitor.
My PC is hooked up to a 4KTV so 4k60 is my target. Happy enough at this pleb framerate lol. I find graphics look worse when playing games on a monitor.

I think 144 FPS would require you to game at 1080p and/or have a monster rig.
 
Last edited:

yamaci17

Member
My PC is hooked up to a 4KTV so 4k60 is my target. Happy enough at this pleb framerate lol. I find graphics look worse when playing games on a monitor.

I think 144 FPS would require you to game at 1080p and/or have a monster rig.

if you're not exclusively targeting a perfect flawless 144 fps, you should not need to target 1080p

anything between 85-130 frames are smooth and enjoyable on a VRR-144 hz screen

its been 4 yrs since i got my vrr 144 hz screen and i never felt the need of getting a flawless 144 fps in any game. if i can get something in the range of 70-100 i'm usually hapy. if it happens to push above 100, i'm even more happy. latest game i've played was god of war, at 4k dlss performance i was getting 80-110 frames with optimized settings (no ultra but some highs and originals). it was a very fun experience. (by the way, 4k + dlss performance provides higher fidelity than native 1440p. but that's a bit hard to explain)

targeting 144 fps in that game would be brutal, 1080p, everything low... even then it would probably not possible. too much sacrifice for not so much gains
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I am not advocating buying a 1080 Ti! Just pointing out the performance tier where the 3060 sits.

Moreover, the discussion is about memory usage in relation to resolution. Yes, you can use DLSS to play at 1440p on a 3060 with some ray tracing effects. But the same applies just as well to a 3070, where you can expect to see better ray tracing effects. DLSS diminishes the need for more VRAM by pushing down the render resolution, while delivering similar quality to native rendering.
A performance deficit.
Especially one as large as the one between the 3060 and 3070 cant be made up by VRAM.
Its such an odd conversation.
This cat is saying ill turn down the settings but keep ultra textures......as if the 3070 cant turn down settings.
Keeping ultra textures as well seems kinda silly on a memory bandwidth limited card, especially if you are already pushing higher resolutions.
Theres no universe where the 3060 beats the 3070 even with that extra VRAM

I think people forgot/forget how big the gap between teh 3060 and 3070 actually is.
The extra VRAM will never factor into that "battle".


On Topic

Nvidia are taking a piss if Ada Lovelace eats even more power.
700watts in gaming workloads?
Power.png
 
Last edited:

CuNi

Member
A performance deficit.
Especially one as large as the one between the 3060 and 3070 cant be made up by VRAM.
Its such an odd conversation.
This cat is saying ill turn down the settings but keep ultra textures......as if the 3070 cant turn down settings.
Keeping ultra textures as well seems kinda silly on a memory bandwidth limited card, especially if you are already pushing higher resolutions.
Theres no universe where the 3060 beats the 3070 even with that extra VRAM

I think people forgot/forget how big the gap between teh 3060 and 3070 actually is.
The extra VRAM will never factor into that "battle".


On Topic

Nvidia are taking a piss if Ada Lovelace eats even more power.
700watts in gaming workloads?
Power.png

At this point, everyone not UVing his GPU is either filthy rich, lives in cold climate and has actual use for the heat or resides in a country that has cheap electricity.
Change my mind.

No it isnt. 3090 is already struggling in nextgen stuff.

And the point of this comment is? VFX is right. Just because the 3090 "struggles" with nextgen stuff doesn't mean GPUs releasing every 2 years isn't a fast evolution?
You do realize that games will ALWAYS max out Hardware, right? Like if Aliens would drop a GPU that is 1.000x more performant than the 3090 is currently, do you think devs wouldn't create content that would also max out this card? By your logic, that card would also be "struggling" with next-gen stuff lol.
 
Last edited:

Hezekiah

Banned
LOLNO, you're living in Candyland if you believe selling out $1000+ GPU's within 5 seconds for 2 years hasn't altered their 4XXX product stack. Take the "historical norm" expectation and shift it down a tier. The AD103 is the 4080. Full AD102's will either be a Titan or 4090 while faulty AD102's will either be a 4090 (under the Titan) or 4080Ti (under the 4090). There is no reality where Nvidia are going to sell a card twice as fast as a 3080 for "$699" or even "$799" or "$899". Full AD102 card will be $2000, cut down AD102 card will be $1500. AD103 Founders Edition will likely be unveiled as "$799" with 10 units available while the remaining 99.9% of inventory are AIB's priced between $900 - $1000. And they will sell every card they can make while posting record profits.
Agreed. Nvidia will increase the price. History shows they're greedy af, plus the demands is there.
 
Last edited:

LiquidMetal14

hide your water-based mammals
I think the enthusiasts in here will agree that powering these new cards will require the newest CPU to keep performance at a peak (duh). I have a 5900x and put this rig together 2 years ago (3090) and don't feel like just upgrading the GPU. Plus with AMD, I'll be able to hopefully upgrade CPU's for a few years.
 
I think the enthusiasts in here will agree that powering these new cards will require the newest CPU to keep performance at a peak (duh). I have a 5900x and put this rig together 2 years ago (3090) and don't feel like just upgrading the GPU. Plus with AMD, I'll be able to hopefully upgrade CPU's for a few years.

I think a problem some aren't even addressing is how the 3x series cards have some crazy power spikes. I was recommended a 1000 watt PSU, but I went with 1200 watts instead, in case my next upgrade needs to overcompensate for another increase in power draw. Hoping for more efficiency and lower draw with 5x series.
 

DaGwaphics

Member
The 3060 is slower than a 1080 Ti and is roughly in between a 2070 and a 5700 XT. So we would expect it to be around a PS5 in rasterization performance and slower than an XSX. (Though comparisons with PC aren't always straightforward).

When console games start targeting 1440p, you will have a choice between playing at 1440p with console-like settings, or dropping to 1080p. Right now in Cyberpunk, you won't get a locked 60 at 1440p unless you drop below medium settings.

Yo, I was never advocating 3060 as a 4k card, I think the quote feature of the forum hit a snag there.

I agree that 3060 seems to perform similarly to the consoles, making it an option if you don't mind console framerates on PC (I'd just get a console for that). So far, it can even match the consoles at 4k for the most part (30fps), in spite of the low memory bandwidth.
 
Last edited:
I only care about the facts that are relevant to the topic we're discussing

texture load distance (which is what doom eternal's texture quality settings stands for above the high setting) is not relevant to the actual texture quality, therefore it holds no value for this discussion

In actual texture pack quality discussion, you may maybe find 1 out of 10 last-gen games that 8 gb truly has an advantage over 4 gb at 1080p. its your fault that you portray a situation as if almost all games run with higher textures on 8 gb GPUs, whereas the situation is rare. If a 4 GB GPU can match the 8 GB one with maximum texture fidelity in %99 of the titles in lastgen, your point becomes moot

so here are some corrections for your "facts"

"use significantly higher texture settings and not run into stutter in games as soon as it happened to 970.... IN ONLY 1 GAME OUT OF 10 GAMES (best case) "

you can claim that you've went and modified unreal engine 4 parameters in config files and forced the game to use more vram and got less stutters. no one says more vram is bad or anything. you can find uses for it. but its niche. that's about it. engines are built around limitations of hardware. the fact that series s has 4.5 gb available vram, even 6 gb vram won't be dead at 1080p for a long time. you can of course at some point run higher textures and be happy about it. it does not change the fact that gtx 970 1650s users enjoy the games just as the same because most of the time, even high and ultra textures (if the said ultra textures are aimed at 4K users) look similar but its not even the point of our discussion. you can run 4k textures on a 1080p screen with a 8 GB 580 but it wont do you much good anyways. thats why series s is given 10 gb budget and series x is given 16 gb budget, because 4K textures are also useless on a system that targets 1080p (hello rx 580)
1 out of 10 games is a lie, and whatever you're trying to claim as a "last gen game" is irrelevant as we are still in cross gen. So you are downplaying doom textures, ok. Now try far cry 6. I could just come back at you again and again with a game after every one of these dumb ass posts, and the goal posts would change again.
I am not advocating buying a 1080 Ti! Just pointing out the performance tier where the 3060 sits.
Well, it's not lower than 1080ti though.
 
Last edited:

DaGwaphics

Member
I only care about the facts that are relevant to the topic we're discussing

texture load distance (which is what doom eternal's texture quality settings stands for above the high setting) is not relevant to the actual texture quality, therefore it holds no value for this discussion

In actual texture pack quality discussion, you may maybe find 1 out of 10 last-gen games that 8 gb truly has an advantage over 4 gb at 1080p. its your fault that you portray a situation as if almost all games run with higher textures on 8 gb GPUs, whereas the situation is rare. If a 4 GB GPU can match the 8 GB one with maximum texture fidelity in %99 of the titles in lastgen, your point becomes moot

so here are some corrections for your "facts"

"use significantly higher texture settings and not run into stutter in games as soon as it happened to 970.... IN ONLY 1 GAME OUT OF 10 GAMES (best case) "

you can claim that you've went and modified unreal engine 4 parameters in config files and forced the game to use more vram and got less stutters. no one says more vram is bad or anything. you can find uses for it. but its niche. that's about it. engines are built around limitations of hardware. the fact that series s has 4.5 gb available vram, even 6 gb vram won't be dead at 1080p for a long time. you can of course at some point run higher textures and be happy about it. it does not change the fact that gtx 970 1650s users enjoy the games just as the same because most of the time, even high and ultra textures (if the said ultra textures are aimed at 4K users) look similar but its not even the point of our discussion. you can run 4k textures on a 1080p screen with a 8 GB 580 but it wont do you much good anyways. thats why series s is given 10 gb budget and series x is given 16 gb budget, because 4K textures are also useless on a system that targets 1080p (hello rx 580)

Where did you see that the XSS only has 4.5GB available VRAM, I know they've said it has 8GB of unified memory available to games. How is the 3.5GB segregated from the rest?
 

OZ9000

Banned
if you're not exclusively targeting a perfect flawless 144 fps, you should not need to target 1080p

anything between 85-130 frames are smooth and enjoyable on a VRR-144 hz screen

its been 4 yrs since i got my vrr 144 hz screen and i never felt the need of getting a flawless 144 fps in any game. if i can get something in the range of 70-100 i'm usually hapy. if it happens to push above 100, i'm even more happy. latest game i've played was god of war, at 4k dlss performance i was getting 80-110 frames with optimized settings (no ultra but some highs and originals). it was a very fun experience. (by the way, 4k + dlss performance provides higher fidelity than native 1440p. but that's a bit hard to explain)

targeting 144 fps in that game would be brutal, 1080p, everything low... even then it would probably not possible. too much sacrifice for not so much gains
What are the downsides of G sync or VRR? Does it introduce more input lag?

I plan on upgrading my TV to either the Samsung Q90B or LG C2. Both offer 4k120 though I'd probably settle for anything between 60 and 90 FPS.
 
Last edited:
This cat is saying ill turn down the settings but keep ultra textures......as if the 3070 cant turn down settings.
Keeping ultra textures as well seems kinda silly on a memory bandwidth limited card, especially if you are already pushing higher resolutions.
Higher textures are pretty much performance free as long as you have the vram.

3070 doesn't have the vram needed for that luxury in comparison to 3060, end of story.

Now if you people would excuse me, I'll bail this dumpster fire thread and get back to playing games, at 4k, on the 3060 😘
 

FireFly

Member
Well, it's not lower than 1080ti though.
You're quibbling a minor point, while ignoring the main issue, which is that 3060 is a console tier GPU, so is not designed for 4K60, and even at 1440p you will have to lower settings in the future.

But, since you insist:


If you have another performance summary showing the 1080 Ti equal or behind, feel free to post it.
 
You're quibbling a minor point, while ignoring the main issue, which is that 3060 is a console tier GPU, so is not designed for 4K60, and even at 1440p you will have to lower settings in the future.

But, since you insist:


If you have another performance summary showing the 1080 Ti equal or behind, feel free to post it.
The link you posted showed 1080ti ahead by 4-5%.

Before we even think about going further, how is that not 1080ti tier performance? That is a marginal difference any way you want to look at it.

And again "not designed for 4k 60... You'll have to lower settings at 1440p later on" just shows that people have ultra settings in mind when they say it's not a 4k60 card when it has been shown how wasteful ultra can be.

1080ti was considered a 4k card. 3060 is right there with it, with more vram, with rt and tensor cores, and I am an actual user that plays in 4k with the card not just looking at benches and you all want to trash the card.

The narrative in here is completely bullshit, and I don't care if 10 more people came in here saying the same crap.
 
Last edited:

FireFly

Member
The link you posted showed 1080ti ahead by 4-5%.

Before we even think about going further, how is that not 1080ti tier performance? That is a marginal difference any way you want to look at it.
It's a 7% difference against the stock card. If that's enough for you to call it "equal" even though it's closer to a 2070, then fine.

As I said, the main point is that the GPU sits in the performance range where the consoles are.
 
It's not really dead. Ray Tracing you got from PC and the latest big deal in gaming, Battle Royale, also from PC. Just as it always was. Nearly every genre we play today was invented on PC- shooters, rpg's, stealth, mmo's, adventures, strategy games, sim games. PC continues to be the benchmark for everything
RT only became mainstream once PS5 and Xbox had it.
There's no point introducing some magic exclusive tech if console doesn't support it, it will just end up as Nvidia's hairworks.
 
It's a 7% difference against the stock card. If that's enough for you to call it "equal" even though it's closer to a 2070, then fine.

As I said, the main point is that the GPU sits in the performance range where the consoles are.
It wins in cyberpunk, though marginally. It wins in games that use async compute heavily, like Wolfenstein, or rainbow six siege.

It's a better gpu than the consoles have.

Also my evga model is factory overclocked as well.
 
Last edited:

FireFly

Member
It wins in cyberpunk, though marginally. It wins in games that use async compute heavily, like Wolfenstein, or rainbow six siege.

It's a better gpu than the consoles have.

Also my evga model is factory overclocked as well.
Wolfenstein didn't receive a next-gen patch on the consoles AFAIK. And I haven't seen a comparison of Siege's 120 FPS mode to PC. If it's mostly a locked 120 FPS, it won't tell you much.

I can't find any like-for-like benchmarks of Cyberpunk. PC benchmarks show that it performs like a 2060 Super/2070/5700 XT, which is exactly where you would expect it to be, and where you would expect the consoles to be, too.


But this is also getting away from the point, which is not consoles vs. PC. But rather whether when console games target 1440p, you will have enough performance to not have to lower settings or resolution. Well, in Cyberpunk with a 3060 you can't get a locked 60 FPS at 1440p even at medium settings, so you either drop settings further or drop resolution (natively or via DLSS). I can see the situation repeating or worsening with UE5 games.
 
Last edited:

VFXVeteran

Banned
No it isnt. 3090 is already struggling in nextgen stuff.
3090 wasn't meant to play everything at 4k/60. I keep mentioning over and over again that NONE of these GPUs (consoles) have enough power to run full on RT without the need for DLSS, so stop assuming that a new GPU will play everything at 4k/60 no matter what the settings are. And how you think 1.5yrs is too slow is beyond any rationale thinking. Should Nvidia now release 1 graphics card series per year?!?
 
Last edited:

VFXVeteran

Banned
RT only became mainstream once PS5 and Xbox had it.
There's no point introducing some magic exclusive tech if console doesn't support it, it will just end up as Nvidia's hairworks.
Are you serious? RT showed up in many games on the PC GPUs at the end of last gen. Just because the new consoles have a large fanbase doesn't mean that the developers waited on implementing RT and then decided "let's start making our games be RT since the consoles have it".
 

Life

Member
I''m not geek and I'm not gonna pretend I am - but from reading some of the posts, power demands appear to be pretty ridiculous. Yes, I understand how important gfx/res/gaming all are - but a good chunk of the time I won't be gaming on my PC. Perhaps just browsing/watching vids etc:

If I've got one of these new bad boys installed, does that mean the power usage is still gonna be high, despite doing the above 70% of the time (and 30% gaming)? Or is the gfx card irrelevant and it's all about the PSU you're forced to buy. In which case, will the PSU adjust power accordingly?

I hope I'm being naive and all this was thought of and resolved decades back....right?
 

yamaci17

Member
I''m not geek and I'm not gonna pretend I am - but from reading some of the posts, power demands appear to be pretty ridiculous. Yes, I understand how important gfx/res/gaming all are - but a good chunk of the time I won't be gaming on my PC. Perhaps just browsing/watching vids etc:

If I've got one of these new bad boys installed, does that mean the power usage is still gonna be high, despite doing the above 70% of the time (and 30% gaming)? Or is the gfx card irrelevant and it's all about the PSU you're forced to buy. In which case, will the PSU adjust power accordingly?

I hope I'm being naive and all this was thought of and resolved decades back....right?

if your pc is idle and all you do is watching videos, reading books, typing stuff, your system will overall will have low power consumption REGARDLESS the psu you have. so, you can have a 600w PSU or 6000W PSU, power draw will be the same (if the psu efficiencies are matched. some psus have bad efficiency at low wattages (we're talking about %4-10 differences btw))


and yes, GPUs and CPUs spent enormously lower power when you're doing normal stuff. Most GPUs idle at 10-25 watts compared to their full loads at 200+ watts. as i'm typing this to you, my gpu pulls 20 watts and my cpu pulls 25 watts. in gaming scenarios, i'd get 200 watts of gpu and 80 watts of cpu
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Now if you people would excuse me, I'll bail this dumpster fire thread and get back to playing games, at 4k, on the 3060 😘
At 30fps.
But this is also getting away from the point, which is not consoles vs. PC. But rather whether when console games target 1440p, you will have enough performance to not have to lower settings or resolution. Well, in Cyberpunk with a 3060 you can't get a locked 60 FPS at 1440p even at medium settings, so you either drop settings further or drop resolution (natively or via DLSS). I can see the situation repeating or worsening with UE5 games.
Hes already playing at potatoe quality but keeping the textures on ultra.
Hahahaha all the modern gaming effects and features that actually make a game look good he will abandon but keep the technology that effectively hasnt improved, very rarely does the difference between high and ultra make a difference visually.
But turning down all the other effects makes games look like CounterStrike 1.6 with a texture pack.....all this to justify buying a woefully underpowered card for 4K just cuz it has 12GB of VRAM and you know more VRAM means its better than the almost double the framerate RTX 3070/70Ti.
 

LiquidMetal14

hide your water-based mammals
I think a problem some aren't even addressing is how the 3x series cards have some crazy power spikes. I was recommended a 1000 watt PSU, but I went with 1200 watts instead, in case my next upgrade needs to overcompensate for another increase in power draw. Hoping for more efficiency and lower draw with 5x series.
Absolutely. I did the exact same thing and got a 1200w Supernova for my 3090 rig to cover myself but that's what I'll likely be needing upgrading to a more power hungry CPU/GPU.
 
I was planning on showing some before and after fps data on my rig once the 5800x3d arrives for comparison with the 1600af. Should post it in the new PC thread.

But yes, games like Witcher 3 are smooth 60fps at high/ultra mix at native 4k. Metro Exodus runs great with dlss. Other newer games are native 4k with tweaked settings, but always retaining highest textures.

On 3060. 🤔

Enjoy not even being able to say that the cards higher than 3060 should have shipped with more vram, lest that Nvidia dick you're taking stops feeling as good to you anymore.
 

HeisenbergFX4

Gold Member
3090 wasn't meant to play everything at 4k/60. I keep mentioning over and over again that NONE of these GPUs (consoles) have enough power to run full on RT without the need for DLSS, so stop assuming that a new GPU will play everything at 4k/60 no matter what the settings are. And how you think 1.5yrs is too slow is beyond any rationale thinking. Should Nvidia now release 1 graphics card series per year?!?
People don't realize how close the 3080 and 3090 are in game performance as too many think the 3090 should be smoking the 3080

Truth is the 3090 was not intended for the average gamer
 
Wolfenstein didn't receive a next-gen patch on the consoles AFAIK. And I haven't seen a comparison of Siege's 120 FPS mode to PC. If it's mostly a locked 120 FPS, it won't tell you much.

I can't find any like-for-like benchmarks of Cyberpunk. PC benchmarks show that it performs like a 2060 Super/2070/5700 XT, which is exactly where you would expect it to be, and where you would expect the consoles to be, too.
I am talking about 1080ti vs 3060, not 3060 vs consoles. With regards to games like wolf, rainbow and cyberpunk.
 
Last edited:

FireFly

Member
I am talking about 1080ti vs 3060, not 3060 vs consoles. With regards to games like wolf, rainbow and cyberpunk.
Ok, the "it's a better gpu than the consoles have" bit confused me. Sure, I can grant that the 3060 is better for a number of modern games. Though I am not sure what that has to do with the discussion on memory usage or the 3060 being potentially forced back to 1080p for max settings.
 
Last edited:
Ok, the "it's a better gpu than the consoles have" bit confused me. Sure, I can grant that the 1080 Ti is better for a number of modern games. Though I am not sure what that has to do with the discussion on memory usage.
Oh no, I was just saying that the 3060 trades blows with 1080ti, or is very close in the games where the former loses. Those game examples are where 3060 pulls ahead of 1080ti.

My point is considering how close they are, with both having a similar amount of vram it makes more sense to place 3060 at 1080ti level rather than 8gb cards like 2070, which 3060 edges out anyway.

The reason I say it's a better gpu than consoles is because it does perform better in pure raster, I mean you have games that are only running in 1080p on PS5 in their performance modes and still have issues keeping fps up like Guardians of the Galaxy.

Then on top of that you have the good amount of vram, better rt performance and dlss support. I have no doubt that 3060 could last the generation when targeting console like settings, as long as driver support is there. But again I plan on upgrading my gpu every card generation, so it won't come to that.
 
Last edited:

SantaC

Member
3090 wasn't meant to play everything at 4k/60. I keep mentioning over and over again that NONE of these GPUs (consoles) have enough power to run full on RT without the need for DLSS, so stop assuming that a new GPU will play everything at 4k/60 no matter what the settings are. And how you think 1.5yrs is too slow is beyond any rationale thinking. Should Nvidia now release 1 graphics card series per year?!?
Not sure how you came to conclusion that it should be on a one year cycle just because i corrected you when you said it just came out.

You are an idiot.
 

VFXVeteran

Banned
Not sure how you came to conclusion that it should be on a one year cycle just because i corrected you when you said it just came out.

You are an idiot.
Oh. Well I didn't know I was being harassed for not stating the obvious. Of course they have been out over a year. My point is that they just came out compared to a typical GPUs life cycle.

Oh thanks for the insult.
 
Last edited:

SantaC

Member
Oh. Well I didn't know I was being harassed for not stating the obvious. Of course they have been out over a year. My point is that they just came out compared to a typical GPUs life cycle.

Oh thanks for the insult.
A typical GPU lifecycle is 2 years...this has happened since 25 years back in time.
 

FireFly

Member
I have no doubt that 3060 could last the generation when targeting console like settings, as long as driver support is there. But again I plan on upgrading my gpu every card generation, so it won't come to that.
That was my point, responding to your claim that "In no way shape or form is the 3060 a 1080p60 card". If you want above console settings, it makes sense to consider the 3060 a 1080p60 card.

(I looked at Guardians of the Galaxy's performance on PS5 and something seems seriously wrong there, as even a Series S isn't far off in unlocked mode).
 

VFXVeteran

Banned
A typical GPU lifecycle is 2 years...this has happened since 25 years back in time.
And I still say that's the production cycle and not it's usefulness cycle (which is way longer than that). To me 2yrs is too soon and you can't make me think otherwise.
 
Last edited:
That was my point, responding to your claim that "In no way shape or form is the 3060 a 1080p60 card". If you want above console settings, it makes sense to consider the 3060 a 1080p60 card.

(I looked at Guardians of the Galaxy's performance on PS5 and something seems seriously wrong there, as even a Series S isn't far off in unlocked mode).
Again though, I played one (!) game at 1080p... Quantum break. Some games I play like metro Exodus are using 1080p native plus dlss, but that's not the same thing. Metro is a very heavy game.

No one with any sense is using the 3060 as a 1080p60 card. You are still using the criteria of *go ultra or go home*!

I could use it as a 1080p or 1440p 120 card when my cpu upgrade arrives.
 
Last edited:

CrustyBritches

Gold Member
3060 has disappointing performance. The past 2 gens the -xx60 model was able to perform similar to the -xx80 model from the previous gen. 3060 is bang-on a 2070 and 5700XT. I had one and after OC it was slower than my 2060S with OC. 3060Ti should have been the 3060.
 

FireFly

Member
Again though, I played one (!) game at 1080p... Quantum break. Some games I play like metro Exodus are using 1080p native plus dlss, but that's not the same thing. Metro is a very heavy game.

No one with any sense is using the 3060 as a 1080p60 card. You are still using the criteria of *go ultra or go home*!

I could use it as a 1080p or 1440p 120 card when my cpu upgrade arrives.
I'm not using the criteria of "go ultra or go home". I am talking about future workloads, when support for current generation consoles are dropped and games focus on lower resolutions like 1440p (which is already starting to happen) to maximise visual quality. That's the whole point of the "8 GB isn't enough" argument in the first place, since current generation games still work fine at 4K w/8 GB. (You can increase the texture pool size on Doom Eternal to create a performance hit, but it doesn't provide a discernable visual benefit).
 
I'm not using the criteria of "go ultra or go home". I am talking about future workloads, when support for current generation consoles are dropped and games focus on lower resolutions like 1440p (which is already starting to happen) to maximise visual quality. That's the whole point of the "8 GB isn't enough" argument in the first place, since current generation games still work fine at 4K w/8 GB. (You can increase the texture pool size on Doom Eternal to create a performance hit, but it doesn't provide a discernable visual benefit).
Far cry 6 already shows a disproportionate hit to performance on a 10gb card at 4k.

A cross gen game. Arguing about what is a worthwhile difference in texture settings is moving the goalpost. Either these cards do or do not have a vram capacity bottleneck.

8gb isn't enough. It won't be enough to a greater degree moving forward.
 
Last edited:

FireFly

Member
Far cry 6 already shows a disproportionate hit to performance on a 10gb card at 4k.

A cross gen game. Arguing about what is a worthwhile difference in texture settings is moving the goalpost. Either these cards do or do not have a vram capacity bottleneck.

8gb isn't enough. It won't be enough to a greater degree moving forward.
I thought Far Cry 6 loaded lower res mip maps to avoid stutter when running out of GPU memory. And the 3080 had a bug with the high res texture pack (which was fixed), causing the textures to look especially blurry, which made it look like it couldn't support the additional resolution.

But say you're right. It's the same difference between the 3060 being a *go ultra or go home* card at 1080p60 right now, or being perfectly fine in most cases at 1440p. The compelling reason to think 8 GB is not enough at 4K is in future applications. And equally, the real reason to think the 3060 is a 1080p60 card is the arrival of future games targeting the next generation consoles exclusively.

Edit: To be clear, my point is that if you're going to be forward looking with VRAM, you have to be forward looking with performance too.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Nvidia might have done a node shrink all the way down to TSMC 4n.

Then why the fuck is this card expected to eat so much power?
VpOWpFZ.jpg
 
Last edited:
I thought Far Cry 6 loaded lower res mip maps to avoid stutter when running out of GPU memory. And the 3080 had a bug with the high res texture pack (which was fixed), causing the textures to look especially blurry, which made it look like it couldn't support the additional resolution.

But say you're right. It's the same difference between the 3060 being a *go ultra or go home* card at 1080p60 right now, or being perfectly fine in most cases at 1440p. The compelling reason to think 8 GB is not enough at 4K is in future applications. And equally, the real reason to think the 3060 is a 1080p60 card is the arrival of future games targeting the next generation consoles exclusively.

Edit: To be clear, my point is that if you're going to be forward looking with VRAM, you have to be forward looking with performance too.
But again, what 1080p60 games are you talking about? 3060 can run 1080p ultra settings at 120fps in a number of games including shadow of the tomb raider, death stranding etc. Witcher 3 can hit way over 120fps.

And to be clear, my point is that Nvidia cheaped out and the cards we are discussing simply should have had more vram.

In this thread users have used a straw man saying I am suggesting 3060 doesn't need extra cuda cores or whatever because it has 12gb... That's not what I am saying at all, and I look forward to having a card that can run 4k120 games. A card that has enough vram for longevity and not like the 3070 or even 3080 10gb!
 
Last edited:
Top Bottom