• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unknown 3000 series NV GPU "leaks" on Time Spy, 30% faster than 2080Ti FE

Moochi

Member
I have a 980ti I bought 3 years ago. It crushes every game at 1440p, and with newer games that have dynamic upscaling it gets pretty close to 4k 60hz. I play on an 85" Samsung from about six feet away. A card that can do 4k60 on ultra would be nice, but I wouldn't spend more than $500.
 
Last edited:

Yoda

Member
People assuming this is anything other than the 3080ti need to look back to history.

The most recent gen equivalent was:
2080ti == its own category (for consumer cards)
2080 == 1080ti
2070 >= 1080

If you think nvidia is going to shoot themselves in the foot by giving more value when there is NO compeition at the high end desktop GPU market you're fooling yourself.
 
People assuming this is anything other than the 3080ti need to look back to history.

The most recent gen equivalent was:
2080ti == its own category (for consumer cards)
2080 == 1080ti
2070 >= 1080

If you think nvidia is going to shoot themselves in the foot by giving more value when there is NO compeition at the high end desktop GPU market you're fooling yourself.
AMD plans on giving competition in the high-end segment though.

Not that I disagree with it being the 3080 Ti. The odds of it being a 2080 Ti are almost nil.
 

ZywyPL

Banned
People assuming this is anything other than the 3080ti need to look back to history.

The most recent gen equivalent was:
2080ti == its own category (for consumer cards)
2080 == 1080ti
2070 >= 1080

If you think nvidia is going to shoot themselves in the foot by giving more value when there is NO compeition at the high end desktop GPU market you're fooling yourself.


But if we go back one more generation:

1080Ti = its own category
1080 = its own category
1070 = 980Ti
1060 = 980

It will all go down to how effectively NV will utilize 7nm, the jump from Maxwells to Pascals was so big because of process node shrink, whereas Turings were made in more or less the same process as Pascals, hence not much more processing power. I think they will as always want to jump away as far as possible from AMD's upcoming RDNA2 GPUs, so if needed they can just adjust the price, like they always do, instead of settling for "good enough" and eventually go into panic mode if something goes wrong and having to redesign their entire lineup or rush 4000 series.
 
Last edited:

Mister Wolf

Member
It's always the best to jump 2 gens, you should wait for 4080/90ti, people with 1080ti will get exactly what they need with this GPU.

Ultimately you're right. Club 3D is releasing the adapter so I can use 4K 120hz at the end of this month but it wont let me use the TV's gsync anymore. Will have to see how I feel about giving that up. I cant stand screen tearing but I know at high framerates its not as bad.
 

GetemMa

Member
RTX 3080 sounds like my next card. Using GTX 1080 right now and it's still very capable of running almost all games at native res, high to max settings, and getting well above 60fps on even brand new titles but of course there's no real time ray tracing. I have a secondary PC all ready to go for a racing rig I am building, all it needs is a video card and the GTX 1080 should give me really high frame rates on a 1080p/144hz monitor.

33% increases over a 2080ti and much improved RTX chips should keep those frame rates up at 1440p/144hz well into next gen. I just hope Nvidia hasn't lost the plot on pricing but I'm bracing myself for bad news on that front.
 

Yoda

Member
But if we go back one more generation:

1080Ti = its own category
1080 = its own category
1070 = 980Ti
1060 = 980

It will all go down to how effectively NV will utilize 7nm, the jump from Maxwells to Pascals was so big because of process node shrink, whereas Turings were made in more or less the same process as Pascals, hence not much more processing power. I think they will as always want to jump away as far as possible from AMD's upcoming RDNA2 GPUs, so if needed they can just adjust the price, like they always do, instead of settling for "good enough" and eventually go into panic mode of something goes wrong and having to redesign their entire lineup or rush 4000 series.

Fair point, but I'd argue the trend for Nvidia is to deliver less per new gen vs. delivering more. I'm not convinced AMD will be competitive, esp at the very high end xx80/ti. Without that pressure, it'd almost be irresponsible from a shareholder value PoV to supply extra value to consumers knowing they'd be forced to pay for less.
 

Pizdetz

Banned
Saw this over on the Hardware reddit.
People were saying if it's the RTX3080 then it's good news.
But that could put the new 3070 in 2080 Ti territory. Yikes. The whole pricing structure would collapse on old cards.
If it's the RTX3080Ti or the RTX 3090 or whatever high end card they have, then this node advance is nothing special. Basically you're better off buying a console because the value proposition just goes to shit for a GPU.
 

Krappadizzle

Gold Member
AMD plans on giving competition in the high-end segment though.

Not that I disagree with it being the 3080 Ti. The odds of it being a 2080 Ti are almost nil.
And this is what we ALL need to be happy about. Nvidia went off their fucking rocker with the 20xx series pricing because AMD had jack shit to compete. If AMD has something very competitive for us, we'll see prices come down across the board. Competition is HEALTHY people. It's why I never understood the Sony zealots that say MSFT should get out of the market. Do ya'll not understand that a market dominated by ONE company is not a market that gives a shit about consumers because what other choice do you have?!

Saw this over on the Hardware reddit.
People were saying if it's the RTX3080 then it's good news.
But that could put the new 3070 in 2080 Ti territory. Yikes. The whole pricing structure would collapse on old cards.
If it's the RTX3080Ti or the RTX 3090 or whatever high end card they have, then this node advance is nothing special. Basically you're better off buying a console because the value proposition just goes to shit for a GPU.

It's only been like that for the last 20 years or so.... It's also why the 1080ti has stayed a relevant card. The performance improvements for the 20xx series just wasn't enough to warrant the jump in price, so people saw why the 1080ti had SO much value, much longer than most cards in similar categories have had in the past. The 1080ti has been the card that just keeps on giving. It reminds me of the old days when the 8800GT had so much longevity.
 
Last edited:

PhoenixTank

Member
I want this to be the 3080. I expect it to be the 3080Ti, though the node shrink adds wiggle room. I'm probably not upgrading yet either way.

Competition is HEALTHY people. It's why I never understood the Sony zealots that say MSFT should get out of the market. Do ya'll not understand that a market dominated by ONE company is not a market that gives a shit about consumers because what other choice do you have?!

78Dk2KT.jpg
 

ZywyPL

Banned
Fair point, but I'd argue the trend for Nvidia is to deliver less per new gen vs. delivering more. I'm not convinced AMD will be competitive, esp at the very high end xx80/ti. Without that pressure, it'd almost be irresponsible from a shareholder value PoV to supply extra value to consumers knowing they'd be forced to pay for less.

Well, it's withing their best own interest to offer slight jumps every year or two, rather than one substantial leap and then figuring out how to top it. especially given how hard it is nowadays with Moore's Law not working anymore. That's how every business in the world works, no one is selling you staff that will work for 20-30 years anymore.

But you have to say, Turings are HUGE, up to 800mm GPUs, that's where the price bump comes from. but half or so of the die space goes for Tensor and RT cores, if they would've been replaced with Ordinary CUDA cores we could already have 30TF for two years.
 

CrustyBritches

Gold Member
1060 performed around a 980, 2060 was around a 1080, and 3060 should be around a 2080. 2070 was 5-10% under the 1080ti, and 3070 should land in a similar position relative to the 2080ti. 3080 will probably be 5-10% faster than the 2080ti, while 3080ti ends up 30-40% faster.

This isn't taking into account RT gains. Personally, RT is what I'm most looking forward to, so I hope it's substantial.
 

SantaC

Member
Saw this over on the Hardware reddit.
People were saying if it's the RTX3080 then it's good news.
But that could put the new 3070 in 2080 Ti territory. Yikes. The whole pricing structure would collapse on old cards.
Thats why Nvidia milking their shitty 20xx series as long as possible.

If AMD wasnt releasing their new cards this year; i bet you that ampere would been delayed to next year.
 

CuNi

Member
With leaks increasing, I wonder if they'll make a announcement event in July with benchmarks etc and announcement of availability or just go all in and make a even on launch-day in August/September.
 

The Cockatrice

Gold Member
Honestly besides Cyberpunk I see no reason why I should upgrade from 2080. By the time we get demanding and worth while games on PC to use a high end gpu we'll be at 4080.
 

SantaC

Member
With leaks increasing, I wonder if they'll make a announcement event in July with benchmarks etc and announcement of availability or just go all in and make a even on launch-day in August/September.
Doubtful we hear anything this summer. I think these cards are october bound.
 
Top Bottom