• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia RTX 30XX |OT|

Why are people talking about 3080 Ti as if anyone really waiting for it?

3080 < 3090 = + ~10%
3070 < 3080 = + ~25%

So people dream about + ~5% faster card for $1000? Really?
If anything 3070 Ti is what people should expect.
It's probably people excited to buy a more "future proof" card now instead of being forced to go 3080>3080ti later. Talking from a memory perspective of course.
 

nemiroff

Gold Member
It's probably people excited to buy a more "future proof" card now instead of being forced to go 3080>3080ti later. Talking from a memory perspective of course.

Yeah feelings really does have value even if they are often unsubstantiated. And I'm not being sarcastic either. I mean, how could we ever have a good night of sleep if we had to worry about the size of our VRAM.
 
Last edited:

DGrayson

Mod Team and Bat Team
Staff Member
It's probably people excited to buy a more "future proof" card now instead of being forced to go 3080>3080ti later. Talking from a memory perspective of course.

this is why i was considering a 3090. As i game at 1440p with no plans to upgrade to 4k for a while, was considered getting the 3090 as the more future proof option. In the end i just couldn't justify the cost and managed to snag a EVGA 3080 at MSRP.
 
Last edited:
It's probably people excited to buy a more "future proof" card now instead of being forced to go 3080>3080ti later. Talking from a memory perspective of course.


This video card futureproofing is a bunch of nonsense. Videocards are dying out as soon as you bought them. Each game that comes out is pushing just a little bit harder and harder on them. Getting a card "for the future" is some of the biggest nonsense i can think off. Look at 1080Ti how far it dropped in benchmarks, what a lower midrange card its become in the years since it came out. Its 11 gigs of vram completely useless because the card is dying due to lack of power.

You get a card to play games right now. If it can play games right now, its gonna play games 2 years from now just fine, then you can get whats new then.
 

DGrayson

Mod Team and Bat Team
Staff Member
This video card futureproofing is a bunch of nonsense. Videocards are dying out as soon as you bought them. Each game that comes out is pushing just a little bit harder and harder on them. Getting a card "for the future" is some of the biggest nonsense i can think off. Look at 1080Ti how far it dropped in benchmarks, what a lower midrange card its become in the years since it came out. Its 11 gigs of vram completely useless because the card is dying due to lack of power.

You get a card to play games right now. If it can play games right now, its gonna play games 2 years from now just fine, then you can get whats new then.

Agreed, providing you are pushing both resolution and framerate. If not however, cards can last a long time.
 

Patrick S.

Banned
What exactly would be the point of this?

I just think it's cool to have a higher performance card. It's not about epeen, it's about wanting cooler stuff :) If I'm being 100% honest, it's because I'm unhappy with the performance in Cyberpunk. I know it's dumb, because the GPU isn't at 100% usage or anything. I'm probably much better off saving for a better CPU, because I think my i7 6700k is what's holding the game back. Cyberpunk is becoming my all time favourite game, and I want to play it with buttery smooth performance goddammit :D
 
Last edited:

Kenpachii

Member
This video card futureproofing is a bunch of nonsense. Videocards are dying out as soon as you bought them. Each game that comes out is pushing just a little bit harder and harder on them. Getting a card "for the future" is some of the biggest nonsense i can think off. Look at 1080Ti how far it dropped in benchmarks, what a lower midrange card its become in the years since it came out. Its 11 gigs of vram completely useless because the card is dying due to lack of power.

You get a card to play games right now. If it can play games right now, its gonna play games 2 years from now just fine, then you can get whats new then.

Until that future is tommorow. Specially when u realize we are experiencing a generation shift right now.

The 1080ti example makes no sense because of it.
 
Last edited:

Spukc

always chasing the next thrill
even with owning a 3080 i prefer the ps5 because of the dual sense XD

i had some fun with that gears 5 DLC tho. but so far my 3080 experience has been pretty meh.
Should prolly try anno 1800
 
Last edited:

Kenpachii

Member
even with owning a 3080 i prefer the ps5 because of the dual sense XD

i had some fun with that gears 5 DLC tho. but so far my 3080 experience has been pretty meh.
Should prolly try anno 1800

Hope u got a beast of a CPU be ready for sub 20 fps.
 
Nobody will 3080 will update to 3080Ti. By the time 10GB VRAM is an issue, 4080Ti will be out.
That's pure speculation. But following your logic, every 3080 owner should only keep the card for 2 years?

Also reminder: "There is no need for more than 2GB of VRAM" at the start of last gen.
I certainly hope my 3080 will be enough for a long time and considering I am not 4k, it most likely will. But let's not pretend that 10GB of VRAM is great.
 
That's pure speculation. But following your logic, every 3080 owner should only keep the card for 2 years?

Also reminder: "There is no need for more than 2GB of VRAM" at the start of last gen.
I certainly hope my 3080 will be enough for a long time and considering I am not 4k, it most likely will. But let's not pretend that 10GB of VRAM is great.
2 years is a long ass time in PC hardware space. If I can play the newest games with close to maximum graphical fidelity for 2 years, that's a big win.

I'm not saying that 10GB VRAM will not become insufficient. I'm saying that before that happens next generation of GPUs will be out.
 

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
No chance the penultimate GPU of this generation has “not enough” VRAM during this cycle. If it becomes an issue, the 40xx will be out.

Useless worrying. You really think every Nvidia GPU except the 3090 will be obsolete soon? Jeez.
 
No chance the penultimate GPU of this generation has “not enough” VRAM during this cycle. If it becomes an issue, the 40xx will be out.

Useless worrying. You really think every Nvidia GPU except the 3090 will be obsolete soon? Jeez.
Who is worried here outside butt-hurt posters getting extremely defensive over the 3080ti?
 

BluRayHiDef

Banned
No chance the penultimate GPU of this generation has “not enough” VRAM during this cycle. If it becomes an issue, the 40xx will be out.

Useless worrying. You really think every Nvidia GPU except the 3090 will be obsolete soon? Jeez.

You're in denial, broski. You know that you sub-3090 chumps messed up big time!

Just joking.
 

kittoo

Cretinously credulous
Maybe someone can help me out with issues I am having with my 3080. The display is an LG OLED C9.
I keep getting random loss of signal/black screens. It mostly happens in games but happens otherwise also sometimes. Only a hard reboot gets the display back.
I have tried disabling g-sync , undervolting the card, updating drivers, using maximum performance mode etc. Didn't help. Happens more in some games (such as gears 5) and less in others (such as Ass creed Valhalla).
Any help would be very appreciated.
 

Buggy Loop

Member
That's pure speculation. But following your logic, every 3080 owner should only keep the card for 2 years?

Also reminder: "There is no need for more than 2GB of VRAM" at the start of last gen.
I certainly hope my 3080 will be enough for a long time and considering I am not 4k, it most likely will. But let's not pretend that 10GB of VRAM is great.

By the time 10GB even becomes a problem, you’ll either be rasterization limited (see Cyberpunk), or we’ll have moved to DirectStorage for optimizing these huge IO data streams. Maybe a few dummy devs will brute force their way because they make shit PC ports, but console IO management on PS5 and Xbox series S|X should make this pretty much a standard in upcoming ports.

VRAM becomes almost a buffer, only holding what’s immediately necessary and not idling assets like we do now.
 
Last edited:

regawdless

Banned
Raytraced reflections compared, native 1440p vs DLSS quality.

photomode_25122020_00c9jsy.png
photomode_25122020_006tj9r.png


That's such a huge difference, some might say I guess?
 
I only fear there aren't any good games to play soon after Cyberpunk for my 3080 :messenger_tears_of_joy:
That’s cause the industry is pretty shit tbh

Lots of development time is wasted in providing stale “cinematic” games, treadmill / carrot stick systems, micro transactions, or communities that ignore shit heads.

Cinematic games are boring and formulaic (see any Sony AAA)
Carrot stick systems benefit the low-skill no-lifers (see World of Warcraft)
Micro transactions are for cucks
Communities that ignore shitheads can turn a perfect game into a waste of time (see DOTA 2)
 
Last edited:

Rickyiez

Member
That’s cause the industry is pretty shit tbh

Lots of development time is wasted in providing stale “cinematic” games, treadmill / carrot stick systems, micro transactions, or communities that ignore shit heads.

Cinematic games are boring and formulaic (see any Sony AAA)
Carrot stick systems benefit the low-skill no-lifers (see World of Warcraft)
Micro transactions are for cucks
Communities that ignore shitheads can turn a perfect game into a waste of time (see DOTA 2)

You're perfectly right about Dota 2 , I've been playing it on and off for years but sometimes dealing with some of toxicity is just too much .

I'm just waiting for more awesome games to utilize this card like the first UE5 game , RE8 , FF16 or Elden Ring . But they will not be anytime soon sigh
Maybe someone can help me out with issues I am having with my 3080. The display is an LG OLED C9.
I keep getting random loss of signal/black screens. It mostly happens in games but happens otherwise also sometimes. Only a hard reboot gets the display back.
I have tried disabling g-sync , undervolting the card, updating drivers, using maximum performance mode etc. Didn't help. Happens more in some games (such as gears 5) and less in others (such as Ass creed Valhalla).
Any help would be very appreciated.

What is your PSU ? And also are you using two separate PCIE cables instead of daisy chaining it ?
 

kiphalfton

Member
Figured I would be fine with a RTX 3080, since I had a 1440P 144HZ monitor... but decided to get an Alienware AW3420DW monitor and sell my old monitor. So looks like performance is back to sub-100 FPS in all games.
 
D

Deleted member 17706

Unconfirmed Member
It's probably people excited to buy a more "future proof" card now instead of being forced to go 3080>3080ti later. Talking from a memory perspective of course.

After having owned an RTX 3080, it seems like a fool's game to expect wonders from a hypothetical additional 10GB of VRAM. The thing already can't run some current games maxed out at 4K60 and it's almost never because of VRAM limitations. 20GB VRAM or not, I don't think it's reasonable to expect to run new games three or four years from now maxed out at 4K and get 60 fps or higher.

Basically, I don't think the 10GB VRAM is ever going to be a legitimate bottleneck for this card outside of some very edge scenarios.
 
This video card futureproofing is a bunch of nonsense. Videocards are dying out as soon as you bought them. Each game that comes out is pushing just a little bit harder and harder on them. Getting a card "for the future" is some of the biggest nonsense i can think off. Look at 1080Ti how far it dropped in benchmarks, what a lower midrange card its become in the years since it came out. Its 11 gigs of vram completely useless because the card is dying due to lack of power.

You get a card to play games right now. If it can play games right now, its gonna play games 2 years from now just fine, then you can get whats new then.
I've had my 1080 Ti since it originally launched, almost 4 years ago now, and it has served me exceptionally well those many years. Smart people know that buying more video card than you need now will tend to work out later on in terms of longevity. If you only buy what is barely good enough now, in a few years you'll be forced to upgrade whether you like it or not even if you are otherwise fine with what you have. I was able to skip the entire Turing generation because my 1080 Ti was just fine during those years. If my 3090 ends up also lasting as long as my 1080 Ti did, I'll be quite happy indeed with what I got for the money I paid.
 
this is why i was considering a 3090. As i game at 1440p with no plans to upgrade to 4k for a while, was considered getting the 3090 as the more future proof option. In the end i just couldn't justify the cost and managed to snag a EVGA 3080 at MSRP.

Enjoying The Bat Team, bro. Keep it good.
 

kittoo

Cretinously credulous
What is your PSU ? And also are you using two separate PCIE cables instead of daisy chaining it ?

PSU is cooler master 750W brand new.
I did wonder if it was a PSU issue. But then why would the black screen happen when not gaming also (although that's very rare and often happens when I have closed a game recently).
The card requires 3 8-pin power connectors though, if that helps.

What's daisy chaining? :p
 

longdi

Banned
Why are people talking about 3080 Ti as if anyone really waiting for it?

3080 < 3090 = + ~10%
3070 < 3080 = + ~25%

So people dream about + ~5% faster card for $1000? Really?
If anything 3070 Ti is what people should expect.

If 3080Ti is $899-999 with 20GB, why not? It will probably runs as fast as 3090 with a more mature Samsung 8nm, higher more consistent boost clocks.

Nvidia can get crazy competitive if they wants it. 🤷‍♀️
 

longdi

Banned
3dcenter’s final compilation, now including the 6900 XT as being normalized (100%)

From Reddit :




3080 is king relative to it's price and aggregation of results overall (considering this is just rasterization, nor VR, nor RT, not DLSS, not NVENC..) 👑


Wow just wow, Thats why im nibbling Nvidia stocks every dips. Still performance leadership, Crypto revival, ARM is coming. Jensung is da man!

I always felt rDNA2 is $50 too expensive, and with AIB cards priced much higher, no reasons to get 68/6900 yet.

Unless Amd new drivers does the 'fine wine' performance uplifts. Idk. 🤷‍♀️
 
Wow just wow, Thats why im nibbling Nvidia stocks every dips. Still performance leadership, Crypto revival, ARM is coming. Jensung is da man!

I always felt rDNA2 is $50 too expensive, and with AIB cards priced much higher, no reasons to get 68/6900 yet.

Unless Amd new drivers does the 'fine wine' performance uplifts. Idk. 🤷‍♀️
The Radeon 6000 series are indeed $50 too expensive. They would be very competitive.

But we still got AMD-tards believing that AMD cards deserve to be bought because they’re the underdogs. 😂 And they try to gaslight people (and themselves) into believing that ray-tracing and DLSS are not good.
 
Wow just wow, Thats why im nibbling Nvidia stocks every dips. Still performance leadership, Crypto revival, ARM is coming. Jensung is da man!

I always felt rDNA2 is $50 too expensive, and with AIB cards priced much higher, no reasons to get 68/6900 yet.

Unless Amd new drivers does the 'fine wine' performance uplifts. Idk. 🤷‍♀️
Let's all keep in mind that Nvidia isn't even prioritizing rasterization anymore, they have dedicated significant precious die space to the RT units, the Tensor cores, and the video encoding unit. Even though ever since Turing it's obvious they don't care that much about rasterization they can still beat out AMD which put everything they had into rasterization for Big Navi and don't even have dedicated units to do RT and Tensor (needed for DLSS). And they are doing it on Samsung's kind of shitty "8nm" process which is similar to TSMC 10nm whereas AMD is using TSMC's top of the line 7nm high power process. TSMC 5nm is only for low power chips like ARM, they do not have a high power 5nm process.

Nvidia dominates this gen from top to bottom and it's not even really a contest, especially since AMD knows they are just as supply constrained as Nvidia are and so they are pricing Big Navi very close to Ampere (and honestly with AIB markup there is NO price difference) since neither vendor can remotely supply what the market demands anyways so why not harvest as much revenue as possible. More power to AMD here, they are making a rational decision to optimize their profits, but the insufferable fanboys who worship every tiny thing AMD does are even more retarded in this context. AMD could conceivably choose to undercut Nvidia but they won't because their supply wouldn't be enough either way so why bother? Might as well charge the same and be constantly sold out like Nvidia is. Which is exactly what they are doing, and the fanboys just lap it up. Pathetic.
 

Kenpachii

Member
Let's all keep in mind that Nvidia isn't even prioritizing rasterization anymore, they have dedicated significant precious die space to the RT units, the Tensor cores, and the video encoding unit. Even though ever since Turing it's obvious they don't care that much about rasterization they can still beat out AMD which put everything they had into rasterization for Big Navi and don't even have dedicated units to do RT and Tensor (needed for DLSS). And they are doing it on Samsung's kind of shitty "8nm" process which is similar to TSMC 10nm whereas AMD is using TSMC's top of the line 7nm high power process. TSMC 5nm is only for low power chips like ARM, they do not have a high power 5nm process.

Nvidia dominates this gen from top to bottom and it's not even really a contest, especially since AMD knows they are just as supply constrained as Nvidia are and so they are pricing Big Navi very close to Ampere (and honestly with AIB markup there is NO price difference) since neither vendor can remotely supply what the market demands anyways so why not harvest as much revenue as possible. More power to AMD here, they are making a rational decision to optimize their profits, but the insufferable fanboys who worship every tiny thing AMD does are even more retarded in this context. AMD could conceivably choose to undercut Nvidia but they won't because their supply wouldn't be enough either way so why bother? Might as well charge the same and be constantly sold out like Nvidia is. Which is exactly what they are doing, and the fanboys just lap it up. Pathetic.

Of course they do. Otherwise the 3080 would have never come out with the full blown top end chip if they didn't care.
 

Starfield

Member
People STILL can't get a RTX 3xxx card? So glad I got mine in November lol

Has it gotten worse or better since november? more frequent drops?
 
Last edited:

Rickyiez

Member
PSU is cooler master 750W brand new.
I did wonder if it was a PSU issue. But then why would the black screen happen when not gaming also (although that's very rare and often happens when I have closed a game recently).
The card requires 3 8-pin power connectors though, if that helps.

What's daisy chaining? :p
d7eb8p-jpg.169286


This is a perfect illustration . The last one on the right is daisy chaining
 
Last edited:
After having owned an RTX 3080, it seems like a fool's game to expect wonders from a hypothetical additional 10GB of VRAM. The thing already can't run some current games maxed out at 4K60 and it's almost never because of VRAM limitations. 20GB VRAM or not, I don't think it's reasonable to expect to run new games three or four years from now maxed out at 4K and get 60 fps or higher.

Basically, I don't think the 10GB VRAM is ever going to be a legitimate bottleneck for this card outside of some very edge scenarios.
We don't have any real next gen games yet though. Even Cyberpunk still has last gen constraints. WD Legion is already pushing way above 8GB at WQHD, but I guess it's Ubisoft so doesn't really count. But shit ports are not a rarity unfortunately.

I think the argument "in two years we have 40 series" is not very smart because it's totally reasonable to skip a GPU gen from time to time.

I will step-up from 3080 to 3080ti with EVGA if it comes out within the next 2-3 months.

People STILL can't get a RTX 3xxx card? So glad I got mine in November lol

Has it gotten worse or better since november? more frequent drops?
From what I have seen in my country, things have not improved one bit. Prices also STILL keep going up. Probably Christmas + Cyberpunk release.
 
Last edited:

Max_Po

Banned
Has anyone compared a 3080 from Alienware PC to another brand like EVGA

how do their OEM card compare to other manufacturers ? ASUS/EVGA
 

regawdless

Banned
Edit:
Was thinking about selling my 3080 for a 3090. Looking at some benchmarks.... Yeah scrap that thought. At 1440p it's literally just a handful fps.
 
Last edited:

nemiroff

Gold Member
let's not pretend that 10GB of VRAM is great.

Because..?

[insert technical verification here, no notion nor no outdated extrapolations, please]


People STILL can't get a RTX 3xxx card? So glad I got mine in November lol

Has it gotten worse or better since november? more frequent drops?


Here's the situation (it's updated daily) from one of the internet shops in my country, yes it's crazier than ever:

NVIDIA GeForce RTX | 3060 Ti | 3070 | 3080 | 3090 » Full overview here! (proshop.no)

QdlpU5a.gif


Many cards like the Asus Strix and MSI Trio even have a "none delivered since launch" status.. It's pretty bad..

I'm kinda over it, especially since I found out that I can run both Cyberpunk, and even MSFS in VR, looking good with my old 1080 at 1950 MHz.. Will buy one when they are available.
 
Last edited:
Top Bottom