• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

On paper, the GPU in the Xbox Series X is faster than a GeForce RTX 2080 Super.

diffusionx

Gold Member
Yes, they do. No PC game today is built for a 2080 Ti. They are built for the mainstream cards and then the additional performance is used to make them look better. It's a brute force approach. A console game built for 12 TF will crush any PC game that can use a 12 TF GPU but is built for 6. The 12 TF are the baseline for Xbox next gen. The 13.5 TF of a 2080 Ti are the maximum of PC gaming (aside from the $2.400 Titan RTX). It's going to be a huge leap forward and anybody who thinks what a 2080 Ti can deliver today will be more than what a XSX can deliver in a year will be proven wrong.

Haha what? Define "crush"? A game running on a PC with a 2080TI will perform better than the same game running on a Xbox. Anyone with a 2080TI or similar card will benefit from this "huge leap forward" without having to buy anything.

I don't know what the next gen of console games is going to look like because they haven't arrived yet, but my guess is, early on at least, they actually will look and run like current gen games with higher settings. That's how it has gone for the past two console transitions at least. Anyone who thinks that the Xbox will arrive with all these mind blowing games that melt the eyes off everyone and make PC gamers beg for forgiveness is setting themselves up for disappointment. It's not going to happen.
 

S0ULZB0URNE

Member
:messenger_tears_of_joy::messenger_grinning_smiling::messenger_grinning_sweat::messenger_grinning_squinting:Can you show me some examples?! Cause I must have missed this somehow.

I have a single liquid cooled 2080 TI. I have not OC'd it. But, it does auto OC by itself, and by a huge margin.(Native GPU's from both Nvidia and AMD do this automatically because of the environment of pc's). I'm touching 16TF plus, continuous (even though TF numbers don't count). That's just because of the cooling it can provide as well as the power it can receive. But even if I were to limit it to stock frequency, and not auto overclock, it will still beat out every new game for ALL of next gen, just because if the difference in PC architecture. No TDP or power limitations, cooling limitations, cut down chip, etc. So.... You backing out from that bet, or not? You have absolutely NOTHING to lose, except your reputation.
Yeah that's OCed boost clocks or not.
Plus we don't know the CPU which I am assuming is OCed as well.
What is the bet exactly?
 
Haha what? Define "crush"? A game running on a PC with a 2080TI will perform better than the same game running on a Xbox. Anyone with a 2080TI or similar card will benefit from this "huge leap forward" without having to buy anything.

I don't know what the next gen of console games is going to look like because they haven't arrived yet, but my guess is, early on at least, they actually will look and run like current gen games with higher settings. That's how it has gone for the past two console transitions at least. Anyone who thinks that the Xbox will arrive with all these mind blowing games that melt the eyes off everyone and make PC gamers beg for forgiveness is setting themselves up for disappointment. It's not going to happen.
I keep telling folks this. And it may seem like we're being "elitist", but it isn't. It's just the cold hard truth. I've said it many times before, and I'll say it again. I HONESTLY WANT consoles to be as good as they can be, so pc games don't get nerfed because devs have to make a game that is playable on lower end hardware.
 

diffusionx

Gold Member
Also of note - virtually every third party game these days come out on every platform. Microsoft's games come out on PC. They'll be written for "6 TF" too. This is realistically 98% of the market - basically everything but first party Sony and Nintendo games.

Hell some of them will be written for the One S. This is the market. The days of Konami writing custom assembly language and dev tools for their PS2 library is long gone.
 

S0ULZB0URNE

Member
I keep telling folks this. And it may seem like we're being "elitist", but it isn't. It's just the cold hard truth. I've said it many times before, and I'll say it again. I HONESTLY WANT consoles to be as good as they can be, so pc games don't get nerfed because devs have to make a game that is playable on lower end hardware.
No you don't and you are obviously beyond rattled by the beast that is XSX.
 
Yeah that's OCed boost clocks or not.
Plus we don't know the CPU which I am assuming is OCed as well.
What is the bet exactly?
I don't oc my gpu.

Does it matter what the specs are? You claimed the xbsx will beat the 2080ti within a year.

I'll bet my gpu, that a moderate cpu and 2080 ti will decimate every game for "next gen" hardware.
 

pawel86ck

Banned
Now let's say you have a 2080/2080S/2080 TI, you'll be able to play every game for next
You dont know how much VRAM next gen ports will require. In 2012 I have bought GTX 680 and back then I thought 2GB was insane amount of VRAM, but as soon PS4 launched my GPU was VRAM limited. Even current gen ports can sometimes use up to 8GB VRAM and VRAM requirements will only increase.
 
You dont know how much VRAM next gen ports will require. In 2012 I have bought GTX 680 and back then I thought 2GB was insane amount of VRAM, but as soon PS4 launched my GPU was VRAM limited. Even current gen ports can sometimes use up to 8GB VRAM and VRAM requirements will only increase.
In 2012, there was the ps3 and Xbox 360. The gtx 680 beats out both consoles by a landslide. It still beats xb1 and ps4, minus vram. But those days are long gone. Just about all current gen gpu's have a minimum of 6gb or vram or more. Pc's work completely different in the fact that their "ram" isn't closed into a single, lump sum of memory. On pc you still have your dedicated system memory, plus vram. 16gb or 32gb (or more) of system memory, plus xxgb of vram is what you're dealing with on pc. Insufficient vram hasn't been an issue for any gpu's launched in the past couple of years. ESPECIALLY the equivalent or better gpu's to previous, current and next gen consoles.

Btw you bought the low end 680 with 2gb of vram, and not 4gb. A couple bucks more would be a better investment.
 
Last edited:

darkinstinct

...lacks reading comprehension.
Haha what? Define "crush"? A game running on a PC with a 2080TI will perform better than the same game running on a Xbox. Anyone with a 2080TI or similar card will benefit from this "huge leap forward" without having to buy anything.

I don't know what the next gen of console games is going to look like because they haven't arrived yet, but my guess is, early on at least, they actually will look and run like current gen games with higher settings. That's how it has gone for the past two console transitions at least. Anyone who thinks that the Xbox will arrive with all these mind blowing games that melt the eyes off everyone and make PC gamers beg for forgiveness is setting themselves up for disappointment. It's not going to happen.

You didn't read correctly. I said what a 2080 Ti is showing today will be worse than what a XSX can deliver in 2021. And that is true. Because games today are not built for 12 TF. Of course a 2080 Ti will profit from that new baseline and display games even better. That's not the point. The point is that if you think next gen games won't look better than 2080 Ti games today, you are fooling yourself.
 

S0ULZB0URNE

Member
I don't oc my gpu.

Does it matter what the specs are? You claimed the xbsx will beat the 2080ti within a year.

I'll bet my gpu, that a moderate cpu and 2080 ti will decimate every game for "next gen" hardware.
It sure does matter.
A XSX will put out better looking games than a 2080ti.
Your card is OCed so it's not the same.
You don't own a moderate CPU though.

Decimate eh?
So A REGULAR clocked 2080ti and a moderate powered non OCed CPU will Decimate a XSX?
I'll take that bet!
 
You didn't read correctly. I said what a 2080 Ti is showing today will be worse than what a XSX can deliver in 2021. And that is true. Because games today are not built for 12 TF. Of course a 2080 Ti will profit from that new baseline and display games even better. That's not the point. The point is that if you think next gen games won't look better than 2080 Ti games today, you are fooling yourself.
And why wouldn't games look better in the future with a 2080 TI? Games aren't being tailored to that super high end, because of what? Because it would leave every current gen console, every next gen console, and several pc gamers, obsolete. Imagine turning current roads into a place where you only can drive with the fastest, high end cars on? It would leave the majority of cars from driving in those streets. It works the same way.
 
It sure does matter.
A XSX will put out better looking games than a 2080ti.
Your card is OCed so it's not the same.
You don't own a moderate CPU though.

Decimate eh?
So A REGULAR clocked 2080ti and a moderate powered non OCed CPU will Decimate a XSX?
I'll take that bet!
The bet is on! Please get a mod in here to sanction this!!

Let's see in 2021, I'll even give you till December 31st to prove me wrong. This is gonna be so utterly embarrassing for you, just fyi.
 

S0ULZB0URNE

Member
In 2012, there was the ps3 and Xbox 360. The gtx 680 beats out both consoles by a landslide. It still beats xb1 and ps4, minus vram. But those days are long gone. Just about all current gen gpu's have a minimum of 6gb or vram or more. Pc's work completely different in the fact that their "ram" isn't closed into a single, lump sum of memory. On pc you still have your dedicated system memory, plus vram. 16gb or 32gb (or more) of system memory, plus xxgb of vram is what you're dealing with on pc. Insufficient vram hasn't been an issue for any gpu's launched in the past couple of years. ESPECIALLY the equivalent or better gpu's to previous, current and next gen consoles.

Btw you bought the low end 680 with 2gb of vram, and not 4gb. A couple bucks more would be a better investment.
PS4 has put out better looking "games" than the 680 has on the PC though.
 

diffusionx

Gold Member
You didn't read correctly. I said what a 2080 Ti is showing today will be worse than what a XSX can deliver in 2021. And that is true. Because games today are not built for 12 TF. Of course a 2080 Ti will profit from that new baseline and display games even better. That's not the point. The point is that if you think next gen games won't look better than 2080 Ti games today, you are fooling yourself.

Eh, maybe. I doubt it. Like I said, I think the first round of next-gen games at least will look a lot like today's games at high end PC settings. Eventually, maybe, but keep in mind, by then, there will be a 3080TI and a 4080TI and then a 5080TI that will take those games well beyond what the XSX can do.
 

GymWolf

Member
Anyone have links to video of 12 TF pc graphics? I tried to find some on YouTube and I wasn't hugely impressed
because the 12tf on pc are just used for bells and whistles in games developed on a 1.8tf machine, the core of the game\graphics is still the same, just wait for games developed with a 12tf machine in mind.

watch some metro exoddus, control or rdr2 pc version to see some impressive things (still the core remains "old" even on these games)
 
Last edited:

Myths

Member
If it can run with this level of quality:


Then I’d be impress. And keep in mind, I’m overdoing it with 4X AA at 4K because my eyes still detect the jags on cars. Call it overkill, oh well.

I doubt it. But I’m always up for a challenge to be proven dead wrong.
 
Last edited:

alucard0712_rus

Gold Member
This whole post didn’t disprove what he just said. Details on RD2 like you said 50-80fps. What do you have to set to get that and we all know it is t maxed out for the settings.

I know this because I have a 9900k 2080ti EVO 970s OLED C9 55” with GSYNC and based on game to game sacrifices have to be made and worked out PER SYSTEM to get the best balance for you to achieve.

Making blanket statements that they just run better is bs. You have to lower specs for it to run better because the option is there to do so because the games are designed to work with multiple configurations.

None of these games are designed for your 2070 or my 2080ti specifically. We just have the option to adjust and get better FPS. But none of us is at max settings with all games at 4K or 1440p

I think your statement didn't disprove what I said ether.
He did not mentioned different PC configurations ether. In my configuration it runs better, at higher settings and without problems.
And I'm a "console" guy.
 
PS4 destroys the 750.
It's very true games like Gow,The Order and Uncharted 4(to name a few) look better than anything the 680 has put out.
Care to provide some examples or in depth proof? Not cheery picked screenshots like the other thread you were in. If not, a simple Google search can disapprove that, rather easily. So go ahead, I'm waiting.
 

pawel86ck

Banned
In 2012, there was the ps3 and Xbox 360. The gtx 680 beats out both consoles by a landslide. It still beats xb1 and ps4, minus vram. But those days are long gone. Just about all current gen gpu's have a minimum of 6gb or vram or more. Pc's work completely different in the fact that their "ram" isn't closed into a single, lump sum of memory. On pc you still have your dedicated system memory, plus vram. 16gb or 32gb (or more) of system memory, plus xxgb of vram is what you're dealing with on pc. Insufficient vram hasn't been an issue for any gpu's launched in the past couple of years. ESPECIALLY the equivalent or better gpu's to previous, current and next gen consoles.

Btw you bought the low end 680 with 2gb of vram, and not 4gb. A couple bucks more would be a better investment.
There was only 2GB version at launch and GTX 680 wasnt low end (in fact it was the fastest Nvidia GPU back then). Yes my GTX 680 could run every PS3 / xbox 360 port like a dream (these games used only around 500-1000MB VRAM, so there was plenty left), but not from next gen consoles and the same story will apply to RTX 2070/2080 8GB (and not even insane amount of system RAM will help with stuttering and pauses when GPU will run out of VRAM) . It's never a good idea to build high end PC before next gen console launch, however Ampere GPUs should be already prepared to run XSX/PS5 ports (I expect around 16-20GB VRAM on these cards).
 

HolyTruth

Banned
I don't recall AMD putting out a single game before... Maybe your confused about something else? How about this. I challenged someone before on this. If a game comes out next gen that can beat my current gpu, you can have it for free. I'll even pay shipping. Now if I win, you owe me NOTHING. Let's see how confident you are on these "claims"


ZIUrQpq.jpg

anyone can buy the box on EBay for $10
 
There was only 2GB version at launch and GTX 680 wasnt low end (in fact it was the fastest Nvidia GPU back then). Yes my GTX 680 could run every PS3 / xbox 360 port like a dream (these games used only around 500-1000MB VRAM, so there was plenty left), but not from next gen consoles and the same story will apply to RTX 2070/2080 8GB (and not even insane amount of system RAM will help with stuttering and pauses when GPU will run out of VRAM) . It's never a good idea to build high end PC before next gen console launch, however Ampere GPUs should be already prepared to run XSX/PS5 ports (I expect around 16-20GB VRAM on these cards).
When i said low end, I started low end version of the 680. Almost 10 years ago, consoles were closer to pc's graphically. With the bonus of being about to run higher framerate, higher AA, and resolution. That card still performed good for the next few years after current gen consoles released. But I agree, not the best idea to buy a gpu, before the "next gen". You'll always get better results getting a current gpu before next gen, but will be in a much better place getting a gpu around the time of release of new consoles. Turing was like a mid cycle refresh between Pascal and Ampere. Ampere will be around 20 TF or more (hate to even use that stupid TF metric honestly). So around 50% faster or so, than "next gen"
 

mitchlol

Member
Can someone fill me in on how clock speeds affect TFs. Microsoft are saying 12TF but is that based on the die/core counts only? Or is it possible to gauge what frequency the GPU will be clocked at and boost up to? Surely power/heat need to be a consideration for XSX and historically AMD isn’t..... quiet or cool?

Either way it’s great that the XSX will be powerful for many years after release but if Ampere is a generational leap then won’t it feel like PCs have already leap frogged over consoles before they are out?
 
Can someone fill me in on how clock speeds affect TFs. Microsoft are saying 12TF but is that based on the die/core counts only? Or is it possible to gauge what frequency the GPU will be clocked at and boost up to? Surely power/heat need to be a consideration for XSX and historically AMD isn’t..... quiet or cool?

Either way it’s great that the XSX will be powerful for many years after release but if Ampere is a generational leap then won’t it feel like PCs have already leap frogged over consoles before they are out?
TF is the combination of cores and frequency multiplier. So the 12TF MS claims is already accounting for the clock speed and cu's. With a higher clock speed, your TF count will go up. So a stock 2080 TI is 14Tf, but can go up to 16 or more by being able to boost to higher clocks, for a sustained length of time, because of ample cooling and getting the power it requires to reach that count.
 
Last edited:

GymWolf

Member
If it can run with this level of quality:


Then I’d be impress. And keep in mind, I’m overdoing it with 4X AA at 4K because my eyes still detect the jags on cars. Call it overkill, oh well.

I doubt it. But I’m always up for a challenge to be proven dead wrong.

i think that this video is gonna get destroyed by the first proper next gen AAA open world game from sony or ubisoft or microsoft (or any other big devs), no doubt about it.

maybe not in the microdetails where rockstar is the queen, i give you that.

not talking about half assed projects or multy gen titles or AA games of course.
 
Last edited:

Alexios

Cores, shaders and BIOS oh my!
If you sensibly upgrade CPU every 2 console generations and GPU every generation but time both upgrades mid-console-generation rather than the start or end of one, you can play most games better than consoles and have a crazy back catalogue to revisit improved in UHD remaster style, free. Plus cheap games, no online fees, VR, broad selection of different control devices, mods. The PC may struggle just before that mid-generation GPU upgrade point but you can postpone those few games for a while until you can soon destroy them & most consoles aren't bought at launch either...
 
Last edited:
I agree that a 12 Tflop GPU utilizing RDNA 1 would be more powerful than a 2080 Nvidia GPU. However, I don't think that it would be much more powerful. I think that a 13TFlop GPU would be required to match a 2080 Super.
 
i think that this video is gonna get destroyed by the first proper next gen AAA open world game from sony or ubisoft or microsoft (or any other big devs), no doubt about it.

maybe not in the microdetails where rockstar is the queen, i give you that.

not talking about half assed projects or multy gen titles or AA games of course.
For a 6 plus year old game... I honestly would pray to God that is the case. But in a nut shell, that is not a hard feat to accomplish.
 

GymWolf

Member
For a 6 plus year old game... I honestly would pray to God that is the case. But in a nut shell, that is not a hard feat to accomplish.
it's an heavily modded version of a 6 year old game tho if i'm not mistaken.
even modded skyrim is a delight to watch on pc.

still, if you look at some details, it's clearly still an old gen game at his core.

why you are so diffident? look at rdr2 on a 1.3-1.8tf machine, he looks fucking glorious\impossible.

imagine the same devs (or people like guerrilla, nd, ubisoft, etc.) on a 9-12tf machine, why it's so hard to believe?

we can pray together but i'm an atheist so you do the talk and i just move my lips at rhythm with your voice 😆
 
Last edited:
it's an heavily modded version of a 6 year old game tho if i'm not mistaken.
even modded skyrim is a delight to watch on pc.

still, if you look at some details, it's clearly still an old gen game at his core.

why you are so diffident? look at rdr2 on a 1.3-1.8tf machine, he looks fucking glorious\impossible.

imagine the same devs (or people like guerrilla, nd, ubisoft, etc.) on a 9-12tf machine, why it's so hard to believe?

we can pray together but i'm an atheist so you do the talk and i just move my lips at rhythm with your voice 😆
Games will definitely look better as time progresses, so I would definitely hope next gen core gameplay looks better than old modded games. With that being said, games will look even better on higher end hardware. It's just how technology and gaming progresses as a whole.
 

GymWolf

Member
Games will definitely look better as time progresses, so I would definitely hope next gen core gameplay looks better than old modded games. With that being said, games will look even better on higher end hardware. It's just how technology and gaming progresses as a whole.
we are basically saying the same thing :goog_grinning_squinting:
 

S0ULZB0URNE

Member
Care to provide some examples or in depth proof? Not cheery picked screenshots like the other thread you were in. If not, a simple Google search can disapprove that, rather easily. So go ahead, I'm waiting.
No that's not how it works.
I made the statement it's you who has to try and refute it.
 
Last edited:

S0ULZB0URNE

Member
Honestly with the closed box environment of consoles the Series X will punch well above its weight compared to those high end cards. The real world outcome will be even greater than what's stated here.
exactly plus the potential of games being build from the ground up with much faster storage in mind unlike the PC.
 
No that's not how it works.
I made the statement it's you who has to try and refute it.
I can take the best screen shot from those games, and they just don't look anywhere near "better" than even multiplatform games on pc. So I was hoping you could disapprove me. I mean check out the pc screenshot thread on here. FULL of examples to choose from. Don't even have to leave this site.
 

S0ULZB0URNE

Member
I can take the best screen shot from those games, and they just don't look anywhere near "better" than even multiplatform games on pc. So I was hoping you could disapprove me. I mean check out the pc screenshot thread on here. FULL of examples to choose from. Don't even have to leave this site.
Those games got awards for best visuals.
No 3rd party game that the 680 could run can best them.
 

Romulus

Member
Also of note - virtually every third party game these days come out on every platform. Microsoft's games come out on PC. They'll be written for "6 TF" too. This is realistically 98% of the market - basically everything but first party Sony and Nintendo games.

Hell some of them will be written for the One S. This is the market. The days of Konami writing custom assembly language and dev tools for their PS2 library is long gone.

PS5 will be the exception here. Even at 9tf, custom games built for that hardware will look amazing combined with a gaming CPU. It wont have to run on lower spec PCs like XSX ports.
 

pawel86ck

Banned
I agree that a 12 Tflop GPU utilizing RDNA 1 would be more powerful than a 2080 Nvidia GPU. However, I don't think that it would be much more powerful. I think that a 13TFlop GPU would be required to match a 2080 Super.
We still dont know how much faster RDNA2 GPUs will be. There are leaks that suggest massive performance gains compared to RDNA1.

 

SmokSmog

Member
Xbox sx is on 12TF RDNA2 , 5700x is RDNA1 9.75TF

Add 30% to 5700x relative performance and you will get something between rtx 2080 and 2080ti

relative-performance_3840-2160.png
 
Xbox sx is on 12TF RDNA2 , 5700x is RDNA1 9.75TF

Add 30% to 5700x relative performance and you will get something between rtx 2080 and 2080ti

relative-performance_3840-2160.png
30% would be best case. Remember, the 25% improvement for RDNA1 is best case. It's not always that big of an improvement in every game.

I hope I'm wrong and RDNA2 turns out to be a HUGE change in architecture.
 
Top Bottom