• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Scorn Developer: Xbox Series X Scorn Trailer Was Running on an RTX 2080Ti

IntentionalPun

Ask me about my wife's perfect butthole
Bottom line is:

If the XSX can't output the graphics showed in that middling presentation they have much bigger problems than bad marketing lol

I would hope the XSX can do far more than that shit.. I mean Scorn itself IIRC barely looked like a game at this point.

But considering the above statements, I doubt we saw PC footage because the XSX isn't capable of what was shown.
 
Last edited:

CurtBizzy

Member
A 2080 Ti is capable of 60 fps vs Xbox Series X 30 fps
E4Avbeu.png

 

Bo_Hazem

Banned
You are sure? 2080ti is 13.4 TF only on paper, and in reality it's 16 TF out of the box thanks to GPU boost, and up to 18 TF when OC'ed. I have seen benchmarks where 2080ti has literally demolished 5700XT and I dont believe XSX can do the same.

Of course I'm talking about base. The comparison between the two was pretty close, with 2080Ti being a bit sharper on some details, must watch it on 4K screen to identify the difference:





If that youtuber could be trusted though.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
A 2080 Ti is capable of 60 fps vs Xbox Series X 30 fps
E4Avbeu.png


Well.. that could be because the XSX ray tracing sucks compared to RTX.

I don't think Scorn or any game shown was using ray tracing.

Either way.. nothing shown in the MS Video would REQUIRE a 2080TI... it all looked like shit that would run on a 2060 at high framerates...
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
As per usual. Then when the games actually release on consoles the 'downgrade' outrage begins. Common sense: you aren't getting the performance of a 1300 dollar GPU in a 500 dollar box. That people fall for this every single time is just amazing.
Yeah we definitely aren't...

.. but what game in that video appeared to require a $1300 GPU? lol
 

DeepEnigma

Gold Member
Anyone with a brain would think it was running on early dev-kit, with all due respect:

"Game and console in development; in-engine footage representative of expected Xbox Series X visual quality".

Just like the UE5 demo was running on a PS5 dev-kit not the final console. Does that makes me deceived? I can't see it that way.

That is a rather ambiguous statement. The only thing clearly transparent was the font.

Call it 1000, then. My point is the same.

I still think they price them almost 2.5 to 3 times what they can still profit off of, because the market will pay for it.

The crypto tax never dropped, they only added upon it. But I get what you are saying.
 

pawel86ck

Banned
Of course I'm talking about base. The comparison between the two was pretty close, with 2080Ti being a bit sharper on some details:





If that youtuber could be trusted.

It depends on the game, but 59% difference in 4K is really big IMO and I doubt XSX would be able to close the gap without some downgrades.
Screenshot-20200529-234245-You-Tube.jpg
 
Let's be honest. It's only $1300 because nVidia says so.
It's so expensive because every generation they test out what they can get away with and people LOVE when others are excluded from buying a bigger dick. So the trend upwarts continues.

Few generations ago you bought the medium market card für 300 bucks. We're somewhere around 550 now I think.
But that's a hate topic for another thread :mad:
 

RespawnX

Member
RDNA 1 is nowhere near close Turing architecture, let alone RDNA 2. Just compare Minecraft RT on PC and XSX. PC version is far more complex and also it sustained higher fps count. XSX version is quite the opposite.
Despite AMDs 7nm vs NVIDIAs 14nm, Turing is still more energy-efficient and architecture is simply better on Nvidia. AMD with RDNA 2 somehow got close, but that's not enough.

Ampere will just butcher RDNA 2 soon.

Took me less than five minutes to figure out, that RX 5700 XT has around 9.75 tflops and RTX 2080 around 10,1 tflops. As we only have tflops from XsX to compare, we compare them. RX 5700 XT has basicly the same fps per flop ratio as RTX 2080 (RTX is a bit better). We know that RDNA 2.0 is going to have 10%> IPC gains and XsX has 12 tflops with 52 CUs and they scale better than the clock performance (at least for RDNA 1.0). Basic maths and some approximation and you get around RTX 2080 Ti performance. Comparing evolving engines like BF V, Metro Exodus and not stone age games like GTA V. Of course, if you take into account energy efficiency Turing has still an advantage but not that big. We will see the real gains of 7nm on Nvidia side with Ampere but the leaks are pointing to far higher Tensor core count and only "slight" (20-30%) better shader rendering performance.

So why are we discussing things we know since half a year? Did you miss the whole XsX discussion or are you pissed off because at the end of the year you can buy a console with today's high-end performance for the price of today's mid-range GPU?
 
D

Deleted member 471617

Unconfirmed Member
No issues as all third party games end up being shown on high end power PC's. So not really a surprise. As long as the game itself is good and runs great when it releases day one on Xbox Series X, I'll be a happy camper as that increases the chances of me giving the game a chance.
 

Ascend

Member
Ahh so you’re saying that it’s okay to show how games will run on a future AMD GPU by using Nvidias current best GPU.

That AMD secret sauce is coming... eventually!
or you can just get the Nvidia GPU right now
What's the problem if it's equivalent performance? I'm not interested in GPU brand wars with you.

5700 XT game clock = 1755 MHz
XSX GPU clock = 1825 MHz
Slight advantage for XSX

5700 XT = 40 CUs
XSX GPU = 52 CUs
XSX has 30% advantage here.

2080 Ti is about 40% faster than a 5700XT at 4K. So... The slightly higher clock, the additional CUs, the potential IPC increase plus console optimization will clearly put it close to a 2080 Ti. If you can't believe that, that's your problem.
 

Seko

Neo Member
RDNA 1 is nowhere near close Turing architecture, let alone RDNA 2. Just compare Minecraft RT on PC and XSX. PC version is far more complex and also it sustained higher fps count. XSX version is quite the opposite.
Despite AMDs 7nm vs NVIDIAs 14nm, Turing is still more energy-efficient and architecture is simply better on Nvidia. AMD with RDNA 2 somehow got close, but that's not enough.

Ampere will just butcher RDNA 2 soon.
Here is a comparison between Navi(RDNA 1) and Turing. Its in german but with the chart it should be still understandable.
 
Last edited:

Ascend

Member
A 2080 Ti is capable of 60 fps vs Xbox Series X 30 fps
E4Avbeu.png


DLSS is effectively rendering the game at a lower resolution and using ML to make it look native. Impressive tech, but not useful for comparing GPU performance in a fair manner. Although I do agree that it doesn't matter if you can't tell the difference in the end, you can't use this to say that the XSX GPU is not as powerful as a 2080 Ti.

But that's not even what I'm worried about. I didn't watch the whole video, but I don't think the 2080Ti used DLSS there. But what does matter for a direct comparison is what settings where used, amount of light bounces being the most important, in addition to the render distance. Without those details, the comparison is faulty.
 

Ascend

Member
RDNA 1 is nowhere near close Turing architecture, let alone RDNA 2. Just compare Minecraft RT on PC and XSX. PC version is far more complex and also it sustained higher fps count. XSX version is quite the opposite.
Despite AMDs 7nm vs NVIDIAs 14nm, Turing is still more energy-efficient and architecture is simply better on Nvidia. AMD with RDNA 2 somehow got close, but that's not enough.

Ampere will just butcher RDNA 2 soon.
Oh. We got one of those people... Energy efficient blah blah. Just a blind nVidia follower. After this post putting you in your place you'll be put on ignore. Sorry (not sorry) to burst your bubble.

Navi IPC is comparable to Turing and that is a fact. Computerbase.de did a test comparing Navi and Turing. On average, Navi was 1% faster.
 

jimbojim

Banned
Took me less than five minutes to figure out, that RX 5700 XT has around 9.75 tflops and RTX 2080 around 10,1 tflops. As we only have tflops from XsX to compare, we compare them. RX 5700 XT has basicly the same fps per flop ratio as RTX 2080 (RTX is a bit better). We know that RDNA 2.0 is going to have 10%> IPC gains and XsX has 12 tflops with 52 CUs and they scale better than the clock performance (at least for RDNA 1.0). Basic maths and some approximation and you get around RTX 2080 Ti performance. Comparing evolving engines like BF V, Metro Exodus and not stone age games like GTA V. Of course, if you take into account energy efficiency Turing has still an advantage but not that big. We will see the real gains of 7nm on Nvidia side with Ampere but the leaks are pointing to far higher Tensor core count and only "slight" (20-30%) better shader rendering performance.

So why are we discussing things we know since half a year? Did you miss the whole XsX discussion or are you pissed off because at the end of the year you can buy a console with today's high-end performance for the price of today's mid-range GPU?

Pissed? Lol. Even here Xbox fans trying to spread FUD how XSX is on par with 2080 Ti RTX. Guess what, it's not.
 

jimbojim

Banned
Oh. We got one of those people... Energy efficient blah blah. Just a blind nVidia follower. After this post putting you in your place you'll be put on ignore. Sorry (not sorry) to burst your bubble.

Navi IPC is comparable to Turing and that is a fact. Computerbase.de did a test comparing Navi and Turing. On average, Navi was 1% faster.


Putting me on ignore list yet you spew this crap

The XSX GPU is not too far off from a 2080 Ti, so...

Guess what, it's not close in any single way. Difference between XSX 12 TF and RTX 2080ti 13.45. ( without OC ) is far bigger than differenfece between PS5 and XSX.
But you can put me on ignore list.
 
Last edited:

Stuart360

Member
Just for clarification, that screenshot makes it look like the text is faded and hard to see, due to the Youtube bar covering it. Its perfectly bright and clear in the actual trailer though.
 

AGRacing

Member
But the 2080Ti is 13.4TF base clock? Shouldn't be 2080 Super that matches it?

After a year of ridiculous assertions about teraflops in the next generation speculation thread..... and GCN, RDNA1 & 2.... I honestly can't believe this is a real question.

12 TF XSX performance is not measurable against One X 6TF or Nvidia 13TF or anything else that isn't RDNA2 or whatever the architecture of the Series X ends up actually being.
 

Rikkori

Member
It's not out of the question that games on XSX will rival the performance of a 2080 ti desktop, especially as the years roll by and more & more optimisations are done around RDNA by developers, which they obviously will since those are the major platforms for the next 7+ years. I mean, let's face it, Nvidia abandons their GPUs as soon as new ones are out in terms of further driver optimisations, and for Turing that's gonna happen before the new consoles are even out, even if only by a few months. No different than what happened with GCN vs Kepler/Maxwell/Pascal. This will be more obvious in some games & engines than others, as is tradition.

Where PC will really flex its muscles will be more advanced ray tracing & overall performance if DLSS is further adopted. But obviously we're still early for all that. Overall though, I don't think XSX Scorn players will experience something meaningfully worse than a 2080 ti PC gamer.
 

RaySoft

Member
I'm qute certain Sony was ahead starting this next-gen, but MS came out first gun-slingin' & confident, so they succeeded to change the meta..
But now it seems like they jumped the gun a bit? Let's see if MS can even follow Sony after 4 june;-)

Edit: Im just questioning MS' PR push. I have no doubt that XSX will run this game at least as well as an 2080
 
Last edited:

RaySoft

Member
A 2080 Ti is capable of 60 fps vs Xbox Series X 30 fps
E4Avbeu.png


Just because you may be slower in eating your breakfast, doesn't mean you can't be faster in anything else...
None of the consoles will be better at RT. This is AMD's first rendition of it. Cerny actually "said it" between the lines in the Wierd interview.. something of the lines of.. "Yes it can do RT, but we're not focusing on that" Can you say it more clearly?
It's MS that has been hyping up RT, let's see if they can deliver...
 
Last edited:
Do you have no shame. Blatant bait post to at best get some tiny slight in against MS, from one of the biggest fanboys on this forum no less. Or to draw in equally terrible warring posts for you to flame-react...
I'm qute certain Sony was ahead starting this next-gen, but MS came out first gun-slingin' & confident, so they succeeded to change the meta..
But now it seems like they jumped the gun a bit? Let's see if MS can even follow Sony after 4 june;-)

Edit: Im just questioning MS' PR push. I have no doubt that XSX will run this game at least as well as an 2080
I rest my case. You are genuinely pathetic and I feel sorry for you.
 

RaySoft

Member
Do you have no shame. Blatant bait post to at best get some tiny slight in against MS, from one of the biggest fanboys on this forum no less. Or to draw in equally terrible warring posts for you to flame-react...

I rest my case. You are genuinely pathetic and I feel sorry for you.
Why have't they showed any games then? All they've showed has been PC games..

Edit: I'll answer it for you: Because they weren't ready! But "as typical americans." they sold the skin before the bear was shot.
 
Last edited:

CatLady

Selfishly plays on Xbox Purr-ies X
Why have't they showed any games then? All they've showed has been PC games..

They've shown a hell of lot more than Sony has.

You got a logo and a controller is all Sony shown in all this time. You still haven't even seen the damn console let alone games. 3rd parties have shown a couple of very meh last gen or worse looking games and an engine tech demo. Have you been as concerned about Sony's utter lack of reveals and communication up to this point?
 

RaySoft

Member
They've shown a hell of lot more than Sony has.

You got a logo and a controller is all Sony shown in all this time. You still haven't even seen the damn console let alone games. 3rd parties have shown a couple of very meh last gen or worse looking games and an engine tech demo. Have you been as concerned about Sony's utter lack of reveals and communication up to this point?
We''ll talk 4 june when the tables have turned.
They won't show the console, but at least all the games are running on a PS5 and it will show. SSD Yo!:)
 
Last edited:

CatLady

Selfishly plays on Xbox Purr-ies X
We''ll talk 4 june when the tables have turned.
They won't show the console, but at least all the games are running on a PS5 and it will show:)

So you're in here complaining Xbox hasn't shown enough, when you obviously aren't interested in an Xbox in the first place, but you're perfectly fine that Sony hasn't shown shit up to this point. So pretty much you're just here to console war then, right?
 

RaySoft

Member
So you're in here complaining Xbox hasn't shown enough, when you obviously aren't interested in an Xbox in the first place, but you're perfectly fine that Sony hasn't shown shit up to this point. So pretty much you're just here to console war then, right?
I just stated my oppinion that I think MS came out prematurely. Then I got attacked (ofc) and stated my case.
This news (about Scorn running on PC) just made it more obvious.
That would be a serious discussion actually, but I guess it's not relevant?
 
Last edited:

RaySoft

Member
You, uh, realise the only game Sony has shown is also coming to PC?
What game? Godfall?
My point was that Sony will be showing off PS5 games on the 4 of june, and they won't be labeled "IN-ENGINE FOOTAGE REPRESENTATIVE OF EXPECTED PS5 VISUAL QUALITY"
 
Last edited:
Top Bottom