• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The PlayStation 5 and the Xbox Series X Will Both Be Partially Outclassed by the Time That They're Released And Fully Outclassed One Year Later

Onocromin

Banned
Oh, and for those mentioning the ssd difference - PC's nowadays allow you to install upwards of 512+ gigs to 1 terabyte of onboard system memory. PCs already hold the crown on ridiculous performance - you'll just never see it until someone decides to throw more money at developing a game to take advantage of it. It's why PC gamers love this consoles seeking parity with PC sh!t, it means they may finally get a game that actually stresses ONE of their GPU's properly. Up until, most ports have still been poorly optimized on PC and have not taken advantage of the full breadth of performance PC hardware offers. But you do see 45% + Performance in games when they actually somehow properly take advantage of SLI. And honestly, it would be upwards of 90% performance improvement over Single GPU solutions if the industry decided to standardize and focus on optimizations for PC SLI Solutions.
 

RaySoft

Member
Oh, and for those mentioning the ssd difference - PC's nowadays allow you to install upwards of 512+ gigs to 1 terabyte of onboard system memory. PCs already hold the crown on ridiculous performance - you'll just never see it until someone decides to throw more money at developing a game to take advantage of it. It's why PC gamers love this consoles seeking parity with PC sh!t, it means they may finally get a game that actually stresses ONE of their GPU's properly. Up until, most ports have still been poorly optimized on PC and have not taken advantage of the full breadth of performance PC hardware offers. But you do see 45% + Performance in games when they actually somehow properly take advantage of SLI. And honestly, it would be upwards of 90% performance improvement over Single GPU solutions if the industry decided to standardize and focus on optimizations for PC SLI Solutions.
Why spend time and money optmizing PC games, when the PC owners are doing it for them for free by constantly upgrading their hardware;-)
 
Last edited:

Onocromin

Banned
No no, you have it wrong - lazy console developers put lazy ports on PC. And the Modder Community optimizes and improves the game from there on out. It is unfortunate that no amount of optimization will ever really show the true disparity in graphics that a game built from scratch on a state of the art high end PC would. So PC Gamers are stuck with games that are hardly optimized and the only chance they really have of seeing what a super high end PC is capable of is when a 1st party port from one of these new consoles lands. Which will only then almost take full advantage of 1 GPU. While ignoring the disparity in system memory and higher core counts (64 cores if you're an AMD fan).

Imagine, a game tailored to bring a 64 core cpu, and 4 or in some cases 6 SLI'd GPU's down to it's knees just due to sheer graphical bliss. You can't? Probably because the dev's can't either.

And no one bother bringing up that shitty starwars concept either, the only thing it had going for it was reflection and shadows. The poly count's ect were nothing extraordinary. But maybe one day a game will actually demonstrate (oh yeah!) absurd polycounts, micro-geometry and micro granular finite detail. We're approaching that now with this next gen of consoles, but I doubt a mound of polygons meant to resemble granular dirt will be anything more than fancy shaders and textures atop topologized poly's attempting to look like a granular mound of dirt.
 
Last edited:

killatopak

Gold Member
No no, you have it wrong - lazy console developers put lazy ports on PC. And the Modder Community optimizes and improves the game from there on out. It is unfortunate that no amount of optimization will ever really show the true disparity in graphics that a game built from scratch on a state of the art high end PC would. So PC Gamers are stuck with games that are hardly optimized and the only chance they really have of seeing what a super high end PC is capable of is when a 1st party port from one of these new consoles lands. Which will only then almost take full advantage of 1 GPU. While ignoring the disparity in system memory and higher core counts (64 cores if you're an AMD fan).

Imagine, a game tailored to bring a 64 core cpu, and 4 or in some cases 6 SLI'd GPU's down to it's knees just due to sheer graphical bliss. You can't? Probably because the dev's can't either.

And no one bother bringing up that shitty starwars concept either, the only thing it had going for it was reflection and shadows. The poly count's ect were nothing extraordinary. But maybe one day a game will actually demonstrate (oh yeah!) absurd polycounts, micro-geometry and micro granular finite detail. We're approaching that now with this next gen of consoles, but I doubt a mound of polygons meant to resemble granular dirt will be anything more than fancy shaders and textures atop topologized poly's attempting to look like a granular mound of dirt.
It already happened back in 2007 with Crysis.

The result was a massive majority can’t even experience the “best” it had on offer. Like super high end PCs TODAY struggle to run it at 60fps at max settings.

You really won’t see anymore of those today. The closest would be Star Citizen which should be a lot more conservative and optimized to handle stuff.
 
TV is a standalone device, as oppose to PC displays that without a PC are completely useless/worthless. No one has to buy a TV because everyone already has at least one in their house.
Couldn't that be said about a computer as well? You don't need a monitor, and can use the t.v. instead. Of course to enjoy each to the full extent, a high refresh rate monitor, or an hdmi 2.1 is preferred. One is found anywhere, and one is rarer and pretty pricey.
 
Last edited:

Onocromin

Banned
Even that was poorly optimized as it only took one generation to come to console and in fact look better on less powerful hardware also PC's were generally barely scratching the multi-core CPU itch of the consumer and as a computer scientist since 2009... ever since the roadmap shrink happened in 2014, (and I plan to author a thread on this topic on gaf/gaming) micro-processing shrinks have resulted in PC performance at the high end not simply climbing 23 times per year as it did back then or... 23 times more performance per year. But now with these new manufacturing shrinks PC Hardware will once the CPU makers get their shit in gear too - offer cpu's that aren't 23 times faster per year. But 123 times faster per year. And next year, with the next roadmap CPU shrink - 334 times faster every year. That is 15 years of performance gain per year at 123 times improvement on the year. And well over 32 years of improvement/performance per year when the next shrink enables CPU's to improve at a rate of 334x per year.

So the disparity you've mentioned while an obvious one would be no where near the same if a game, by actual talented devs was built to take advantage of high end PC's Today.

Making the comparison irrelevant considering PC's now use multi GPU's solutions and are due to start improving at a rate of 30+ years per year with each next gen CPU release relative to old standards.

But think about the other point I've made here, when Intel/AMD FINAAAALLLly DEEECIDE to get in gear and start to release these CPU products using these new microprocessing shrinks we are going to get to see CPU's that harness upwards of 35+ to 60+ years of performance improvement for each year they release new improved CPUs.

This is the era, that I - as a Computer Scientist - have been waiting for. I sort of ignored the previous jump that happened with CPU improvements, because 35x per year just isn't nearly as interesting as 123x to 334x per year depending on the shrink roadmap.


But now, it's obvious why console manufacturers are releasing mid gens and next gen machines - these jumps in the PC sector will be impossible to keep up with otherwise and if they don't try - pc's and consoles will never see gaming parity... and at least there's a chance at it if you release mid gen upgrades. Otherwise PC gamers are just kinda screwed until dev's decided to en mass ignore consoles for PC 1st party offerings. And with these next gen CPU's on the radar at 7-6-4 and 3 nanometers - we're kinda screwed anyways and will have to wait for AI to start making games that take advantage of all this outrageous performance.
 
Last edited:

Onocromin

Banned
Within 1 minute of my own posting, you've read my reply - quoted it, and tried to actually reply?

You're as Irrelevant as Crysis.
 

BluRayHiDef

Banned
It already happened back in 2007 with Crysis.

The result was a massive majority can’t even experience the “best” it had on offer. Like super high end PCs TODAY struggle to run it at 60fps at max settings.

You really won’t see anymore of those today. The closest would be Star Citizen which should be a lot more conservative and optimized to handle stuff.
My PC can run Crysis in 4K at over 80 frames per second with maximum settings and 2x MSAA.

i7-5820k
Asus X99
4 x 4GB DDR4
GTX 1080 Ti
Samsung EVO 500GB Sata SSD
3 Western Digital HDDs (1TB, 4TB, 5TB).
 

Onocromin

Banned
Crysis 3 should in fact be the real go to measurement as it was ported with care and great on both PC and Console. It also holds up better today.

But to say Crysis would somehow stack up to a game tailored to run, by experienced actual good devs who know how to optimize on PC today.. is useless.

Crysis showed what the open world game could look like, before consoles could even manage in that era.

But what's the next step going to look like now that we can stack 10 open world games on their head in one game?

Or are we just stuck with more shitty non imaginative open world offerings and a Crysis Remake.
 
Last edited:
You#d be in for a surprise.

This is not an Amiga anymore where you are coding on the bare metal, or the PS3 where you needed to have specific SPU code running to get results.

These consoles are just PCs, and those companies are using standard APIs these days.
It's even the stated design goal Cerny gave out since the inception of the PS4.

You just want to believe there is some magic, but there is none (anymore).

How are they getting these results then? The consoles, if they are not writing to the metal. These are jaguar cores we are talking about.

If these consoles were pc only, how come a pc with same jaguar cores can't do anything remotely close?
 

killatopak

Gold Member
Yea, but you said that even "super high end PCs TODAY struggle to run it at 60fps at max settings."
Yeah I consider that super high end. A mid grade one would be on the level of Pro/X. A lower end would be base PS4/XBO.

Just considering the breakdown of GPU used on steam, the 1060 is the highest which is on the level of X.

besides you just used 2x msaa. Change it to 8x.
 
Last edited:

Onocromin

Banned
none of those are super high end PC's, a super high end PC is state of the art latest of the latest. Best of the best, with a large pricetag... otherwise you're just sporting a capable, decent PC with a 5 year old GPU LOL. If I was to go in and ask to have a high end state of the art computer built, you could bet I wouldn't consider anything less than a SLI rig with 16core+ , 512gigs of on board ram high end... or particularly state of the art. A good PC? Sure. High end? As in state of the art? As the terminology implies ? Are you on drugs? I will say, that computers built that were once high end and sporting parts 2-5 years old now, while certainly not high end - are far more capable than 2-5 year old "high end" PC's of the previous generation. But still not actually high end. People don't want other's actually building REAL high end PCS that's ABSURD

LOL
 

S0ULZB0URNE

Member
How are they getting these results then? The consoles, if they are not writing to the metal. These are jaguar cores we are talking about.

If these consoles were pc only, how come a pc with same jaguar cores can't do anything remotely close?
Bingo!

Cause they use custom parts so can't be compared.
 

S0ULZB0URNE

Member
Crysis 3 should in fact be the real go to measurement as it was ported with care and great on both PC and Console. It also holds up better today.

But to say Crysis would somehow stack up to a game tailored to run, by experienced actual good devs who know how to optimize on PC today.. is useless.

Crysis showed what the open world game could look like, before consoles could even manage in that era.

But what's the next step going to look like now that we can stack 10 open world games on their head in one game?

Or are we just stuck with more shitty non imaginative open world offerings and a Crysis Remake.
PC's couldn't run Crysis with respectable resolution and fps when it launched either.
 
Last edited:

BluRayHiDef

Banned
Yeah I consider that super high end. A mid grade one would be on the level of Pro/X. A lower end would be base PS4/XBO.

Just considering the breakdown of GPU used on steam, the 1060 is the highest which is on the level of X.

besides you just used 2x msaa. Change it to 8x.

You're not making any sense.

8x MSAA is unnecessary at 4K, because the pixel density is so high that aliasing is minimal. Why would I intentionally cripple performance for no appreciable increase in image quality and just to prove a ridiculous point? The fact of the matter is that if my four-year old PC can run the game at 4K at 60 frames per second with max settings and only the necessary level of anti-aliasing, then a PC with TODAY's greatest specs could definitely do so and could perhaps run the game with even 8x AA (even though that's unnecessary). The original Crysis is no longer a challenge for respectable PC hardware; Crysis 3, on the other hand, is a challenge.
 
Last edited:

BluRayHiDef

Banned
none of those are super high end PC's, a super high end PC is state of the art latest of the latest. Best of the best, with a large pricetag... otherwise you're just sporting a capable, decent PC with a 5 year old GPU LOL. If I was to go in and ask to have a high end state of the art computer built, you could bet I wouldn't consider anything less than a SLI rig with 16core+ , 512gigs of on board ram high end... or particularly state of the art. A good PC? Sure. High end? As in state of the art? As the terminology implies ? Are you on drugs? I will say, that computers built that were once high end and sporting parts 2-5 years old now, while certainly not high end - are far more capable than 2-5 year old "high end" PC's of the previous generation. But still not actually high end. People don't want other's actually building REAL high end PCS that's ABSURD

LOL
If you're referring to the GTX 1080 Ti, then you're wrong about it being five years old; it was released in 2017, so it's only three years old. I upgraded to it from a 980Ti when it was released a year after I initially built my current PC.
 

killatopak

Gold Member
You're not making any sense.

8x MSAA is unnecessary at 4K, because the pixel density is so high that aliasing is minimal. Why would I intentionally cripple performance for no appreciable increase in image quality and just to prove a ridiculous point? The fact of the matter is that if my four-year old PC can run the game at 4K at 60 frames per second with max settings and only the necessary level of anti-aliasing, then a PC with TODAY's greatest specs could definitely do so and could perhaps run the game with even 8x AA (even though that's unnecessary). The original Crysis is no longer a challenge for respectable PC hardware; Crysis 3, on the other hand, is a challenge.
Did I stutter when I said max settings?

It’s not even about image quality anymore it’s about peak performance on what you get with the current tech on game made in 2007. The point isn’t about how much fps or resolution your GPU could do, it’s about the target specs of the devs. Now imagine using max settings on 720p/1080p with tech from 2007. It is absurd because a vast majority of your market can’t respectably run it at that during the time.

Crytek went all in on single core use which is why when the market shifted to multi core tech, it took a long ass while for Crysis to be able run as good as it right now. That further added to complication of the game.

At the end of the day, this is all a business. The reason you don’t target the high end is for money. Plain and simple. You make the best possible game you can on the most amount of hardware. It’s why games like CS Go or League of Legends still can run on potato pc despite their game engines and graphics engine being overhauled throughout the years.

Targetting for high end is simply a gamble and the only successful game in recent memory that managed to gather an audience is Star Citizen and it had to be crowd funded and interest gauged before they got outside funding from other publishers.
 

Onocromin

Banned
PC's couldn't run Crysis with respectable resolution and fps when it launched either.
But consoles with worse specs than ultra state of the art PC's had at launch could. Which underscores the disparity in how badly Crysis was coded on PC launch opposed to it's console counterparts. Particularly Crysis 3 as it launched with improved performance and better visuals across both platforms.
 
Last edited:

Onocromin

Banned
Did I stutter when I said max settings?

It’s not even about image quality anymore it’s about peak performance on what you get with the current tech on game made in 2007. The point isn’t about how much fps or resolution your GPU could do, it’s about the target specs of the devs. Now imagine using max settings on 720p/1080p with tech from 2007. It is absurd because a vast majority of your market can’t respectably run it at that during the time.

Crytek went all in on single core use which is why when the market shifted to multi core tech, it took a long ass while for Crysis to be able run as good as it right now. That further added to complication of the game.

At the end of the day, this is all a business. The reason you don’t target the high end is for money. Plain and simple. You make the best possible game you can on the most amount of hardware. It’s why games like CS Go or League of Legends still can run on potato pc despite their game engines and graphics engine being overhauled throughout the years.

Targetting for high end is simply a gamble and the only successful game in recent memory that managed to gather an audience is Star Citizen and it had to be crowd funded and interest gauged before they got outside funding from other publishers.
Right, public funding still was not enough and they were still not targeting state of the Art PC's with the most cutting edge advanced solutions. Which is why it's a good thing Consoles are trying to keep pace.

However Half Life 2 did target the high end cutting edge sector, at least on the GPU side and it payed off.
 

Onocromin

Banned
LOL, it's literally only one generation behind the current generation of Nvidia graphics cards, and I'll be upgrading it when Nvidia and AMD release their next series of cards.
And that's ancient as itself was succeeded by higher clocked variants ect, particularly if you consider that nvidia has released 3 succeeding generation's of it's current series all with higher clocks with partner manufacturers

That's also excluding the latest 20X derived Titan RTX variant not to mention the TitanV TitanGTX ect successors released to their respective standard cards.

People say these higher clocked cards, larger meaner cards don't offer much more than their predecessor but I wager if they had one in hand to test in person they would react far differently. Particularly for SLI rigs and moreso with simulation/rendering.
 

Onocromin

Banned
Oh, and I'm politely saying yes they are only 1 generation off from standard Nvidia numbered derivatives (2 as I've kindly decided to exclude the titan varients) - but in some cases 3 and 4 generation's behind what is actually on the market currently. A souped up 2080ti with 200mhz higher clock speeds can be a world of difference in an SLI Rig, adding a 600-800mhz gap between those respective iterations and standard 2080ti vanilla variants. And I've also excluded the founders additions which are arguably also another generation ahead of the vanilla 2080ti as they come overclocked but also offer more room to unlock/overclock than standard iterations.
 

S0ULZB0URNE

Member
But consoles with worse specs than ultra state of the art PC's had at launch could. Which underscores the disparity in how badly Crysis was coded on PC launch opposed to it's console counterparts. Particularly Crysis 3 as it launched with improved performance and better visuals across both platforms.
Crysis 3 is a totally different story and doesn't change what I said.
Crysis launched to un-capable PC's.
 

S0ULZB0URNE

Member
If you're referring to the GTX 1080 Ti, then you're wrong about it being five years old; it was released in 2017, so it's only three years old. I upgraded to it from a 980Ti when it was released a year after I initially built my current PC.
I still have my 1080ti upgrading from it to a 2080ti would of been LOL worthy so I didn't.
It's still a beast for most of the games I play.
3080ti or greater will be a suitable upgrade.
 

killatopak

Gold Member
Right, public funding still was not enough and they were still not targeting state of the Art PC's with the most cutting edge advanced solutions. Which is why it's a good thing Consoles are trying to keep pace.

However Half Life 2 did target the high end cutting edge sector, at least on the GPU side and it payed off.
I don‘t recall if Half Life 2 did it but if they did it, I think part of their intention is showing off the new Source engine. You can actually say the same for Crysis and the Cry Engine. It worked wonders for both of those titles and a lot of developers bought license for their engine.

It’s actually pretty smart of them considering the circumstance.
 

Neo_game

Member
Specs of the consoles are actually pretty impressive and I think most if not all devs are pleased for it. For first couple of years most games are going to be cross-gen games anyways so even consoles are not going to use their full potential. I think visually most impressive games will come in some 3-4yrs after the laucn of the console. Probably by then we may not see big improvement.

I do not see anypoint in comparing consoles with PC. Devs never really use PC to its potential and probably cannot do. Since very few % only buy 500+$ GPU. So depending on the specs people simply need to adjust their gfx settings, resolution to get the fps they like. There are just too many variables on PC.
 
Last edited:

Onocromin

Banned
Doesn't change that the x360 was able to out perform PC's and run Crysis as Crysis, while made as you imply - was also poorly, poorly optimized upon PC release as PC's with better hardware than 360 were still struggling to run Crysis after they finally decided to tailor fit a version to work on 360. And if you are implying Crysis was not poorly optimized you are wrong. Forums were awash with enthusiasts, and modders complaining and hoping crytek would release a patch to rectify some of these issues for 4 years after it's release.

Meanwhile, it releases on console with most of the glaring, terrible optimization issues fixed and at a lower Resolution.. meanwhile the community found out custom ini tweaks and other fixes could be applied to the PC variant and alleviate some issues in droves. Underscoring my point, which you completely ignored - That Crysis was both coded to run on cutting edge PC's and poorly optimized making for a horrible example unless you are arguing that poorly optimized titles that also happen to be cutting edge are specifically the only selling point for high end PCs.

Same case with FarCry except the PC community did not notice or care for how it performed on 360, it was already far too old as by the time it was in fact released on 360 over 7 GPU cycles had passed.

And yes Half Life 2 was touted for being shader intensive, cutting edge spec that was also highly optimized and was doing thing's no game at that time had. Particularly with physics and of course advanced math shaders which had not been seen before. Gaben took to stage heralding the latest ATI variant, and then 2 years later Nvidia swooped in with an amazing GPU solution that offered 95% performance over the previous gen - GabeN took to stage again and stressed that it was still just barely enough to run Half Life 2 with the Highest settings, and also announced they were in fact maintaining a dev cycle that would see HL2 eventually updated with even more advanced shader's - and said the hardware would in fact not be able to run that variant. All before it's release.

I mentioned this in the other thread and will say again, the Series X is a 4,193 bit graphics console - for those unaware - this is the correct resolution of graphical bit rate when you apply the math. For those interested yet unaware.
 
Last edited:

martino

Member
Doesn't change that the x360 was able to out perform PC's and run Crysis as Crysis,

giphy.gif

360:
unledwan.jpg

pc:
crysis2011090912510250.jpg
 

Onocromin

Banned
Oh yeah, i forgot the most glaring issue PC gamers had with Crysis on release - Crytek as development team said they were unable to optimize Crysis to their own internal standards due to money issues but looked at upcoming hardware cycles and knew that eventually these issues would be alleviated through brute force! LITERALLY

and then later the PC community came out swinging with a catchphrase about the xbox versions release "Uglier but Optimized, Cryengine"


Crytek dev admits; Crysis was an unoptimized mess


So an unoptimized game was released - that also just happened to be cutting edge. What a great marriage of factor's for PC gamers!

"We're making our game very difficult to run on current high end hardware, oh yeah - it's also unoptimized sot it would of ran poorly anyways and yeah we probably could of got it to run better but we need money and ironed out the gameplay glitches enjoy!" lol

Crysis still had horrendous microstuttering on hardware that could in fact run it (latest cutting edge hardware) that would make the game just pause for 2 second's while it tried to reset it's orientation, on hardware that could actually run above 80fps.

Starcitizen while being cutting edge at the beginning of it's dev cycle and doing things at the higher end - is a game that while yes still cutting edge across some areas - has been in dev hell since 2011. Most PC enthusiast's know's it will not release and take complete advantage of current cutting edge hardware when they finally do ship the product, almost a decade (probably over a decade) later.


Star Citizen
is an upcoming multiplayer space trading and combat simulator developed and published by Cloud Imperium Games for Microsoft Windows. A spiritual successor to 2003's Freelancer, Star Citizen is being led by director Chris Roberts and has become highly controversial during its production. Development of the game began in 2011 and was announced in 2013 through a successful Kickstarter campaign which drew in over US$2 million. While its launch was originally anticipated for 2014, significant expansion of gameplay features and scope have led to repeated delays. In 2013, Cloud Imperium Games began releasing parts of the game, known as "modules", to provide players with the opportunity to experience gameplay features prior to release. The latest of these modules, known as the "Persistent Universe", was released in 2015 and continues to receive updates.
 
Last edited:

Onocromin

Banned
Also, let's not forget just because one "Highly Unoptimized mess" as it's own developer put it - was released on PC with great speed (Development speed, they didn't optimize) that also happened to be cutting edge - does not mean PC Gamer's do not in fact deserve more Cutting Edge PC Specific Releases, released in a time frame that fully exploit and utilize current hardware on release.

That is, once - once in the entire history of PC's that it has actually happened, a game tailored (and sadly with trash optimization) to run on the best hardware avaiable at launch.

Even Doom ran on my highend 6800GTS at over 130 FPS highest settings on release.

So to this date, the PC community has gotten 1 game specifically tailored to it's highest tier hardware.

Meanwhile, star-citizen is over here like "hey I'll still release eventually - maybe in the next 5 years - and take full advantage of some hardware and some (not all, they still have not mentioned an RTX update for instance) sure not everything I'm doing is cutting edge but geesh guys" LOL.

Star Citizen developer calls ray tracing "a massive headache"



How very sad for gamers everywhere.


Maybe one day, we will see....


GASP

a SECOND GAME IN THE ENTIRE HISTORY OF THE PC SINCE 1943 THAT TAKES FULL ADVANTAGE OF THE LATEST CUTTING EDGE PC HARDWARE.
 
Last edited:

kamal9deuce

Neo Member
Speccing out and maintaining a PC with no problems is a burden which isn't there for a traditional console. PCs are not plug + play for most people. Two different worlds defeats the purpose by comparing it.
 

lukilladog

Member
Outclassed, yeah. So what?, nvidia and amd will still be playing the milking game, selling hardware that for some reason will perform worse on every new wave of ports... while consoles will go in the opposite direction, where every new wave of games will look and run better than the previous one... on the same fricking hardware!.
 
Last edited:

Onocromin

Banned
Speccing out a PC for high end enthusiast's will always be worth it, if you are a hardware Enthusiast.

There are sooo many reasons to do this too.

Particularly if you are interested in advanced collision simulations.

And advanced AI simulation.

Before, on low end last gen hardware - at lot of this was a burden to do.

Not so now on a state of the Art PC.

In fact, there is a whole movement that, because of these improvements I am listing here, will eventually begin.

I can see it now - Gamers shifting towards programming their own collision simulations in realtime for reasons known only to them.

This is the era that finally will allow computer scientist to begin simulating vast amounts of data (with examples listed below).

As an example, these are physics/collisions simulations that are only possible to create in a timely manner on the high end PC's of Today.

And for those who believe a high end PC is not worth it, this offers a look into what games will be lacking until the next gen or possible successors.

Realtime rendering of advanced simulations.

Point being, these are all rendered on the fly and you can see the simulation take place without the need to bake it in as next gen Series X dev's will be forced to do for similar effect.

This increase in raw compute marks an Epoch gamers haven't considered but should, as it finally allows everyone to create wild simulation's extremely accurately with relative compromise.

Realtime rendering is a huge boon overlooked by the gaming community as most believe this mean's rendering 1 frame for fidelity purposes when it in fact means rendering an entire simulation of data in realtime. This could not in fact be accomplished without a considerable amount of time spent rendering/frames and baking before this hardware was available. And is all the reason anyone may need to
look into building their own state of the art high end pc.

If you can't see why having this capability would be useful for the current line of PC enthusiasts you do not have the capacity to use a computer and think at this level anyways.

Oh and eventually games will take advantage of the extra power too. But I don't see consoles being able to render advanced realtime simulation's for at least another generation.





 

v1oz

Member
Common sense? Logic? The amount of shaders, cores, the higher bandwidth, etc? Why would someone think they would get better performance than a gpu which costs double that of a complete console? Case, motherboard, storage, cpu/gpu, controller, etc? I'm pretty sure anyone could guess next gen consoles can't touch the high end or enthusiast pc's. I figured that would be completely obvious, otherwise I would go out and buy a console instead if I could get better performance than my pc.
That's lacking common sense. You can't directly compare Nvidia shader cores to AMD shader cores like for like. By the time Xbox Series X ships the 2080Ti will be over two years old. Technology has moved on and efficiency gains have been made.

In any case Nvidia measures the 2080Ti's ray racing performance at 10 gigarays (but its more likely in reality 13.6 gigarays/sec). Xbox Series X is, 1825mhz * 208 RT cores, that gets the 380 billion intersections per second that Microsoft has said on their specs, you divide that by the 10 deep BVH and that gets you ~38 (37.96 to be exact) gigarays/sec, or ~4x the performance of the 2080Ti.

In a nutshell, basically AMD have feature matched all of Turing's, VRS, Mesh shading, HDR, RT etc in a relatively smaller power envelope. Also the new chip is fabricated on N7P and will in addition have all the efficiency gains that will come purely from a new architecture.

And let's not forget the CPU in the Xbox will be more powerful and have more cores than a current generation Intel i7. So that will be powerful than what the majority of gamers have in their PC's.
 
Last edited:
That's lacking common sense. You can't directly compare Nvidia shader cores to AMD shader cores like for like. By the time Xbox Series X ships the 2080Ti will be over two years old. Technology has moved on and efficiency gains have been made.

In any case Nvidia measures the 2080Ti's ray racing performance at 10 gigarays (but its more likely in reality 13.6 gigarays/sec). Xbox Series X is, 1825mhz * 208 RT cores, that gets the 380 billion intersections per second that Microsoft has said on their specs, you divide that by the 10 deep BVH and that gets you ~38 (37.96 to be exact) gigarays/sec, or ~4x the performance of the 2080Ti.

In a nutshell, basically AMD have feature matched all of Turing's, VRS, Mesh shading, HDR, RT etc in a relatively smaller power envelope. Also the new chip is fabricated on N7P and will in addition have all the efficiency gains that will come purely from a new architecture.
Lacking common sense is believing you'll be getting a console that outperforms a 2080 TI in any aspect besides power consumption. Especially in raytracing. As everything we've seen thus far, points towards a good bit weaker gpu, when it comes to exactly that. People can't be THAT dense, can they...
 

v1oz

Member
Lacking common sense is believing you'll be getting a console that outperforms a 2080 TI in any aspect besides power consumption. Especially in raytracing. As everything we've seen thus far, points towards a good bit weaker gpu, when it comes to exactly that. People can't be THAT dense, can they...
Let's see the hard evidence to justify your assertions.
 
I used to game on PC during PS360 era due to the huge difference in textures, resolution and AA. I had GTX 260 and then upgraded to 470 and then 970 earlier this gen. This gen the ps4 started good with high quality graphics that were not that far from PC.I picked up ps4 and when Ps4 pro came later and I picked it up and never went back to PC since. Consoles this gen and most likely next gen are an excellent deal ( cheap and high quality).
 
Let's see the hard evidence to justify your assertions.
I've already compared them throughout this thread, and so have others. AMD can't compete with the 2080, 2080S, or 2080TI. Their high end card, 5700 xt, uses up a shit ton of power and still falls short of those mentioned cards. How on earth can AMD suddenly beat enthusiast level gpu's, with a console :messenger_tears_of_joy:, yet can't even come near Nvidia on PC platform?

Nvidia came out with raytracing and DLSS in their mainstream gpu's 2 years ago, AMD came out with shit drivers, an inefficient card, and had so many issues. Compare the hardware between 2080 TI vs XSX gpu, and it's beyond obvious which is the better performer. Microsoft used their in house games to demonstrate what over 7 years of improvements and where hardware design would take us. And they demoed fucking current gen games, with very unimpressive results, wtf?!. AMD/MS raytracing demo was underwhelming, compared to the worst raytracing demo from Nvidia. And you could tell the performance wasn't great at all.

I applaud your positivity, but it's wishful thinking at best. I would love to see evidence to support your claim, as there are too many things that disprove even a possibility of a subsidized plastic box, beating a $1300 dedicated gpu, which won't be limited by cooling or power constraints. Feel free to change my mind.
 

Daniel Thomas MacInnes

GAF's Resident Saturn Omnibus
Oh, for the love of...these machines aren’t even out yet, and gamers are already complaining about their being “obsolete” and “outclassed.”

Does anybody actually play videogames anymore? Or is this some sort of Coronavirus quarantine thing, and we’re all going crazy ala The Shining?
 
Top Bottom