• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The PlayStation 5 and the Xbox Series X Will Both Be Partially Outclassed by the Time That They're Released And Fully Outclassed One Year Later

pawel86ck

Banned
Not true at all, input-lag is a real thing.
Input lag is not only related to framerate. I have played hundreds of games on consoles and not every games is equally responsive at 30fps. In PS1-PS3 and xbox - x360 era many 30fps games were very responsive on gamepad at 30fps and back then I didnt even cared if game run at 30fps or 60fps. Now however (since xbox one and PS4 launch) 30fps games are laggy as hell.

I have compared GTA 5 on PS3/x360 and then on PS4 specifically. It should be the same engine with similar gamepad settings, yet xbox/ps3 version is way more responsive compared to xbox one / PS4 verison despite running sometimes at 21 fps. I can no longer aim without autoaiming feature enabled on current gen consoles because if feels like I would aim underwater or something. The same with tomb raider and rise of the tomb taider on xbox360 and xbox one (in fact even digital foundry has mentiond x360 verison was way more responsive).

It looks like input lag can be increased when game use more demanding graphic effects and more CPU cores / threads and maybe that's indeed a case because certain developers like turn10 even mentioned they have reduced the amount of working CPU cores in forza 7 because it increased input lag. That's why forza 7 was using limited number of cores at launch, but game was patched later on, because people wanted more performance. In fact I'm testing sometimes CPU cores affinity and many games feels more responsive with only one CPU core enabled.

I have also compared tested many PC games. Probably the most laggy game I have ever played on PC was doom 2016, that game absolutely need 120fps because controls are so laggy even at 60fps. I think Crysis 1 and 3 is also a good example. Crysis 1 is clearly more responsive at the same 60fps and I had to run it at 100+ fps just to aim without laggy feeling. Crysis 3 has much improved graphics engine with advanced multicore support and I believe that's why it feels laggy.

So I say 30fps on gamepad in ideal scenario (no additional input lag in your TV and graphics engine input lag) is perfectly fine. Unfortunately most new games have laggy graphics engines and IMO 60fps is a must these days.
 
Last edited:
It's always the same story: consoles are "outclassed" almost immediately and yet -- outside of an ever-shrinking number of AAA exclusives -- videogames are still designed and built with console constraints in mind. At best, PC gets the consolation prize of higher LoD, better framerate, higher resolution, etc for the exact same console-limited narrow corridor shooter and copypasted open world.


Consoles arent what hold pc back. PC does.

Those close to minimum requirement. While pc is powerful if you have the hardware, games are designed to run on low settings on gtx1050 level of hardware.

And before you jump in saying ps4 base version is even less powerful, it runs those games on medium high settings. So lowest end hardware is ACTUALLY pc only.
 

Kenpachii

Member
Input lag is not only related to framerate. I have played hundreds og games on consoles and I have noticed sometimes very strange. In PS1-PS3 and xbox - x360 era many 30fps games were very responsive on gamepad at 30fps and back then I didnt even cared if game run at 30fps or 60fps. Now however (since xbox one and PS4 launch) 30fps games are laggy as hell.

I have compared GTA 5 on PS3/x360 and then on PS4 specifically. It should be the same engine with similar gamepad settings, yet xbox/ps3 version is way more responsive compared to xbox one / PS4 verison despite running sometimes at 21 fps. I can no longer aim without autoaiming feature enabled on current gen consoles because if feels like I would aim underwater or something. The same with tomb raider and rise of the tomb taider on xbox360 and xbox one (in fact even digital foundry has mentiond x360 verison was way more responsive).

It looks like input lag can be increased when game use more demanding graphic effects and more CPU cores / threads and maybe that's indeed a case because certain developers like turn10 even mentioned they have reduced the amount of working CPU cores in forza 7 because it increased input lag. That's why forza 7 was using limited number of cores at launch, but game was patched later on, because people wanted more performance. In fact I'm testing sometimes CPU cores affinity and many games feels more responsive with only one CPU core enabled.

I have also compared tested many PC games. Probably the most laggy game I have ever played on PC was doom 2016, that game absolutely need 120fps because controls are so laggy even at 60fps. I think Crysis 1 and 3 is also a good example. Crysis 1 is clearly more responsive at the same 60fps and I had to run it at 100+ fps just to aim without laggy feeling. Crysis 3 has much improved graphics engine with advanced multicore support and I believe that's why it feels laggy.

So I say 30fps on gamepad in ideal scenario (no additional input lag in your TV and graphics engine input lag) is perfectly fine. Unfortunately most new games have laggy graphics engines and IMO 60fps is a must these days.

What you are encountering is what's called MS ( frame times ). It's basically the reaction time from your input controller towards the game actually doing something.

This is the most important factor in a game after framerate, this is why lots and lots of people sit at 1080p on PC and push high FPS even if your screen isn't high hz. Because upping the resolution will add a fuck ton more MS by lowering the framerate.

The gernal rule for PC was 16 ms or lower is great, everything above it is janky and 25+ it starts to get really sluggish.

However for now i would say 2-8 ms is the sweet spot and everything above it feels sluggy's.

Typical console will push around the 30 or higher ms or sit between heavy fluctuating MS when they go lower which will feel absolutely janky.


56783bd4bc8b4714a845925ed3718040.gif


0f045fa8dee1c5300d6118d43b0fbb24.gif


1b54421cc0528e4ec9c332e28e47b891.gif




Next gen won't be changing this, because games will still be focused around 30 fps or 60 fps at best and those gpu's will sit at 4k which will add a truck ton of ms towards the mix.

The only way for you to get rid of it is get a gaming PC and focus on high framerate.
 
Last edited:
Consoles catch up quick and even surpass the highest end pcs when it comes to games, for a cheaper price too.

Look at this digital foundry video, xbox x looks the same if not better than pc maxed:

What are you smoking bro?! Have you not seen (or lack there of) how many settings are turned off or lowered than low, on the Xbox 1 X console? Or not even being able to hold a constant 30fps?! You can't be this disingenuous, can you?
uXa3t6I.png
 

T-Cake

Member
I'm seriously wondering whether to continue watching Digital Foundry. I spent almost the entirety of this gen watching their PC v console comparisons and it got me to get a PC so I could play the "best" versions. But at the end of the day, aside from the higher framerates and quicker loading, it's turned out to be a poor choice because the graphics look practically the same for all intents and purposes. Without DF pointing out the differences, one wouldn't really be any the wiser. I'm going back to consoles next gen and that's that. :messenger_sunglasses:
 
I'm seriously wondering whether to continue watching Digital Foundry. I spent almost the entirety of this gen watching their PC v console comparisons and it got me to get a PC so I could play the "best" versions. But at the end of the day, aside from the higher framerates and quicker loading, it's turned out to be a poor choice because the graphics look practically the same for all intents and purposes. Without DF pointing out the differences, one wouldn't really be any the wiser. I'm going back to consoles next gen and that's that. :messenger_sunglasses:
For some it may not be so obvious, but if you check out RDR2 comparison and focus on the background or foreground textures, shadows, sharpness, LOD, etc, it becomes obvious which is the better version. That is from stills only, but comparing them in person is a night and day difference. 21fps, vs 5x that, plus all of the fidelity, makes an obvious choice. I would still choose pc if there were no differences between the two visualy, if framerate alone was the difference. I'll take high framerate all day, every day. Even people with medium spec pc's can get better visual quality, and double the framerate of consoles in that game.
 

Bankai

Member
I'm seriously wondering whether to continue watching Digital Foundry. I spent almost the entirety of this gen watching their PC v console comparisons and it got me to get a PC so I could play the "best" versions. But at the end of the day, aside from the higher framerates and quicker loading, it's turned out to be a poor choice because the graphics look practically the same for all intents and purposes. Without DF pointing out the differences, one wouldn't really be any the wiser. I'm going back to consoles next gen and that's that. :messenger_sunglasses:

Ignorence is bliss I guess? I like knowing the differences, I find the tech interesting and research that goes into it. DF does a great job and I really dig their analyses.

I'm still able to enjoy an "inferior" version of a game and I think it's quite petty to "have" to play the best version in order to be able to enjoy a game.
 
Ignorence is bliss I guess? I like knowing the differences, I find the tech interesting and research that goes into it. DF does a great job and I really dig their analyses.

I'm still able to enjoy an "inferior" version of a game and I think it's quite petty to "have" to play the best version in order to be able to enjoy a game.
I agree with you up until the very last part. It's a preference to play the best version. It's the same thing as living in an apartment vs a house. Eating a steak vs McDonald's. Public transportation vs driving a car. Some people would prefer the finer things in life, and that's fine. As long as people enjoy their purchase, is what matters at the end of the day. As a pc gamer, I won't shit on a console player. Just don't make baseless claims that a console will have better graphics than a pc, and we're good to go.
 

fybyfyby

Member
Ignorence is bliss I guess? I like knowing the differences, I find the tech interesting and research that goes into it. DF does a great job and I really dig their analyses.

I'm still able to enjoy an "inferior" version of a game and I think it's quite petty to "have" to play the best version in order to be able to enjoy a game.
Maybe he mean that sometimes it doesnt matter if there is some different pixel which you cant even notice in motion, but what is different is gameplay. I love DF videos too, but they use magnifying tool too see differences a lot (if they dont compare fps only at that time). I mean did you saw GT sport vs Forza 7 differences video? One has better reflections, other better details on car....but in the end, when you race, none of it is important. Only gameplay and fun.
 

ZywyPL

Banned
nput lag is not only related to framerate. I have played hundreds og games on consoles and I have noticed sometimes very strange. In PS1-PS3 and xbox - x360 era many 30fps games were very responsive on gamepad at 30fps and back then I didnt even cared if game run at 30fps or 60fps. Now however (since xbox one and PS4 launch) 30fps games are laggy as hell.

I have compared GTA 5 on PS3/x360 and then on PS4 specifically. It should be the same engine with similar gamepad settings, yet xbox/ps3 version is way more responsive compared to xbox one / PS4 verison despite running sometimes at 21 fps. I can no longer aim without autoaiming feature enabled on current gen consoles because if feels like I would aim underwater or something. The same with tomb raider and rise of the tomb taider on xbox360 and xbox one (in fact even digital foundry has mentiond x360 verison was way more responsive).

It looks like input lag can be increased when game use more demanding graphic effects and more CPU cores / threads and maybe that's indeed a case because certain developers like turn10 even mentioned they have reduced the amount of working CPU cores in forza 7 because it increased input lag. That's why forza 7 was using limited number of cores at launch, but game was patched later on, because people wanted more performance. In fact I'm testing sometimes CPU cores affinity and many games feels more responsive with only one CPU core enabled.

I have also compared tested many PC games. Probably the most laggy game I have ever played on PC was doom 2016, that game absolutely need 120fps because controls are so laggy even at 60fps. I think Crysis 1 and 3 is also a good example. Crysis 1 is clearly more responsive at the same 60fps and I had to run it at 100+ fps just to aim without laggy feeling. Crysis 3 has much improved graphics engine with advanced multicore support and I believe that's why it feels laggy.

So I say 30fps on gamepad in ideal scenario (no additional input lag in your TV and graphics engine input lag) is perfectly fine. Unfortunately most new games have laggy graphics engines and IMO 60fps is a must these days.

Depends on the game, or specifically the game mode - in SP there are tons of distractions from the core gameplay - cut-scenes, scripts, the plot itself, NPCs, and so on, but it all fades away immediately when you jump into MP and you're left with nothing more but the gameplay, and needless to say, it's not a coincidence the games that last months or even years are 60FPS titles, while 30FPS games are empty within even just a few weeks. You can have a great SP experience with 30FPS sure, but you can't have fun gameplay without 60.
 

v1oz

Member
How does the OP know that the RTX2080ti is more powerful than the Series X GPU? Has he actually benchmarked games on both and come to that conclusion.
 

Thirty7ven

Banned
Seems obvious that if you want to ride at the cutting edge of gaming tech you need to go PC, but it gets expensive. The bar is raised every year after all.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Let me start off by saying, it'd be best if you stayed away from false equivalency when comparing VFX to dark10x. Dark10x is a humble dude who has an interest in technology and doesn't pretend to be the beginning and ending in the technical discussions he participates in. We all know why Dark10x was "dog-piled" and it's because he had the unfortunate privilege of trying to walk through the valley of console zealots. Look, you're free to defend VFXVeteran if you like after all, that's your prerogative. As to his attitude, I needn't discuss it further as others will do so on my behalf. With regards to his approach, I've had the privilege of working with extremely brilliant people in my line of work and the one thing that stuck out is a certain level of professional humility. It's because there are checks and balances on your work. Everyone is very educated on the subject matter at hand and people are cautious with the statements they make. The reckless, egotistical, and frankly narcissistic approach displayed by VFX is not an approach that I've observed to be associated with some of the brilliant individuals I've worked with in my experience. Combine this with the frequent inaccuracies from VFX, it just looks off. A certain juxtaposition perhaps. There's a reason he gets his flack wherever he goes and it's not undeserved.

The bolded is 100% true. VFXVeteran VFXVeteran actually could be one of the best posters on GAF if he wasn't so biased for PC and biased against consoles. He's smart, but his bias leads him to say some really dumb stuff. It clouds his judgement and that's disappointing.

And the 2nd bold is again 100% on point. It's weird that he feels the need to tell people what "WILL" happen with next-gen consoles. If we would have let him tell it 8 years ago, he would have said something like God of War wouldn't have been possible (graphics wise).
 
Last edited:
How does the OP know that the RTX2080ti is more powerful than the Series X GPU? Has he actually benchmarked games on both and come to that conclusion.
Common sense? Logic? The amount of shaders, cores, the higher bandwidth, etc? Why would someone think they would get better performance than a gpu which costs double that of a complete console? Case, motherboard, storage, cpu/gpu, controller, etc? I'm pretty sure anyone could guess next gen consoles can't touch the high end or enthusiast pc's. I figured that would be completely obvious, otherwise I would go out and buy a console instead if I could get better performance than my pc.
 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
I agree with you up until the very last part. It's a preference to play the best version. It's the same thing as living in an apartment vs a house. Eating a steak vs McDonald's. Public transportation vs driving a car. Some people would prefer the finer things in life, and that's fine. As long as people enjoy their purchase, is what matters at the end of the day. As a pc gamer, I won't shit on a console player. Just don't make baseless claims that a console will have better graphics than a pc, and we're good to go.

I love your post, but having a console vs. a top of the line PC isn't that big of a difference than your comparisons :messenger_grinning_smiling:
I wouldn't consider buying a PS5 or XSX to riding public transportation. I'd say it's closer to this.....

PS5/XSX is a Mercedes C300 ($41,000 new)
2020-C-SEDAN-MP-016.jpg




Top of the Line PC is a Mercedes S560 Maybach ($180,000 new)
maxresdefault.jpg




Both are nice cars, yet one is CLEARLY superior to the other. But you'll pay for it too.
 

JimboJones

Member
For some it may not be so obvious, but if you check out RDR2 comparison and focus on the background or foreground textures, shadows, sharpness, LOD, etc, it becomes obvious which is the better version. That is from stills only, but comparing them in person is a night and day difference. 21fps, vs 5x that, plus all of the fidelity, makes an obvious choice. I would still choose pc if there were no differences between the two visualy, if framerate alone was the difference. I'll take high framerate all day, every day. Even people with medium spec pc's can get better visual quality, and double the framerate of consoles in that game.

But with patented console smooth 30fps it makes the entire experience much better.:messenger_weary::messenger_ok:
 

n0razi

Member
Zen2 is an order of magnitude better than Jaguar (PS4/XB1)... the complete ass level Jaguar in the PS4 was able to deliver titles like Horizon and RDR2. PC will always be better but this is still exciting considering how gimped the prior gen was.
 

martino

Member
Ps4 and XB1 were outclassed before they released as well.
Nobody cared.
Exclusives will still set visual benchmarks.

Visual benchmarks and what importance graphic take into it is subjective and variable (marketing, influencer and too often on forum logo on the box count too ;) ).
Best visual (global aesthetic) is not best "computer graphics". Any third party will do better on pc using high+ settings.

Graphics is mostly objective (AA, use of different kind of blur can becomes subjective but for the latter it's also an aesthetic choice, a choice you can make on pc . In the end the same game there can please more people because they can choose to disable aesthetic graphics features)
for the rest :
Better shadow resolution -> no debate
Better shadow draw distance -> no debate
Better draw distance -> no debate
etc....
 
I love your post, but having a console vs. a top of the line PC isn't that big of a difference than your comparisons :messenger_grinning_smiling:
I wouldn't consider buying a PS5 or XSX to riding public transportation. I'd say it's closer to this.....

PS5/XSX is a Mercedes C300 ($41,000 new)
2020-C-SEDAN-MP-016.jpg



Top of the Line PC is a Mercedes S560 Maybach ($180,000 new)
maxresdefault.jpg




Both are nice cars, yet one is CLEARLY superior to the other. But you'll pay for it too.


The problem is that consoles are stuck in time for 6-7 long years, but in just a year or two years max we suddenly have a disparity that looks more like this:

PS5/XSX is exactly the same thing and performes same exact way in 2021.
2020-C-SEDAN-MP-016.jpg



Top of the Line PC on the other hand evolves and runs laps around both consoles in 2021.
mercedes-f1-making-breathing-aid-2.webp
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Zen2 is an order of magnitude better than Jaguar (PS4/XB1)... the complete ass level Jaguar in the PS4 was able to deliver titles like Horizon and RDR2. The Best PCs will always be better but this is still exciting considering how gimped the prior gen was.

Fixed.

The problem is that consoles are stuck in time for 6-7 long years, but in just a year or two years max we suddenly have a disparity that looks more like this:

PS5/XSX is exactly the same thing and performes same exact way in 2021.
2020-C-SEDAN-MP-016.jpg



Top of the Line PC on the other hand evolves and runs laps around both consoles in 2021.
mercedes-f1-making-breathing-aid-2.webp

Yes, but who's buying new PC parts every 2 years like that?
 
Visual benchmarks and what importance graphic take into it is subjective and variable (marketing, influencer and too often on forum logo on the box count too ;) ).
Best visual (global aesthetic) is not best "computer graphics". Any third party will do better on pc using high+ settings.

Graphics is mostly objective (AA, use of different kind of blur can becomes subjective but for the latter it's also an aesthetic choice, a choice you can make on pc . In the end the same game there can please more people because they can choose to disable aesthetic graphics features)
for the rest :
Better shadow resolution -> no debate
Better shadow draw distance -> no debate
Better draw distance -> no debate
etc....


People just care about whether to not a game "looks good".
Obviously that's subjective. But I also think that it depends more on budget than tech.

All the things you listed seem irrelevant when one game just has much better animation than the other, or more variety and detail in assets, or better performance capturing.

Resolution, draw distance, AA and many other things are easily measurable, so they're more objective, but I honestly don't think that small differences here have a big impact on the overall look of a game. Red Dead at 4k or 1440p isn't going to drastically change my perception of the game's visuals.

One of the reasons why exclusives or games like Red Dead 2 stick out from the crowd is because of their big budgets and different production processes.

And I am convinced it will be like that next-gen as well.
The majority of games dropping jaws will be exclusives.
 
Fixed.



Yes, but who's buying new PC parts every 2 years like that?

giphy.webp

You only really need to get new gpu every two years as you pu it to be top of the line or very close and only new pc every 5-6 years as cpu's don't evolve nearly as fast, so if you have say 3700x you're good for next 6 years.

Of course you're out of luck and will have to get a whole new pc, if you're on intel as PCIe gen4 is not even available, if you want to use super fast ssd that is.
 
Last edited:
The truth is, the extra cost of pc doesnt transfer in extra value. A pc costing 2x that of a console won't give you 2 times the performance/graphics. (It may do so theoretically on paper in raw numbers, but the actual final realise of the game it won't show)
I usually get double console fps, as a minimum, if not much more. Depending on the game, double the resolution as well. There are many things I can do with a pc, outside of gaming, which are impossible to do on a console. It's a multi-purpose device without the limitations.
 
The truth is, the extra cost of pc doesnt transfer in extra value. A pc costing 2x that of a console won't give you 2 times the performance/graphics. (It may do so theoretically on paper in raw numbers, but the actual final realise of the game it won't show)
That depends on when you buy the PC. At new console gen launch, I would totally agree that a 1000$ pc will absolutely not give you double performance. A gpu alone, double the ps5/xsx speed, from amd will probably be ~700 later this year, so more than whole console.

Fast forward to 2022 and now we have a mid range gpu with double performance of ps5/xsx for just $ 300. [if running at the same "optimised" settings]
 

CrustyBritches

Gold Member
D DoctaThompson

On a side note, the other day I mentioned that Steam Survey doesn't necessarily encompass the entire PC gaming landscape. Based on 90-100 million active users per month, I calculated about 9 million with RTX GPUs. I was reading articles about DLSS 2.0 and according to Nvidia, they have sold 15 million RTX-capable GPUs.
DLSS is one of several major graphics innovations, including ray tracing and variable rate shading, introduced in 2018 with the launch of our NVIDIA RTX GPUs.

Since then, about 15 million NVIDIA RTX GPUs have been sold. More than 30 major games have been released or announced with ray tracing or NVIDIA DLSS powered by NVIDIA RTX. And ray tracing has been adopted by all major APIs and game engines.
That's a lot of people with RT, VRS, ML/Deep Learning capable GPUs. Imagine what a game like Cyberpunk could do for those numbers.
 
Have you not seen (or lack there of) how many settings are turned off or lowered than low, on the Xbox 1 X console?

I would consider this as a positive for consoles. Stuff that makes no visual impact should be turned off instead of wasting watts. I wish I could do this level of optimization for each pc games I play. On console Dev's do it for me.

I usually get double console fps, as a minimum, if not much more. Depending on the game, double the resolution as well. There are many things I can do with a pc, outside of gaming, which are impossible to do on a console. It's a multi-purpose device without the limitations.

Are you already gaming at 8k? As for being able to do other stuff, yes I also use my pc for doing stuff. Justify the cost. But I don't get top of the line hardware. Always mid range stuff.

That depends on when you buy the PC. At new console gen launch, I would totally agree that a 1000$ pc will absolutely not give you double performance. A gpu alone, double the ps5/xsx speed, from amd will probably be ~700 later this year, so more than whole console.

Fast forward to 2022 and now we have a mid range gpu with double performance of ps5/xsx for just $ 300. [if running at the same "optimised" settings]

Two years later is also when consoles start producing best looking games. That is when Dev's start to learn how to code properly for the hardware.

But there is no number attached to it. How optimised your game is. So obviously lot of people don't call it advantage and brush it off as art style.
 

S0ULZB0URNE

Member
I don't ignore them. I don't like them. They are a bandaid for hardware that doesn't have enough power or throughput. That doesn't make me not know what it is. Which is your claim. If you don't agree. Then fine - don't agree. But don't act like I don't know how the reconstruction technique works.



I never ever talk of superior visually unless we are talking about power features that bring (in the long run) stellar rendering quality. I leave you guy's subjective opinion alone. But you damn sure should leave mine alone too.



That's not an advancement. It's a workaround. You save on low sums of overhead because you have to - not because you want to. If the consoles had enough power to render true 4k, then they would. I know about sacrifices and reducing pixels is a BIG factor in maintaining FPS since these hardware is bandwidth limited. I get that. @psorcerer gets it. Why don't you?



No it's not. My goal is to play CG quality games. My goal is to see offline rendering a thing of the past. Call up EpicGames and ask their CTO what their goal is. I'll give you a hint. Kim used to work at ILM - you know.. the guys that worked on Star Wars, Matrix, Pirates, etc..? Those guys.



You've made no case and you'll be back I'm sure.. Defending your Sony army.
Still waiting on that PS5 Pro that's launching alongside the PS5 that you claimed and what are it's specs?Hey what's the Sony made PS5 exclusives coming to PC?
 
Zen2 is an order of magnitude better than Jaguar (PS4/XB1)... the complete ass level Jaguar in the PS4 was able to deliver titles like Horizon and RDR2. PC will always be better but this is still exciting considering how gimped the prior gen was.

Keep in mind both of those games benefited massively from GPGPU offloading tasks particularly because the Jaguar cores were so nasty ass.
 

S0ULZB0URNE

Member
D DoctaThompson

On a side note, the other day I mentioned that Steam Survey doesn't necessarily encompass the entire PC gaming landscape. Based on 90-100 million active users per month, I calculated about 9 million with RTX GPUs. I was reading articles about DLSS 2.0 according to Nvidia, they have sold 15 million RTX-capable GPUs.

That's a lot of people with RT, VRS, ML/Deep Learning capable GPUs. Imagine what a game like Cyberpunk could do for those numbers.
But yet most modern 3rd party games not only sell better but have more active online players.....
Going to be a ghost town on the upcoming games if you game on PC,well if they even come :)
 

JimboJones

Member
But yet most modern 3rd party games not only sell better but have more active online players.....
Going to be a ghost town on the upcoming games if you game on PC,well if they even come :)
Depends on the game, PC players tend to gravitate towards particular titles, Team Fortress on Xbox was a wasteland online compared to PC for example, where I'm sure COD is very popular on consoles.

I would consider this as a positive for consoles. Stuff that makes no visual impact should be turned off instead of wasting watts. I wish I could do this level of optimization for each pc games I play. On console Dev's do it for me.

Yeah devs do it for you when they have time or sanctioned to do it by nanny publisher/console manufacturer.
Then you have instances like RE3 on Xbox, launching in a terrible state, then they fix it by lowering the resolution, but why not leave options? The mode it launched in would have been able to push 60 on Xbox series X and what about people who prefer pixels over frame rate? They couldn't implement a 30fps 4K mode?
All this is very trivial stuff to change on PC.

And look I get it, you don't want to deal with it and just accept the games as there presented but that stuff does bother people and that's why alot of people get PC's so they can customise and take matters into their own hands to a much greater extent. You get to play the game they way you want with the hardware you want that shit matters to a lot of people and is part of the fun.
 
Last edited:

CrustyBritches

Gold Member
But yet most modern 3rd party games not only sell better but have more active online players.....
Going to be a ghost town on the upcoming games if you game on PC,well if they even come :)
What does this have to do with the amount of RTX GPUs sold? The context of my post is "how many PC gamers have RT, VRS, and Deep Learning capable GPUs". It's relevant to the topic of discussion in this thread. Why do you feel compelled to spin like that?
 

S0ULZB0URNE

Member
What does this have to do with the amount of RTX GPUs sold? The context of my post is "how many PC gamers have RT, VRS, and Deep Learning capable GPUs". It's relevant to the topic of discussion in this thread. Why do you feel compelled to spin like that?
Tech talk ok....
Update this thread when actual games are released and built for 4.5-9gb/s SSD on the PC.
But this isn't happening so consoles have a tech advantage.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
giphy.webp

You only really need to get new gpu every two years as you pu it to be top of the line or very close and only new pc every 5-6 years as cpu's don't evolve nearly as fast, so if you have say 3700x you're good for next 6 years.

Of course you're out of luck and will have to get a whole new pc, if you're on intel as PCIe gen4 is not even available, if you want to use super fast ssd that is.
I upgrade every 6 months or so... lose about $50-100 per upgrade as long as I sell the older part


Thanks for the answer guys. Now ask yourself, are you in the PC minority? I think you guys are doing something that most PC gamers don't do. You are the top 5% of PC gamers more than likely.
 

JimboJones

Member
"most"MODERN 3rd party games.

I don't doubt PS4 probably has higher populations but PC is hardly a "ghosttown" 🙄

Especially when most games in the next two years will most likely be cross gen and a lot of pc users will be able to hop on at reduced settings.
 
Last edited:

S0ULZB0URNE

Member
I don't doubt PS4 probably has higher populations but PC is hardly a "ghosttown" 🙄

Especially when most games in the next two years will most likely be cross gen and a lot of pc users will be able to hop on at reduced settings.
From a actual multiplatorm gamer like myself... I double and some time triple dip.
The PC versions populations are ghost towns in comparison.
 

CrustyBritches

Gold Member
Tech talk ok....
Update this thread when actual games are released and built for 4.5-9gb/s SSD on the PC.
But this isn't happening so consoles have a tech advantage.
PC already has 15 million users on RT, VRS, and ML/Deep learning capable GPUs. Tech console players won't have until November. Even though most console players hide behind the Xbox One X, the majority have Radeon 7770 and 7850/7870 level GPUs. This is why they are compelled to spin in GPU tech threads.
---
D DoctaThompson
On a side note, imagine September 17th comes around and you're stuck playing Cyberpunk on a Jag CPU and 7850/7870 GPU. I just can't bring myself to do that. Which is why I take solace in the fact that I'm one of the 15 million PC players with an RTX-capable GPU that supports RT, VRS, and DLSS 2.0. That Nvidia fine wine.🍷
 
I usually get double console fps, as a minimum, if not much more. Depending on the game, double the resolution as well. There are many things I can do with a pc, outside of gaming, which are impossible to do on a console. It's a multi-purpose device without the limitations.
getting double console framerates near console generation end. Considering ps4 released 7 years ago with outdated jaguars, that's part of the reason. Depending on how much vram is used by the ps5 you might not even be able to run at comparable settings without an upgrade*( assuming you don't have a titan.).

And we still don't know how much of an effect the bottleneck free ps5 ssd will play.

My biggest issue is that certain people claim, we won't see a significant visual leap in graphics, despite there being a big jump in performance, about as big as the last jump.

To be honest that would be quite surprising and interesting to see. But in all honesty I expect such people to either backtrack, or hold that gameplay footage for certain games is fake when revealed(say santa monica's or naughty's next project).

edit:
PC already has 15 million users on RT, VRS, and ML/Deep learning capable GPUs. Tech console players won't have until November. Even though most console players hide behind the Xbox One X, the majority have Radeon 7770 and 7850/7870 level GPUs. This is why they are compelled to spin in GPU tech threads.
---
D DoctaThompson
On a side note, imagine September 17th comes around and you're stuck playing Cyberpunk on a Jag CPU and 7850/7870 GPU. I just can't bring myself to do that. Which is why I take solace in the fact that I'm one of the 15 million PC players with an RTX-capable GPU that supports RT, VRS, and DLSS 2.0. That Nvidia fine wine.🍷
My issue with rtx is that only a few effects are viable in most titles. When the new cards can run full path tracing on modern titles at decent framerates, that's when I'll buy.
 
Last edited:
Then you have instances like RE3 on Xbox, launching in a terrible state, then they fix it by lowering the resolution, but why not leave options? The mode it launched in would have been able to push 60 on Xbox series X and what about people who prefer pixels over frame rate? They couldn't implement a 30fps 4K mode?
All this is very trivial stuff to change on PC.

How much choice you actually have? I personally think you are locked down once you get a display. You really can't play at 1080p on 4k monitor to double the frame rate. Can you?

The best you can do is decide on your preference, then get a monitor with desired refresh rate and resolution. Once you do that it's locked in just like the console.

Also a lot of advantages of pc come into picture when you assume them to have unlimited power. Which is almost never the case.
 

JimboJones

Member
How much choice you actually have? I personally think you are locked down once you get a display. You really can't play at 1080p on 4k monitor to double the frame rate. Can you?

The best you can do is decide on your preference, then get a monitor with desired refresh rate and resolution. Once you do that it's locked in just like the console.

Also a lot of advantages of pc come into picture when you assume them to have unlimited power. Which is almost never the case.

Yes you can play at 4k or 1440p (or lots of arbitrary resolutions) on 1080p and benefit from super sampling. This is important to console owners aswell because they have eebn crying for "performance" and "cinematic" modes.

You can push higher refresh rates for lower latency.
Or you can lower refresh rates to save on resources.

You have more freedom upgrading to monitors and matching the resolution output to said display.

You can choose to have a graphics card that supports freesynce or gsync and benefit from variable refresh rates
 
Last edited:
PC already has 15 million users on RT, VRS, and ML/Deep learning capable GPUs. Tech console players won't have until November. Even though most console players hide behind the Xbox One X, the majority have Radeon 7770 and 7850/7870 level GPUs. This is why they are compelled to spin in GPU tech threads.
---
D DoctaThompson
On a side note, imagine September 17th comes around and you're stuck playing Cyberpunk on a Jag CPU and 7850/7870 GPU. I just can't bring myself to do that. Which is why I take solace in the fact that I'm one of the 15 million PC players with an RTX-capable GPU that supports RT, VRS, and DLSS 2.0. That Nvidia fine wine.🍷
:messenger_fire: :messenger_fire: :messenger_fire: 🧠 🧠 🧠 !!!
You can include me in that RTX gang gang!

So many console warriors throw that line "well how many pc gamers have a *insert gpu here* ? The average pc player has a better gpu than current consoles. Further more, how many people have a ps5 or series X right now, at this very moment?! You would sweat everyone in this forum, as they keep mentioning how SSD are suddenly going to change the world, cure cancer, ends poverty, and is the root cause of world peace. Or how "next gen" games look so spectacular in 4k@120fps. I can only laugh at these time travelers.
 
getting double console framerates near console generation end. Considering ps4 released 7 years ago with outdated jaguars, that's part of the reason. Depending on how much vram is used by the ps5 you might not even be able to run at comparable settings without an upgrade*( assuming you don't have a titan.).

And we still don't know how much of an effect the bottleneck free ps5 ssd will play.

My biggest issue is that certain people claim, we won't see a significant visual leap in graphics, despite there being a big jump in performance, about as big as the last jump.

To be honest that would be quite surprising and interesting to see. But in all honesty I expect such people to either backtrack, or hold that gameplay footage for certain games is fake when revealed(say santa monica's or naughty's next project).

edit:

My issue with rtx is that only a few effects are viable in most titles. When the new cards can run full path tracing on modern titles at decent framerates, that's when I'll buy.
Wait, do you honestly think consoles will out perform an RTX 2080 TI?! lmfao you can't be serious right now?! You realize even 2xxx series will more than likely handle raytracing better than AMD's implementation? Which includes AMD console hardware.
 

ZywyPL

Banned
D DoctaThompson
That's a lot of people with RT, VRS, ML/Deep Learning capable GPUs. Imagine what a game like Cyberpunk could do for those numbers.


15MLN? Out of what, hundreds of millions, billions PCs? Which still, with so many people having RT-capable GPUs it makes me wonder where the hell are the appropriate software sales? Control, Metro etc. didn't even hit 1MLN copies on PC (why did they even got those cards for?? xD), so hardware sales are one thing, but until people actually start buying games that support the new fancy tech the devs won't widely support it.
 
Top Bottom