• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

is PS5 GPU is slightly better than gtx 1080 thats pretty pathetic.

the 1080 was release 3 years ago in 2016. while it was 500 Dollars. it was still 3 damn years ago. that would make the console just like last gen, outdated on release day. am not expecting it to be as good as a current gen PC GPU at the high end. but giving us hardware thats maxed out on day one doesn't leave too much of a good taste in my mouth

am just using some of the leaks here that might not be true. 1080 performance is nothing to write home about. 1080 TI perf in 2020? yeah . that would be 4 years after these cards came out. thats pretty reasonable isn't it?.

hell even PS4 wasn't worse than a 2009 GPU.
 
Last edited:
Based on rumors and if so, who cares? Console are far better optimized than PC because dev only have to develop from that one specific configuration.

they arent far optimized anymore because they are the same parts as pcs. back in the day you had the ps2 CPU running at 300 MHZ but is a better gaming machine than a 1 GHZ pentium 4 machine because it wasnt the same part it was specified for games

now days thats not the case.
 

Kenpachii

Member
PS4's 1.84 Tflop GPU was fine, the only issue this generation really had was CPU performance that of which will be remedied.

Was not fine. It was outdated horrible the moment it released. CPU wasn't the only issue it was facing. a 1,8 tflop amd gpu in a box when 5,6tflop gpu's from AMD where out at that time. I had two of those.
 
Last edited:
Was not fine. It was outdated horrible the moment it released. CPU wasn't the only issue it was facing. a 1,8 tflop amd gpu in a box when 5,6tflop gpu's from AMD where out at that time. I had two of those.
Those 5.6 Teraflop GPU's (290X) were 1440p target GPU's, I would know I own two of them. The PS4 is a 1080p console. Its GPU was perfectly suited for its application.

That can't be. Those cards can't play games in native 4k.
They can.
 
Last edited:

Kenpachii

Member
Those 5.6 Teraflop GPU's (290X) were 1440p target GPU's, I would know I own two of them. The PS4 is a 1080p console. Its GPU was perfectly suited for its application.

They can.

Not really.

290x = a 1080p card. 1440p it doesn't have the amp to run games even at the time of release at stable 60 fps at ultra settings. even ac unity struggled to keep 60 fps with just a single card which resulted in people already opting for 2 cards to get it, same goes for witcher 3.

The fact that consoles already scaled detail back proved that gpu simple didn't hold up already from day one. The box was what it was and that's it. The CPU wasn't the only issue. the GPU was nothing to talk about also.

That sony spends gazillions of cash to limit enviroments and push details forwards to maximize there hardware is something most devs won't be bothered with. Ac unity / watch dogs etc all where good examples of this. Hell i think witcher 3 even got a downgrade for the sole fact that console had to run it to validate there budget.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
they arent far optimized anymore because they are the same parts as pcs. back in the day you had the ps2 CPU running at 300 MHZ but is a better gaming machine than a 1 GHZ pentium 4 machine because it wasnt the same part it was specified for games

now days thats not the case.
No. Not even close.

My PC with similar specs from back in the day absolutely blew away anything on the PS2.
 

avalonzero

Member
That can't be. Those cards can't play games in native 4k.

It's not the same playing field when comparing consoles and PC. Consoles have hardware tailor made strictly for the games being played. PC's are simultaneously running Windows and other applications, along with 3rd party drivers that aren't optimized as well as consoles are because they have a specific use.
 
It was not and AC unity proved this also games not running from the get go on ultra settings but cut down settings also proved this.
AC unity was a CPU gutting game, they pushed their crowd technology too far.

What console has ever run anything at Ultra settings? I mean bar the Xbox One X in some instances it's never been a thing...
 

JohnnyFootball

GerAlt-Right. Ciriously.
giphy.gif


There is so many levels of dumb in the OPs post.

By now you think people would get the clue about consoles vs PC.
A game coded to the metal that can take full advantage of a GPU is going to give several generations better performance than the PC counterpart.
 
Last edited:

JordanN

Banned
giphy.gif


There is so many levels of dumb in the OPs post.

By now you think people would get the clue about consoles vs PC.
A game coded to the metal that can take full advantage of a GPU is going to give several generations better performance than the PC counterpart.
Hasn't this been debunked?

PS4 launched with games that were still 1080p/30fps. PC hardware was already way above this.

It's quite the opposite. Consoles were bottlenecked with really bad CPU's this generation, it made low end PC's feel more powerful.
 
Last edited:

Nethernova

Member
giphy.gif


There is so many levels of dumb in the OPs post.

By now you think people would get the clue about consoles vs PC.
A game coded to the metal that can take full advantage of a GPU is going to give several generations better performance than the PC counterpart.

This too. Its never 1:1 comparison. I wouldnt say "generations" ahead though, but yeah, considerably ahead.
 
Last edited:
Not really.

290x = a 1080p card. 1440p it doesn't have the amp to run games even at the time of release at stable 60 fps at ultra settings. even ac unity struggled to keep 60 fps with just a single card which resulted in people already opting for 2 cards to get it, same goes for witcher 3.

The fact that consoles already scaled detail back proved that gpu simple didn't hold up already from day one. The box was what it was and that's it. The CPU wasn't the only issue. the GPU was nothing to talk about also.

That sony spends gazillions of cash to limit enviroments and push details forwards to maximize there hardware is something most devs won't be bothered with. Ac unity / watch dogs etc all where good examples of this. Hell i think witcher 3 even got a downgrade for the sole fact that console had to run it to validate there budget.
You keep running to Unity, that's a horrible example of performance on anything. It was poorly optimized and an egregious CPU hog. The 290X for all intents and purposes is a 1440p GPU just like the GTX 980 was.

The Witcher 3 is a 2015 game son, that is a 2013 flagship GPU. I opted for two cards because I have a high refresh rate monitor, not because it wasn't doing its job.

I can't be bothered talking to people like you, you're so clueless it's nauseating.
 

Dante83

Banned
Don't forget the base ps4/xbone were way underpowered at launch. To get close benchmark numbers to capable PCs is already impressive, and devs don't have to worry about so many variables when optimising their games for a console (close to the metal). It's more impressive than you think actually.
 

LordOfChaos

Member
hell even PS4 wasn't worse than a 2009 GPU.

The PS4's GPU was close to a mid range GPU from March of that year vs November launch, the 249 dollar 7850 is what it was closest to.

The 1080 is still more expensive than that.

This isn't a move downwards, just the pretty normal order of the universe apart from the freak 7th gen (which were also massive money losers at the start)

I think your expectations are askew tbh. Bringing a GPU of this price down to a total system cost of 4-500 including the CPU, fast SSD storage, networking etc etc, that's not bad. Then there's it being a closed box and RT hardware, it'll perform better than the 1080 in the real world.

 
Last edited:

StreetsofBeige

Gold Member
they arent far optimized anymore because they are the same parts as pcs. back in the day you had the ps2 CPU running at 300 MHZ but is a better gaming machine than a 1 GHZ pentium 4 machine because it wasnt the same part it was specified for games

now days thats not the case.
That was crazy shit back then. When our family had PC and Sega Genesis, PC could do higher res games and due to storage and pur cpu power could crank out big games with tons of stats..... but fuck the games couldn't even smoothly do 2D side scrolling games and the audio was shit. Plug in a PC joystick and 10% of your cpu power went to operating a joystick or mouse driver.
 
Last edited:

Keihart

Member
PS4 was outdated on release yet DOOM and DMCV came out this generation running at 60 fps and looking great? I think you are overreacting.
The games don't fully take advantage of this hardware until it hits consoles and becomes the new target specs for almost everything, so making a gtx1080 the target for most games sounds a pretty big jump to me.
 

StreetsofBeige

Gold Member
Your 300 mhz Pentium II was not outperforming the PS2's graphics.
In 2001, I had a Pentium III 800 mhz pc with a Geforce 256MX (I think that was the gpu). I forget how many megs of ram it had, but when I got my PS2 around 2002, there is no way my PC could do an average PS2 game.

No doubt, PC games heavy in 3D or simulating stats it was fine (Unreal/Quake), but games like Ace Combat 4, Twitsted Metal, God of War, etc.... no way my PC could do that, and it probably had 10x the ram.
 
Top Bottom