• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

is PS5 GPU is slightly better than gtx 1080 thats pretty pathetic.

the 1080 was release 3 years ago in 2016. while it was 500 Dollars. it was still 3 damn years ago. that would make the console just like last gen, outdated on release day. am not expecting it to be as good as a current gen PC GPU at the high end. but giving us hardware thats maxed out on day one doesn't leave too much of a good taste in my mouth

am just using some of the leaks here that might not be true. 1080 performance is nothing to write home about. 1080 TI perf in 2020? yeah . that would be 4 years after these cards came out. thats pretty reasonable isn't it?.

hell even PS4 wasn't worse than a 2009 GPU.


It's not slightly better than 1080, it's closer to 1080ti which is +11 TF card and also has RT support. PS5 will have RT, surely won't be 8.8 TF like 1080 and it doesn't have support for RT
 

Romulus

Member
You heard it here first folks ; memory bandwidth doesn't matter.

Why aren't we still using gddr3

Memory bandwidth does matter, but if I could choose to play Nintendo games built from the ground up for original Xbox vs GC hardware, I'd choose Metroid Prime etc every time on Xbox. Any game for that generation I'd choose on Xbox. The memory advantage just isn't enough to offset a massive GPU and CPU disparity. It's just too much of a gulf.
 
Last edited:

Journey

Banned
What is the argument ? Xbox was faster than any other console out there. Arguing about memory bandwidth as if it mattered to comparison makes no sense.


Exactly, the proof is in the pudding. Xbox was able to output a decent list of games in HD as well, something that the GameCube was never able to achieve, even with its exclusive titles.

Take a look at this late gen title, Enter the Matrix, it ran at 1920 x 1080 on Xbox, that's just ridiculous, and despite the resolution, it still had better texture work than the GameCube version... one has to wonder why?

 
Memory bandwidth does matter, but if I could choose to play Nintendo games built from the ground up for original Xbox vs GC hardware, I'd choose Metroid Prime etc every time on Xbox. Any game for that generation I'd choose on Xbox. The memory advantage just isn't enough to offset a massive GPU and CPU disparity. It's just too much of a gulf.
But it is enough, depending on what the game was doing. Xbox was factually bottlenecked when processing alpha transparencies for example. Its fsb speed limited poly counts and frame rates.

Obviously I'm not saying GameCube wins in every category, but the overall advantage in poly counts and effects thanks to the fast framebuffers and texture cache means the exclusives just look more modern.

Sam fishers pointy arms next to leon s kennedy and the octogon modeling in doom 3 just doesn't hold up. Other titles like gaiden and orta absolutely look great, but the usual Xbox suspects not so much.
 
Last edited:
This isn't that complicated. Xbox was a more modern, more powerful machine, yes. It was also not a balanced piece of hardware like GameCube.

Imagine if ps4 shipped with a 128 bit bus and no edram? Would it still beat xbox one every time?
 
Last edited:

Bigfroth

Member
Comparing these next gen consoles to PC's is pointless you will always be disappointed. I game on PS4, I ask myself will the the ps5 be more powerful? Yes. Will it play PS4 games? Yes. Don't care if it's more or less powerful than nextbox or what great innovation PC's come up with. I do enjoy reading these console war threads though. 😜
 
Last edited:

Romulus

Member
But it is enough, depending on what the game was doing. Xbox was factually bottlenecked when processing alpha transparencies for example. Its fsb speed limited poly counts and frame rates.

Obviously I'm not saying GameCube wins in every category, but the overall advantage in poly counts and effects thanks to the fast framebuffers and texture cache means the exclusives just look more modern.

Sam fishers pointy arms next to leon s kennedy and the octogon modeling in doom 3 just doesn't hold up. Other titles like gaiden and orta absolutely look great, but the usual Xbox suspects not so much.

If RE4 ran on the PS2 to the extent it did, no doubt in my mind a specifically built Xbox port would best them both. The poly counts were never confirmed by anyone on how many they actually pushed on a given game compared to the next, just speculation on which game pushed the most, supposedly. Nintendo said one thing, devs said another. Xbox said one thing, and their devs said another.
 
Last edited:
If RE4 ran on the PS2 to the extent it did, no doubt in my mind a specifically built Xbox port would best them both. The poly counts were never confirmed by anyone on how many they actually pushed on a given game compared to the next, just speculation on which game pushed the most, supposedly. Nintendo said one thing, devs said another. Xbox said one thing, and their devs said another.
On the games that use normal maps on Xbox ; there absolutely is a polygon deficit in favor of GameCube.

With regards to re4, I'm sure the alpha effects would probably be reduced on Xbox. At points there is a LOT of fog on screen. Given that a port to Xbox probably wouldn't include normal maps, and there's only per vertex lighting, its possible Xbox could match cubes poly counts. One thing xbox could have is higher res textures.

It's pure speculation though. Mikami made it sound like a port even to Xbox wouldn't be ideal.
 

V4skunk

Banned
Its strange how much people underestimate the power of the original xbox. Having a Geforce 3 in the xbox was the equivalent of having a 2080 (maybe Ti) in consoles right now.

I had to laugh at previous posts suggesting that Metroid Prime looked better than Chronicles of Riddick.
You are over hyping the og xbox. It is clear you never owned a GC.
Xbox only had a slight advantage in directx and unified gpu tech with normal map shiny wall fx, these nice effects were part of the directx library quick and simple to add to a game, GC exclusives were pushing out 14-20 million polygons a second with custom effects and nicer textures.
You guys comparing the xbox Intel cpu to the GC IBM cpu are also delusional. Comparing apples to oranges ffs.
 
You are over hyping the og xbox. It is clear you never owned a GC.
Xbox only had a slight advantage in directx and unified gpu tech with normal map shiny wall fx, these nice effects were part of the directx library quick and simple to add to a game, GC exclusives were pushing out 14-20 million polygons a second with custom effects and nicer textures.
You guys comparing the xbox Intel cpu to the GC IBM cpu are also delusional. Comparing apples to oranges ffs.
As for the cpus, it turns out they deliver the exact same flops per cycle. Gamecube's advantage is from the faster fsb and possibly the extra l2 cache, although xbox's l2 cache is more associative. 8 way vs cubes 2 way.
 

pawel86ck

Banned
Ps2 version of re4 looks like ass. Greatly reduced polys stripped lighting, reduced foliage etc

Gc could do anything Xbox can through its tev units, sans normal maps which, are not a suitable replacement for geometry.

But, you do you.
According to you Game Cube could do anything like Xbox, and who know's maybe it's true, but the question is how efficiently GC hardware could emulate xbox effects without hardware acceleration for certain effects? GC had no pixel and vertex shaders or shadow buffers!

These days raytracing is a big thing, and GPU like 1080ti can run raytracing in software, but can it run RT in software as fast as 2080ti with hardware RT support? Hell no! Just run Quake 2 RTX and you will see the same fps on 2080ti in 1440p as on 1080ti in 480p! So theoretically both GPU's support raytracing, but only 2080ti with HW RT will run RT with decent results.

Game Cube developers very rarely used effects comparable to xbox in their games. Besides Star Wars Rogue Leader and maybe starfox even simple bump mapping was almost non existent in GC games, and not to mention other effects. I believe the problem was GC developers had to do everything in software with much higher performance penalty and besides that they were also RAM limited (xbox had 50% more ram and HDD on top of that). On xbox pretty much all exclusive games (and later on multiplatform games as well) were using very impressive texture and lighting effects (thanks to shaders) and high quality dynamic shadow (thanks to shadow buffers) on regular basis. GC was good at pushing polygons, but without xbox like effects these GC games were flat in the end. There's no way games like splinter cell 3, doom 3, half life 2, PGR2, Riddick, far cry or even halo 2 could be ported to GC with decent results.
 

Romulus

Member
On the games that use normal maps on Xbox ; there absolutely is a polygon deficit in favor of GameCube.

With regards to re4, I'm sure the alpha effects would probably be reduced on Xbox. At points there is a LOT of fog on screen. Given that a port to Xbox probably wouldn't include normal maps, and there's only per vertex lighting, its possible Xbox could match cubes poly counts. One thing xbox could have is higher res textures.

It's pure speculation though. Mikami made it sound like a port even to Xbox wouldn't be ideal.


Nearly every port on Xbox was superior to GameCube's, especially more demanding games. The polygon stuff is all paper tiger talk during that generation, no hard stats proven either way. There's no metric at which the Gamecube was better at effects or polygon count. Whatever Mikami said, there was a ps2 version, and again, it holds up to GameCube in many aspects. Xbox would definitely further that with a ground up port.
 
Nearly every port on Xbox was superior to GameCube's, especially more demanding games. The polygon stuff is all paper tiger talk during that generation, no hard stats proven either way. There's no metric at which the Gamecube was better at effects or polygon count. Whatever Mikami said, there was a ps2 version, and again, it holds up to GameCube in many aspects. Xbox would definitely further that with a ground up port.
The ps2 version is pretty damn ugly. You're free to believe that about polygon counts, I'm just offering my unbiased observation. It's quite obvious in the modeling of games like doom and Riddick there's a deficit, but I won't die on this hill.
According to you Game Cube could do anything like Xbox, and who know's maybe it's true, but the question is how efficiently GC hardware could emulate xbox effects without hardware acceleration for certain effects? GC had no pixel and vertex shaders or shadow buffers!

These days raytracing is a big thing, and GPU like 1080ti can run raytracing in software, but can it run RT in software as fast as 2080ti with hardware RT support? Hell no! Just run Quake 2 RTX and you will see the same fps on 2080ti in 1440p as on 1080ti in 480p! So theoretically both GPU's support raytracing, but only 2080ti with HW RT will run RT with decent results.

Game Cube developers very rarely used effects comparable to xbox in their games. Besides Star Wars Rogue Leader and maybe starfox even simple bump mapping was almost non existent in GC games, and not to mention other effects. I believe the problem was GC developers had to do everything in software with much higher performance penalty and besides that they were also RAM limited (xbox had 50% more ram and HDD on top of that). On xbox pretty much all exclusive games (and later on multiplatform games as well) were using very impressive texture and lighting effects (thanks to shaders) and high quality dynamic shadow (thanks to shadow buffers) on regular basis. GC was good at pushing polygons, but without xbox like effects these GC games were flat in the end. There's no way games like splinter cell 3, doom 3, half life 2, PGR2, Riddick, far cry or even halo 2 could be ported to GC with decent results.
It doesn't work like this! Both GameCube and Xbox had 4 pixel pipelines. Xbox could not do effects cheaper than GameCube just because of its shader featureset. The programmable sharers are simply a toolset ; it doesn't magically make xbox output more than cube or vice versa. Funny you mention star fox ; another early game that pretty much tells where the GameCube stands technically.

Literally anything besides normal maps and stencil shadows can be done on cube, at similar cost.
 
Last edited:

Romulus

Member
The ps2 version is pretty damn ugly. You're free to believe that about polygon counts, I'm just offering my unbiased observation. It's quite obvious in the modeling of games like doom and Riddick there's a deficit, but I won't die on this hill.

I think the GC version is ugly too. Sure the ps2 version is worse, but most of is clarity and low res textures, which the Xbox would clean up to an even further degree than GC. All those games were deficit back then in polygons. I'm not surrounding myself in some agenda filled belief here, there's literally no proof of those metrics in polygons. You're claiming these Xbox shooters are polygon deficit, but what about something built from the ground up like Metroid? That's no polygon stunner by any means. Actually, there's not a single GC title that I think of where it's like, wow that's pushing some polygons. I'm being objective. Maybe Rogue Squardon, but it's so hard to tell because everything looks so simple.

A mean this is first party title and I loved but in no way is it stand next to top tier Xbox shooters visually, even in terms of poly count imo. That's just some bizarre way of mudding the waters because it can't be measured, only an opinion in which I can't prove you wrong or likewise. Not saying it's intentional, but you see my point
 
Last edited:
I think the GC version is ugly too. Sure the ps2 version is worse, but most of is clarity and low res textures, which the Xbox would clean up to an even further degree than GC. All those games were deficit back then in polygons. I'm not surrounding myself in some agenda filled belief here, there's literally no proof of those metrics in polygons. You're claiming these Xbox shooters are polygon deficit, but what about something built from the ground up like Metroid? That's no polygon stunner by any means. Actually, there's not a single GC title that I think of where it's like, wow that's pushing some polygons. I'm being objective. Maybe Rogue Squardon, but it's so hard to tell because everything looks so simple.

A mean this is first party title and I loved but in no way is it stand next to top tier Xbox shooters visually, even in terms of poly count imo. That's just some bizarre way of mudding the waters because it can't be measured, only an opinion in which I can't prove you wrong or likewise. Not saying it's intentional, but you see my point
You sound triggered lol.
 

pawel86ck

Banned
The ps2 version is pretty damn ugly. You're free to believe that about polygon counts, I'm just offering my unbiased observation. It's quite obvious in the modeling of games like doom and Riddick there's a deficit, but I won't die on this hill.

It doesn't work like this! Both GameCube and Xbox had 4 pixel pipelines. Xbox could not do effects cheaper than GameCube just because of its shader featureset. The programmable sharers are simply a toolset ; it doesn't magically make xbox output more than cube or vice versa.

Literally anything besides normal maps and stencil shadows can be done on cube, at similar cost.
You want to tell people here GC could render shadows as fast without shadow buffers? The TEVs aren't quite as flexible as you think, and in terms of features Flipper was more comparable to Geforce 2 and Radeon 7500 (DX7 cards).

You want to talk about polygons? Xbox GPU was a polygon beast, it had 2 texture units per pixel pipe unlike GC and other important features
Here's very detailed Xbox vs GC hardware analysis

As for geometry performance, the XGPU is absolutely loaded for bear. Although its architecture in many ways resembles that of the GeForce 3, the XGPU is actually more closely modeled on a next-generation 3D core from nVidia (GeForce 4). One of the very noteworthy features is the addition of a second vertex pipeline, which rockets its geometry throughput up to 116.5 million triangles/sec.
Other estimates put Flipper’s geometry rate somewhat higher at 20-30 million polygons/second, which is still well below Xbox’s 116.5 million polygons/second
20-30 million triangles/sec on Game Cube vs 116.5 million triangles/sec on xbox
Literally anything besides normal maps and stencil shadows can be done on cube, at similar cost.
And yet shadows and texture effects in splinter cell games were extremely downgraded. These games looked really ugly on GC. If GC would be as powerful as what you are saying developers would use GC power in their games and match what xbox was doing, but it wasnt like that.

RE4 on Game Cube

gfs-5435-2-1-mid.jpg


gfs-57544-2-3-mid.jpg


Xbox Games





splinter-cell-screenshot.jpg


Best-Worst-Visual-Xbox-Games-Doom-3.jpg


Best-Worst-Visual-Xbox-Games-Riddick.jpg
 
Last edited:

yurinka

Member
It's not slightly better than 1080, it's closer to 1080ti which is +11 TF card and also has RT support. PS5 will have RT, surely won't be 8.8 TF like 1080 and it doesn't have support for RT
Yep, rumors say around 11TF and RT support is confirmed. So seems it's going to be more 1080ti than 1080. And looking at what PS4 did with games like Uncharted 4 or Horizon, PS5 exclusive games are going to look stunning and way above than current PC games.

In addition to this, we also should think that PS5 needs to be priced around $400 or $500 max at launch, which obviously won't compete against $3000 PCs.
 
Last edited:

LordOfChaos

Member
It's not slightly better than 1080, it's closer to 1080ti which is +11 TF card and also has RT support. PS5 will have RT, surely won't be 8.8 TF like 1080 and it doesn't have support for RT

Can't compare flops across architectures like that. Traditionally 8Tflops on the Nvidia side would perform better than the same on the AMD side, because the flops everyone throws around are just a paper calculation of theoretical performance (ALUs * clock speed * 2 ops per clock per core). Navi with its 1.25x IPC improvement will bring things closer, but there's still no comparing them directly across architectures .
 

V4skunk

Banned
You want to tell people here GC could render shadows as fast without shadow buffers? The TEVs aren't quite as flexible as you think, and in terms of features Flipper was more comparable to Geforce 2 and Radeon 7500 (DX7 cards).

You want to talk about polygons? Xbox GPU was a polygon beast, it had 2 texture units per pixel pipe unlike GC and other important features
Here's very detailed Xbox vs GC hardware analysis



20-30 million triangles/sec on Game Cube vs 116.5 million triangles/sec on xbox

And yet shadows and texture effects in splinter cell games were extremely downgraded. These games looked really ugly on GC. If GC would be as powerful as what you are saying developers would use GC power in their games and match what xbox was doing, but it wasnt like that.

RE4 on Game Cube

gfs-5435-2-1-mid.jpg


gfs-57544-2-3-mid.jpg


Xbox Games





splinter-cell-screenshot.jpg


Best-Worst-Visual-Xbox-Games-Doom-3.jpg


Best-Worst-Visual-Xbox-Games-Riddick.jpg
A thread back from 2002.
Xbox pushes 8 million polygons a second with textures and effects in the real world, GC pushes 14-20 million real world.
Rogue Leader 1+2 clearly dominating that gen with polycount. RE4 GC clearly had better geometry and more enemies than Riddick on screen as well and the ps2 port of RE4 was a joke.
 
Last edited:

Vorg

Banned
It finally devolved into screenshot wars and metrics stating xbox could push 4x the poly as cube xD

Dude, not sure what you're getting at. There's literally nothing on gamecube that would not be possible on Xbox. On the other hand, you have titles like dead or alive ultimate, doom3, chaos theory, ninja gaiden that would have to be butchered pretty heavily to be ported to the gamecube. No one is saying the gamecube wasn't a great console, but this is a pretty dishonest discussion. The Xbox was like the xbone x of that generation. It was just capable of a lot more and some of its games felt almost like a generational leap over the other consoles at the time.
 

Ten_Fold

Member
They are not trying to price the ps5 over $499, also they don’t need to go for the something over the top just needs to be a nice improvement of the ps4 which it will be.
 

Aidah

Member
It's not pathetic, it's what you'd expect from a console. By late 2020 when it comes out, it'll have a GPU equivalent to a low to midrange PC GPU, not bad for a console that's supposed to be affordable.
 
Last edited:

iHaunter

Member
the 1080 was release 3 years ago in 2016. while it was 500 Dollars. it was still 3 damn years ago. that would make the console just like last gen, outdated on release day. am not expecting it to be as good as a current gen PC GPU at the high end. but giving us hardware thats maxed out on day one doesn't leave too much of a good taste in my mouth

am just using some of the leaks here that might not be true. 1080 performance is nothing to write home about. 1080 TI perf in 2020? yeah . that would be 4 years after these cards came out. thats pretty reasonable isn't it?.

hell even PS4 wasn't worse than a 2009 GPU.

Want to pay $1000 for a console? A 2080 is like $700 on its own man.

The only problem I have is the 8K shit-slinging they're throwing out. I have a GTX 1080Ti and it can barely handle 1440P with high FPS. Let alone 8K. If anything PS5 MIGHT be able to do real 4k/30. That's it.
 
Last edited:

betrayal

Banned
I think a console for ~600 up to 700$ would be fine, if it has enough power to run games with constant 60+ FPS and at least in 1440p, no exceptions. I'm not sure if a GTX 1080 is able to guarantee that for all new game releases during the next 4-5 years. They should ditch the 4K mantra and be honest...4K and fixed 60 FPS ist not happening with the next generation or maybe even the generation after that if they want to sell consoles for ~400$.

I mainly played on consoles the for about the last ten years and recently bought a new mid-tier PC (RTX 2070, Ryzen 2700X, 144Hz monitor) and switching from PUBG with 30 FPS to 90 - 130 FPS on a PC is quite impressive in a way that i really wouldn't want to go back to console gaming. Even OV or R6 with constant 144 FPS on a PC feels and plays way better, let alone games that you can not play on consoles. And i'm only talking about the technical differences and not about controller vs M/KB.
The point i want to make is what i already said in the first sentence. We need a consoles which constantly outputs 1440p and 60 FPS (or even more) for 600 - 700$. A console like this would dominate the market in a never before seen way. 700$ may sound a lot, but we're living in a world where most people spend between 200 - 1200$ for a smartphone (which they probably replace after 1-2 years).
 

Eteric Rice

Member
God damn, you guys want Sony to repeat the PS3 don't you? A console over $499 is suicide. I'd say $399 is the sweet spot. And for $399 you're not going to get the latest and greatest PC hardware. Just isn't happening.

Not to mention it can be financial suicide.
 
Last edited:

AV

We ain't outta here in ten minutes, we won't need no rocket to fly through space
How the fuck is this thread still going?
 

Armorian

Banned
You can't run nfs underground on pc with 32mb even if you turn the world upside down

PS2 doesn't have any OS running in the background, all avaible resources are used by games. Windows XP on PC requires 64MB of RAM (with 128MB recommended), current gen consoles both eat up to 3GB of RAM for their operating systems.
 

Romulus

Member
God damn, you guys want Sony to repeat the PS3 don't you? A console over $499 is suicide. I'd say $399 is the sweet spot. And for $399 you're not going to get the latest and greatest PC hardware. Just isn't happening.

Not to mention it can be financial suicide.

I think we're ready for $499 honestly in 2020. $399 is just less and less money to work with to break even or make a small profit. I don't see how a 9tf GPU, zen 2 CPU GGDR6, SSD, and a 4k bluray will make much profit at $499 and I lowballed the GPU. $399 would be an absolute nightmare. If its $399, expect massive cuts we didnt see coming. If X1X can sell well at $499 with crap exclusives and an ancient tablet CPU, ps5 will be just fine imo.
 
Last edited:
There is so many levels of dumb in the OPs post.

By now you think people would get the clue about consoles vs PC.
A game coded to the metal that can take full advantage of a GPU is going to give several generations better performance than the PC counterpart.
Small problem...console games aren't bare metal anymore. Both the XBOX One and the PS4 use APIs. The XBOX One uses DirectX 11/12 and the PS4 uses some fucky Sony proprietary thing.
 

Whitecrow

Banned
Small problem...console games aren't bare metal anymore. Both the XBOX One and the PS4 use APIs. The XBOX One uses DirectX 11/12 and the PS4 uses some fucky Sony proprietary thing.
APIs let you code to the metal. An API is just a communication layer, yet you can still communicate you want your code be at metal level, or at least they should. I assume console APIs can do that.
 
Last edited:
APIs let you code to the metal. An API is just a communication layer, yet you can still communicate you want your code be at metal level, or at least they should. I assume console APIs can do that.
At that point you're basically bypassing the API and defeating the point of it existing...now...can you? Sure. Technically you can do the same thing on PC. It's a non-sequitur though because nobody bothers to do that.
 

Whitecrow

Banned
At that point you're basically bypassing the API and defeating the point of it existing...now...can you? Sure. Technically you can do the same thing on PC. It's a non-sequitur though because nobody bothers to do that.
It depends. There's things its worth being at metal and things that dont.

Do you have a loop in the code with a lot of iterations? Then its worth that the code in the loop is highly optimized.
Maybe a 0,002 ms gain coding something to the metal is nothing. But that 0,002 over 1000 iterations, is really something.

A lot of devs just use UE or Unity or whatever, add their customization, and release the game. But I'm sure First party games uses the metal to squeeze out max possible performance.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Small problem...console games aren't bare metal anymore. Both the XBOX One and the PS4 use APIs. The XBOX One uses DirectX 11/12 and the PS4 uses some fucky Sony proprietary thing.
Its still going to give you far far far better performance than the GPU equivalent on PC since that API only has to be optimized to communicate with one set of hardware.
 

Armorian

Banned
Its still going to give you far far far better performance than the GPU equivalent on PC since that API only has to be optimized to communicate with one set of hardware.

Currently you have 4 different GPU architectures out there - Maxwell/Pascal, Turing, GCN and RDNA just with different power levels. I think that at least in GPU space, things are pretty clear, devs don't optimize for older hardware and don't give a fuck about Intel gpus for the most part. With CPU you have Core and Ryzen and DDR3 and 4 for memory (only difference is speed here).

There are also 5 different consoles on the market: Switch (Nvidia/Maxwell), X1 (GCN 1.0, SRAM), PS4 (GCN 1.1), PS4 Pro (GCN 1.4+) and X1X (GCN 1.4) so these are not the same good times for developers when like when they were 2/3 machines to code for :)

With DX12/Vulcan developers have low level access to hardware on PC.
 
Last edited:
Its still going to give you far far far better performance than the GPU equivalent on PC since that API only has to be optimized to communicate with one set of hardware.
Except...that isn't how APIs work. APIs are designed to be hardware agnostic. The sequence goes something like this:
Game -> API -> GPU Driver -> GPU
Now it might not be perfect, I'm no expert, but the point is that the API provides a standardized layer that the game can use to communicate with the GPU, and the driver is the hardware and vendor specific part of the sequence that sits between the API and the GPU. This is how the XBOX One X is able to just run XBOX One games without any per game patches necessary despite having a different GPU. This is also how PC games from 15 years generally work without issue when you try to play them with an RTX 2080 Ti (atleast graphically...sometimes they don't like running on newer versions of Windows, but that's a totally different issue).
 

Psykodad

Banned
I think we're ready for $499 honestly in 2020. $399 is just less and less money to work with to break even or make a small profit. I don't see how a 9tf GPU, zen 2 CPU GGDR6, SSD, and a 4k bluray will make much profit at $499 and I lowballed the GPU. $399 would be an absolute nightmare. If its $399, expect massive cuts we didnt see coming. If X1X can sell well at $499 with crap exclusives and an ancient tablet CPU, ps5 will be just fine imo.

who are "we"? Because 100% guaranteed that 499 is really pushing it for the masses.

Anything above that and they done f*ck up, both Sony and MS.
 

Whitecrow

Banned
Except...that isn't how APIs work. APIs are designed to be hardware agnostic. The sequence goes something like this:
Game -> API -> GPU Driver -> GPU
Now it might not be perfect, I'm no expert, but the point is that the API provides a standardized layer that the game can use to communicate with the GPU, and the driver is the hardware and vendor specific part of the sequence that sits between the API and the GPU. This is how the XBOX One X is able to just run XBOX One games without any per game patches necessary despite having a different GPU. This is also how PC games from 15 years generally work without issue when you try to play them with an RTX 2080 Ti (atleast graphically...sometimes they don't like running on newer versions of Windows, but that's a totally different issue).
That's like saying that a player who barely understands starcraft and a professional starcraft player can play at the same level just because the gameplay is skill-agnostic.

Knowing your target hardware is crucial when you have to make choices in how/where to use your resources or how to allocate them. And in PC, you dont have target hardware just bruteforce.
 
Last edited:
That's like saying that a player who barely understands starcraft and a professional starcraft player can play at the same level just because the gameplay is skill-agnostic.

Knowing your target hardware is crucial when you have to make choices in how/where to use your resources or how to allocate them. And in PC, you dont have target hardware just bruteforce.
That's how APIs work. That's the entire point of their existence, to ensure compatibility at the cost of maximum performance. Console games can't go bare metal anymore. Not unless devs are going to patch games in purpetuity. A game that runs on bare metal on the XBOX One won't work on the XBOX One X for example. Next-gen wouldn't be compatible with any games running on bare metal either.
 

mmorg

Neo Member
they arent far optimized anymore because they are the same parts as pcs. back in the day you had the ps2 CPU running at 300 MHZ but is a better gaming machine than a 1 GHZ pentium 4 machine because it wasnt the same part it was specified for games

now days thats not the case.

you really sound like a knowledgeable person about cpu & gpu power. you must be right, they are selling basically trash cans to us for over 400 dollars. Consoles are over. No more consoles, everybody go home and play on Switch.

il pretend i got goofed and this is a daily joke thread
 
Last edited:

Whitecrow

Banned
That's how APIs work. That's the entire point of their existence, to ensure compatibility at the cost of maximum performance. Console games can't go bare metal anymore. Not unless devs are going to patch games in purpetuity. A game that runs on bare metal on the XBOX One won't work on the XBOX One X for example. Next-gen wouldn't be compatible with any games running on bare metal either.
Why are you talking like if a game couldnt have multiple source codes for different machines?

APIs doesnt make the same code run in different machines by magic. That's the work of the software suit used to develop the game.

Like you said, the API is a layer between code and the HW drivers, but is not a layer between code and all existing hardware, thats why each machine have its own api.
 
Last edited:
Why are you talking like if a game couldnt have multiple source codes for different machines?

APIs doesnt make the same code run in different machines by magic. That's the work of the software suit used to develop the game.

Like you said, the API is a layer between code and the HW drivers, but is not a layer between code and all existing hardware.
You can have multiple different builds for the different machines...but if they were doing that unpatched base XBOX One games would not work on the XBOX One X...
You're right, the way APIs work isn't by magic, it's well documented. You provide a standardized layer so that you do not need to patch the game and the GPU vendor can make any necessary changes in the driver instead. This is why ancient as fuck games and ancient as fuck versions of DirectX, say DX8, work on a 2080 Ti, but ancient as fuck drivers do not. The driver is the hardware specific part of the chain. The point of the API is not to make hardware a complete non-factor but to ensure the game itself is hardware agnostic. If you ensure the game is hardware agnostic then you do not need the source code to make it run on newer hardware that did not exist at the time, you can modify the driver instead.
 

Jigsaah

Gold Member
Wish I had a 1080ti...

It was the first viable 4k GPU if I'm not mistaken. Coupled with a much better CPU in an enclosed console environment ultimately makes it a huge upgrade regardless. I mean I dont know what you want. Going above that would raise the cost of the console considerably. I mean can you even get a 1080ti for 500 these days? I doubt it
 
Top Bottom