mcjmetroid
Member
I know PC games are always going to have more options than console games that's always a given considering the many combinations of hardware possible. Console game graphic settings are often limited to "film grain" on or off etc. Simple toggles.
However why not let them have more? On PC if you have a 10 year old card you can still play most new games at 60fps with a massive hit in graphical settings and Resolution. Consoles now tend to have a bit of a choice between 30 and 60 but that will probably stop once the cross gen games disappear. Why not let a console user adjust resolution and at least a preset graphic detail setting so they can hit that 60fps?
Is there a quality test that needs to be passed from Sony etc that prevents them from putting in options like 30fps or 60fps in each game just in case a user breaks the game by accident?
I mean if there is a " revert to default settings" clearly placed on the screen then it should be fine right.
However why not let them have more? On PC if you have a 10 year old card you can still play most new games at 60fps with a massive hit in graphical settings and Resolution. Consoles now tend to have a bit of a choice between 30 and 60 but that will probably stop once the cross gen games disappear. Why not let a console user adjust resolution and at least a preset graphic detail setting so they can hit that 60fps?
Is there a quality test that needs to be passed from Sony etc that prevents them from putting in options like 30fps or 60fps in each game just in case a user breaks the game by accident?
I mean if there is a " revert to default settings" clearly placed on the screen then it should be fine right.