• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Watch Dogs: Legion confirmed 4K/30 with RT on XsX/PS5

uhm... in the early 90s consoles performed way better than PCs... back then PCs couldn't even really do smooth scrolling in 2D games without loads of programming trickery. and in the 16bit era most games ran at 60fps.


Sure, back then they were dedicated gaming machines while pc's didnt even have dedicated graphics accelerators. That changed quickly though by mid 90's. But thats why i said shitty performance for the "past 3 decades".
 

01011001

Banned
Sure, back then they were dedicated gaming machines while pc's didnt even have dedicated graphics accelerators. That changed quickly though by mid 90's. But thats why i said shitty performance for the "past 3 decades".

but even there... even the graphics cards that were promoted for Quake 2 for example ran the game at around 20-30fps
I still say no... back then PC games also ran like shit and often worse than on Console. the N64's GPU was way ahead of any PC at the time of launch.

it was during the jump to PS2/Xbox/GCN when the PC slowly overtook the consoles in terms of performance
 
Last edited:
but even there... even the graphics cards that were promoted for Quake 2 for example ran the game at around 20-30fps
I still say no... back then PC games also ran like shit and often worse than on Console. the N64's GPU was way ahead of any PC at the time of launch.

it was during the jump to PS2/Xbox/GCN when the PC slowly overtook the consoles in terms of performance

No, they didnt ran quake so bad. You could run Quake 1 at 40 frames with a voodoo 2 and using sli was already modern levels of framerate. N64 ran goldeneye at a handfull of frames, turok 2 at like 10 frames and so on. And even if we discount 3d acceleration, PC's were outdoing consoles in memory and cpu even earlier. Thats why you could have complex simulation games on pc even in the late 80's.

It definitely didnt took until ps2 or xbox to overtake consoles. They were overtook in mid 90's with voodoos. PS2 was always weak, that never had an advantage over pc's. xbox was competitive for a few months, but quickly overtaken by the geforce 3 refresh in late 2001.
 

01011001

Banned
No, they didnt ran quake so bad. You could run Quake 1 at 40 frames with a voodoo 2 and using sli was already modern levels of framerate. N64 ran goldeneye at a handfull of frames, turok 2 at like 10 frames and so on. And even if we discount 3d acceleration, PC's were outdoing consoles in memory and cpu even earlier. Thats why you could have complex simulation games on pc even in the late 80's.

It definitely didnt took until ps2 or xbox to overtake consoles. They were overtook in mid 90's with voodoos. PS2 was always weak, that never had an advantage over pc's. xbox was competitive for a few months, but quickly overtaken by the geforce 3 refresh in late 2001.

back then PCs were superior in some aspects and consoles in others.
and sure, with 2 Voodoo2 cards you could run Quake at decent framerates. but a single Voodoo2 card would cost you $250, so it is fair to say that most people would probable not run these in SLI, that was a super enthusiast thing... it wasn't like nowadays where even a low end PC can give you above console-level performance.

the Voodoo2 was also released in 1998 and the N64 in 96, so really, it was to be expected that it should be a bit faster.

the truth was still that most people with realistically spec'ed PCs would run 3D games at around 30fps. the time of consistent above 60FPS PC gaming only came after/around the time the PS2 launched
 
Last edited:

Hawk269

Member
Is this Native 4k and is that 30fps pretty much locked? I don't mind it being 30fps if it is native 4k and if that 30fps is locked. But if they are pushing native 4k and if that games has drop's it won't be something I would want. 30fps is fine for many games if the 30fps is locked and the frame pacing is not an issue.
 
Ray tracing is not worth 30fps. I would prefer a mode that has the same resolution and visuals as one X, but with double the frames. All the extra power should be able to handle that.
 
It's cute that somebody actually asked about 120 fps in a Ubisoft game.

Ej6qfa0WkAAf5P8.jpg
 

base

Banned
Ray tracing is not worth 30fps. I would prefer a mode that has the same resolution and visuals as one X, but with double the frames. All the extra power should be able to handle that.
Precisely. I would vote for 1440p60fps than 4k30 fps. The difference in picture quality is small but the smoothness makes it worth it. No to 30 fps.
 

Tripolygon

Banned
4k120fps? Haha yeah right, with graphics from 90s or only when you look at the ground :D
Checkerboard rendering. The game is 3072 x 1728cb 60fps on XOX so its not hard to believe that they're doubling the fps with a much better CPU next gen consoles have.
 
Last edited:

base

Banned
Checkerboard rendering. The game is 3072 x 1728cb 60fps on XOX so its not hard to believe that they're doubling the fps with a much better CPU.
Or they are using some RDNA2 technique. An equivalent of DLSS. Some rumors confirmed it but still too good to be true.

4k120fps? Ok but with limitation. Rainbow Six isn't a sabdbox, it's not so demanding but we all know Ubisoft and their promises. Ac3, Unity etc etc.

1440p is a perfect option.
 
Surprised by some of the negative reactions. I expected and am still expecting most open world games to be 30fps.
I'll be happy to be wrong. But assuming that open world games would be 60fps on these consoles seems a bit naive to me.

There's no reason to expect 30 fps this gen since the CPU is powerful enough to push 60.
 

MrFunSocks

Banned
Someone should tell Ubisoft that they are wrong and in fact won't have raytracing because the Xbox APIs for raytracing aren't finished. Would save them a lot of confusion.

There's no reason to expect 30 fps this gen since the CPU is powerful enough to push 60.
CPU power has never been why games are 30fps instead of 60fps.

At 30fps devs get twice the time to render the frame, which means the frame can be much more complex.
 
Last edited:

01011001

Banned
CPU power has never been why games are 30fps instead of 60fps.

At 30fps devs get twice the time to render the frame, which means the frame can be much more complex.

CPU power was very much the reason many current gen games ran at 30fps
how do we know that? when even the Xbox One X or PS4 Pro can't run games at a locked 60fps while running the same settings as the base consoles... that's how.
example A: Hitman 2
example B: God of War

these systems have either double the GPU power (Pro) or even more than that (One X) of the base systems, yet these games can't reach double the performance in framerate mode running at way lower settings.

in the case of Hitman 2, the PS4 Pro didn't even get a framerate mode... because it wasn't even worth doing on that system.
the One X which has a slightly better CPU has one but still can't hold 60fps at about half the resolution and reduced settings of its 30fps graphics mode.

the only reason it has such a mode one One X is because that CPU is higher clocked to a point where it gets somewhat close to 60fps to make it kinda worth having that mode for people using freesync ultimate screens.

and God of War on Pro halfs the effective resolution in framerate mode but still runs at close to 30fps in a lot of scenes and only goes up to 50fps and higher in extremely rare moments. basically when only a very small room/area is loaded in.

that's not the jump in performance you would expect if the game was GPU bound but very much what you expect from a CPU bound game

another extreme example, Star wars Jedi: Fallen Order.

framerate mode which is basically base console settings, on One X still drops below 30fps even...
yeah, totally not the CPUs fault
 
Last edited:

MrFunSocks

Banned
CPU power was very much the reason many current gen games ran at 30fps
how do we know that? when even the Xbox One X or PS4 Pro can't run games at a locked 60fps while running the same settings as the base consoles... that's how.
example A: Hitman 2
example B: God of War

these systems have either double the GPU power (Pro) or even more than that (One X) of the base systems, yet these games can't reach double the performance in framerate mode running at way lower settings.

in the case of Hitman 2, the PS4 Pro didn't even get a framerate mode... because it wasn't even worth doing on that system.
the One X which has a slightly better CPU has one but still can't hold 60fps at about half the resolution and reduced settings of its 30fps graphics mode.

the only reason it has such a mode one One X is because that CPU is higher clocked to a point where it gets somewhat close to 60fps to make it kinda worth having that mode for people using freesync ultimate screens.

and God of War on Pro halfs the effective resolution in framerate mode but still runs at close to 30fps in a lot of scenes and only goes up to 50fps and higher in extremely rare moments. basically when only a very small room/area is loaded in.

that's not the jump in performance you would expect if the game was GPU bound but very much what you expect from a CPU bound game

another extreme example, Star wars Jedi: Fallen Order.

framerate mode which is basically base console settings, on One X still drops below 30fps even...
yeah, totally not the CPUs fault
You're overlooking the GPU. With twice the rendering time the GPU can do twice as much work. It's not a matter of being GPU bound, it's simply a matter of you can do more GPU work in 33ms than you can in 16ms.

.

Devs choose to render games at 30fps because it lets them make the graphics and other things better. This is 1000000% fact. You cannot disagree with this. 33ms to render vs 16ms to render. Twice as much time.
 
Last edited:

01011001

Banned
You're overlooking the GPU. With twice the rendering time the GPU can do twice as much work. It's not a matter of being GPU bound, it's simply a matter of you can do more GPU work in 33ms than you can in 16ms.

.

yes but that's completely besides the point.
I just gave you examples of games that literally can not hit 60fps even if the GPU load is cut in half or even more than that.

Fallen Order can't even hold its framerate above 30 in framerate mode on One X, where the game runs at essentially base PS4 settings ON A 6TF GPU! and with a shitton more memory bandwidth as well.

Hitman 2, same situation, less than half the GPU load and still struggles to hit 60fps

God of War, literally half the GPU load + no checkerboarding which also would take up GPU cycles...
still drops down to 30fps and rarely comes close to 60fps.

these games literally can't run at 60fps on current gen systems due to the CPU
 

MrFunSocks

Banned
yes but that's completely besides the point.
I just gave you examples of games that literally can not hit 60fps even if the GPU load is cut in half or even more than that.

Fallen Order can't even hold its framerate above 30 in framerate mode on One X, where the game runs at essentially base PS4 settings ON A 6TF GPU! and with a shitton more memory bandwidth as well.

Hitman 2, same situation, less than half the GPU load and still struggles to hit 60fps

God of War, literally half the GPU load + no checkerboarding which also would take up GPU cycles...
still drops down to 30fps and rarely comes close to 60fps.

these games literally can't run at 60fps on current gen systems due to the CPU
But that's irrelevant.

Developers choose to make a game either 30fps or 60fps based on what they want to do graphically, essentially.

A game not being able to run at 60fps doesn't mean it was ever wanted to run at 60fps, because at 60fps they would have had to have downgraded the graphics compared to 30fps. Read the link that I posted.
 

regawdless

Banned
There's no reason to expect 30 fps this gen since the CPU is powerful enough to push 60.

If you're not GPU limited. In the end, it's a design decision by the devs and not necessarily a hardware question. Remember old console games being mostly 60fps (16bit consoles, even PS2 had a lot of 60fps games).
But I believe that, especially for open world games, devs will stick with throwing as many effects in while targeting 30fps.

In other games, I think there'll be more options like performance vs eye candy modes.
 

regawdless

Banned
1080p / 60fps / RT
4K (or upscaled) / 30fps / RT
4K (or upscaled) / 60fps No RT

Basically every next gen title should have these 3 options as standard, everyones happy.

Except the devs, who have to optimize three settings for Series x, Series S and PS5 in addition to the wild west that is PC gaming AND possibly PS4/XBox One/One X if cross gen. And then the possible mid gen updates like a PS5 Pro. :messenger_tears_of_joy:

I'm with you though, to have the choice as a consumer would be the best way. But this can very well lead to not all settings delivering the desired performance. As we have seen this gen when some games offered performance modes that failed to reach stable 60fps.
Which, on the positive side, can be made more bearable with TVs that support variable refresh rates.
 
1080p / 60fps / RT
4K (or upscaled) / 30fps / RT
4K (or upscaled) / 60fps No RT

Basically every next gen title should have these 3 options as standard, everyones happy.
Yup,

I think developers are feeling forced to put ray tracing in games to keep up with the Joneses right now.

I think they are going to soon feel just as forced to provide 60fps at some resolution when backlash of 30fps shows up on ps5 and xbx.
 

Blond

Banned
I actually expected lower resolutions.


Anyone who even remotely thought baseline 60 FPS was going to be a thing in any gen isn’t playing with a full deck. Add to that raytracing and it’s even more ridiculous.

Yep. The PS2 era was an anomaly as even SNES and PS1 games were 30 FPS largely.
 

Ehtonk

Neo Member
Actually i don't think this game looks good enough to be just 4k 30fps only.
If it were a banger in terms of graphics. Ok im fine with it. But it just looks like Watch Dogs 2 in London with RT reflections and higher resolution. There isn't a big leap.

Most of the people doesn't even have a 4k TV. Thats why microsoft is bring the Series S.
The only thing we get with the new Watch Dogs is higher resolution and better loading times.
Hurray. Whats wrong with all these Ubi Studios? Nobody playing games by them self anymore?

Just marketing buzzwords like 4k and Ray Tracing are important?
 

//DEVIL//

Member
30 fps... if that is going to be the trend I’ll just stay on pc . Fuck this

not playing games on 30 FPS when I have been playing 60 FPS+ on pc for 10 years .

this is a next gen console . Give me 2k 60fps option. What is the point of the game looking pretty if it runs like shit ?
 
Top Bottom