• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Saints Row PC Tech Preview: 2022 Reboot Shines on PC – And Hints Towards PS5/Series X Features

adamsapple

Or is it just one of Phil's balls in my throat?



Saints Row is coming out next month, so how is the PC version looking so far - and what can we learn about the game on PS5 and Series X? Tom Morgan attended a preview event to find out, and came back with word of a promising reboot that uses an evolution of Volition's in-house engine.



-



Text Article:



By the time the game releases in the latter stages of August, we'll be back with a full tech breakdown of not just the final PC release build, but also the situation on console. My hope is that given the settings scalability we're seeing here, there's room for a 4K 30fps mode with RT enabled on PS5 and Series X, as well as a 60fps mode without RT and perhaps a few other graphics tweaks. It'll be fascinating to see how the experience translates to last-gen machines too - will it be 1080p 30fps, or something different? Ultimately, we'll have to wait and see. For now though, this year's Saints Row reboot has huge promise as a sandbox open world title, with impressive options for character customisation, mission types, and underlying tech that should hopefully scale well to consoles old and new.
 
Last edited:

intbal

Member
r4U0bMA.png



g1KDiyj.gif
 

M1chl

Currently Gif and Meme Champion
Well if 3080 struggles to hit 60fps in open-world, I wonder how it will fare on Consoles....
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
This is looking better and better with every preview i see.
So glad theyve gone back to SR2 style.
SR4 and GOOH went alittle too far for my liking.

Its gonna be quite heavy at 4K it seems.
The particle effects look pretty damn good for an openworld game, I wouldnt have expected that much in such a game.
Ill see how far I can push my PC at Ultra Wide 3440x1440 cuz i want all the effects....all of them!

LOL Fuck me 3080 dead already !!
RTX 3080s have been struggling with 4K for a while now.
4k is shit ton of pixels to push at Ultra settings in pretty much any modern game.
The game also has RTAO in an open world so thats alot of coverage, no surprise its quite taxing.

Well if 3080 struggles to hit 60fps in open-world, I wonder how it will fare on Consoles....
They wont be running the game at Ultra settings with RTAO enabled......or will be 30fps Ultra. 60fps medium/high.....like pretty much every other recent title.
 

M1chl

Currently Gif and Meme Champion
This is looking better and better with every preview i see.
So glad theyve gone back to SR2 style.
SR4 and GOOH went alittle too far for my liking.

Its gonna be quite heavy at 4K it seems.
The particle effects look pretty damn good for an openworld game, I wouldnt have expected that much in such a game.
Ill see how far I can push my PC at Ultra Wide 3440x1440 cuz i want all the effects....all of them!


RTX 3080s have been struggling with 4K for a while now.
4k is shit ton of pixels to push at Ultra settings in pretty much any modern game.
The game also has RTAO in an open world so thats alot of coverage, no surprise its quite taxing.


They wont be running the game at Ultra settings with RTAO enabled......or will be 30fps Ultra. 60fps medium/high.....like pretty much every other recent title.
It's 4K native, I forgot, somewhat I thought it's running it with DLSS. Nevermind...
 

MidGenRefresh

*Refreshes biennially
Lol. If this simpleton game is tapping the 10gigs if the 3080 at 4k, then fuck I made the right choice to get a 6900xt instead . Jesus

It has nothing to do with VRAM and it's clearly evident from the video. At max settings this game is not even using 7GB of VRAM.

It's just poorly optimised.
 

//DEVIL//

Member
It has nothing to do with VRAM and it's clearly evident from the video. At max settings this game is not even using 7GB of VRAM.

It's just poorly optimised.
I got bored half way and stopped watching the video. the game "look" like shit. was hoping watching more videos would change my mind but nope. still ps3 level graphics but at 4k 60 frames or whatever. didn't see where its 7 gigs. he said tapping the 10 gig of the 3080 and I was like LoL.

But usually Saintrow games are not about graphics but the pure fun. so will wait for reviews.

with that being said of course the game is poorly optimized. I do not expect much from woke developers.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Lol. If this simpleton game is tapping the 10gigs if the 3080 at 4k, then fuck I made the right choice to get a 6900xt instead . Jesus
Err what?
YQOPaTM.png

Not even using 8GB so an RTX3070Ti should be fine.

P.S Its a game with RT and you are happy you picked the an RX of an RTX?
hahaha sure jan
 

//DEVIL//

Member
Err what?
YQOPaTM.png

Not even using 8GB so an RTX3070Ti should be fine.

P.S Its a game with RT and you are happy you picked the an RX of an RTX?
hahaha sure jan
Please read my previous post as I explained what I said.

and Yes. for 750$ CAD xfx 6900xt is the best GPU buy I ever did in recent gaming hardware related purchases. But I am one of those who do not care about ray tracing at all ( because 60 frames is not enough for me these days. 90 or above as I have a 4k 144 monitor )
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I got bored half way and stopped watching the video. the game "look" like shit. was hoping watching more videos would change my mind but nope. still ps3 level graphics but at 4k 60 frames or whatever. didn't see where its 7 gigs. he said tapping the 10 gig of the 3080 and I was like LoL.

Please read my previous post as I explained what I said.

and Yes. for 750$ CAD xfx 6900xt is the best GPU buy I ever did in recent gaming hardware related purchases. But I am one of those who do not care about ray tracing at all ( because 60 frames is not enough for me these days. 90 or above as I have a 4k 144 monitor )

Tapping is another way of saying using.
So he is using a 10GB RTX 3080 just differentiating from the 12GB RTX 3080.
The game barely works the VRAM of an RTX 3070Ti.

4K90 and above are you assuming an RTX 3080 cant do that cuz even in pure raster the RTX3080 and 6900XT at 4K have the same performance.
The RTX 3080 just has the benefit of being able to do Raytracing well.

I dont understand what this comment is implying:
Lol. If this simpleton game is tapping the 10gigs if the 3080 at 4k, then fuck I made the right choice to get a 6900xt instead . Jesus
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Using RTAO is a welcome addition. Using raytracing for reflections is low hanging fruit.
In an open world game no less.
Dream scenario is RTGI for RTX4000 owners
The game has some form of GI already, let us push our systems even further.
 

DenchDeckard

Moderated wildly
Please read my previous post as I explained what I said.

and Yes. for 750$ CAD xfx 6900xt is the best GPU buy I ever did in recent gaming hardware related purchases. But I am one of those who do not care about ray tracing at all ( because 60 frames is not enough for me these days. 90 or above as I have a 4k 144 monitor )

But RTAO makes the game look so much better, so you are going to want that turned in. I hope your 6900XT can handle it.
 

Mister Wolf

Member
In an open world game no less.
Dream scenario is RTGI for RTX4000 owners
The game has some form of GI already, let us push our systems even further.

I wonder what would produce an image more closer to "ground truth". Their light probe GI paired with RTAO or utilizing raytracing to make their probes more accurate like how Returnal and Chernobylite utilize raytracing.
 
In an open world game no less.
Dream scenario is RTGI for RTX4000 owners
The game has some form of GI already, let us push our systems even further.

RTAO is a form of RTGI if I’m not mistaken. Performance hit is not as bad either, I’d love to see it become more adopted across games.
 

ANDS

King of Gaslighting
That real time graphically update gimmick needs to be something that more developers put into their games.

It has nothing to do with VRAM and it's clearly evident from the video. At max settings this game is not even using 7GB of VRAM.

It's just poorly optimised.

A poorly optimized. . .preview build. Never change you crazy diamonds.
 

M1chl

Currently Gif and Meme Champion
It's truly awful and people itt are just talking about how hard this turd is to run lol

I mean its graphics remind me of rage 2, with a similarly pukey art style, calm down it's not the graphics card's fault haha.
Exactly what I was thinking. Rage 2 had godlike gameplay, in a sense of shooting and shit like that (game was boring due to world). This does not seems anywhere close to that in terms of gameplay loop.
 

intbal

Member
To be fair, there isn’t much being released that they can really do a deep dive on.

They have never done a video for Elex 1 or 2. Piranha Bytes has their own engine. It's a open world action rpg. Exactly the kind of thing they usually look at. But haven't.

9th gen consoles just got an update with new performance and filtering options (TAA and FXAA) in PUBG. Also added a gargantuan new map with pretty impressive verticality.
I figure that's worth a quick video. But then they'd have to be regular players of that game to be able to remain in a match long enough to do any analysis. And I bet none of them have launched PUBG in two years.
 

//DEVIL//

Member
But RTAO makes the game look so much better, so you are going to want that turned in. I hope your 6900XT can handle it.
even so, I am still not interested in ray tracing. when a game loses so many frames because of ray tracing, it's something I am not interested in. I would rather have a locked high frame than ray tracing. as I do notice dips on frames.

I don't care if the 6900xt can handle it or not ( had 3090 and sold it for profit when I had the chance before the GPU crash ). I will probably buy the 4090 when it comes out if the price is within an acceptable range. ( do not want to pay more than 2000$ CAD plus tax ) for FE. ( I do not favor AMD over Nvidia or the other way around. I bought the 6900xt as I mentioned for around 500$ or so ( estimate conversion from how much I paid Canadian and technically I got it for almost for free just from the profit I made from selling the 3090 alone )
 
Last edited:

//DEVIL//

Member
Tapping is another way of saying using.
So he is using a 10GB RTX 3080 just differentiating from the 12GB RTX 3080.
The game barely works the VRAM of an RTX 3070Ti.

4K90 and above are you assuming an RTX 3080 cant do that cuz even in pure raster the RTX3080 and 6900XT at 4K have the same performance.
The RTX 3080 just has the benefit of being able to do Raytracing well.

I dont understand what this comment is implying:
I think there is a misunderstanding here. My comment about thank god I got a 6900xt was because of VRAM and not about the performance of the 6900xt vs 3080. in my mind, if this ugly-looking game tapping the 3080 VRAM ( I honestly thought it was close to 9 something when he said that ), then what would happen when real next-gen games come out? that 10 gig isn't enough. which is why I said what I said about the 6900xt performance.

to me, they kinda both the same in terms of performance. maybe the 6900xt edge it a bit, while the other has better ray tracing. but I believe the 16gig is more important to me than ray tracing ( even the 4080 is 16 gigs. )

to each his own I guess. but they all are good cards at the end.
 

Sleepwalker

Gold Member
Time to maybe grab a 3090 or 3090ti now that they are plummeting as im sure the 4000 series will be aids to get and I plan to get the QD Oled monitor when its available.


Absolutely no interest in this game tho
 

//DEVIL//

Member
Time to maybe grab a 3090 or 3090ti now that they are plummeting as im sure the 4000 series will be aids to get and I plan to get the QD Oled monitor when its available.


Absolutely no interest in this game tho
For the love of god, unless the 3090 is 800$ or less, do NOT come close to it. The 4070 will have a better performance than the 3090ti .

Just...don't
 

MidGenRefresh

*Refreshes biennially
A poorly optimized. . .preview build. Never change you crazy diamonds.

Game went gold ands it's releasing in less than a month? Are you suggesting that footage for all these preview videos were recorded on some kind of older build?
 

BadBurger

Is 'That Pure Potato'
I was going to go day one on PS5, now I think I may wait for patches and eventually go with PC. Crummy news.
 
Not sure what’s so amazing about this graphically. Certainly not seeing anything that should stress out my 3080 either.

Saints Row has always been over the top but the character designs and style in general is just not my thing. Probably give this one a pass.
 
I think the worst part about this is that it's the same game we've been playing for the past 20 years since GTA 3. The industry is boring af.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I think there is a misunderstanding here. My comment about thank god I got a 6900xt was because of VRAM and not about the performance of the 6900xt vs 3080. in my mind, if this ugly-looking game tapping the 3080 VRAM ( I honestly thought it was close to 9 something when he said that ), then what would happen when real next-gen games come out? that 10 gig isn't enough. which is why I said what I said about the 6900xt performance.

to me, they kinda both the same in terms of performance. maybe the 6900xt edge it a bit, while the other has better ray tracing. but I believe the 16gig is more important to me than ray tracing ( even the 4080 is 16 gigs. )

to each his own I guess. but they all are good cards at the end.
While I do agree 10GB/12GB is looking to become a bottleneck in the future as is for a 2 year old card about to be replaced the RTX3080 did and does a stellar job, games even at 4k rarely if ever actually need 10GB or more they might commit more but they dont actually use more.
WatchDogs Legion for instance only uses around 8GB of VRAM at 4K Ultra with DXR.

Im sure if AMD could go back in time and give the 6900XT a 384bit bus with less VRAM(12GB vs 16GB) they would jump at the opportunity.
At those speeds it would have been a true powerhouse card.
Heck the 7900XT is expected to use a 384bit bus cuz im sure AMD have figured out at those higher resolutions infinity cache is nice but having an actually fast and wide bus just gets the job done.
The RTX 4090 vs 7900XT battle is going to be glorious as they both have large cache and are both on 384bit.

And im guessing you just misunderstood his comments, tapping just means using not tapping out as in finishing.

Further I dont expect the VRAM of the 6000XT series to matter before the chips actually starts gassing out, both the 6000XTs and RTX3000s are already struggling at native 4K.
Time will tell of course but i made this bet at the start of the Ampere generation and to date I was proven right, 10GB has not been a true bottleneck for the RTX3080 and i dont expect it to become one until the chip is legit just out of gas....when games use 10GB of VRAM at 1080p thats when we can say the RTX 3080 is tapped out.
 
Top Bottom