• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry, Alan Wake 2 PC - Rasterisation Optimised Settings Breakdown - Is It Really THAT Demanding?

Mr Moose

Member
It doesn't really matter, I don't think you're lying. If he said that, he was either wrong or taken out of context. According to the screenshot above she said the same thing I'm saying now. He would be contradicting himself.

Regardless of what Alex says, anyone can verify that he will not lose a single frame using adaptive or normal vsync.
He's talking about vsync on and off in that pic, I am talking about adaptive v triple buffered.
I forgot which video it was but I'll have a look see if I can find it.
 
Last edited:

Mr Moose

Member
HtuUuKc.png


He's saying the exact same thing as Vergil1992 Vergil1992
It doesn't really matter, I don't think you're lying. If he said that, he was either wrong or taken out of context. According to the screenshot above she said the same thing I'm saying now. He would be contradicting himself.

Regardless of what Alex says, anyone can verify that he will not lose a single frame using adaptive or normal vsync.


Found it.
 
Last edited:

Vergil1992

Member
He's talking about vsync on and off in that pic, I am talking about adaptive v triple buffered.
I forgot which video it was but I'll have a look see if I can find it.
But the thing is, what he says doesn't make sense. Vsync on and off is exactly the same result as triple buffered vs adaptive vsync (in terms of performance). In the video it seems to talk about vsync activated vs deactivated.

When you use adaptive vsync and the frame rate drops, vsync is disabled. If it is vsync triple buffered, it is still enabled. There is no difference, it is exactly the same as comparing vsync on vs off if the frame rate is below the target.

In any case, no matter what he said, I don't agree with Alex. Unless there is some bug in the implementation that breaks the game, it doesn't matter if it is adaptive and "standard", the performance will be the same. I don't know why he says that in the video.


Example:

Vsync on:

Assassin-s-Creed-Odyssey-04-09-2023-23-26-58.png


Adaptative vsync:

Assassin-s-Creed-Odyssey-04-09-2023-23-27-35.png
 
Last edited:

YCoCg

Member
Does Alan Wake 2 on consoles use nearly the equivalent of LOW/MEDIUM PC settings?

Yes.

Does Alan Wake 2 still look amazing despite that?

Yes.

As for the whole mesh shaders thing and the game not being viable to play on a GTX 10 series or lower, I get it but those cards over seven years old now and there always comes a point when it times to aim for higher hardware. GPU prices do suck at the moment but we can't expect studios to sit around and stay on older hardware, even though that's what the majority do, it's nice to have something in PC actually designed for "modern" hardware.
 

Gaiff

SBI’s Resident Gaslighter


Found it.

In that game specifically which has about 10 modes, including a HFR Vsync Mode that is seemingly broken. As I said before, it can happen at times due to issues with the game's renderer but this isn't expected behavior. As far as I could tell in Alan Wake, it doesn't happen, nor does it in most games.

Alex and John are talking about the Vsync in this game's HFR mode, not in general.
 

TwinB242

Member
I skimmed through the video. Seems unfair to just compare it to PS5's performance mode and not quality....
 
Last edited:

S0ULZB0URNE

Member
Where are examples of this, I have not seen any signs of DF hating on PS.

But I have seen people take DF comments out of context...


He doesn't just complain about PC games, he praises the good PC versions like Alan Wake 2 :messenger_smiling_with_eyes:
I am not watching through the countless videos where they either take a jab at PlayStation or when PlayStation has the advantage say it's minor.

It needs DLSS with a 4090...
Either the 4090 isn't up to par or its not a good port.
 
Last edited:

Leonidas

Member
I am not watching through the countless videos where they either take a jab at PlayStation or when PlayStation has the advantage say it's minor.
You can't even name one example :messenger_tears_of_joy:

It needs DLSS with a 4090...
Either the 4090 isn't up to par or its not a good port.
You don't need DLSS with a 4090, but you'd be an idiot to not turn it on. DLSS Quality often looks equal to and sometimes even better than native.

This is a great PC version. An RTX 3070 (a mid-range GPU from 3 years ago) is running 43-46% faster than the PS5. That's better than usual.
 
Last edited:

S0ULZB0URNE

Member
You can't even name one example :messenger_tears_of_joy:


You don't need DLSS with a 4090, but you'd be an idiot to not turn it on. DLSS Quality often looks equal to and sometimes even better than native.

This is a great PC version. An RTX 3070 is running 43-46% faster than the PS5. That's better than usual.
I did give a example.
When PS wins in a area they say it's slight but when Xbox (for example) has the advantage it's bigger.
They been suspect for years as many notice.


"With DLSS disabled, Nvidia's benchmarks reveal that frame rates plummet to just 32.8 fps on the RTX 4090, 22.2 fps on the RTX 4080, and 17.4 fps on the 4070 Ti."


Yeah no
 

Leonidas

Member
I did give a example.
When PS wins in a area they say it's slight but when Xbox (for example) has the advantage it's bigger.
They been suspect for years as many notice.
Link me to the vids where this occurs.


"With DLSS disabled, Nvidia's benchmarks reveal that frame rates plummet to just 32.8 fps on the RTX 4090, 22.2 fps on the RTX 4080, and 17.4 fps on the 4070 Ti."
4K isn't the only resolution of PC displays. 4090 runs maxed with path tracing at native 1440p at around 60 FPS average (much faster than PS5 Quality mode, which uses much lower settings).
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
It needs DLSS with a 4090...
Either the 4090 isn't up to par or its not a good port.
"With DLSS disabled, Nvidia's benchmarks reveal that frame rates plummet to just 32.8 fps on the RTX 4090, 22.2 fps on the RTX 4080, and 17.4 fps on the 4070 Ti."


Yeah no
Are you purposefully glossing over the part where this is with path tracing?

performance-3840-2160.png


Max settings 4K, most cutting-edge game on the market right now. Where does it "need" DLSS? It doesn't, unless you throw ray tracing and path tracing at it. Now, do you "need" those features? No, you don't. If you want them, go right ahead and enable DLSS and Frame Generation. You'd be a retard not to take advantage of those features.

Furthermore, I don't even think this is a PC port. More than likely, it was ported to consoles, assuming they even did a port.
 
Last edited:

S0ULZB0URNE

Member
Are you purposefully glossing over the part where this is with path tracing?

performance-3840-2160.png


Max settings 4K, most cutting-edge game on the market right now. Where does it "need" DLSS? It doesn't, unless you throw ray tracing and path tracing at it. Now, do you "need" those features? No, you don't. If you want them, go right ahead and enable DLSS and Frame Generation. You'd be a retard not to take advantage of those features.

Furthermore, I don't even think this is a PC port. More than likely, it was ported to consoles, assuming they even did a port.
Link me to the vids where this occurs.


4K isn't the only resolution of PC displays. 4090 runs maxed with path tracing at native 1440p at around 60 FPS average (much faster than PS5 Quality mode, which uses much lower settings).
"At 1440p with the same graphical settings, the RTX 4090 was able to pump out 170.1 fps, 4080 133.1 fps, 4070 Ti 108.3 fps, and 4070 91.9 fps with DLSS 3.5 enabled. Without Nvidia's performance-enhancing tech enabled, frame rates plummeted to 62.7 fps on the RTX 4090, 44.6 fps on the 4080, 35.3 fps on the 4070 Ti, and 28 fps on the 4070."

63 fps at 1440p with a 4090 isn't good.
 

Gaiff

SBI’s Resident Gaslighter
"At 1440p with the same graphical settings, the RTX 4090 was able to pump out 170.1 fps, 4080 133.1 fps, 4070 Ti 108.3 fps, and 4070 91.9 fps with DLSS 3.5 enabled. Without Nvidia's performance-enhancing tech enabled, frame rates plummeted to 62.7 fps on the RTX 4090, 44.6 fps on the 4080, 35.3 fps on the 4070 Ti, and 28 fps on the 4070."
Yeah, that's with path tracing you troll.
63 fps at 1440p with a 4090 isn't good.
Says who? What are you comparing it to? How many games using path tracing are out there? What is the benchmark for path traced games?
 
Last edited:

Bojji

Member
"At 1440p with the same graphical settings, the RTX 4090 was able to pump out 170.1 fps, 4080 133.1 fps, 4070 Ti 108.3 fps, and 4070 91.9 fps with DLSS 3.5 enabled. Without Nvidia's performance-enhancing tech enabled, frame rates plummeted to 62.7 fps on the RTX 4090, 44.6 fps on the 4080, 35.3 fps on the 4070 Ti, and 28 fps on the 4070."

63 fps at 1440p with a 4090 isn't good.

This is with path tracing, the most expensive (and realistic) thing we have in graphics right now. What do you expect?
 

S0ULZB0URNE

Member
Yeah, that's with path tracing you troll.

Says who? What are you comparing it to? How many games using path tracing are out there? What is the benchmark for path traced games?
Here comes the pcmr fanboy name calling smh

It's a 4090!
Lol@ any of yaw buying a 4090 to play in 1440p.
 
I did give a example.
When PS wins in a area they say it's slight but when Xbox (for example) has the advantage it's bigger.
They been suspect for years as many notice.



"With DLSS disabled, Nvidia's benchmarks reveal that frame rates plummet to just 32.8 fps on the RTX 4090, 22.2 fps on the RTX 4080, and 17.4 fps on the 4070 Ti."


Yeah no

"As many notice"

Many = some Gaf users and a few twitter troll warriors maybe lol, haven't seen DF guys being called out for hating Playstation or being Xboxers in any other forum

But like that spanish guy who posts game version comparisons is also banned here for being xboxer? or anti PS guy i guess

Ppl in this forum specially sees enemies of playstation everywhere rofl
 
Last edited:

Bojji

Member
I hear that but.....
With single player games?

I want 4K and as many bells and whistles as possible.

And you get that, 4090 is above 30fps with fully max out settings at native 4K.

On PS5 you have 30fps at sub 1440p resolution without any RT and many settings on medium.
 

S0ULZB0URNE

Member
Why don't you have an RTX 4090 then?
Because of 4090 Super and ti early next year rumors and because of the 16gb 4070 variant that's rumored.
I am building me and my son a build with him getting the 4070.
And you get that, 4090 is above 30fps with fully max out settings at native 4K.

On PS5 you have 30fps at sub 1440p resolution without any RT and many settings on medium.
Right
 

CamHostage

Member
Alan Wake 2 is now running on Steam Deck.



(At launch, it would not even boot on Deck, but users have figured out a solution and it's now playable.)

Some sever downsides of how it performs and looks, but also a few surprises in how it works out on the PC handheld (generally in the <30FPS range with liveable low/mid settings.) This video shows a few gameplay areas, then goes into the methods to get it to boot.
 
Last edited:

CrustyBritches

Gold Member
I hear that but.....
With single player games?

I want 4K and as many bells and whistles as possible.
Different strokes for different folks. On PC or PS5 I'm going with 60fps or 120fps if I can get it. It's not even purely about input latency reduction. Fidelity modes look ok until you move the camera then they look worse than Performance, imo. There's a certain clarity you get from smoother camera movement with less intrusive motion blur due to higher frame rate.

I don't have Alan Wake 2 yet. I'd prefer it on PC over PS5, but being EGS exclusive is a bummer. If I played on PS5 I'd go for Performance mode, and on PC I'd be shooting for 60-70fps and then using FG to take it over 100fps. I play on a 32" 1440p/165Hz monitor and the difference between 1080p and 1440p isn't massive.

P.S.- I don't have a 4090. I'm playing on a 4060Ti 16GB. I wanted the most memory I could get for fucking around with hobbyist projects, so I'm using this until I can afford a 4080 or 4090.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Lysandros Lysandros The captures were done with a 3600 with Vsync Off.

kigG7E8.jpg


The 3070 is unusually strong here, comfortably beating the 2080 Ti when they're usually within 1-5% of one another, with the 2080 Ti even beating it sometimes. The 2080 Ti is 29% faster than the PS5, so within the norm. The PS5 is in turn 9% faster than the 2070S, again, within normal range. The 3070 is a bit of an outlier, albeit, not a massive one.
 

winjer

Gold Member
Lysandros Lysandros The captures were done with a 3600 with Vsync Off.

kigG7E8.jpg


The 3070 is unusually strong here, comfortably beating the 2080 Ti when they're usually within 1-5% of one another, with the 2080 Ti even beating it sometimes. The 2080 Ti is 29% faster than the PS5, so within the norm. The PS5 is in turn 9% faster than the 2070S, again, within normal range. The 3070 is a bit of an outlier, albeit, not a massive one.

Could be a matter with pixel fill rate.
The PS5 has a higher pixel fill rate than the 2070S. And the 2080Ti has higher than the PS5. And the 3070, higher than the 2080Ti.
 

Bojji

Member
Could be a matter with pixel fill rate.
The PS5 has a higher pixel fill rate than the 2070S. And the 2080Ti has higher than the PS5. And the 3070, higher than the 2080Ti.

2080ti is just more powerful GPU overall (than PS5 and 2070S). Difference between 3070 and 2080ti is most likely driver or game optimizations for newer architecture.
 

winjer

Gold Member
2080ti is just more powerful GPU overall (than PS5 and 2070S). Difference between 3070 and 2080ti is most likely driver or game optimizations for newer architecture.

With the game using mesh shaders, it means a ton of triangles being drawn.
Now consider that rasterization is process of converting geometric primitives (points, lines, triangles) into pixels.
This means we are rasterizing a lot more pixels than usual, hitting the ROPs a lot harder.
 
I should focus these videos on PC, on how they work in different configurations, what are the recommended settings according to the different GPUs, if it is well optimized and things like that, but that would be asking too much of Alex, what he likes is to compare PS5 with PC to make a fool of the console and that's it, it's an absurd comparison.
 

Bojji

Member
With the game using mesh shaders, it means a ton of triangles being drawn.
Now consider that rasterization is process of converting geometric primitives (points, lines, triangles) into pixels.
This means we are rasterizing a lot more pixels than usual, hitting the ROPs a lot harder.

With mesh shaders who knows, this is first game using it. Usually 3070 and 2080ti are on par outside of some RT workloads.

I should focus these videos on PC, on how they work in different configurations, what are the recommended settings according to the different GPUs, if it is well optimized and things like that, but that would be asking too much of Alex, what he likes is to compare PS5 with PC to make a fool of the console and that's it, it's an absurd comparison.

He wants to know what settings console versions uses to get best "bang for the buck" settings that according to developers themselves are the most optimal. For Alan Wake this doesn't work that well because Remedy set some things too low and they look ugly but for most games it works really well.

Quality mode settings are Medium High and a setting or other on Ultra.

He shows that in the video, the reason he used performance was to benchmark the game on PC with identical settings.

You guys are crazy 🤣

Yeah, hate for DF and Alex here is quite stupid. It's probably thanks to him that shader stuttering was even brought to light and now it's not a problem in most new releases.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
I should focus these videos on PC, on how they work in different configurations, what are the recommended settings according to the different GPUs, if it is well optimized and things like that, but that would be asking too much of Alex, what he likes is to compare PS5 with PC to make a fool of the console and that's it, it's an absurd comparison.
This video was to address the system requirements controversy. "WTF, unoptimized game! Consoles are running this at 60fps and you need a 3070 for 540p!?"

Then Alex set out to demonstrate that the Medium preset isn't Medium at all and why the requirements are so high such as the game having ray traced reflections even in non-RT mode. A 3070 was very specifically picked because this is what was in the specs sheet for 1080p Medium/DLSS Performance. It turns out that the game isn't badly optimized at all on PC and the console performance very much falls in line with expectations. He's working on another deep dive video due next week.

Could be a matter with pixel fill rate.
The PS5 has a higher pixel fill rate than the 2070S. And the 2080Ti has higher than the PS5. And the 3070, higher than the 2080Ti.
Possibly, but in real life scenarios, the 3070 would have about a 10-12% pixel fill rate over the 2080 Ti. Their reported boost clocks heavily undersells the 2080 Ti. NVIDIA says 1545MHz for the 2080 Ti but 1725MHz for the 3070. They're both way below real world results but the 2080 Ti more so. Typically, a 2080 Ti will boost to around 1900MHz in games and the 3070 to around 1960 ish. 96 vs 88 ROPs in favor of the 3070 would result in a small pixel fill rate advantage but I don't think this explains the overall 11% advantage at the end of the pipeline that has way more variables than just ROPs/pixel fill rate.
 
Last edited:
Top Bottom