• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Hogwarts Legacy recommended specs 1080Ti, i5-8400

cormack12

Gold Member


  • RECOMMENDED:
    • Requires a 64-bit processor and operating system
    • OS: Windows 10
    • Processor: Intel Core i5-8400 OR AMD Ryzen 5 3600
    • Memory: 16 GB RAM
    • Graphics: NVIDIA GeForce 1080 Ti or AMD RX 5700 XT
    • DirectX: Version 12
    • Storage: 85 GB available space
    • Additional Notes: SSD, 1080p/60 fps, High Quality Settings, Upscale Quality Setting
 

Hugare

Member
"Upscale Quality Setting"

Seems like they are adding their own upscalling sollution, just like Spider-Man on PC

It doesn´t look any better tha a ps4 game like HZD or God of war, insane specs.
God of War is not open world and HZD is kinda old by now

Dont think that these specs are demanding by any means, imo

The game will probably scale just fine, since they are also making it for the Switch
 

Fbh

Member
I'm more curious about how this will supposedly run on a Switch when the minium specs are a 1070 and a ntel Core i5-8400.

I know it will be some custom version way below "low settings" on PC. But still, look at the sacrifices they've had to make on games like Witcher 3 and Doom eternal which require lower specs.
 

Northeastmonk

Gold Member
I wonder what the Switch’s file size will be. Pushing some people to buy bigger memory cards. You’d have to have some decent storage on your Steam Deck too. It’ll be cool if the Steam Deck rocks this game.
 

Stuart360

Member
God of War is not open world and HZD is kinda old by now

Dont think that these specs are demanding by any means, imo

The game will probably scale just fine, since they are also making it for the Switch
Its the first time i have seen a 1080ti listed for the 'recomended settings'. Even Cyberpunk doesnt.

Still this is next gen only right?, so it makes sense i suppose, especially as its for 60fps.
 
Last edited:

Hugare

Member
Its the first time i have seen a 1080ti listed for the 'recomended settings'. Even Cyberpunk doesnt.

Still this is next gen only right?, so it makes sense i suppose, especially as its for 60fps.
These specs are aiming for 60 fps / High settings tho

Cyberpunk and other games minimun specs usually aim for Low settings / 30 fps
 

Stuart360

Member
These specs are aiming for 60 fps / High settings tho

Cyberpunk and other games minimun specs usually aim for Low settings / 30 fps
Oh you were talking about the minimum setings?. Well still, i dont think i have ever seen a 1070, or similar power gpu, listed for 'minimum settings' on any game yet.
Both the minimum and maximum settings are very high, but if this is the first proper 'next gen' open world game then they are pretty fair i suppose.
 

Stuart360

Member
Should be RTX2060 minimum, then we could get some jump.
Er you realize the 1080ti is considerably more powerful than a 2060 right?. In fact even the GTX 1080 beats the 2060 in most games.
If you check Youtube benchmarks, a 1080ti is very close to the 2070 Super in benchmarks, and actually beats the 2070 Super in most games., and a 2070 Super can exceed PS5 and XSX in game benchmarks.

 
Last edited:

yamaci17

Member
probs overestimation or high settings are unoptimized. 1080ti is "not that" far away from ps5/series x, which is alarming for thse consoles too (especially if they still say you need to use "upscaling". not even native 1080p gets you 60 fps with a 1080ti? then you can pretty much kiss goodbye to a proper 1440p 60 fps on sx / ps5 as well
 

yamaci17

Member
Er you realize the 1080ti is considerably more powerful than a 2060 right?. In fact even the GTX 1080 beats the 2060 in most games.
If you check Youtube benchmarks, a 1080ti is very close to the 2070 Super in benchmarks., anfd a 2070 Super can exceed PS5 and XSX in game benchmarks.
there are some games that use specific dx12 instructions that makes 2060 somewhat faster than 1080Ti

2060 outperforms the 1080ti in fc6 at 1080p and ties it at 1440p
https://www.techpowerup.com/review/far-cry-6-benchmark-test-performance/4.html

2060 simply murders and decimates the 1080ti in halo infinite;
https://www.techpowerup.com/review/halo-infinite-benchmark-test-performance/4.html

and they are mostly "tied" in cyberpunk
[/URL][/URL][/URL]

its not happening with every game (as a matter of fact, in fh5 and dying light 2, even more recent dx12 titles, 1080ti still outpeforms 2060). but there are already occurences where 1080ti falls below 2060 and cannot even hold a candle to 2070-2070s)

this will only be more frequent, with more engines fully embracing dx12/dx12u features that both turing and ampere cards can benefit from. remember that even pascal had jank and wonky dx12 support, so they will fall short in certain games with certain workloads

i'm not generalizing anything by the way, nor I'm claiming 2060 is a better GPU than 1080ti. i just say that 2060 can outperform 1080ti or 1080ti can be slower than 2060 in certain cases, and those cases will be more frequent, despite their relation in the past. spiderman for example works faster on 1080ti than 2060. maybe it depends on how much work developers do for older architectures. but at this point, it comes down to this anyways. it all depends how much work dev put into for your architecture.

look at guardians of galaxy, a 1050ti outperforms a gtx 980 in that game, because developers simply refused to optimized the game for maxwell, and game runs horrible on maxwell cards, while having acceptable performance on even entry level pascal cards. this reflected on their requirements too, having a pascal card as minimum (i gather this game too will show similar characteristics for maxwell. not saying they should do it, but there will come a time that devs will simply refuse to optimize for pascal as well. and considering maxwell already started getting this treatment, that day is even closer for pascal. but that does not matter anyways, they're simply, 6+ years old and they proved their worth honestly)
 
Last edited:

Larxia

Member
Should be RTX2060 minimum, then we could get some jump.
Outside of things like DLSS compatibility, a 1080ti is faster than a 2060, so a recommended 2060 would be a lower requirement actually.
It's a mid-upper tier GPU from 6 years ago
It's definitely getting old, but it never was "mid-upper", 1080ti was the high end back then, there were no 90 during that gen.
 
Last edited:

Krathoon

Member
I take it a 3060 RTX would work fine. I assume that the game will use the extra stuff that an RTX has.

I bet you can use DLSS.
 
Last edited:

Stuart360

Member
look at guardians of galaxy, a 1050ti outperforms a gtx 980 in that game, because developers simply refused to optimized the game for maxwell, and game runs horrible on maxwell cards, while having acceptable performance on even entry level pascal cards. this reflected on their requirements too, having a pascal card as minimum (i gather this game too will show similar characteristics for maxwell. not saying they should do it, but there will come a time that devs will simply refuse to optimize for pascal as well. and considering maxwell already started getting this treatment, that day is even closer for pascal. but that does not matter anyways, they're simply, 6+ years old and they proved their worth honestly)
To be fair GOTG is a unique case as its absolutely botched on Maxwell, BUT if you download Nvidia drivers before a certain version, the game runs good and how it should do.

Funnily enough i have a back up PC with a 980, and yeah i Googled for fixes and the older driver fix worked. Went from around 15fps at 1080p max settings, to around 50fps at 1080p max settings with the older driver.

I dont know if that would class the problem as a Nvidia problem, or the actual game devs.
 

lukilladog

Member
LOL how is that high considering 1080Ti is from 6 years ago .

There are those who despise cross gen looking games and then there are those like you 🤷‍♂️

Too much power required for ps4 like graphics, I swear it´s not that complicated.



XS6Z5lS.jpeg

Cs0RjEI.jpeg
 
Last edited:

Rickyiez

Member
Too much power required for ps4 like graphics, I swear it´s not that complicated.
Doesn't means its as optimized . Saint Rows looking like PS3 graphics but asking for RTX2080 . This is not new in PC graphics requirement I swear it not complicated as well .
 

Andodalf

Banned
Outside of things like DLSS compatibility, a 1080ti is faster than a 2060, so a recommended 2060 would be a lower requirement actually.

It's definitely getting old, but it never was "mid-upper", 1080ti was the high end back then, there were no 90 during that gen.
Was replying to min spec
 

usp84

Member
I have the the exact same specs on my PC(AMD). First time i have seen them at the recommended section for high settings
 

lh032

I cry about Xbox and hate PlayStation.
? If an 1080Ti is recommended for 1080p/60, consoles will fart or have some medium/low settings. Or pc version is just bad.
nah console performance is higher than that, and console optimization is different than PC.
I dont think PC version is bad btw
 
Last edited:
Maybe the Switch version won't be a cloud version?
No, it’ll just be heavily compromised and run like utter trash, and probably look far worse than the The Witcher 3 port. Still not sure how they’re even gonna pull this off. Best of luck to them! I’ll be playing it on my Series X, because even though I love my Switch, it’s really not meant to run stuff like this.
 

Guilty_AI

Member
Its a spec for 60 fps. No wonder its so high despite not looking much better than other games with lower requirements.
 

Krathoon

Member
Recently, I installed the old Harry Potter games. They are pretty neat. Some of them are kind of like Zelda games.

Some of the transparencies were not working right. I had to used dgVooDoo2 and a fan patch to fix it.
 
Last edited:

JoduanER2

Member
LOL how is that high considering 1080Ti is from 6 years ago .

There are those who despise cross gen looking games and then there are those like you 🤷‍♂️
Did you read what you replied to? GPU prices are crazy high and stocks are as low as ever. We are in unprecedented times, with a lot of shortage in chip manufacturing.
 
If they have a Switch version, they can't make this game engine too beefy.
The Switch port was announced a mere 5 months ago. The game's been in development since 2017. Given that timeline, that would definitely not have anything to do with it. It'll require some extra work with the external studio that's doing the Switch version, and no doubt it will have some heavy downgrades vs the the other machines.

Closest example to this would be the Switch port of Witcher 3 vs running on the machines it was originally targeted for. It's technically possible to do it, sure, but yikes. It definitely won't be a looker.
 
Last edited:
Top Bottom