• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Assassin's Creed Valhalla will run at 4K 30FPS on Xbox Series X

So you value graphics over gameplay

And this is why many devs will target 30fps

Gamers have no standards
For me, personally, you know, in my opinion, not yours, Assassin's Creed plays fine at 30 fps. So in this case yes - I value graphics. To me 60 fps doesn't really add anything to the gameplay in this case. Graphics do however.

Isn't opinions fun? Not that you seem familiar with the term but whatever 🥰
 
Last edited:

MMaRsu

Banned
For me, personally, you know, in my opinion, not yours, Assassin's Creed plays fine at 30 fps. So in this case yes - I value graphics. To me 60 fps doesn't really add anything to the gameplay in this case. Graphics do however.

Isn't opinions fun? Not that you seem familiar with the term but whatever 🥰

Lol how can you say it adds nothing to the gameplay when the game is just smoother at 60fps.

It sounds to me like you have never even played it at 60fps, so how can you judge it?
 
Lol how can you say it adds nothing to the gameplay when the game is just smoother at 60fps.

It sounds to me like you have never even played it at 60fps, so how can you judge it?
Of course I haven't played Valhalla, but I've played several of the other AC games on PC at 60 fps. For this type of game, to me, it doesn't add anything. Or rather, the improved graphics gives me more pleasure. I just think they play just fine at 30 fps. You really don't need to agree.
 
D

Deleted member 775630

Unconfirmed Member
Lol how can you say it adds nothing to the gameplay when the game is just smoother at 60fps.

It sounds to me like you have never even played it at 60fps, so how can you judge it?
Key word is to him. He doesn't care that much about framerate and prefers better graphics. Is his opinion, there is no wrong or right.
 

Riven326

Banned
They just don't get that PS4 is the target hadware and next gen upscale FROM PS4, not the opposite. They don't get you need 8tflops with RDNA1 to upscale to 4k a PS4 game with a Jaguar CPU. They don't get that SX is mostly the double of that requiement (12 TFLOPS ANNNND RDNA2). They don't get Zen2 is about 4X more powerfull than a last gen Jaguar. They don't get it has extra fast ram + SDD. We had watch DF.

But we get that what Phil said about next and 60 FPS and it's not real... again. Do we know we want FPS other Ray Tracing?

Now, i'm sticking with my oneX and Pro.
I'm sure Phil was talking about Microsoft first party titles. But see, that's where shit gets muddy. Because he was being obscure about the details on purpose in an attempt to mislead gamers into believing that the Series X was going to deliver all games at 4k/60hz without actually saying it. So it gives him a way out when confronted with games like AC that are running at 30hz.

He can say but I never said all games were going to run at 60hz. Which is true. But he purposely leaves out the details so that it is implied, and this is what helps create these unrealistic expectations.
 

Xenon

Member
Looks like they assainated people's dreams of a 60fps next gen. I'm fine since it's cinematic game. Some devs are always going to want to put more shit on the screen vs getting that extra fps. Especially in a first wave cross gen title.

This is a 3rd party game. Remember their job is to sell games, not justify your early adopter purchase. Please show me a cross gen title where people didn't say it doesn't look like a generational leap.
 
Eye test booked.

The difference between 1080p and 4K is night and day on any TV bigger than 32" or so. 1440p and 4K not so much. 1800p and 4K hardly any.

Lmao I actually don't have the best vision, so I'm way ahead of you (I usually sit about 5 ft away from a 55 in 4k tv because otherwise I can't focus on what I'm doing). I get that there is a difference, but it just isn't that substantial to my eyes. I'd have to pause the game and compare side by side to notice because I'm usually so focused on what I'm doing. The framerate difference is much more noticeable to me.
 
Last edited:

Vawn

Banned
How about 1080p 60fps instead? 4K seems a bit overrated, and I have a 4k tv. I haven't really noticed much of a difference in the same way I would going from 30 fps to 60 fps.

There is what makes a better gaming experience and then there is what is easier to market and advertise.

Ubisoft, along with most game publishers, choose the latter.
 

Gavon West

Spread's Cheeks for Intrusive Ads
Meh, I don't care. As long as its not janky and the visuals and sound is up to snuff, Im good with it.

This probably isn't due to the Series X not being able to run it at 60fps
More than likely a creative decision. We gotta remember, features like instant travel and RT will add to the overall experience. I look forward to it.
 
Last edited:
There is what makes a better gaming experience and then there is what is easier to market and advertise.

Ubisoft, along with most game publishers, choose the latter.

Yeah, I remember at the start of the PS360 gen, 1080P FULL HD was everywhere, even if the game looked terrible. They were just trying to sell TVs.
 

Journey

Banned
PS5 ssd is too fast and they wanted parity


Ha!

That's the word I was looking for. Dat parity smh
That didn't take long.




🤣




Lol

But in his defense... *ducks tomato

both Xbox 360 and PS3 were considered to be 720p as standard, but that didn't stop games from running at 600p like CoD. Same for PS4, you can consider 1080p the standard despite a few games running at 900p.
 
Last edited:

Geki-D

Banned
People expected 60/4K to be the new minimum (people who never learn)
Well I think MS literally said to expect that. Though I don't think people should count on 60/4K always, like obviously there will be games that push graphics so much either one will drop, but for a cross gen launch game like Valhalla I'm pretty sure not hitting that 60/4K is just bad optimisation. Do we know the fps & rez of any of the other games they've shown? I'm pretty sure all of them will be using ray tracing too, if they're all doing 30fps then something might be up but I've got the feeling only AC will have this problem.
 

DeepEnigma

Gold Member
Ha!

That's the word I was looking for. Dat parity smh




Lol

But in his defense... *ducks tomato

both Xbox 360 and PS3 were considered to be 720p as standard, but that didn't stop games from running at 600p like CoD. Same for PS4, you can consider 1080p the standard despite a few games running at 900p.

Is it really hard for this man to use phrases like, "our goal is" or "we're shooting for"?
 

Jayjayhd34

Member
Ha!

That's the word I was looking for. Dat parity smh




Lol

But in his defense... *ducks tomato

both Xbox 360 and PS3 were considered to be 720p as standard, but that didn't stop games from running at 600p like CoD. Same for PS4, you can consider 1080p the standard despite a few games running at 900p.

That clears it up I've always considered standard to mean something else don't know if that's a British thing or just me personally.
 

Vawn

Banned
Well I think MS literally said to expect that.

Because Phil said so. The focus was the more FPS over/with high resolution.

I'm sure they worded it more carefully and never used words like "minimum".

If one game runs at 4K/60FPS or better they can say things like, "Enjoy games in stunning 4K graphics with a solid 60 FPS or higher", and not be lying.

Every company does things like this, but one of my biggest issues with Xbox since the start of the Xbox One generation is they overpromise and underdeliver too often. They would rather get you excited now and deal with your disappointment later than not have a lot of hype, but have fans be excited when expectations were exceeded.

The way they hyped this last Inside Xbox was the latest example of this. I expect when more of these same resolution vs framerate issues crop up, we will realize that they may have overhyped just how much of a difference next-gen consoles will actually have.
 
Last edited:

NullZ3r0

Banned
This is just Ubi being non-committal. After the Watchdogs debacle, they won't commit to 4k/60 months from release.

But don't expect maximum effort from them on this.
 
Yea i think dose of reality for 4k 120 fps guys is gonna hit soon . Many will prioritise pretty graphics over fps
Most dev's will definitely prioritize prettier graphics, without a doubt. The reality is it's currently easier to sell prettier graphics over better fps. Most of the general public has no idea what the hell FPS even means. If you asked them what FPS their movies are playing at they couldn't tell you, they'd probably ask you what the hell does fps mean? Does that suck? Yea, it kinda does.
 
Last edited:

Polygonal_Sprite

Gold Member
Here's your windows overhead delusion.

Memory:

2gb OS windows usage
2,5gb OS xbox

CPU:

Windows uses 3% usage on a 8 core 16 thread 3700 ryzen.
Consoles lock 1 core out of the 8 away which equals 12,5% usage for OS tasks.

GPU:

Windows: 0-1% usage on PC
Consoles? probably the same

That’s good to know. Thanks for sharing the stats. I will happily concede I was wrong on this front and had an outdated view of how much performance Windows actually takes up.

Personally I’ve had situations where my games framerate tanked on my 2070 and when I pulled up the task manager my Windows Defender was pulling away 40% of my CPU usage and on several other occasions Windows 10 was downloading updates which caused massive performance losses to the games I was playing.

U do realize consoles are the same these days as PC's right? Let me help you a bit

Ubisoft has to optimize for
Xbox series X
PS5
Lockheart ( if that even releases )
PS4
PS4 pro
Xbox one X
Xbox one
Xbox one S

What if old consoles are getting phased out after 2-3 years.
U will have:
Xbox series X
Xbox series X slim
Xbox series X Pro
PS5
PS5 pro

3rd party dev will look at all those boxes. much like how they look at PC.
So the PS5 is the weakest? Lets build it around there and just boost some graphical settings on the other consoles if we got time for it otherwise we just lock it either way.

I get that but my point was that there are thousands of possible PC set up’s from low end laptops all the way to the highest end gaming rigs. It’s not the same thing at all even if you include PS4, XB1, PS4 Pro, XB1X, PS5, PS5 Pro, XBSX and XBSX X (let’s throw Switch and Stadia in for the fuck of it). That’s ten possible set ups with four of them being on the exact same architecture with more than likely extra GPU power only being added (next generation consoles and their upgraded versions). A bit different from the thousands of unique component combinations a gaming PC can have for developers to try and optimise around. This is one of the major reasons you get the likes of AC Unity, Ryse, Arkham Knight etc that run like absolute dogshit on PC and it took two generations of GPU to be able to bruteforce 60fps in them five years later.

PC has low level api's mate this is not the year 2000 anymore. U should google vulkan, game mode and dx12. They are all designed to mirror console space as much as possible performance wise and frankly that's exactly what it does.

And how many big name AAA console games support those low level PC API’s? A fraction of a single percent so no it’s not like a console where you’re working with chips soldered to the main board and all the I/O latency that it cuts out not to mention things like the Velocity engine which gives engineers even more performance from the silicon. PC’s do not get the same performance chip for chip. Carmack has even said you get close to double the performance on console vs PC component vs component because of all of the above. That’s why we can have the likes of Quantum Break, Uncharted 4, Forza Horizon 4, God of War, Gears 4, Horizon and Spider-Man etc running on absolute trash laptop hardware from 2012.

————————————————

I don’t know what you’re trying to prove with the second part of your post. Not you or anyone else can say “next gen consoles can’t do RT at 4k lol” why? because you’re basing your knowledge of RT and it’s performance costs on PC software (patched current gen games like BF, Metro and Minecraft) running on Nvidia hardware using very specific Nvidia developed API’s for RT on a PC.

You don’t have any idea what AMD have in terms of their custom RTsilicon in the hardware of Series X / PS5 and what custom API’s they’re using to leverage RT in next gen games (RT games built around the technology and not current gen games patched to support it which are never going to be as efficient or run as well as games built with the technology in mind from the start).

It then depends on what type of game it is, what you are using RT for (is it for GI like Metro, reflections like Battlefield, shadows like Gears 5 or a combination of many etc).

There are soooo many variables that it’s utterly pointless comparing a Series X or PS5 using custom AMD RT hardware and brand new RT API’s combined with built from the ground up RT supported next gen games.

I get it. You’ve probably spent a grandupgrading your PC to support RT at 4k/30fps so the thought of a mere peasants console outperforming your “rig” for a fraction of the price probably annoys the fuck out of you 😂

Let’s wait and see how built from the ground up next gen games that use RT look and run on Series X and PS5 before you spunk in your pants in celebration of the master race.

Everything else is pointless comparisons with only one side having current data available (an Nvidia RTX equipped PC).

PC’s and consoles are not the same and never will be. Despite them using similar hardware nowadays.
 

VFXVeteran

Banned
"If they was able to make these kinds of graphics on a 1.8TFLOP console, imagine what they will do with 10TFLOP!!"

LOL

Nothing if they run the game at 4k. Like I said before over and over. 4k is a bandwidth hog and these consoles don't have good bandwidth to sustain 4k@60FPS with any complex graphics scenarios. If the 2080Ti couldn't do it, the consoles definitely can't do it. It doesn't matter whether you program to the "metal" or not. Bandwidth is bandwidth. Period.

Yet another brutal reality I tried to convey many many months ago..
 
I'm sure Phil was talking about Microsoft first party titles. But see, that's where shit gets muddy. Because he was being obscure about the details on purpose in an attempt to mislead gamers into believing that the Series X was going to deliver all games at 4k/60hz without actually saying it. So it gives him a way out when confronted with games like AC that are running at 30hz.

He can say but I never said all games were going to run at 60hz. Which is true. But he purposely leaves out the details so that it is implied, and this is what helps create these unrealistic expectations.
Just like the gears 5 benchmark on series x. Mentioning 100fps in the benchmark, right around the time of the comparison to running similar performance to a rtx 2080 @4k rez. Disingenuous af
 

Virex

Banned

Ubisoft responded to Portuguese Eurogame that Valhalla will run at minimum 30FPS in 4K on XSX. Not sure if there will be other modes yet.

Some people won't like it . But imo for the type of game it is smooth 30 fps works great.

6IoKKCv.jpg
 

Atomic Odin

Member
They ported the ezio trilogy to PS4 at 30 fps and that was without any bells and whistles that you are gonna see on next gen. I don't know why people are surprised when it comes to Ubisoft, their focus on consoles was probably never at achieving 60 fps.
 
Most dev's will definitely prioritize prettier graphics, without a doubt. The reality is it's currently easier to sell prettier graphics over better fps. Most of the general public has no idea what the hell FPS even means. If you asked them what FPS their movies are playing at they couldn't tell you, they'd probably ask you what the hell does fps mean? Does that suck? Yea, it kinda does.
They might not know what fps means, but they'd likely prefer the experience of higher fps even if they don't understand why.
 

Portugeezer

Member
Here's the deal. They are making 7 versions of this game XB1/1X/SX, PS4/PRO/5, PC.

They don't give enough of a shit to optimise for 60fps, "optimised for series X" is a bullshit lmao.
 

Vawn

Banned
They might not know what fps means, but they'd likely prefer the experience of higher fps even if they don't understand why.

True, but only AFTER they already bought the game. It doesn't lead them to be more likely to buy the game from seeing a commercial or screenshots like prettier graphics.

These companies obviously care about what makes a game sell, more than what makes a game better.
 
Last edited:

Jaxcellent

Member
I rather have no ray tracing and 60 frames a sec.... It looks so much better on a projector,but then again 4k makes a huge difference on a projector aswell.. after i saw that Digital Foundy vid yesterday, im even more convinced ray tracing is overrated.. it looks great on those old blocky games like quake and minecraft but i couldnt see a difference in that cyber ninja game or whatever... Maybe in the future it will be worth it but now id rather they hit 4k/60 first...
 
Top Bottom