• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.
  • The Politics forum has been nuked. Please do not bring political discussion to the rest of the site, or you will be removed. Thanks.

Assassin's Creed Valhalla will run at 4K 30FPS on Xbox Series X

Journey

Banned
Aug 18, 2014
2,928
2,678
620
That clears it up I've always considered standard to mean something else don't know if that's a British thing or just me personally.


It depends on the context, for hardware, when you say a console will have a hard drive as standard, it means all models will have it, no excpetions, just like a car, saying ABS or Airbags as "Standard" means every model will contain ABS brakes and Airbags, so I can see how people will interpret that statement as all games guaranteed to be 60fps, but I don't interpret it that way, I see it like 360/PS3 generation where 720p was the standard, yet we had some games lower than 720p and even some games as 1080p. So it stands to reason that most games will run at 60fps (720p) go below 60fps (600p) or go beyond and hit 120fps (1080p).
 

DoctaThompson

Banned
Jan 5, 2020
1,643
2,351
570
Big Caulk County
If it does spec for spec then yes, the devs are intentionally capping the framerate on the consoles.
I think that is the problem though. People are thinking the series x it's the equivalent of a 2080 and a 3700x cpu. They are not the same. Especially when you take into account of having dedicated system memory. Take a look at the Minecraft demo and you'll see where things start to fall apart.

It's crazy to think Ubisoft would nerf the framerate of next gen consoles. Only getting 61fps on a 2080 TI in Odyssey didn't bat an eye when it released, but look how things are backfiring now for Ubisoft. Next gen consoles are the biggest thing to happen for casual gamers in the past 7 years. Ubisoft would be crazy not to upsell consoles, in the best light possible. The same people who would buy into the AC franchise, would buy Valhalla either way. Now to see a "next gen" version of the game, could easily make AC one of the biggest launch games, ever. It's not about Ubi gimping consoles, as that would never help the company, as a pc gamer or a console gamers perspective.

The real problem is buying into the hype from Microsoft and Sony PR. Take that and all the hypeman warriors on both sides, and you get people thinking next gen consoles will give you 8k120fps because of the SSD.
 
Last edited:
  • Like
Reactions: VFXVeteran

DeepEnigma

Gold Member
Dec 3, 2013
49,171
104,506
1,430
I think that is the problem though. People are thinking the series x it's the equivalent of a 2080 and a 3700x cpu. They are not the same. Especially when you take into account of having dedicated system memory. Take a look at the Minecraft demo and you'll see where things start to fall apart.

It's crazy to think Ubisoft would call the framerate of next gen consoles. Only getting 61fps on a 2080 TI in Odyssey didn't bat an eye when it released, but look how things are backfiring now for Ubisoft. Next gen consoles are the biggest thing to happen for casual gamers in the past 7 years. Ubisoft would be crazy not to upsell consoles, in the best light possible. The same people who would buy into the AC franchise, would buy Valhalla either way. Now to see a "next gen" version of the game, could easily make AC one of the biggest launch games, ever. It's not about Ubi gimping consoles, as that would never help the company, as a pc gamer or a console gamers perspective.

The real problem is buying into the hype from Microsoft and Sony PR. Take that and all the hypeman warriors on both sides, and you get people thinking next gen consoles will give you 8k120fps because of the SSD.

Spec for spec, what does this mean I wonder? :pie_thinking:

I don't need your PCMR story.
 

Rubberwald

Member
Feb 18, 2019
799
1,113
380
How about 1080p 60fps instead? 4K seems a bit overrated, and I have a 4k tv. I haven't really noticed much of a difference in the same way I would going from 30 fps to 60 fps.

Depends on a game and things like anti-aliasing, but I recently played a game called The Vanishing of Ethan Carter and on Xbox One X you can switch resolution instantly, between 1080p, 1440p and 4K. You might think there is not a lot of difference until you see it one right after another, even between 1440p and 4K.
 
Last edited:
  • Like
Reactions: Godzilla Emu

Jayjayhd34

Member
Oct 22, 2018
888
552
320
why not 1440p or 1080p at 60fps like a performance mode?


Only developer can explain how easy is incorate speperate modes. The only problem I see is if did that

Thats 5 version before taking into consideration the testing on individual graphics carrds and processors etc on PC.

However this purely speculation the only right answer is from someone in dev community.
 
Last edited:
  • Like
Reactions: VCL

Hendrick's

Member
Jan 7, 2014
9,107
15,688
995
For those smarter than me, how hard would it be to incorporate VRS on a cross-gen game like this?
 

VFXVeteran

Professional Victim (Vetted)
Nov 5, 2019
5,460
12,413
805
why not 1440p or 1080p at 60fps like a performance mode?

The problem with lower resolution is it being too low to see the entire rendering approximation. Upscaling takes away from the true image. Running at lower res also takes away when you are running on a native 4k screen because the internal TV will have to upscale it to 4k which causes blurring of pixels.

The only real way to enjoy good clarity graphics is to run at 4k native. Or run at the native resolution of the medium to get a perfect 1:1 pixel to pixel ratio from framebuffer to TV.
 

Jayjayhd34

Member
Oct 22, 2018
888
552
320
The problem with lower resolution is it being too low to see the entire rendering approximation. Upscaling takes away from the true image. Running at lower res also takes away when you are running on a native 4k screen because the internal TV will have to upscale it to 4k which causes blurring of pixels.

The only real way to enjoy good clarity graphics is to run at 4k native. Or run at the native resolution of the medium to get a perfect 1:1 pixel to pixel ratio from framebuffer to TV.


Totally true upscaling will never better than native however good 4k TV has really good scaling while it looks worlds better at 4k native 1080p material still looks pretty good, considering TV shows outside amazon and netflix havnt gone 4k I would say good 60-70% my content is 1080p if broadcast they can be SD witch is where it really gets bad.

There's also augment to make that not everyone has gone 4k yet.
 

Harlock

Member
Jul 6, 2011
5,852
1,387
1,135
 
  • Like
Reactions: AngryWhiteMan

Neo_game

Member
Mar 19, 2020
730
733
315
The problem with lower resolution is it being too low to see the entire rendering approximation. Upscaling takes away from the true image. Running at lower res also takes away when you are running on a native 4k screen because the internal TV will have to upscale it to 4k which causes blurring of pixels.

The only real way to enjoy good clarity graphics is to run at 4k native. Or run at the native resolution of the medium to get a perfect 1:1 pixel to pixel ratio from framebuffer to TV.

Native 4K is just waste of resources IMO. It is a cross-gen game so I think it should not have a problem. But otherwise I am sure things like gfx detail, fps will be given importance than native resolution.
 

Puskas

Member
Sep 14, 2018
218
540
325
Oh, Ubisoft.
”30 was our goal, it feels more cinematic. 60 is really good for a shooter, action adventure not so much. It actually feels better for people when it's at that 30fps. It also lets us push the limits of everything to the maximum.

It's like when people start asking about resolution. Is it the number of the quality of [sic] the pixels that you want? If the game looks gorgeous, who cares about the number?”

 
  • Fire
Reactions: Night.Ninja

diffusionx

Member
Feb 25, 2006
14,563
14,699
1,800
For those smarter than me, how hard would it be to incorporate VRS on a cross-gen game like this?

Chances are it is using it on the platforms that support it.

These tech things are a meme. That's not to say they aren't helpful and beneficial - they're not blast processing - but they're not the magic secret sauce people on the internet make them out to be. And if Ubi decided to make the game 30fps, that's that.
 

scalman

Member
Feb 6, 2019
2,975
2,109
415
its enough. 30 fps stable its what is enough for adventure games single player ones. and all will get 60 fps option too. so calm fc down.
 

Journey

Banned
Aug 18, 2014
2,928
2,678
620
Native 4K is just waste of resources IMO. It is a cross-gen game so I think it should not have a problem. But otherwise I am sure things like gfx detail, fps will be given importance than native resolution.


There's no mention on whether you'll have a "Performance mode" in the options menu for people who don't have a 4K TV or who are fine with a 1440 resolution at 60fps. My pitchfork and torch are put away until I find out more lol.
 

scalman

Member
Feb 6, 2019
2,975
2,109
415
i remember playing Ps2 games at any fps they where that time and then playing ps3 games at like 28fps most times and thinking wow what a game , and now people wont buy game because of some fake news about fps, i mean this went long way , and you shouldnt be called gamer for this at all. i dont care, i played witcher 3 in time at 20-15fps in worse cases and i enjoyed game on 840m gpu laptop. never i thought : wish i could play it at more fps. it just doesnt matter .
 
Last edited:
Jan 29, 2019
6,252
6,822
520
I think imma pull the trigger and go with AMD this year. But probably won't get a super fast SSD for a year or so. My Samsung's are doing just fine for now.
At this point in time even a 533MB/s SATA SSD will give you 95% is the benefits a 3.2GB/s NVmE drive will.

Hopefully we will eventually have some credible use case for all that extra speed.
i remember playing Ps2 games at any fps they where that time and then playing ps3 games at like 28fps most times and thinking wow what a game , and now people wont buy game because of some fake news about fps, i mean this went long way , and you shouldnt be called gamer for this at all. i dont care, i played witcher 3 in time at 20-15fps in worse cases and i enjoyed game on 840m gpu laptop. never i thought : wish i could play it at more fps. it just doesnt matter .
It matters, but I think that very few people will make their final decision on buying a game or not on a game's resolution or performance.

Still, if you happen to own a PS4 and a PC (or Switch/xbox whatever) you are very likely to choose the platform where your desired game runs best--obviously, few people have more than one gaming platform, many who have PCs/Macs don't play on it.

Then there is the general preference factor, some people have allegiance to one platform, even if they bought the other one they will just ignore the benefits.

So it's important in a prestige way, it will help a company establish their machine as the "true next gen experience", a bit like the PS4 did early on in the last generation, and then it will drive up sales to a certain demographic who cares about this, the hope is that they will spread the buzz to their friends and eventually 100million consoles will be sold just like that... Assuming the games are showing up to back the claims.
 
Last edited:
  • Praise the Sun
Reactions: DoctaThompson

DoctaThompson

Banned
Jan 5, 2020
1,643
2,351
570
Big Caulk County
At this point in time even a 533MB/s SATA SSD will give you 95% is the benefits a 3.2GB/s NVmE drive will.

Hopefully we will eventually have some credible use case for all that extra speed.
And based off what has been said about the Unreal Engine demo, many people won't even need to upgrade. Of going above and beyond that, couldn't hurt, and could possibly have added benefit! This will be a great year for everyone!
 
  • LOL
Reactions: Persian buttercup