And Series X isn’t a bloated Windows 10 PC with all its processes and other apps running in the background using up compute.
Console - A single set of specs where developers code to the metal, no separate plug and play devices of which there can be a million different configurations like on PC.
They also have low level API’s, highly optimised middleware solutions which give additional performance on console vs PC and the Velocity engine on Series X (I’m sure they’ll be something similar in PS5).
Even ignoring the above, Series X has a roughly 17tflop GPU in comparison to the XB1X’s 6tflop GPU so it has almost 3x the compute performance of a GPU that already runs the current iteration of the Assassin’s Creed engine at roughly 2/3rd’s of full 2160p (1800p).
I will bet you $100 right now that AC Valhalla will run at native 4k/30fps on Series X if Ray Tracing is enabled (probably reflections and ambient occlusion although I wouldn’t rule out GI too).
If it doesn’t have any form of Ray Tracing then it will be dynamic 2160p/60fps. Yes, yes I know the shitty farmed our port of Odyssey runs like crap on PC so needs to be brute forced to 60.
In short, Series X is in no way, shape or form a Windows 10 PC and it’s absolutely stupidly powerful for a console, especially in comparison to the consoles when they launched in 2013 with their low end laptop CPU/GPU combo. Comparing it to a PC is pointless.
Some people need to leave their ego at the door and realise consoles are about to be on par with high end gaming rigs where they will offer either Dynamic 4k/60fps or Native 4k/30fps with RT.
Once we have a built form the ground up next gen AC game then things might change. Until then we have a 17tflop console running a game built around the constraints of a 1.3tflop XB1S...
I didn’t get into the CPU because both consoles have a CPU which is a 500% increase over current gen. Their storage devices are roughly 40x faster on Series X, 100x faster on PS5 too...
And Series X isn’t a bloated Windows 10 PC with all its processes and other apps running in the background using up compute.
Here's your windows overhead delusion.
Memory:
2gb OS windows usage
2,5gb OS xbox
CPU:
Windows uses 3% usage on a 8 core 16 thread 3700 ryzen.
Consoles lock 1 core out of the 8 away which equals 12,5% usage for OS tasks.
GPU:
Windows: 0-1% usage on PC
Consoles? probably the same
U tried.
Console - A single set of specs where developers code to the metal, no separate plug and play devices of which there can be a million different configurations like on PC.
U do realize consoles are the same these days as PC's right? Let me help you a bit
Ubisoft has to optimize for
Xbox series X
PS5
Lockheart ( if that even releases )
PS4
PS4 pro
Xbox one X
Xbox one
Xbox one S
What if old consoles are getting phased out after 2-3 years.
U will have:
Xbox series X
Xbox series X slim
Xbox series X Pro
PS5
PS5 pro
3rd party dev will look at all those boxes. much like how they look at PC.
So the PS5 is the weakest? Lets build it around there and just boost some graphical settings on the other consoles if we got time for it otherwise we just lock it either way.
WIth PC they look at what do people use right now as gpu? lets focus on that, and people can push settings higher if they have better performing hardware.
Those whole 2 cpu architectures ( which basically consoles use 1 from ) and 2 GPU architectures are some mighty hard shit to code for mate. Specially when every single engine uses them as base for everything even console games.
U didn't thought this one through did you?
They also have low level API’s, highly optimised middleware solutions which give additional performance on console vs PC and the Velocity engine on Series X (I’m sure they’ll be something similar in PS5).
PC has low level api's mate this is not the year 2000 anymore. U should google vulkan, game mode and dx12. They are all designed to mirror console space as much as possible performance wise and frankly that's exactly what it does.
Want to see the amazing 750 ti run ac odyssey, here u go. PS4 version runs at900p-1080p at anywhere from 20-30fps.
it's also almost like u can overclock the hardware while u are it, or just upgrade and get even better performance
Even ignoring the above, Series X has a roughly 17tflop GPU in comparison to the XB1X’s 6tflop GPU so it has almost 3x the compute performance of a GPU that already runs the current iteration of the Assassin’s Creed engine at roughly 2/3rd’s of full 2160p (1800p).
Yet a 2080ti that has 4x the compute performance of the xbox one X can't run odyssey even at 4k and 60 fps at ultra settings without raytracing remotely. How do visual settings work. U want them to run the game at low settings straight out of the gate? or u want crisp and high quality detail with less poppins? u know where a CPU and GPU actually has to do some worth with.
I guess it's all just magical to you.
I will bet you $100 right now that AC Valhalla will run at native 4k/30fps on Series X if Ray Tracing is enabled (probably reflections and ambient occlusion although I wouldn’t rule out GI too).
Here a really simplistic game nowhere near the detail of open world games pushing raytracing at 4k on a 2080ti which is far faster on RT then RDNA2 will ever be at 4k
However anywhere between 15 and 27 fps with huge lag spikes as the memory can't keep up.
Yet xbox series X with weaker CU's, less bandwidth, and less gpu performance will have no problem rendering complex environments with RT without effort because magic. Even shitty fake raytracing modes like from that marty guy that fake half of it can't keep a stable 30 fps at 4k on even a rtx titan. This is why RT cores are going to balloon in the next series of GPU's.
If it doesn’t have any form of Ray Tracing then it will be dynamic 2160p/60fps. Yes, yes I know the shitty farmed our port of Odyssey runs like crap on PC so needs to be brute forced to 60.
And where do you base that odyssey runs like crap on PC on? a 750ti with a potato 2 core CPu runs it at a stable 30 fps at 1080p. yea seems like badly optimized to me oh wait.
U should probably stop reading drivel from idiots and base your facts around that. It's a very very well optimized game even when at ultra settings it kills all hardware on the market. Because ultra settings on PC is mostly builded for future hardware to let it age better. And if a game doesn't offer it people will mod it.
It's clear however that u got absolutely no clue how fucking hard it is for RT to be rendered and how demanding 4k is for the GPU in those boxes.
This is why PC gamers started to laugh at the mention of 4k and RT as they know how unrealistic it is. There is no hardware in xbox series X that could push those frames forwards unless they sacrifice massively on the quality of the game and PS5 sure as hell will capitalize on it with high fps / high resolution and better draw distances. While microsoft sits there with a bit "better lightning". Good luck selling that.
In short, Series X is in no way, shape or form a Windows 10 PC and it’s absolutely stupidly powerful for a console, especially in comparison to the consoles when they launched in 2013 with their low end laptop CPU/GPU combo. Comparing it to a PC is pointless.
U wut mate.
Its practically a PC. It uses all the parts, it uses the windows core and that's exactly what they want because of BC for the future but also for now with there own boxes but also to keep it more in sync with the PC market which they aim for heavily at this point.
The xbox series X would be a high end performing box if it released right now and focused on 1080p, but because it focuses on 4k and that's the issue here its simple weak sauce.
Let me explain.
I can buy a 2080ti right now and slam 8k on it and no game will run for shit. is the 2080ti a weak gpu? no not really, rocks everything at 1080p without issue's. But it doesn't do 1080p it does 8k, is it a weak 8k card? yea its complete dog shit. And there you go with xbox series x. Welcome to the 4k club where performance dissappears in thin air. And now u understand why PC gamers stated GPU is all that matters, luckely MS understood this and actually slammed a fast GPU in it as much as they could even while they were better off going with nvidia and slam a 3080ti in that box or a full blown rdna2 90c gpu. PS5 however yea good luck with that.
This is why PC gamers laughed at the 2080ti when it came out, RT performance was a a complete joke. People would even call it a shit card. Even while its the fastest on the market besides the titan etc.
Some people need to leave their ego at the door and realise consoles are about to be on par with high end gaming rigs where they will offer either Dynamic 4k/60fps or Native 4k/30fps with RT.
The only one with a ego is you. 4k rt and 60 fps is simple not doable with the hardware in those boxes unless u drop the complexity massively and that's not what a game like ac odyssey represents. Maybe fighters and that's about it.
About your high end rigs.
Let me me tell u something, those consoles aren't out right now and when they do, they will be dwarved by PC hardware. Want a few examples.
PC 1080ti sits at 3,5k cuda cores at 2ghz pushes about 2080 gtx ( xbox series X performance, if we can believe DF on that front )
Next Nvidia GPU sits at 8k cuda cores.
xbox has ~52 cu's, 2080ti has 78rt cores Next rumored GPU has 256 RT cores. ( 2070 super = 52 rt cores )
AMD next flag ship GPU is rumored to have 120 cu's
Nvidia RT is heavily proven to work, amd the demo they showcased on the xbox reveal was laughable shit performing, the same for rtx minecraft.
Not to forget PC is riddled with memory far faster and more then those boxes have. actual memmory, will have dlss3.0 amd version of this are all heavily unproven
there is no comparison. And then people on PC will not move to 4k which will give them another level of performance boost. If cyberpunk runs with RT on on a 3080ti at barely 60 fps at 4k, consoles will probably struggle to run the same quality at 1080p 30 fps with RT on.
Once we have a built form the ground up next gen AC game then things might change. Until then we have a 17tflop console running a game built around the constraints of a 1.3tflop XB1S...
AC games are already build up from the ground up for multicore CPU's, maybe u didn't notice but PS4 and xbox all uses more cores, odyssey uses 6 cores and 12 threads on my 9900k all day long even goes as far to adress another core at times. It's a perfect fit for next gen consoles. as they will be sitting with weaker cpu's and 7 cores to adres. U act like PC doesn't exist and live in a vaccuum, yet PC is ubisofts biggest market.
I didn’t get into the CPU because both consoles have a CPU which is a 500% increase over current gen. Their storage devices are roughly 40x faster on Series X, 100x faster on PS5 too...
And complexity increases and rip CPU performance. How does stuff work.
They could have 300x faster storage devices, doesn't change shit. Guess what storage devices are used for, fucking storage holy shit.
Anyway u tried tho.