• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Inside the target specs of the next Xbox 'Project Scarlett,' 'Anaconda', and 'Lockhart' (WindowsCentral.com )

Lockhart is what you need to run 4k Scarlett games at 1080p. In fact it will be slightly more capable at 1080p than Scarlet will be at 4k as its 4tf vs 12tf. Setting wont need to be changed in most cases, it will simply be 'you put the game in Scarlett and its 4k, you put the game in Lockhart and its 1080p'.
The only think holding back next gen is 4k, not Lockhart.

What I mean is, say 75% of next gen xbox owners buy a lockhart and 25% buy a scarlett.

All of those assets need to be made for Scarlett, be it physics or 4k or whatever.

That means all that effort Is put in for only 25% of the user base to experience because you have 75% of the user base going for the cheaper alternative.

It seems like a lot of wasted time/money/resource to make something that hardly anyone (potentially) will experience.
 

Stuart360

Member
What I mean is, say 75% of next gen xbox owners buy a lockhart and 25% buy a scarlett.

All of those assets need to be made for Scarlett, be it physics or 4k or whatever.

That means all that effort Is put in for only 25% of the user base to experience because you have 75% of the user base going for the cheaper alternative.

It seems like a lot of wasted time/money/resource to make something that hardly anyone (potentially) will experience.
Well first off, devs almost always design games on the console thats winning, and that will probably be PS5. And i wouldnt look too much into which will sell the most out of the Xbox's. The 4k machine will sell the most with the hardcore crowd (do a poll on here with which Xbox they would buy, it would probably be 90% for the 4k version). I think Lockhart is for the more casual crowd who havent made a jump to 4ktv's yet, and probably wont until their tv's fail.
Devs already do OneX and Pro versions of games, and i would think it would be easier with Lockhart, just in reverse.
 
Last edited:

Romulus

Member
1080p vs 4k. I honestly dont get how so many people cant get their heads around this.

Its not that I can't understand it
Lol.
Way to try and manufacture that as some prevelant misunderstanding just to take a cheap shot. Okay. My point is how much did they limit the ram so that's more of a complaint than the weak gpu. If were considering rumors as true.
 

makaveli60

Member
Even if they get rid of Xbox One S & Xbox One X you still have low end PCs. MS has no reason to jump out the window & try making games that can only run on a 12TF console that may or may not sale enough to recoup from the work it would take to make a game that only play on that console.
Most of the times AAA games are designed around the weakest current gen console in mind (on which it is being released) then scaled up and down from there. This has been the case for a very long time now. Games won't be designed around some arbitrary low-end PC configs but around that shitty Lockhart. If there was no Lockhart then they would be designed around the weaker nextgen console (multiplatforms, of course).
 
Last edited:

Gavin Stevens

Formerly 'o'dium'
This reminds me, my wife may be cheating on me because the homeless man over the road told me there was less birds on the lawn the other day.

I should worry, seems reasonable.
 

Gavin Stevens

Formerly 'o'dium'
For me, cards on the table... I just want one console. Maybe a new “pro” a few years down the line, but for now, I just want ONE Xbox, ONE PS5 and ONE Switch. That’s all. And my pc of course and other bits but you know what point I’m making.

If the lock heart specs are these then down ports are easy, just stupidly annoying as it’s extra work. It’s stupid. People don’t want this.

The memory talk may be because PS5 is aiming for higher memory than scarlet, but if so, I’m gonna be honest here and say that’s a LOT of memory and I would be worried about devs using it correctly right now.

So for me, two xboxes is a hard NO and a waste of time. But it’s not going to make games look any worse or anything, it’s just an extra dev cycle, the same as when you release “game” and “game Pro“ right now.

I just think two consoles is a colossal fucking waste of time... just make a single, ultra powerful bit of kit and stop fucking around.
 
Last edited:

MaulerX

Member
This reminds me, my wife may be cheating on me because the homeless man over the road told me there was less birds on the lawn the other day.

I should worry, seems reasonable.


Homeless man is lying. She's doing it with a full flock of birds.
 

Max_Po

Banned
RDR2 has a super low mode, that makes it scalable. Which means it could come to switch.. maybe. It uses GTAV's engine, so it's totally possible.


The Witcher 3 is on Switch and the switch is 192 GFLOPS in handheld mode and 384 GFLOPS docked. The Playstation 3 was 192 GFLOPS. So.. it's possible.

Where is your god now?


Nintendo of America ain't gotta be happy that you leaked RDR2 Switch version; let alone compare it with MAX PC Masta Race settingz
 

psorcerer

Banned
The people not understanding that game engines are scaleable are going to be spouting uneducated garbage until they come out and blow minds.

Game engines are not scalable.
They just work hard on making them scalable because of the PC/multiplat mess we are in right now.
It's much easier to target high end platform with guaranteed performance, than to make a scalable "fits all" game.
Essentially it forces developers to use engines that abstract some hardware differences = under-utilization.
That's why we see exactly the same games on 2080Ti and vanilla PS4 but with 4x resolution and some "ultra settings" crap.
 

Gavin Stevens

Formerly 'o'dium'
As somebody with their own game engine that’s VERY scalable... eh...?

You can run a game with ray tracing, 200,000 polygon player models and effects all over the place that brings a 2080ti to its knees....

...that can also run on a 970gtx.

The point is that any decent game engine worth its salt is modular and you can enable/disable or adjust all of it on the fly at any point with zero need for waiting on code updates. That’s a standard feature that all engines have and have had for 20 odd years.

Even things that are heavily cpu bound, you can lower the precision or disable things entirely.

Why do you think Witcher works on switch? The game wasn’t even designed for the hardware it was released on, it was designed for high end PC and then scaled DOWN for Xbox/PS4. Yet here we have a very playable and not half bad version on PORTABLE.

I know the Sony juice flows strong in some of your heads but... have a day off.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
As somebody with their own game engine that’s VERY scalable... eh...?

You can run a game with ray tracing, 200,000 polygon player models and effects all over the place that brings a 2080ti to its knees....

...that’s can also run on a 970gtx.

The point is that any decent game engine worth its salt is modular and you can enable/disable or adjust all of it on the fly at any point with zero need for waiting on code updates. That’s a standard feature that all engines have and have had for 2” odd years.

Even things that are heavily cpu bound, you can lower the precision or disable things entirely.

Why do you think Witcher works on switch? The game wasn’t even designed for the hardware it was released on, it was designed for high end PC and then scaled DOWN for Xbox/PS4. Yet here we have a very playable and not half bad version on PORTABLE.

I know the Sony juice flows strong in some of your heads but... have a day off.

Games have been on PC's since a lot longer than 2 odd years, it is a question of what you can tie to game design and what you cannot, what you have time to work on and charge money for or not (Switch Witcher 3, PS4 Witcher 3, PC Witcher 3... mean 3x$59 purchases), and what you have time to research and take advantage of or not based on limited time and budget.

The day where console OS's/SDK's and HW have nothing different from desktop Windows PC's you have a stronger point. Right now, I think the console generations model still makes sense and I want as big of a jump as I can get.

Lockhart RAM wise, with a suitably fast CPU and SSD I/O solution (down to how the latter is exposed to games) the important part is how compute and target resolution and RAM are sized proportionally to the aforementioned factors. Once you move into a direction where RAM is more a local cache where you constantly stream data in then the absolute size of RAM become less and less of an issue (not that you can go to 50 MB of main RAM and cal it a day).

The reason devs may be complaining is that for launch software you tend not to be able to take advantage of such a solution even if there is good tooling and docs around it, so you’d rather brute force it: easier on Scarlett and PS5 than Lockhart if the latter has less RAM.
 
Last edited:

SleepDoctor

Banned
Rumors are just that... rumors. But it is rather entertaining reading the comments here how a rumor is only true when it fits their agenda lol. Lots of armchair devs in here.

We probably won't know a goddamn thing till E3 or maybe sooner if Sony does an unveiling event in Feb like they did the Ps4 if I am remembering correctly.

Till then everything is to be taken with a grain of salt.
 

VAL0R

Banned
Xbox's strategy going into this gen is brilliant, especially when compared to their flop launch last gen.

1) Have the most powerful console. This is huge, especially for Xbox. This will ensure that Xbox will be the best place to play 3rd party content which will endear it to the hardcore and the press. It will be seen as a comeback and redemption. The good PR they will get from this isn't small beans and will allow Microsoft to push the "Xbox is the world's strongest console brand" narrative to the ends of the earth.

2) Lockhart is for the tens of millions of people who do not have 4K TVs and who do not need to have the settings cranked to max. They DO NOT CARE if the box is gimped in comparison to anaconda. They will happily exchange that cutting edge tech for a cheaper box that allows them quick and easy entry, especially once Xcloud streaming finds its stride. This box is not for you. It's for filthy casuals and/or poor people. It makes a ton of business sense and if MS prices these right, they will fly off shelves.

3) Xcloud streaming and GamePass will be the twin killer weapons in the coming X-pocalypse.

I think Xbox takes back 'Murica this gen.
 

Gavin Stevens

Formerly 'o'dium'
You can run Quake 2 with 9000x9000 resolution and RTX it will never make it look like Death Stranding.
More than that, even in 720p DS will look miles better.

I know you’re not saying that up scaling is the same as down scaling, because I know you’re not that stupid. Nobody is that stupid.

You can take Death Stranding, lower the resolution, half the target frame rate to 30 (if it even was 60), increase LOD draw distance, lower resolution of effects and frequency, use lower mops and lod bias for textures, reduce main character LODs from their primary to secondary and tertiary, and blah blah blah....

The point is, even the all mighty best game ever created all hail Kojima post man simulator could run on a Switch, it’s not doing anything incredible at all.

If the assets are there, it’s piss easy to reduce fidelity and complexity of instruction. It’s been common place for DECADES.

But if the assets are not there you can’t just magically upscale everything. You can lose detail, you can’t gain it.

That’s why all this talk about how devs are unhappy with lockhearts memory is bullshit. It has zero standing on anything unless that’s the primary console you are making something for. It would be like making a game for PC now and being pissed because a 970gtx hasn’t got enough oomph to run it. No, it does, you just have to make sacrifices. That’s the whole point of the console, and if you’re not making them, then you’ve fucked up.

And this is from somebody (me) who thinks two consoles like this is a waste of time.
 
Last edited:

psorcerer

Banned
I know you’re not saying that up scaling is the same as down scaling, because I know you’re not that stupid. Nobody is that stupid.

You can take Death Stranding, lower the resolution, half the target frame rate to 30 (if it even was 60), increase LOD draw distance, lower resolution of effects and frequency, use lower mops and lod bias for textures, reduce main character LODs from their primary to secondary and tertiary, and blah blah blah....

The point is, even the all mighty best game ever created all hail Kojima post man simulator could run on a Switch, it’s not doing anything incredible at all.

If the assets are there, it’s piss easy to reduce fidelity and complexity of instruction. It’s been common place for DECADES.

But if the assets are not there you can’t just magically upscale everything. You can lose detail, you can’t gain it.

In my experience downscaling never ever worked.
What works is: make a game for the lowest platform and then just upscale textures/resolution/other crap to make it look "better" on a more capable one.
Do you have any example of successfully down scaled game?
Witcher won't count, because obviously it was made for PC and pretty low spec one originally.
 

HeresJohnny

Member
This makes a lot of sense and is my biggest "wishlist" item for next-gen after full backwards compatibility.

I picked up Days Gone over Black Friday - that fucker took over an hour to install to my PS4 then had to download a 30GB update which took over 30 minutes to apply. By the time it was done installing / updating, I had already been playing a different game on my PC and had lost interest entirely.
Yeah, they've really gotten egregious about wasting people's time. To allot gaming time you have to account for all this extraneous bullshit. and for those without much time, that's damn near a dealbreaker.
 

Gavin Stevens

Formerly 'o'dium'
In my experience downscaling never ever worked.
What works is: make a game for the lowest platform and then just upscale textures/resolution/other crap to make it look "better" on a more capable one.
Do you have any example of successfully down scaled game?
Witcher won't count, because obviously it was made for PC and pretty low spec one originally.

Any example? Literary any game that was ported to console that had a pc release, mostly during the 360/ps3 era. Every one off them had compressed textures, lower quality assets and stripped detail.

If you have a 2048x2048 uncompressed texture, then you can use it full res. You can do things like apply detail maps and what not for the “look” of a higher fidelity asset but it will never be unless you use use data.

If you want it to run on a lower memory rig with the same data, you call into memory the compressed asset, or you call the compressed asset and then use a lower mip. Very standard stuff.

However if you have a 512x512 asset created there is nothing you can do to magically make that appear higher fidelity. Not a damn thing, run your game in 16k ultra wise resolution with 64xAA and 128xAF... you will never make it better.

That’s my point. You make the game for target spec and then DOWN port everything. You don’t make it for the lowest denominator and then UP port.

Now there ARE games out there that are made to a lower target spec, that’s common and usually happens with console games made for console that get ported to PC. But the usual way you develop is with your highest performing machine, and then lower until it fits the target spec on others. But you don’t go all in and say “well, this runs on the original am Xbox one, guess that’s fine”. That would be insane.

Look at GTAV “now” on PC. The game was made half a decade ago, but still maxes out modern rigs. That’s because the consoles uses lower quality settings and assets.
 
Game engines are not scalable.
They just work hard on making them scalable because of the PC/multiplat mess we are in right now.
It's much easier to target high end platform with guaranteed performance, than to make a scalable "fits all" game.
Essentially it forces developers to use engines that abstract some hardware differences = under-utilization.
That's why we see exactly the same games on 2080Ti and vanilla PS4 but with 4x resolution and some "ultra settings" crap.
So game engines aren’t scaleable but they are scaleable because the have to be? Do you read what you write? Playstations going to PC, Streaming, and Console next gen. Game engines are scaleable, otherwise you wouldn’t be able to choose between performance or resolution on PS4 Pro titles.
 

Gavin Stevens

Formerly 'o'dium'
I’ll give a very, very simple example from our engine - shadow map filtering resolution.

This is something that greatly impacts performance as every single object in the game world casts its own shadow and every light is a shadow caster (something even now extremely uncommon still).

Anyway, consoles now would likely use a filtering setting equivalent to 4 samples or so, not a lot. This isn’t resolution, as that’s w totally different ball party and is based on lots of factors low cube mapped shadow rendering and the like (6x performance cost of a point light). Anyway... a single setting, right? So I can increase the filtering up or down and it will greatly reduce or increase memory consumption and performance. Yet our engine goes all the way up to 64 samples, something that brings modern cards down.

I mean, it’s just common sense, it’s a setting.

But a recent game engine? They are modular. Everything is this way. Even precision of physics objects and their interactions, can be lowered in real time.

You can ship a game with a 4096x4096 texture on a wall, and have it render its 128x128 mip of you want. The only downside is that this increases load times.... but oh look, the new consoles have SSDs.

Do you see where I’m going? This is people crying over the building being on fire, when the fire is a static printed out picture of a fireplace stuck to the freezer.
 

CeeJay

Member
Native 4k is a waste of resources that would have been better spent on making better looking/performing games. Everyone has been saying that since day fucking one.

Anaconda being designed to run Lockhart games at four times the resolution Is the anchor everyone has been talking about. MS will drive its studios and third parties to master games at 1080 for the lowest common denominator.

Could you imagine what a capable studio could render on a 12 TF system at 1080p or even 1440p with a little MSAA?

Well keep on imagining because if MS gets their way they just kneecapped that possibility for an entire generation (except for Sony exclusives if its comparable specs).
Are you really suggesting that any developers (especially Sony 1st parties) are going to be targeting a next gen game at 1080p? Come on now with this strawman...

"Hey guys look at our new cutting edge, shiny next gen game that runs at a much, much lower resolution than the previous game we shipped 3 years ago!"
 

pawel86ck

Banned
I think 12TF RDNA2 is really possible because MS want to launch 2 different SKU products, cheaper lockhart and expensive anaconda (maybe even at 599$). Yes enthusiast will want to pay premium for anaconda, but not many casuals will buy more expensive console. If Sony will launch only one console they have to think about casuals and enthusiasts at the same time, so "weaker" 10TF make sense and even then their console should cost 500$.

But can you guys imagine how stunning games will look like on next gen consoles? 12TF RDNA 2 performance wise should be around 16.8TF Vega (and maybe even more because RDNA2 should be more efficient compared to RDNA1), so even compared to xbox x the difference is big. I think 4K native will be the most popular resolution on anaconda, and probably even 4K 60fps with checkerboard.
 
Last edited:

psorcerer

Banned
Any example? Literary any game that was ported to console that had a pc release, mostly during the 360/ps3 era. Every one off them had compressed textures, lower quality assets and stripped detail.

If you have a 2048x2048 uncompressed texture, then you can use it full res. You can do things like apply detail maps and what not for the “look” of a higher fidelity asset but it will never be unless you use use data.

If you want it to run on a lower memory rig with the same data, you call into memory the compressed asset, or you call the compressed asset and then use a lower mip. Very standard stuff.

However if you have a 512x512 asset created there is nothing you can do to magically make that appear higher fidelity. Not a damn thing, run your game in 16k ultra wise resolution with 64xAA and 128xAF... you will never make it better.

That’s my point. You make the game for target spec and then DOWN port everything. You don’t make it for the lowest denominator and then UP port.

Now there ARE games out there that are made to a lower target spec, that’s common and usually happens with console games made for console that get ported to PC. But the usual way you develop is with your highest performing machine, and then lower until it fits the target spec on others. But you don’t go all in and say “well, this runs on the original am Xbox one, guess that’s fine”. That would be insane.

Look at GTAV “now” on PC. The game was made half a decade ago, but still maxes out modern rigs. That’s because the consoles uses lower quality settings and assets.

Compressed textures is nothing. Just a memory saver.
And you will have high quality assets in the art pipeline anyway, so essentially having high-quality textures costs nothing.
So what PC do you target? 2080Ti? And if you have a game that's 3 years in development, what PC card will you target? And what if it will be not enough at the time you get there?
GTAV looks no different on PC and on console. Except for some quantities. It doesn't have better lighting, shadows, materials, etc. just little less aliasing.
 

VAL0R

Banned
I think 12TF RDNA2 (performance wise it should be around 16.8TF Vega) is really possibe because MS want to launch 2 different SKU products, cheaper lockhart and expensive anaconda (maybe even at 599$). Yes enthusiast will want to pay premium for anaconda, but not many casuals will buy more expensive console. If Sony will launch only one console they have to think about casuals and enthusiasts at the same time, so "only" 10TF make sense.
And this is exactly why X has the winning strategy. Give the hardcore, who are willing to pay a premium, a "monster eater" console. And give the casuals who are price conscious a great box that plays the same games with lower settings and is far more affordable. Both hardcore and casuals are happy.

With Sony, the casuals may see the PS5 as too expensive when priced next to Lockhart and the hardcore as too weak when compared to anaconda.
 

psorcerer

Banned
What is your experience with game developing?

No comment.

So game engines aren’t scaleable but they are scaleable because the have to be? Do you read what you write? Playstations going to PC, Streaming, and Console next gen. Game engines are scaleable, otherwise you wouldn’t be able to choose between performance or resolution on PS4 Pro titles.

Making scalable game engine has a cost: worse performance, poor hardware utilization.
I don't think anybody wants to invest money in scalability if they could get away without it.


Anyway, consoles now would likely use a filtering setting equivalent to 4 samples or so, not a lot. This isn’t resolution, as that’s w totally different ball party and is based on lots of factors low cube mapped shadow rendering and the like (6x performance cost of a point light). Anyway... a single setting, right? So I can increase the filtering up or down and it will greatly reduce or increase memory consumption and performance. Yet our engine goes all the way up to 64 samples, something that brings modern cards down.

So you employ suboptimal shadowing scheme (PCF is what, 15 years old?), just for the sake of scalability.
Thanks for proving my point. :)

Yet our engine goes all the way up to 64 samples, something that brings modern cards down.

And then Nvidia will "fix" it in the next drivers, by not calculating all 64 samples. And claim "improved FPS on GTX3080Ti SUPER!!!"
 
Last edited:

VAL0R

Banned
Internet: Breaking: Anaconda has 12 teraflops of powa!
Cerny fanclub: But NO GAMES!
giphy.gif
 
Making scalable game engine has a cost: worse performance, poor hardware utilization.
I don't think anybody wants to invest money in scalability if they could get away without it.

Well, considering that games are developed on PC, playable on PC, scaled down to the console, then optimized. Your logic is flawed. This is true for 100% of the games out there which is why the DECIMA engine is being brought over to PC so easily. If it was a PS4 only engine, why is it so easy to port Death Stranding and Possibly Horizon Zero Dawn? Chances are, they will look 2x better on the PC than on the PS4Pro.

Let's also consider, both consoles are using x86 processors which is what they use in a PC, optimization is that much more simplified for developers.

Thinking about Xbox, the difference between the architecture of the 4TF console and the 12TF console is minimal, they will both use the same API's they will both optimize very similarly the major difference being is some sliders will be lower than others. It's not rocket science.
 
Last edited:

Gavin Stevens

Formerly 'o'dium'
Compressed textures is nothing. Just a memory saver.
And you will have high quality assets in the art pipeline anyway, so essentially having high-quality textures costs nothing.
So what PC do you target? 2080Ti? And if you have a game that's 3 years in development, what PC card will you target? And what if it will be not enough at the time you get there?
GTAV looks no different on PC and on console. Except for some quantities. It doesn't have better lighting, shadows, materials, etc. just little less aliasing.

Just "a memory saver" yet here we are where people are moaning because the lower end console... has less memory? Do you not see the stupidity of your point here? That single line tells me not. It saves memory and allows the same assets to be rendered on lower end hardware with no alteration, ergo, means it will work fine on lockheart.

As for what you target you target whatever the hell you set out to target. There's no set rule. But I know many PC developers, some retail and some indie, and not a single one of them would target a middle of the way spec. Not one.

As for GTAV looking no different on PC compared to console... Wow... That's about as ignorant as you could be. I mean, for one its wrong, but in other news... If you increase the resolution of assets, increase draw distance, increase lod fade, add more peds, increase pretty much every other setting... I'm sorry but what else do you expect to happen?

Everything you are saying shows me you have no idea how any of this works.

Making scalable game engine has a cost: worse performance, poor hardware utilization.
I don't think anybody wants to invest money in scalability if they could get away without it.

So you employ suboptimal shadowing scheme (PCF is what, 15 years old?), just for the sake of scalability.
Thanks for proving my point. :)

Hey guess what else is old hat, polygons, lets just get rid of those, too. Oh wait... Let me just check to see if our engine looks 15 years old... https://overdose-game.com/media/screenshots/od1_media_4.jpg Nope, looks at least 30.

You JUST said that a scalable game engine is both more expensive and performs worse. How can an engine specifically designed to be scalable perform worse than an engine with a set spec. EVERY single game on PC right now is scalable. Every one of them. Even the really piss poor ports like Dark Souls 1 and Halo Reach have SOME form of scaling in them or their config files.

At this point I'm not even sure if I'm talking to a troll or what...
 

psorcerer

Banned
Well, considering that games are developed on PC, playable on PC, scaled down to the console, then optimized. Your logic is flawed. This is true for 100% of the games out there. Let's also consider, both consoles are using x86 processors which is what they use in a PC, optimization is that much more simplified for developers.

Thinking about Xbox, the difference between the architecture of the 4TF console and the 12TF console is minimal, they will both use the same API's they will both optimize very similarly the major difference being is some sliders will be lower than others. It's not rocket science.

1. PC is not a platform.
2. There are some platforms that could be approximated as "PC": DirectX12 and Vulkan. They are different from XBoX DirectX and Sony GNM. Some of them are closer, some of them are vastly different.
3. You cannot program for specific hardware in any of these except Sony's one (you can do some low level stuff in XboX though). That's why Sony's hardware utilization is so much better. And that's why Sony's exclusives look so much better.
4. Obviously you can brute force some of these optimizations by employing less hardware specific things, on a better hardware. But it would not magically utilize it better.

My point was that having a poor-perfromance hardware baseline is worse that having a high performance one. That's all.
 

psorcerer

Banned
ust "a memory saver" yet here we are where people are moaning because the lower end console... has less memory?

So you're going to cut only the places where it's easy to cut: textures, render targets. Which will result in over-cutting for a low memory hardware.

As for what you target you target whatever the hell you set out to target.

What you will target, right now? Just a small thought experiment.

If you increase the resolution of assets, increase draw distance, increase lod fade, add more peds, increase pretty much every other setting...

It will not change anything about the game will only reduce aliasing.
If your water was rendered meh it will stay meh.
If you didn't use reflective shadow maps, you will still have no reflective shadow maps.
If you haven't used tessellated progressive meshes you will stay that way.
Etc. etc.
 

Gavin Stevens

Formerly 'o'dium'
This will be my last post because I'm a little bit taken back by how silly this is, but anyway...

So you're going to cut only the places where it's easy to cut: textures, render targets. Which will result in over-cutting for a low memory hardware.

No. The point is, if you have a memory limitation coupled with a lower spec GPU but NOT a CPU limitation, you cut back on memory intensive things. Again, common sense.


What you will target, right now? Just a small thought experiment.

You're asking a stupid question that will result in a stupid answer. So I'll ask one. What will you be thinking Next Thursday? You haven't added specifics such as what type of game, what camera system or provided any sort of design doc. Lets humour you and say we are making a third person action game that has a design like COD. Well, you would target your base line, right now, at the 2060 level as a max, with a few additional extras for future proofing that will help you down the line. These may be additional memory consumption devices, or higher quality filtering/effects. The point is, if you target what's middle of the road NOW, by the time the game ships you will be old hat. If you target too high as your middle ground, likely nobody will run it.

It will not change anything about the game will only reduce aliasing.
If your water was rendered meh it will stay meh.
If you didn't use reflective shadow maps, you will still have no reflective shadow maps.
If you haven't used tessellated progressive meshes you will stay that way.
Etc. etc.

You're essentially arguing that the PC port of GTAV only includes basic additions for additional settings, and that it looks no better than the PS3/360 releases, you know that right? Becuase the game was made for the PS3/360. So please, tell me you can't see any difference between my 2080ti Ryzen 32gb rig running the game and those consoles. Please. I'll even make it easier for you and cap the frame rate to 30 and the resolution to the same as those consoles, but everything else maxed. Please.
 
1. PC is not a platform.
2. There are some platforms that could be approximated as "PC": DirectX12 and Vulkan. They are different from XBoX DirectX and Sony GNM. Some of them are closer, some of them are vastly different.
3. You cannot program for specific hardware in any of these except Sony's one (you can do some low level stuff in XboX though). That's why Sony's hardware utilization is so much better. And that's why Sony's exclusives look so much better.
4. Obviously you can brute force some of these optimizations by employing less hardware specific things, on a better hardware. But it would not magically utilize it better.

My point was that having a poor-perfromance hardware baseline is worse that having a high performance one. That's all.
  • Did I say PC was a platform? No I did not, I said 100% of games are developed on PC then optimized on consoles. Steam, uPlay, Origin, GOG, etc are Platforms.

  • Direct X, OpenGL, and Vulkan are API's. Microsoft typically optimizes the design of the Xbox around Direct X to make creating games easier for developers since both the PC and Xbox utilize Direct X in some fashion, optimization is that much easier. I'm sure Sony is no different.

  • Sony uses the OpenGL API which was created for PC originally. Optimizing from PC to Playstation is the same as Optimizing PC to Xbox. Sony and Mircosofts optimization is the same, which is why games look better and have better assets on the Xbox One X. Just like how PS4 games look better on the PS4Pro, difference being is that I've never downloaded a 4K asset pack on the PS4Pro as it only offers better resolution or better performance.

  • No shit. I've been talking about optimization the entire time. Read my entire post.
TFLOPS are derived from the calculated GPU power output, It has nothing to do about the rest of the system. So sure, the graphics may not be the same, but that's it. Sony and Microsoft are going for 4K. To run next-gen games at 1080p you need 4x less power which also requires less RAM because you're not pushing 4K assets such as textures, models and meshes.
 
Last edited:

Gavin Stevens

Formerly 'o'dium'
To run next-gen games at 1080p you need 4x less power which also requires less RAM because you're not pushing 4K assets such as textures, models and meshes.

While I agree with your other points, this isn't true. You may not get the most out of the assets in 1080p, but you absolutely can run any quality asset at any screen resolution, as the two are not mutually exclusive to each other. You do need less power to run 1080p over 4k, of course, that's just obvious. But you can use assets of any quality and resolution even in 240p if you want.
 

Journey

Banned
4 TF is a pathetic minimum spec for entry level next-gen Xbox.

Then don't buy it 🤷‍♂️

The 12 TF of the high-end Xbox won't matter, devs will never be able to fully use that because it'll just be for getting games to render the same games (built for the 4TF console) at 4K with brute force.

So PS5 and PC multiplatform games are screwed guys, you heard it first from the arm chair developer, he's never wrong.

PS5 specs, with 10+ TF (or close to Anaconda) will be fully utilized, devs can always count on PlayStation base specs.

Yea, the next Call of Duty will look ridiculously good on PS5 and PC, but because of Lockhart, it will look crappy on Scarlett... makes sense.
 

Gavin Stevens

Formerly 'o'dium'
Oh Fak! now that is some good use of Project XCloud 👍

Yeah, THAT is a smart thing to do, actually... I really like that idea.

I also wish that MS would just let users download the games from store to HDD right away, as soon as they are available for pre-load, even without a disk or licence. That way us retail disk people can still pop the disk in and play right away most times, without a painful download.
 
While I agree with your other points, this isn't true. You may not get the most out of the assets in 1080p, but you absolutely can run any quality asset at any screen resolution, as the two are not mutually exclusive to each other. You do need less power to run 1080p over 4k, of course, that's just obvious. But you can use assets of any quality and resolution even in 240p if you want.
I agree with you, but when optimizing for 1080p I'm sure Microsoft will most likely scale back on texture resolution among other things which is why I beleive that they would have less RAM. But you are not wrong. I'm taking a stab at the RAM thing and why less would be required. It could totally be due to less resolution alone.
 

psorcerer

Banned
  • Did I say PC was a platform? No I did not, I said 100% of games are developed on PC then optimized on consoles. Steam, uPlay, Origin, GOG, etc are Platforms.

  • Direct X, OpenGL, and Vulkan are API's. Microsoft typically optimizes the design of the Xbox around Direct X to make creating games easier for developers since both the PC and Xbox utilize Direct X in some fashion, optimization is that much easier. I'm sure Sony is no different.

  • Sony uses the OpenGL API which was created for PC originally. Optimizing from PC to Playstation is the same as Optimizing PC to Xbox. Sony and Mircosofts optimization is the same, which is why games look better and have better assets on the Xbox One X. Just like how PS4 games look better on the PS4Pro, difference being is that I've never downloaded a 4K asset pack on the PS4Pro as it only offers better resolution or better performance.

  • No shit. I've been talking about optimization the entire time. Read my entire post.
TFLOPS are derived from the calculated GPU power output, It has nothing to do about the rest of the system. So sure, the graphics may not be the same, but that's it. Sony and Microsoft are going for 4K. To run next-gen games at 1080p you need 4x less power which also requires less RAM because you're not pushing 4K assets such as textures, models and meshes.

1. "Games are developed on PC" is irrelevant then. PC as a development tool is not the same as a target platform.
2. When you cannot access directly, only through API - an API becomes your platform.
3. Sony doesn't use OpenGL. It's shader language is similar, but that's about it.
4. You need a much smaller than 4x performance increase to target 4k from 1080p. A lot of low frequency render targets may stay the same resolution. vertex buffers will probably stay the same, low res textures (billboards, particles) will probably stay the same. Etc. It all depends on how will you do it. PS4Pro in ~4k runs HZD much better than PS4 in 1080p where the GPU power difference is 2x only.
 

Gavin Stevens

Formerly 'o'dium'
I agree with you, but when optimizing for 1080p I'm sure Microsoft will most likely scale back on texture resolution among other things which is why I beleive that they would have less RAM. But you are not wrong. I'm taking a stab at the RAM thing and why less would be required. It could totally be due to less resolution alone.

Absolutely. There is nothing I'm seeing here that tells me that wont be the case. I see next gen scarlett games running and looking amazing on scarlett, but then the lockhart version simply reducing a few things here and there, maybe running with lower mips, but at 1080p. Theres ZERO that is saying otherwise to me right now, and thats a pretty decent thing. It would fit lower memory constraints too.

Sure, I would MUCH rather they just focus on one console and use xCloud for running on older consoles. But... Who knows with all that.

But Lockhart holding things back? Nah. Don't see it, at all. Its no different than now, where we have a PS4/Pro and a One/X version. No different.
 

Gavin Stevens

Formerly 'o'dium'
4. You need a much smaller than 4x performance increase to target 4k from 1080p. A lot of low frequency render targets may stay the same resolution. vertex buffers will probably stay the same, low res textures (billboards, particles) will probably stay the same. Etc. It all depends on how will you do it. PS4Pro in ~4k runs HZD much better than PS4 in 1080p where the GPU power difference is 2x only.

This isn't accurate. You can't just dictate what sort of performance increase you will get from a resolution drop. Not so easily. It all depends on the specifics of the engine. As for effects and things, these are all serious fillrate hungry areas, where resolution has a MASSIVE impact on them. Case in point, pretty much EVERY game on X right now. Most of them run grand, but the places that they start to drop frame more than the Pro? Effects heavy scenes. Fillrate is and always has been the biggest performance killer, and it likely will be for a long time.
 

longdi

Banned
The more that I think about Lockheart, the more I think it actually makes sense. It sounds like a a pain in the ass for devs, but if they can deliver a 1080p "next gen" box for $299 that plays the same games at lower settings I can see a whole lot of those filthy casuals jumping on the bandwagon pretty quickly - and MS absolutely needs this thing to perform well out the gate. Would I buy it? Fuck no. But I know a lot of people that would.

This dont make sense.
The 2 models pro/non-pro models at launch, may make sense for phones or tablets.
But for consoles, we should go all out for launch, and have a pro-upgraded model 3 years later.
MS is acting "too" smart with this choice.
 
1. "Games are developed on PC" is irrelevant then. PC as a development tool is not the same as a target platform.
2. When you cannot access directly, only through API - an API becomes your platform.
3. Sony doesn't use OpenGL. It's shader language is similar, but that's about it.
4. You need a much smaller than 4x performance increase to target 4k from 1080p. A lot of low frequency render targets may stay the same resolution. vertex buffers will probably stay the same, low res textures (billboards, particles) will probably stay the same. Etc. It all depends on how will you do it. PS4Pro in ~4k runs HZD much better than PS4 in 1080p where the GPU power difference is 2x only.
  • Games run on PC before they run on the dev kit

  • API platform, that is a thing. But in regards to Microsoft, the Direct X API is on a plethora of devices at this point.

  • Sorry, I read the GPU was cable of OpenGL but you're right it's GNM and GNMX.

  • HZD does not run native 4K:

    "The team did give native 1500p rendering a shot, but the results were mixed at best. And after plenty of other failed experiments, they finally decided to use a custom implementation of checkerboard rendering, which involves rendering half of the pixels for the 2160p frame, while the other half are pulled from the previous frame. "

    The game looks great and is one of my favorite games I've ever played. But it's not Native 4K as it utilizes checkerboard rendering to reach 4K. But it still renders at 1080p and borrows pixels from the previous frame and alternates sort of like 2160i but with better implementation. If the system was 4x more powerful, then it would have been rendered in native 4k.
 
Last edited:
Top Bottom