• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF Direct: PlayStation 5 / Unreal Engine 5 Reaction - Now This Is Next-Gen!

eot

Banned
According to this chart which is based on 20/20 vision:


You need to be 7 feet from a 55” display before you even BEGIN to notice 4K over 1080p. About 4 feet away to see the full benefit. (Full benefit of 1440p would be between those figures).
The display resolution (which is the only thing those charts are relevant for) and the internal rendering resolution aren't the same thing, and you'll have much better IQ if you render at a higher resolution, even if you then downscale it to a lower display resolution.
 
That graph has the years 2006-2012 on it, 4k TVs only started releasing the same year and they werent available at 55 inch. Its all theory as far as I'm concerned. I don't think any real world tests were done.

Have you done real world testing yourself? You own a 65inch 4k TV. Have you hooked a PC up to it and compared games at 1080p, 1440p and 4k?

That graph isn’t observations about TVs. It’s just mathematically extrapolated from a single observation that is probably decades old. It’s generally accepted that people can resolve detail at about 60 pixels per degree. It’s objective information that’s the product of actual research (not consumer product research but scientific/medical). That chart just takes that figure and uses math to calculate how much detail you can see from your TV.

Your observations are subjective and are probably the result of some other factors. Earlier in this thread there was a brief discussion about how bad scaling can cause an image to become significantly softer. My guess is that this explains some of your personal experience.
 
The display resolution (which is the only thing those charts are relevant for) and the internal rendering resolution aren't the same thing, and you'll have much better IQ if you render at a higher resolution, even if you then downscale it to a lower display resolution.

I think you have it backwards. That chart is derived from the fact that people can see detail at about 60 pixels per degree (nothing really to do with screens, it could be dots on a chalkboard). It actually assumes your display has near perfect black/white contrast (like an eye chart). IQ is moot here, except that if your IQ falls far short of perfect then your image would be considered lower resolution as far as that chart is concerned.

If you’re interested in the topic, this might be helpful to you:

 

eot

Banned
I think you have it backwards. That chart is derived from the fact that people can see detail at about 60 pixels per degree (nothing really to do with screens, it could be dots on a chalkboard). It actually assumes your display has near perfect black/white contrast (like an eye chart). IQ is moot here, except that if your IQ falls far short of perfect then your image would be considered lower resolution as far as that chart is concerned.

If you’re interested in the topic, this might be helpful to you:

No I know what you mean but even if you don't resolve individual pixels with your eye you can still see what is basically sub-pixel detail that the renderer drew. You can make explicit examples with very high frequency details that illustrate this, for example with moire patterns. A supersampled image is different to one rendered at native resolution, and thus will look different. It can also make itself known temporally, as sub-pixel details will jump in and out of view if you don't supersample.
 

sol_bad

Member
That graph isn’t observations about TVs. It’s just mathematically extrapolated from a single observation that is probably decades old. It’s generally accepted that people can resolve detail at about 60 pixels per degree. It’s objective information that’s the product of actual research (not consumer product research but scientific/medical). That chart just takes that figure and uses math to calculate how much detail you can see from your TV.

Your observations are subjective and are probably the result of some other factors. Earlier in this thread there was a brief discussion about how bad scaling can cause an image to become significantly softer. My guess is that this explains some of your personal experience.

So have you done your own real world tests?
 

Redlight

Member
Hell blade 2 looks about the same.

It will have the same issues as the demo though, it's a controlled presentation designed to highlight particular elements in the way that portrays the product in the best light. The game itself will look great, I'm sure, but it won't be quite up to the quality shown in that trailer.
 
No I know what you mean but even if you don't resolve individual pixels with your eye you can still see what is basically sub-pixel detail that the renderer drew. You can make explicit examples with very high frequency details that illustrate this, for example with moire patterns. A supersampled image is different to one rendered at native resolution, and thus will look different. It can also make itself known temporally, as sub-pixel details will jump in and out of view if you don't supersample.

You're right, this is a good point. Though really what you're talking about is just mitigating rendering artifacts that obscure detail, and not really getting detail beyond the ability of our eyes.

I would argue that super sampling is not a great use of GPU resources unless you have a bunch to spare. You can at least partially overcome these issues with other techniques, and things like DLSS are really promising for this kind of problem.
 

Pizdetz

Banned
Saw the demo. The level of detail and being able to use extremely high-polygon assets is very impressive. Don't now that it's game-changing. Also I felt the character design itself was nothing you couldn't do this gen.
Basically it was a pretty demo of cliffs and rocks, in some ways a tomb raider with a bit of a face lift. I think we're definitely in the zone of diminishing returns in terms of how much better graphics can improve the quality of the game. My biggest worry is if the bar gets set so that all games have to use these high-poly assets and somehow that increases the toll on developers - games are already super expensive, not many devs take risks, and the conditions for game developers are not great job-wise. This may further exacerbate that.

I would prefer they focus more on interesting NPC AI, more reliance on art/aesthetic than polygons, and lots of player feedback through interesting dynamic visual effects. I might be in the minority but at this point I doubt I'll be impressed with more polygons or a slightly better lighting system (RTX has some promise, but in most comparisons it's indistinguishable from conventional methods).
 

CuNi

Member
I don't disagree - but gaming sites, "influencers", and other commentators were telling folk that Valhalla was what "next gen" was going to look like. So, clearly, they're wrong - but it's because Microsoft demoed this stuff as their big "next gen" reveal. So, people took Microsoft at their word, believed what we saw was what the Xbox Series X was going to deliver, and believed we needed to seriously lower our expectations. For that, I put the blame on Microsoft for shitting the bed because, frankly, it's their bed and they chose to shit in it. They could've shown us anything - anything at all. It is not Ubisoft's fault Microsoft themselves decided that Ubisoft's multi-gen multi-platform game was the best choice to introduce the world to the power of their "next gen" console. Whether they meant to or not, Microsoft created the expectation that Valhalla was Series X's "next gen" standard. It's Microsoft's fault for thinking a multi-platform multi-gen title could stack up against the absolute best Sony had to show... when that title is also on Sony's own console. An utterly baffling choice from Microsoft, frankly.

To get real specific: Sony made sure they had something - anything - that offered up a taste of what we can expect from their new console. Sure, this may not be an accurate reflection of Sony's launch titles - no arguments there - but they have provided players something to help us understand what the long term investment in their console can look like. Now we know where that 10.2-ish TFLOPs is going to go, and what our money is going to buy. And, frankly, it's pretty incredible. And look around - people are very excited for the PS5 because we can see what makes it "next gen". We get it - PS4 cannot do what we saw today. We've seen "next gen".
On the opposite side of the fence, to introduce the entire world to the incredible raw potential of their turbo-charged new "next gen" console, Microsoft selected... multi-platform and multi-generational third party titles, most of which will be on Xbox One. And now everyone expects Series X games to look like launch window cross-gen titles because that's what Microsoft selected to show. Months of PR, dozens of articles and blog posts, and an entire hour long presentation, and they still haven't shown off what their 12 TFLOPs can ultimately deliver. And look around - people are very underwhelmed by Microsoft's new console. That's on them. I have no doubt Series X can deliver visuals on par or better than the demo we saw today. But Microsoft haven't really proven it to anyone.

In short: Microsoft had their chance to get out and define next gen, and they dropped the ball in a big, big way. Now, Sony's intercepted the ball and defined "next gen" on their own terms. Microsoft let that happen, so it's absolutely on them.

I got to disagree with some of your points.
Your point is that Sony showed what is possible in the future, but right now, I'm not interested in what games will release in 2023 or later. I want to know what next-gen can deliver for me.. on day 1 with the release.
And that will be mostly the quality that Microsoft showed. Some of their first party titles, just like a few from Sony will look better obviously, but those "omg that is next gen!" stuff will not come before 2022 I'd say at the earliest.
So you could say "MS dropped the ball hard when showing the next gen potential", but you could also say that Sony set up the fans to have expectations that will be unmatched for at least 1, if not even more years after the consoles release.

Not saying Sony is making bad games, but Sonys problem since some time now, was to announce titles to early. While not a game nor a promise, for me it looks like people we go around saying "This is what Sony will deliver with the PS5!" and those people are going to be disappointed when they get the console and all they'll have will be games in AC quality.

Also I am sure there will be more events from both sides that'll show a better release day lineup closer to release.
Right now, I don't see the UE Demo as "That's what Sony can do!", but rather as what it is. "This is what Gaming is going to be next!" and we should all be happy about it. This is a big step for Games right there, no matter if they get released on PS5, XSX or PC. I'm more than thrilled to see what studios can cook up with those new tools!!
 

ethomaz

Banned
Let’s remember that this is a demo, not an actual game.

Let’s assume that the demo was made specifically to demonstrate the PS5’s strengths, particularly the SSD speed, and can’t run like that on the Series X or PC, there’s no evidence of that at this stage, but let’s imagine.

If that is the case then third-party games are not going to be designed around those strengths. Third-party games won’t utilise game designs that can only run efficiently on one piece of hardware, so the only games that will take advantage of those specific strengths will be first party.

So nothing really changes. The vast bulk of games are third-party and multi-platform, how they compare is still to be seen.
If it is like Epic said... scalable then it will automatically increase the number of triangles/details up to limit of the SSD bandwidth.

Of course single hardware optimization will give even better results.
 
New fresh video:


Couple summaries from the thread for the video/article



Alex from DF just put up an Inside Unreal Engine 5 article


It's a long article, some snippets

  • Lumen is not ray tracing, it's using another form of tracing
  • lumen also has specular reflections
  • large objects are traced through voxels
  • medium objects are signed distance fields
  • small objects are screen space (like Gears 5 on XseX)
  • you can see screen space artifacts in the demo
  • uses temporal accumulation like RT, so there's a latency to lighting
  • micro-polygon rendering is primarily used in offline rendering like film
  • nanite uses a high resolution tiling normal map for fine details to help conserve vram through virtual texturing
  • nanite scales the model complexity by how many pixels it takes up
  • micro sized objects are shadowed with SS shadows and combined with a virtualized shadow map
  • shadow map resolution is aligned with screen resolution
  • shadows are filtered to create a penumbra
  • unknown if nanite applies to animated objects like foliage or hair, or characters
  • demo is dynamic res (mostly 1440p) at 30fps
  • resolution scaling is more expensive with this technique
 
Imagine having to run your graphics optimized demo in 1440p at 30fps on a platform that is wanted for 4k gaming at 60fps.

:messenger_neutral:

I assume a lot of people out there would be happy with this though as they couldn't tell the resolution difference and have no problem with 30fps as they are used to it.
 
Last edited:
Such an impressive demo, and it seems that the PS5 was designed for this specific type of workload. Sony’s recent acquisition of the Atom View software for managing volumetric datasets looks like it has some utility here. It will be interesting to see the repercussions of the optimizations/compromises that are needed for other platforms that do not offer the same level of I/O performance.

I suppose it is not surprising that many people continue to overlook the implications for sustained I/O throughput under heavy random load. 12 channels, 6 priority levels, fast synchronized caches, extensive co-processing, etc. just touch on the optimizations that we know of. Also, consider overall system utilization in addition to power metrics. Sony’s studios are legendary for pushing their hardware to the limit.

https://sites.sonypictures.com/sonyinnovationstudios/site/ (Atom View)
 
Yeah, pretty much... he had a big falling out with Phil over exclusives going to PC. We used to always argue with him and Nxtgen720 and Sonic Wolf but then they all changed to PlayStation.
See even the hardest of hardcore can change side... I red some research years ago that said that the people most likely to adopt a religion were the one who held deep religion beliefs to begin with.

I guess it works with console warriors too.
 
How come so little is animated in the demo? I'm curious why deformations aren't a priority or destructibility or animation e.g. the 500 statues are just standing there. The graphics are undeniably awesome and the pipeline savings next gen level but what of the gameplay? There is very little there in the way of enemies animated or AI in real time reacting to players choices/actions.

There's more to next gen I want to see these hardware platforms used for.
 

Shmunter

Member
How come so little is animated in the demo? I'm curious why deformations aren't a priority or destructibility or animation e.g. the 500 statues are just standing there. The graphics are undeniably awesome and the pipeline savings next gen level but what of the gameplay? There is very little there in the way of enemies animated or AI in real time reacting to players choices/actions.

There's more to next gen I want to see these hardware platforms used for.
Well destructibility was on display. But definitely no AI beyond some bats itching to be put into a soup. That’s why it’s a tech demo and not a game I guess.
 

Grinchy

Banned
Let’s be honest, most games aren’t even going to get near this level of fidelity and production.
It's starting to feel like every page in every thread needs the Unreal Engine 4 reveal video posted. Games didn't just match, but exceeded that one.

What about this video even looks so impossible to attain? If next-gen games a few years in aren't looking at least this good on PS5, I'll be disappointed in the generation. There are some weak points in it already like the character model and the lack of anything else but rocks.
 
Most games won't hit 1440p 30fps without raytracing?!?
the image quality of nanite surpasses traditional renderers of 1440p, it baffles pixel counters. Lumin might not be ray tracing but it is global illumination and should give similar quality, perhaps indistinguishable.
Imagine having to run your graphics optimized demo in 1440p at 30fps on a platform that is wanted for 4k gaming at 60fps.

:messenger_neutral:

I assume a lot of people out there would be happy with this though as they couldn't tell the resolution difference and have no problem with 30fps as they are used to it.
imagine that demo trashing every 4k pc game in existence and having superior image quality that even baffled expert pixel counters.
 
You might wanna clean your mouth later with all that shit coming out of it.

Eeewwww

I hope nobody has to go through that.

tenor.gif
 

pawel86ck

Banned
the image quality of nanite surpasses traditional renderers of 1440p, it baffles pixel counters. Lumin might not be ray tracing but it is global illumination and should give similar quality, perhaps indistinguishable.
imagine that demo trashing every 4k pc game in existence and having superior image quality that even baffled expert pixel counters.
According to this viseo Lumin works similar to RT, but it's way more optimized and can run with good results even without HW accelearation.
 
Last edited:

PocoJoe

Banned
So have you done your own real world tests?

Are you asking this out of curiosity?

Or are you really seriously saying that "I dont believe in it if YOU havent PERSONALLY tested it?"

Because if it would be the second option, that would be sad lifestyle.

You could not believe in physics, maths, round world, vaccinations if you or someone you know have been doing the tests and science.

You could not even believe in other countries until you have visited them, or planets/stars :messenger_beaming:


It is just physics vs human eyes that we dont see more details after certain distance.

You cant even see difference between 360P video and 4k video, if you place the TV on stadium and watch it from the half way.

Same with home setups, unless viewers sits close enough = 1080p vs 4k looks the same, or differences are really really small.

And most people sit just too far to get (full) benefits. more than 2-3m and you need huge TV to gain significant difference from fullHD to 4k, that is the point.

I still use 1080p Screen as I have checked 4k screens many times on stores and while they look great from 1-2m away, after 3-4m it doesnt look that special.
 
Some are still arguing about resolution here. With that demo we are dealing with millions of polygons plus smart reconstruction to 4K and global illumination making it indistinguishable with the most recent movie CGI.

When the best pixel counter can't actually pixel count an uncompressed frame and we only know the resolution because the developers disclosed it, native resolution doesn't mean anything anymore. Just forget about it.

Now it's all about: how many triangles can they put in one scene ?
 

sol_bad

Member
Are you asking this out of curiosity?

Or are you really seriously saying that "I dont believe in it if YOU havent PERSONALLY tested it?"

Because if it would be the second option, that would be sad lifestyle.

You could not believe in physics, maths, round world, vaccinations if you or someone you know have been doing the tests and science.

You could not even believe in other countries until you have visited them, or planets/stars :messenger_beaming:


It is just physics vs human eyes that we dont see more details after certain distance.

You cant even see difference between 360P video and 4k video, if you place the TV on stadium and watch it from the half way.

Same with home setups, unless viewers sits close enough = 1080p vs 4k looks the same, or differences are really really small.

And most people sit just too far to get (full) benefits. more than 2-3m and you need huge TV to gain significant difference from fullHD to 4k, that is the point.

I still use 1080p Screen as I have checked 4k screens many times on stores and while they look great from 1-2m away, after 3-4m it doesnt look that special.

I know there is science behind it but I sure don't agree with that graph that was linked.
I was annoyed because I had to play Borderlands 2 in 1440p because the frame rate wasn't very good at 4K in certain areas. I was annoyed because Borderkands looked far better in 4K on my 55 inch 4K TV and 10 feet away. It looked like there was a layer of vaseline smeared on my screen at 1040p.
 
Top Bottom