• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unreal Engine 5 revealed! Real-Time Prototype Gameplay Demo Running On PS5

geordiemp

Member
That makes sense, but I think the real question, or what people really want to know is why PS5 over XsX? Additionally. can the XsX run it as well as PS5 (if it needs scaling down, by how much, etc). We understand what benefits the PS5's architecture have over a standard gaming PC but I think what a lot of us want to know is how the XsX's architecture compares to PS5’s, not just on paper, but in practice, which a demo comparison would shine some light on (if we ever see it running on XsX).

You have seen a demo already, spiderman vs state of decay speed of loading last gen games that take up 5GB of RAM.

That is the only practical comparison we have, for now, and ps5 was under 1 second vs over 6 seconds. That is more than the raw SSD speed suggest.


2wfDTVo.jpg
 
Last edited:
Apparently same demo can do 40 fps at 1440p on a Notebook gpu with 2080 and simple regular ssd. :messenger_tears_of_joy:

[that's what epic engineer is saying at 53 min mark. What an EPIC achievement if true] >> https://www.bilibili.com/video/BV1kK411W7fK

edit link:

note that notebook 2080 is around desktop 2060 perf.


Very interesting. When even the 5700xt has far higher performance than that, and the ps5 is even greater than the 5700xt. How can a lower performing card, a significantly lower, get 30% boost in framerate? wouldn't that require 30% more performance which is near a 2080ti?


edit:

My impression is that he's talking about the downscaled lower detail version that could get 60fps on ps5.
 
Last edited:
Very interesting. When even the 5700xt has far higher performance than that, and the ps5 is even greater than the 5700xt. How can a lower performing card, a significantly lower, get 30% boost in framerate? wouldn't that require 30% more performance which is near a 2080ti?
I can't understand mandarin or whatever language is spoken in that video, thats why I'm saying if true. And PS5 could mabe get same perf or a little greater, but demo was locked to 30 fps because it didnt quite reached 60 locked? Oh and you dig up a game benchmark where rdna performs exceptionally well , well above average when compared to say rtx 2080. [5700xt is ~2070S]
 
Last edited:
I can't understand mandarin or whatever language is spoken in that video, thats why I'm saying if true. And PS5 could mabe get same perf or a little greater, but demo was locked to 30 fps because it didnt quite reached 60 locked? Oh and you dig up a game benchmark where rdna performs exceptionally well , well above average when compared to say rtx 2080. [5700xt is ~2070S]
really, i dug up the first bench I found. And I heard amd had almost abandoned dx11, and its performance sucked on dx 11 compared to nvidia.
 

lukilladog

Member
Apparently same demo can do 40 fps at 1440p on a Notebook gpu with 2080 and simple regular ssd. :messenger_tears_of_joy:

[that's what epic engineer is saying at 53 min mark. What an EPIC achievement if true] >> https://www.bilibili.com/video/BV1kK411W7fK

edit link:


note that notebook 2080 is around desktop 2060 perf.


giphy.webp







If true some comments in here would make this thread one for the ages :messenger_ok:


https://media1.tenor.com/images/379e047a2ebcff9bddff353a6bb7f480/tenor.gif


You laugh but I would cry at how underutilized PC hardware has been for the last few years, Sony/MS are running the show now.
 
really, i dug up the first bench I found. And I heard amd had almost abandoned dx11, and its performance sucked on dx 11 compared to nvidia.
It's always best to look at multiple game benches when comparing gpu, because one game can favour amd's architechture another nvidia's, but yeah 2060S is not far of 5700XT as 5700XT is not far of 2070S, which is basically 2080. >> https://www.tomshardware.com/reviews/amd-radeon-rx_5700-rx_5700_xt,6216-3.html

Would be nice if someone who understands what is being said in that video to confirm or deny this, let's call it a rumor for now.
 
Last edited:
Last edited:
You laugh but I would cry at how underutilized PC hardware has been for the last few years, Sony/MS are running the show now.
Yep multiplats are stuck at lowest denominator which is always consoles. On pc you just get double or better fps and higher resolution and some better details. On other hand even a low end pc can give you 60fps at same settings as consoles. So it depends how you look at it.

Many casuals would go console gaming only and just keep their older pc for pc games! Unfortunately as you heard even next gen games with very powerful processor in them now are still 30fps. [new ass creed for ex]
 
Last edited:

Thirty7ven

Banned
I can't confirm what is said in the video, would be cool if we had somebody who can translate from the source?

Any talk about specific configuration is coming from a forum poster who said he "called" the guy in the video and the guy in the video told him the specs, but apparently he had forgotten which SSD it had, or really, anything other the GPU. In a thread that is populated by 6 or so dudes fanboy warring each other, I would take it with a huge grain of salt.
 

Andodalf

Banned
It said decent ssd is ok probably nvme unless it is lower detail. Since many articles say epic noted even sata ssds were insufficient.


I’ll admit here, my knowledge of IT is mostly from Linius Tech Tips videos, but when I think decent, I don’t think NVME, I think a nicer Sata drive that actually has a dram cache unlike the cheepo ones.

However, Tim’s quote from the DF article does indeed seem to imply that NVME is an improvement, and I trust him!

“It's [PS5] got a God-tier storage system which is pretty far ahead of PCs, but on a high-end PC with an SSD and especially with NVMe, you get awesome performance too."

So maybe Sata gets a solid experience and NVMe gets you balls to the wall?
 
Last edited:

D.Final

Banned
I can't confirm what is said in the video, would be cool if we had somebody who can translate from the source?

Any talk about specific configuration is coming from a forum poster who said he "called" the guy in the video and the guy in the video told him the specs, but apparently he had forgotten which SSD it had, or really, anything other the GPU. In a thread that is populated by 6 or so dudes fanboy warring each other, I would take it with a huge grain of salt.

Ok...
So others source?
 

THE:MILKMAN

Member
I’ll admit here, my knowledge of IT is mostly from Linius Tech Tips videos, but when I think decent, I don’t think NVME, I think a nicer Sata drive that actually has a dram cache unlike the cheepo ones.

However, Tim’s quote from the DF article does indeed seem to imply that NVME is an improvement, and I trust him!

“It's [PS5] got a God-tier storage system which is pretty far ahead of PCs, but on a high-end PC with an SSD and especially with NVMe, you get awesome performance too."

So maybe Sata gets a solid experience and NVMe gets you balls to the wall?

Of course all this will be the case as UE is multiplatform and scaleable and undoubtedly Tim was doing a little PR for Sony but also was genuine about the I/O HW specifically as well as the storage system as a whole. Funny you mention Linus as I just watched a video of him calling out Tim about the PS5 SSD speed and can't believe the ignorance and point missing!

I think at this point I'm just going to wait to see Sony first-party games as clearly if someone like him who I would assume would understand the PS5 I/O system (it isn't just an SSD!) doesn't then it is pointless even trying to understand and discuss this with others. I say this as someone that isn't the most tech savvy but have carefully watched the Cerny presentation and believe I understand the basic concept.
 

MadAnon

Member
Who says that console can't average 50FPS? It was locked at 30 for obvious reasons - not enough power to handle rock solid 60 fps. If this demo rarely or never dropped below 30 then it suggests ps5 can average way more fps in this particular demo if fps is unlocked.
 
Last edited:
By the way, that Cyberpunk cityscape demo i posted in that old 'next gen expectations' thread on here, which uses the updated Unity engine that has similar LOD tech, ran at 1080p/30-60fps on my 980ti system with a Sata SSD. -


Not as good looking at the UE5 demo, but poly counts are as good if not better, and still almsot no LOD.
The UE5 demo could def work on PC and XSX.


ive never seen this before, not sure how I missed it
 

Andodalf

Banned
Apparently same demo can do 40 fps at 1440p on a Notebook gpu with 2080 and simple regular ssd. :messenger_tears_of_joy:

[that's what epic engineer is saying at 53 min mark. What an EPIC achievement if true] >> https://www.bilibili.com/video/BV1kK411W7fK

edit link:


note that notebook 2080 is around desktop 2060 perf.


If true some comments in here would make this thread one for the ages :messenger_ok:


Very interesting. When even the 5700xt has far higher performance than that, and the ps5 is even greater than the 5700xt. How can a lower performing card, a significantly lower, get 30% boost in framerate? wouldn't that require 30% more performance which is near a 2080ti?


edit:

My impression is that he's talking about the downscaled lower detail version that could get 60fps on ps5.



A mobile 2080 should not be as low as a 2060, Nvida really stepped up their mobile game in the 2000x series because manufacturers just started putting actual desktop parts in laptops. How good the GPU is is going to vary by what laptop it is in, based on their thermals. A desktop part can also vary by who you get it from, ROG, Asus, whoever will have a custom cooling profile and boost clock.


Per Notebook check, a 2080 mobile trades blows with a 2070 Super desktop part well. This makes sense, as they have a similar memory setup, and assuming both run at their highest listed frequency, have near identical TF. You can also see it lagging behind the 2080 in this link. An important caviate here, I don't know their full testing parameters, but it would seem they have a platform for the 2080 which eliminates the need for throttling, which not every laptop can say.


On this page you first need to select, mobile and desktop GPUs, then hit restrict, then you select the 2080 mobile and 5700 XT then hit restrict again. You should now just see the mobile 2080 and the 5700xt. Per their benchmarks, It's relatively close on most titles, but the 2080 has a 5-20% advantage at 4k ultra on just about every title. Battlefield V on DX12 is one of the worst games for the 2080, with it running 10% or more faster in DX11 vs DX 12, but i believe this chart shows dx 11 performance for both.

Mobile parts have come a long way, and Nvidia is still kicking AMD in the mid-high end segment.

PS5 should be a solid bit ahead of a 5700XT, just posting this to provide clarify that if the Chinese reports were correct, it doesn't mean the PS5 is 2060 level, and also that the reports aren't necessarily wrong because of their performance on a 2080 mobile, which is indeed a stud
 
Last edited:

ethomaz

Banned
NVMe isn’t just the software. It involved a lot of hardware. The interface itself is different as NVMe SSD connect to a PCIe bus instead of a SATA or SCSI bus.
It is software only.
It is a communication protocol.

It uses the PCIe bus to communicate... PCIe is hardware.

What is NVMe™?
NVM Express™ (NVMe™) is a specification defining how host software communicates with non-volatile memory across a PCI Express® (PCIe®) bus. It is the industry standard for PCIe solid state drives (SSDs) in all form factors (U.2, M.2, AIC, EDSFF). NVM Express is the non-profit consortium of tech industry leaders defining, managing and marketing NVMe technology. In addition to the NVMe base specification, the organization hosts other specifications: NVMe over Fabrics (NVMe-oF™) for using NVMe commands over a networked fabric and NVMe Management Interface (NVMe-MI™) to manage NVMe/PCIe SSDs in servers and storage systems.

 
Last edited:
PS5 should be a solid bit ahead of a 5700XT, just posting this to provide clarify that if the Chinese reports were correct, it doesn't mean the PS5 is 2060 level, and also that the reports aren't necessarily wrong because of their performance on a 2080 mobile, which is indeed a stud
The demo is implying 33% higher performance than ps5 which the mobile 2080 does not deliver.
 

-Arcadia-

Banned
Hey everyone, I saw it mentioned a couple of times that Nanite only works on static geometry? Is that true? Would explain why there was kind of a gulf between the world and the (still fantastic-looking) character for me.
 

Thirty7ven

Banned
Cross posting:

Epic guy in the video says you don't need specs like the PS5 to run the demo. He has the demo running on his laptop in editor mode. They are targeting 60FPS for next gen consoles but not there yet. He says SSD will help because Luminen and Nanite require good IO (very weird, why does Lumen require good IO??).

This comes from a dude who's main language is mandarin, on purple forum.

Apparently the Epic guy doesn't mention specs nor performance, nor settings. He just talks about his notebook running the demo.

But it's really odd that he says IO helps with Lumen? What?
 
A mobile 2080 should not be as low as a 2060, Nvidia really stepped up their mobile game in the 2000x series because manufacturers just started putting actual desktop parts in laptops. How good the GPU is is going to vary by what laptop it is in, based on their thermals. A desktop part can also vary by who you get it from, ROG, Asus, whoever will have a custom cooling profile and boost clock.

Proper benchmarks confirm that mobile 2070 is considerably weaker than desktop part. [~20 %] not mentioning Max-Q versions which trail by ~40 %.

Link: https://www.techspot.com/article/1849-desktop-vs-laptop-gaming-performance/
 

Andodalf

Banned
The demo is implying 33% higher performance than ps5 which the mobile 2080 does not deliver.

There's a lot we don't know. We know the PS5 ran with a dynamic scaler, and could have spent time above 1440p. Add that in with with guy saying, according to a translation, around 40, and it's not a huge leap to assume that it was a bit less than 40 at some times. with that in mind, It could very well be within the tolerances of the differences between GPUs. In the only UE title with a bench i quoted, Fornite, the 2080 mobile puts up 61.4 fps to the 5700XTs 45.4. That almost a 30% improvement in UE4.


Proper benchmarks confirm that mobile 2070 is considerably weaker than desktop part. [~20 %] not mentioning Max-Q versions which trail by ~40 %.

Link: https://www.techspot.com/article/1849-desktop-vs-laptop-gaming-performance/

I compared a mobile 2080 here, not a 2070. The article you link seems solid, as says, as should be obvious, the mobile chip has 19% lower clocks and has 19% lower performance. So let's look at the specs.

A 2080 mobile has 2944 cores at a max boost of 1590, which gives about 9.35 TF. a 2070 super has 2560 cores at 1770 boost, about 9.05 TF. If preformance scales like what you posted, we should expect a slight bump in performance for the 2080 mobile over the 2070 super desktop part, assuming there's no throttling.
 
Last edited:
A mobile 2080 should not be as low as a 2060, Nvida really stepped up their mobile game in the 2000x series because manufacturers just started putting actual desktop parts in laptops. How good the GPU is is going to vary by what laptop it is in, based on their thermals. A desktop part can also vary by who you get it from, ROG, Asus, whoever will have a custom cooling profile and boost clock.


Per Notebook check, a 2080 mobile trades blows with a 2070 Super desktop part well. This makes sense, as they have a similar memory setup, and assuming both run at their highest listed frequency, have near identical TF. You can also see it lagging behind the 2080 in this link. An important caviate here, I don't know their full testing parameters, but it would seem they have a platform for the 2080 which eliminates the need for throttling, which not every laptop can say.


On this page you first need to select, mobile and desktop GPUs, then hit restrict, then you select the 2080 mobile and 5700 XT then hit restrict again. You should now just see the mobile 2080 and the 5700xt. Per their benchmarks, It's relatively close on most titles, but the 2080 has a 5-20% advantage at 4k ultra on just about every title. Battlefield V on DX12 is one of the worst games for the 2080, with it running 10% or more faster in DX11 vs DX 12, but i believe this chart shows dx 11 performance for both.

Mobile parts have come a long way, and Nvidia is still kicking AMD in the mid-high end segment.

PS5 should be a solid bit ahead of a 5700XT, just posting this to provide clarify that if the Chinese reports were correct, it doesn't mean the PS5 is 2060 level, and also that the reports aren't necessarily wrong because of their performance on a 2080 mobile, which is indeed a stud

Could this be more discord FUD?
 
, It could very well be within the tolerances of the differences between GPUs. In the only UE title with a bench i quoted, Fornite, the 2080 mobile puts up 61.4 fps to the 5700XTs 45.4. That almost a 30% improvement in UE4.

This suggests otherwise
even at Ultra 4K graphics settings the Radeon RX 5700 XT 8GB will still get a solid 80 FPS[fortnyte]-game-debate
 
Last edited:
The demo is implying 33% higher performance than ps5 which the mobile 2080 does not deliver.
I'm purely specualting here but I think the demo was locked at 30fps on ps5 since it couldnt get 60 locked, so we can't really compare ps5 gpu with anything yet.

The best part about this is that this tech doesn't require super fast ssd or i/o found in those new consoles and this also makes this thread the most hilarious on gaf.
 
I'm purely specualting here but I think the demo was locked at 30fps on ps5 since it couldnt get 60 locked, so we can't really compare ps5 gpu with anything yet.

The best part about this is that this tech doesn't require super fast ssd or i/o found in those new consoles and this also makes this thread the most hilarious on gaf.
There are multiple articles saying epic told that even sata ssd wouldnt be enough.

 
Last edited:

Andodalf

Banned
This suggests otherwise

Yeah I don't love any BR game as a metric. Here's there page for fortnite testing, which seems to have been done in chapter 1


It seems they ran the bench in battle royal, which would be problematic in how it's inconstant, and new updates can constantly change performance. That said Techspot has similarly bad 4k preformance for the 5700 XT at 4k Max


Though it does seem they put the GPUs through the ringer

"For this we used the "Team Rumble" 20 v 20 game mode, waited until the second final circle and then measured a 60 second passage of gameplay which includes quite a few fast mouse flicks left and right to check for enemies, doing this heavily reduces the 1% low performance. "
 
Last edited:
There are multiple articles saying epic told that even sata ssd wouldnt be enough.

I'm sure we will all be able to benchmark the demo ourselves once epic releases it to public or someone else leaks it on UE5 launch, but i'll take a word of an engineer over a pr head every single time.
 
interesting comments here....maybe PS5’s Max SSD speed isn’t required for non-scales down demo after all?

thicc_girls_are_teh_best thicc_girls_are_teh_best you seen this?

Yep :goog_lol: . Basically what a lot of people were assuming the whole time. I mean, wouldn't it have actually been pretty worrying if the demo's asset streaming was only possible on PS5? That would basically mean:

1: PS5's SSD I/O was being pushed near max by a tech demo before the gen even started (that's bad).

2: That level of asset streaming could only be done at 1440p and 30 FPS (again, that's kinda bad).

3: Vast majority of 3rd-party games wouldn't be able to do it since they'd have to be designed for platforms with slower SSDs, meaning you'd only see maybe 5% of all games ever doing what was in that demo (that's again, kinda bad).

What's even kinda funny is, you'll have some people who probably read what I just posted and say "but the engine's scalable, so it doesn't matter if other systems can run it, devs can still optimize for PS5's SSD I/O.". And you know what? That's probably actually correct, but guess what else that sounds eerily similar to? Lockhart and engine/feature scaling between it and XSX. But I'm almost willing to bet some of the same people saying the asset streaming being scalable (thus not a hindrance to optimized throughput on PS5) were some of the same ones insisting Lockhart would hold back XSX because "features don't scale well".

Welp....lol.

That demo definitely benefited from having a SSD drive and Sony's is obviously really solid stuff, but never did I think the reason Epic demoed it on PS5 was out of it being the only platform able to stream that level of asset quality at that rate. There was probably a marketing agreement between Epic and Sony for this to associate it with their brand and I say kudos, because that was a genuinely smart decision on Sony's part.

But probably what's really interesting about that post are the specs it was using on PC to run that version of the demo. Laptop 2080 is (from what I'm hearing) around a 2070 in terms of actual performance? Dunno if 2070 has a Super variant. And the 970 Evo SSD has a 3.4 GB/s sequential read speed; much lower than PS5's 5.5 GB/s, but still higher than XSX's 2.4 GB/s. That said, MS's numbers were always sustained numbers; certain operations can have sequential read speeds higher than 2.4 GB/s, DrKeo DrKeo mentioned it in a post in the Next Gen Speculation thread a long time ago in fact.

And with that said, keep in mind PC SSD I/O has a lot of overhead currently not present with the consoles, so I doubt the demo was fully stressing the 970 Evo drive, either. So basically, there's pretty much nothing in the UE5 demo that couldn't be done on a decent PC, nor on XSX, as many were speculating. I guess it's nice to have some (lowkey) confirmation of that though, for whatever it's worth.

Ah, and they also said something about the demo on the laptop running at 40 FPS? It was uncapped IIRC; the demo on PS5 was 30 but was that capped? The resolution I know for a fact was dynamic in some areas at the very least. I don't think the laptop demo running at same resolution and higher framerate should be too indicative of any particular PS5 performance comparisons; it's an early tech demo, and I think if it were optimized further it could run 60 FPS on PS5 with no problems. But at the very least, it would suggest that:

1: Variable frequency wasn't being utilized on PS5 in the demo (it was running on a devkit and the devkits have "profiles", as Cerny said).

2: Because variable frequency may not've been used, some component of the PS5 devkit (between CPU and GPU) may've been
"lower speced" compared to the advertised numbers (3.5 GHz CPU/ 10.275 TF GPU) since hard profiles may've been used on the devkit (I would think it was the GPU that was "lower speced")

3: If the laptop 2080 is more comparable to a desktop 2070 (2070 Super?) and the 2070/2070 Super is comparable to a 5700XT, then we can probably guess that the PS5's GPU was operating somewhere around at least 9.2 TF, but not at its peak 10.275 TF (I'm just talking pure TF numbers here, not what was actually utilized) during the more GPU-intensive loads.

...and I know some are probably going to take that negatively, but it's actually a good sign regarding future PS5 performance. Because for starters, it means for sure the demo was not fully optimized for the hardware (as we know the retail system will not use "profiles" like the devkit systems do), both in terms of GPU and SSD I/O. It also means that, most likely, with optimizations the demo could've likely ran at 60 FPS on PS5 while still maintaining 1440p resolution and the same rate of data asset streaming from the SSD.

That also bodes well for XSX; if the demo were running on that system it wouldn't have any issues with the asset streaming displayed (and likely asset streaming beyond that level, too, if it were optimized well for XSX's specific architecture features), and probably also ran at a higher resolution and slightly better FPS. Regarding the data asset streaming stuff, it doesn't mean that things will stay relatively imperceptible between the two systems towards the end of the generation: I don't picture PS5's SSD I/O delta advantage being as big as the paper specs convey, but I do still expect it to have some kind of overall advantage in that area which later-gen 1st-party games in particular will definitely utilize.

All the same, there's a reason MS have taken a bit more of a hardware-agnostic approach to resolving a lot of SSD I/O issues currently facing PCs; they want to seamlessly bring a lot of that stuff to PCs through DirectStorage, and software features can always be iterated/improved over time at a lower cost than hardware revisions/upgrades. There's be more ways to reach intense levels of data asset streaming later on than what Sony's approach provides, but once again when it comes to XSX it'll probably mainly be 1st-party games that leverage those features at their peak.

Though there's an off-chance that 3rd-party devs will more readily adapt to optimizations for things like DirectStorage (which inherently benefit Velocity Architecture) since it will be present across PC and mobile devices utilizing it and DX12U, whereas Sony's SSD I/O hardware will only be present on PS5.
 

Thirty7ven

Banned
Yep :goog_lol: . Basically what a lot of people were assuming the whole time. I mean, wouldn't it have actually been pretty worrying if the demo's asset streaming was only possible on PS5?

I don't want to troll you or anything, but it's important to have the actual information on our side.

- The guy doesn't say it's a 2080 with a 970 Evo. That was made up by a chinese poster on a topic thread with 6 or 7 cats engaging in fanboy warfare. The guy says he called the epic guy and he told him, but only the gpu. The ssd he guess it....

Yes it's honestly pretty embarrassing how everyone is going with that as fact.

So with that bullshit out of the way:

-The Epic guy is saying the first scene(Lumen) can run at 40fps on his notebook, not the whole demo.

-If its a 1080P screen, 2 triangle per pixel, make some compression on vertex, than you still can run this demo, no need very high bandwidth and IO like PS5.

-UE4.25 implemented asynchronous/overlapped loading (Because bottleneck was the CPU). They overhauled their shaders to work well with the event-driven loader. This gave them >50% loading speed improvement.

-In the final UE5 scene, compression and careful disk layout avoided the need for high speed SSD. The workload wasn't that high.

-Guy mentioned they can run the demo in the editor at 40fps, not 40+ but did not specify resolution.

-Currently Nanite has some limitations such as only works on static meshes, doesn't support deformation for animation, doesn't support skinned character model, supports opaque material but no mask.

-Lumen costs quite a bit more than Nanite.UE5 could eventually be a hybrid renderer using both Lumen and Raytracing in the future.

A few things that I noticed from skimming through the video.
* Guy mentioned they can run the demo in the editor at 40fps, not 40+ but did not specify resolution.
* Currently Nanite has some limitations such as only works on static meshes, doesn't support deformation for animation, doesn't support skinned character model, supports opaque material but no mask.
* Lumen costs quite a bit more than Nanite
* UE5 could eventually be a hybrid renderer using both Lumen and Raytracing in the future.

You can find this at B3D, I linked it in the next gen thread. There's a bunch of things the guy never said, it's just a crow of people creating information. If more info comes through on translation, I'll post it.
 
Last edited:
I don't want to troll you or anything, but it's important to have the actual information on our side.

- The guy doesn't say it's a 2080 with a 970 Evo. That was made up by a chinese poster on a topic thread with 6 or 7 cats engaging in fanboy warfare. The guy says he called the epic guy and he told him, but only the gpu. The ssd he guess it....

Even if it isn't a 970 Evo, surely his laptop/SSD isn't anywhere near what PS5's SSD/storage bandwidth is capable of, right?
 

Thirty7ven

Banned
Even if it isn't a 970 Evo, surely his laptop/SSD isn't anywhere near what PS5's SSD/storage bandwidth is capable of, right?

Listen I'm just posting what he said. He say it's scalable and you don't need a PS5 to run it, he says that.

I'm not of the opinion that you can only do the demo on a PS5, that wouldn't even make sense if we are talking about Unreal and EPIC, who target multiplatform solutions that go from fucking smartphones to 3000$ PCs. I've said all along this wasn't a Sony presentation.
 
Last edited:
I don't want to troll you or anything, but it's important to have the actual information on our side.

- The guy doesn't say it's a 2080 with a 970 Evo. That was made up by a chinese poster on a topic thread with 6 or 7 cats engaging in fanboy warfare. The guy says he called the epic guy and he told him, but only the gpu. The ssd he guess it....

Yes it's honestly pretty embarrassing how everyone is going with that as fact.

So with that bullshit out of the way:





You can find this at B3D, I linked it in the next gen thread. There's a bunch of things the guy never said, it's just a crow of people creating information. If more info comes through on translation, I'll post it.

Appreciate that. Did come across some of that info on the Beyond3D thread too, but I didn't figure it was too much to stress with what I wanted to speculate on.

Which was, essentially, the asset streaming very likely wasn't at a level where a SSD I/O as robust as PS5's was an absolute necessity to perform. But I'm also of the particular opinion the demo wasn't fully optimized for PS5 either, and just remembered that the devkits have profile modes while the final system automates that (or something like it) with its final variable frequency setting.

So the profile modes alone probably would mean full optimization of PS5 devkit hardware was not possible with the demo, so some performance was left on the table. If it were optimized for a final PS5 retail unit it could probably hit 60 FPS stable at 1440p, or at least I'd like to think so.

You have seen a demo already, spiderman vs state of decay speed of loading last gen games that take up 5GB of RAM.

That is the only practical comparison we have, for now, and ps5 was under 1 second vs over 6 seconds. That is more than the raw SSD speed suggest.


2wfDTVo.jpg

It was 6.5 seconds. 16 GB/ 6.5 sec = ~> 2.4 GB/s, aka the sustained raw speed of XSX SSD.
 
Last edited:

geordiemp

Member
Appreciate that. Did come across some of that info on the Beyond3D thread too, but I didn't figure it was too much to stress with what I wanted to speculate on.

Which was, essentially, the asset streaming very likely wasn't at a level where a SSD I/O as robust as PS5's was an absolute necessity to perform. But I'm also of the particular opinion the demo wasn't fully optimized for PS5 either, and just remembered that the devkits have profile modes while the final system automates that (or something like it) with its final variable frequency setting.

So the profile modes alone probably would mean full optimization of PS5 devkit hardware was not possible with the demo, so some performance was left on the table. If it were optimized for a final PS5 retail unit it could probably hit 60 FPS stable at 1440p, or at least I'd like to think so.



It was 6.5 seconds. 16 GB/ 6.5 sec = ~> 2.4 GB/s, aka the sustained raw speed of XSX SSD.

Do you know the RAM amount minus OS in current gen ?

Last gen games like state decay do not take 16 GB RAM ? XB1 X is only 12 GB total including OS and stuff.

Hell even XSX is only 16 -3 - 13 GB useable.

Also ahve you watched the timing of the video

It starts at 13 seconds, screen comes up at 24 seconds, so 24-13 = ?

More like 11 seconds for at most 8 GB, maybe 10 seconds if we are generous.

Whole discussion is kind of difficut as we dont know if extra procesing was occuring while loading. to set the game up etc
 
Last edited:

Lethal01

Member
So that's how games will look like in 15-20 years, cool I guess.

So basically, the engine sort of works like RT except for rasterization, where instead of rays/pixel it renders polygons/pixel, sounds so logical and so obvious once someone finally sorted it out. Which still, 4K is around 8M pixels, so it should translate into 8M polygons, yet Epic mentioned there are about 20M polygons being rendered in each frame, that's a huge gap/difference, and on top of that the demo ran at 1440p, which is about 3.5M pixels, so the difference is even greater, the engine basically renders 5-6x more polygons than it can actually be displayed, it just shows how much room for improvement there is.

The temporal GI looks nice as well, if that can be rendered on CUs that means more RT power can be left for reflections, shadows etc., which combined could create outstanding results.

Other than that, what a boring demo, it looked exactly like many current-gen open-world walking sims except with better visuals, the animations systems while looking great makes everything even slower than it already is, and there were still forced slowdown sections... I just hope there will be more smaller scale/linear games in the upcoming generation like they were during PS360 times, instead of even bigger, even more boring open-world games.

But like I said, it's just a tech demo like many others before, so I don't expect the actual games to look nowhere near that good, probably next Gears will be the best utilization of the engine, while most 3rd party/indie titles will as always produce average visuals with average performance (30FPS). And by the time we'll get a hardware on which those visuals will be actually possible to pull out new technologies will show up that will take over. And the cycle continues.


You will most likely see stuff looking like this in 2 to 3 years.
When it comes to realtime time demos for Unreal Engine, They usually look outdated long before the gen is over.
 

Lethal01

Member
I simply cannot tell any obvious abnormalities in the lighting. In movement there are the minor screenspace artifacts, but that's about it.

Well sure, they are no longer as obvious as "This should have a shadow but it doesn't" Now it's that everything on screen looks a bit off, X should be dark, Y should be bright. Although I find the shadows and ambient occlusion to be very obviously wrong.

Think of it this way, you saw how obvious it was when you zoom in on the screenspace error and then it get's corrected? Imagine everything has that error but you don't notice it because it never gets corrected. You don't have and easy "before and after" to make it clear how off everything is.

It's great that people don't notice it at all and I'm happy that this gen I will actually be able to ignore it sometimes instead of everything looking completely wrong. However if you were to take this scene and throw it into a pathtracer it's likely it would look completely different.

Important to remember this is literally our first look at it though, maybe when it comes out I will be completely wrong. and most of the issues get fixed.
 
Last edited:

ZywyPL

Banned
When it comes to realtime time demos for Unreal Engine, They usually look outdated long before the gen is over.


I have to strongly disagree:



And wasn't that PS4 Elemental demo running at 60FPS, as oppose to actual games that ran mostly at 30, hence could look better?

Anyway, I won't believe it until I see it, there have been countless tech demos throughout countless generations that never had anything to do with the actual games, so why it should be any different this time around? Real-time, in-engine etc. those are just a marketing buzzwords used to fool people, show me an actual game played by some actual person, then we can talk, otherwise I'm not buying any of the BS the companies try to sell me.
 

Lethal01

Member
I have to strongly disagree:



And wasn't that PS4 Elemental demo running at 60FPS, as oppose to actual games that ran mostly at 30, hence could look better?

Anyway, I won't believe it until I see it, there have been countless tech demos throughout countless generations that never had anything to do with the actual games, so why it should be any different this time around? Real-time, in-engine etc. those are just a marketing buzzwords used to fool people, show me an actual game played by some actual person, then we can talk, otherwise I'm not buying any of the BS the companies try to sell me.

I did specifically say console demos, because yes, obviously if you just run the demo on dual titan gpus or something it's gonna end up looking far better. But when the demo is running on an actual console it gets beaten.

This demo was running on an actual console. and it would have been playable at GDC if not for... that.

that said, to compare to the samaritan demo.


also another demo that targets ps4 specs


Some of it holds up, the rest is clearly showing it's age and the only thing it really as going for it is that it's well animated. otherwise it's far surpassed by FF7 and even FFXV
 
Top Bottom