• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unreal Engine 5 Benchmarks. 6900xt outperforms 3090 at 1440p. 28 GB RAM usage.

SlimySnake

Flashless at the Golden Globes
Some interesting results found by a youtuber.

  • 3090 at 4k outperforms 6900xt. 10-20% better performance.
  • 6900xt at 1440p outperforms 3090. 10-15% better performance.
  • 6900xt is only 20 Tflops. 3090 is 36 tflops.
  • VRAM usage at both 1440p and 4k resolutions is around 6GB. Demo Allocates up to 7GB.
  • System RAM usage at both 1440p and 4k resolutions goes all the way up to 20+ GB. Total usage 2x more than PS5 and XSX.
  • PS5 and XSX only have 13.5 GB of total RAM available which means their I/O is doing a lot of the heavy lifting.
  • 6900 XT is roughly 50-60 fps at 1440p. 28-35 fps at native 4k.
  • 3090 is roughly 45-50 fps at 1440p. 30-38 fps at native 4k.
EDIT: More benchmarks here.

4rkkHHX.png


6900xt 1440p


3900 1440p


4k Comparison. Timestamped.


DWkKNS7g0wSf9zMt4q7b06g4aRuLgaCiKJjkP17Slscg3wkq5YJgFQPyRvncS_o0iyeoT2pyiJkRB5Rk5yWnJDXUsSMetUx_FRdBlWNO5h8EYAJhh57KdjveB-KK-3bazTnl8B8POd8wX4CH1zK-qD9Z_nub8TvYroTBKwFFiHz7cUXftZvo0tiRWb0nww
 
Last edited:

longdi

Banned
hope to see retest once directstorage/rtx-io is out.

45fps at 1440p with vrr isnt that bad, but this is a plain demo.

i guess it pays to wait for 4080ti with 16gb ram, that should give me >60fps for next gen games.
 

CrustyBritches

Gold Member
The demo isn't nearly as memory intensive as those videos suggest. It only uses 5-6GB of system RAM plus whatever is used by the GPU.
I was thinking the same thing, but I've been unable to build a standalone .exe to test it out for lack of the SDK.

Running as a standalone process, I get ~7GB usage from the demo. VRAM consumption is ~7GB. The rest is the editor and Windows. This demo seems tailor made for a console with 16GB RAM and a couple gigs going towards the OS.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
I’ve seen people run it at 4K with like 10-11 total memory usage across main and VRAM.

But it’s a demo app for an engine that doesn’t even go into preview until 2022 lol
 
Last edited:

Hoddi

Member
I was thinking the same thing, but I've been unable to build a standalone .exe to test it out for lack of the SDK.

Running as a standalone process, I get ~7GB usage from the demo. VRAM consumption is ~7GB. The rest is the editor and Windows. This is demo seems tailor made for a console with 16GB RAM and a couple gigs going towards the OS.
Ya, I had that same SDK issue. And they don't tell you which SDK you need which was pretty frustrating.

But, yeah, people shouldn't read anything into those RAM numbers. They're completely meaningless because they're reporting system-wide utilization and not how much the actual application is using.
 
Last edited:

VFXVeteran

Banned
This video is hard to judge. I don't know if he compiled the project to a Windows executable or not.

I didn't get anywhere near that much CPU RAM usage running it in the editor.

4k faster on 3090 is expected. 3090 has way more bandwidth and is also processing more triangles at 4k. AMD cards were always beastly at lower resolutions.
 

Lethal01

Member
Some interesting results found by a youtuber.

  • 3090 at 4k outperforms 6900xt. 10-20% better performance.
  • 6900xt at 1440p outperforms 6900xt. 10-15% better performance.
  • 6900xt is only 20 Tflops. 3090 is 36 tflops.
  • VRAM usage at both 1440p and 4k resolutions is around 6GB. Demo Allocates up to 7GB.
  • System RAM usage at both 1440p and 4k resolutions goes all the way up to 20+ GB. Total usage 2x more than PS5 and XSX.
  • PS5 and XSX only have 13.5 GB of total RAM available which means their I/O is doing a lot of the heavy lifting.
  • 6900 XT is roughly 50-60 fps at 1440p. 28-35 fps at native 4k.
  • 3090 is roughly 45-50 fps at 1440p. 30-38 fps at native 4k.
6900xt 1440p


3900 1440p


4k Comparison. Timestamped.


DWkKNS7g0wSf9zMt4q7b06g4aRuLgaCiKJjkP17Slscg3wkq5YJgFQPyRvncS_o0iyeoT2pyiJkRB5Rk5yWnJDXUsSMetUx_FRdBlWNO5h8EYAJhh57KdjveB-KK-3bazTnl8B8POd8wX4CH1zK-qD9Z_nub8TvYroTBKwFFiHz7cUXftZvo0tiRWb0nww


Is this being ran un the Unreal Engine Editor?

If so it's not really a good comparison.
 

Rea

Member
Sony for some reason doesn't like giving out specs like that. We barely know much besides what they have told us or what people have found tearing the system down. But it should be right around that ballpark of 13.5
How can we guess the ram allocation by tearing the system down? The RAM partition is done in OS firmware. No developers currently confirmed that PS5 is 13.5 usable ram. I just simply curious, the available ram could be more or could be less.
 

Max_Po

Banned
How can we guess the ram allocation by tearing the system down? The RAM partition is done in OS firmware. No developers currently confirmed that PS5 is 13.5 usable ram. I just simply curious, the available ram could be more or could be less.

He is clueless and tries to give his 2 cents out his arse ..I have him on ignore list.
 
How can we guess the ram allocation by tearing the system down? The RAM partition is done in OS firmware. No developers currently confirmed that PS5 is 13.5 usable ram. I just simply curious, the available ram could be more or could be less.
I never once said people found out RAM allocations based on a tear down. Not sure where or why you assumed that from, as I only started this:





Sony for some reason doesn't like giving out specs like that. We barely know much besides what they have told us or what people have found tearing the system down. But it should be right around that ballpark of 13.5



I'm only implying that Sony doesn't tell us shit. Not sure if it's NDA or lack of confidence. If I'm going to buy a processor or GPU, I will know it's l2 and l3 cache, how much memory or VRAM it has, it's operating speeds, etc. I wouldn't be left in the dark to assume, etc. That's why many believe ps5 primative shaders <<<<< mesh shaders. But we can't confirm because Sony is so shy in this regard.

M Max_Po has me ignored because I schooled his ass in previous threads lol! Had no clue about wifi 6 and said it's better than wired. Proved him wrong in multiple threads, and he/she put me on ignore instead of facing the truth lololol
 
Last edited:

muteZX

Banned
Recommended System Specs (100% Screen Percentage)

Minimum System Specs (50% Screen Percentage)

  • 12-core CPU at 3.4 GHz
  • 64 GB of system RAM
  • GeForce RTX 2080 / AMD Radeon 5700 XT or higher

  • 12-core CPU at 3.4 GHz
  • 32 GB of system RAM
  • GeForce GTX 1080 / AMD RX Vega 64

Valley of the Ancient Sample
 

Rea

Member
I never once said people found out RAM allocations based on a tear down. Not sure where or why you assumed that from, as I only started this:









I'm only implying that Sony doesn't tell us shit. Not sure if it's NDA or lack of confidence. If I'm going to buy a processor or GPU, I will know it's l2 and l3 cache, how much memory or VRAM it has, it's operating speeds, etc. I wouldn't be left in the dark to assume, etc. That's why many believe ps5 primative shaders <<<<< mesh shaders. But we can't confirm because Sony is so shy in this regard.

M Max_Po has me ignored because I schooled his ass in previous threads lol! Had no clue about wifi 6 and said it's better than wired. Proved him wrong in multiple threads, and he/she put me on ignore instead of facing the truth lololol
You sounded like you have that 13.5 number from tearing down the system. Anyway, Sony's always been like this, they don't simply reveal their full specs of hardware. I just wanted to know whether it has been confirmed by other developers, or not.
 
You sounded like you have that 13.5 number from tearing down the system. Anyway, Sony's always been like this, they don't simply reveal their full specs of hardware. I just wanted to know whether it has been confirmed by other developers, or not.
What are you talking about lol? You do realize any operating system uses RAM.... Right? You seem like you don't understand this simple, basic concept yet? Besides you not getting that yet, do you think they will ever release these specs? Better yet, would there be a reason to not release these specs? Why reasoning as to why is so secretive? We already know it's only 16gbs of RAM, it's not like they are hiding another 16gbs in the power supply lol.
 

Rea

Member
What are you talking about lol? You do realize any operating system uses RAM.... Right? You seem like you don't understand this simple, basic concept yet? Besides you not getting that yet, do you think they will ever release these specs? Better yet, would there be a reason to not release these specs? Why reasoning as to why is so secretive? We already know it's only 16gbs of RAM, it's not like they are hiding another 16gbs in the power supply lol.
What??? I don't think they will divulge those specs, that's why i was asking, is there any dev confirmed that PS5 has 13.5 usable RAM because I'm curious. It just a simple question. Why are you making it complicated? LOL, a simple yes or no should be fine.
FYI, i fully aware that the console has 16gb max system ram.
 
What??? I don't think they will divulge those specs, that's why i was asking, is there any dev confirmed that PS5 has 13.5 usable RAM because I'm curious. It just a simple question. Why are you making it complicated? LOL, a simple yes or no should be fine.
FYI, i fully aware that the console has 16gb max system ram.
No, but can you provide specs as to what they use? Not sure if they are more efficient than Xbox, as they won't say. And any time anyone has a win in a department, they don't shut down when their opponent states their findings. But then again, if I were using an outdated tech (primative shaders), I would keep my mouth shut
 
I'm sure PC GPU performance will improve but, damn, the 30xx series feels "last gen" already. By the time actual next gen games come along we'll need a 40xx series GPU. I'm suddenly happy to stick with my 2060 until the next iteration of GPUs are available.
 

Bo_Hazem

Banned
I question them if it is this barebone playground and they're struggling this much, how can they make any playable game in that case for those cards and below? 1080p is more realistic here as more generic action with NPC's will put these two to their knees.
 
Last edited:

Md Ray

Member
No, but can you provide specs as to what they use? Not sure if they are more efficient than Xbox, as they won't say. And any time anyone has a win in a department, they don't shut down when their opponent states their findings. But then again, if I were using an outdated tech (primative shaders), I would keep my mouth shut
Question Mark What GIF by MOODMAN

What does primitive shaders have anything to do with his simple question regarding game RAM allocation?
 
Last edited:

llien

Member
On TFLOPs.

So, before Ampere shaders were able to do INT + FP.
Ampere shaders can do either INT+FP or FP + FP.

This lead to Huang claiming twice the number of shaders that cards really wield AND sharp drop in "real life game"/tflop, (as it's not like you need fp+fp all the time)
 

CuNi

Member
I will eagerly await those threads when one or all of the following happened.

  • Engine gets big performance upgrade
  • Engine leaves Alpha and enters Beta
  • Engine leaves Beta and get's full release
  • DirectStorage is released in Beta
  • DirectStorage is released in full
  • GPUs get driver updates

all of those points will shift everything around.
The Engine Preview we have now is a good starting ground from where we can start to see where this all will go.
They even said so themselves, this is barely the first release. This engine will see big performance changes before it get's released, tools will release that also lower the required specs and drivers will get better as well.
I wonder what performance uplifts we will see in a 1 year and a 2 year timespan. I could see RAM get slashed down to 16GB or with DS even 12/10. With RTX I/O I could see VRAM get reduced to near nothing as well.

So while it is all nice and fun to test the engine out currently, those benchmarks mean nothing as not only are games that release on this engine at least 2 years away, the engine itself is not even optimized yet.
 
Question Mark What GIF by MOODMAN

What does primitive shaders have anything to do with his simple question regarding game RAM allocation?
You sound triggered Md Ray Md Ray go to bed. Before you jumped in, I was talking was about how we don't know how much RAM it's allocated to the OS. Which is true, there's no indication of what it uses, but what would 2.5gb of RAM be out of the norm? Do you have any reason to believe otherwise? Md Ray Md Ray ??

To address your concern, how is that downplaying ps5?



This is him any time he sees a Sony fan or a post related to Sony or PlayStation.
triggered GIF

The dude never misses the chance to downplay anything PS related.


The whole OP doesn't even mentioned Sony or playstation? Wtf are you talking about 😂😂😂😂?! This thread is about 2 different GPU's in the performance of a demo only released on PC.
 

martino

Member
You sound triggered Md Ray Md Ray go to bed. Before you jumped in, I was talking was about how we don't know how much RAM it's allocated to the OS. Which is true, there's no indication of what it uses, but what would 2.5gb of RAM be out of the norm? Do you have any reason to believe otherwise? Md Ray Md Ray ??

To address your concern, how is that downplaying ps5?






The whole OP doesn't even mentioned Sony or playstation? Wtf are you talking about 😂😂😂😂?! This thread is about 2 different GPU's in the performance of a demo only released on PC.

OT is completely misrepresenting current context running the demo on pc to drawn conclusion on ram saved on console(without knowing ram usage there too)
Here ram is used here to run the editor and maintain the project data....the engine is also most probably running in some form of debug mode....
but let's ignore all that an pretend this would be representative of ram usage and performance of a released binary on pc here.
 
Last edited:
OT is completely misrepresenting current context running the demo on pc to drawn conclusion on ram saved on console(without knowing ram usage there too)
Here ram is used here to run the editor and maintain the project data....the engine is also most probably running in some form of debug mode....
but let's ignore all that an pretend this would be representative of ram and performance of a released bionary on pc here.
The baked project barely uses any ram, and it's beyond stupid to compare the two honestly. I completely understand why the editor can use up so much resources, you have EVERYTHING loaded in at once. On the baked demo, you don't. Apples to oranges, but people eat it up and say "it requires 64gb of RAM though"
 

ANIMAL1975

Member
To address your concern, how is that downplaying ps5?




Sony for some reason doesn't like giving out specs like that.
I'm only implying that Sony doesn't tell us shit. Not sure if it's NDA or lack of confidence.
That's why many believe ps5 primative shaders <<<<< mesh shaders. But we can't confirm because Sony is so shy in this regard.
do you think they will ever release these specs? Better yet, would there be a reason to not release these specs?
Why reasoning as to why is so secretive? We already know it's only 16gbs of RAM, it's not like they are hiding another 16gbs in the power supply lol.
But then again, if I were using an outdated tech (primative shaders), I would keep my mouth shut

Ssssecretive
 
Last edited:
Top Bottom