• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Performance Test(GigaPixels/s): PS5 outperforms RTX 2080 by a wide margin under heavy load

LordOfChaos

Member

Guilty_AI

Member
"There will be surprises". Most scoffed it off. Maybe it IS dat PSTriple.

So the surprise was rendering specific text fonts all over the screen a few micro seconds faster than similar equipment? Can't say I wouldn't be surprised at this turn of events though.
 
D

Deleted member 17706

Unconfirmed Member
Yeah, I've been super impressed by the PS5. I've got a Ryzen 9 3900x and Vega 64 (around GTX 1080 performance) build and it really can't come close to matching the PS5 it seems. I'm still waiting to upgrade my GPU until probably next year, so for now, I think I'll be getting multiplat stuff on PS5 instead of PC (Immortals: Fenyx Rising is the first game I made this decision for).

They just need to patch in VRR support!
 
Last edited by a moderator:

LordOfChaos

Member
So the surprise was rendering specific text fonts all over the screen a few micro seconds faster than similar equipment? Can't say I wouldn't be surprised at this turn of events though.

It's also sustaining higher resolutions in many cases, this test may be a hint as to why.
 

diffusionx

Gold Member
Yeah, I've been super impressed by the PS5. I've got a Ryzen 9 3900x and Vega 64 (around GTX 1080 performance) build and it really can't come close to matching the PS5 it seems. I'm still waiting to upgrade my GPU until probably next year, so for now, I think I'll be getting multiplat stuff on PS5 instead of PC (Immortals: Fenyx Rising is the first game I made this decision for).

They just need to patch in VRR support!

Vega 64... that GPU is like an anchor dragging down the rest of your system. Not that you could get a new GPU if you wanted. I'm in a similar boat (1080TI).

I still would rather get games on PC though.
 

Krappadizzle

Gold Member
giphy.gif
 

Zadom

Member
The OP took no shots at XBox but lots of XBox guys came in to take shots at PlayStation. Especially odd considering the Twitter poster said he expected XBox to perform similarly. Some Xbox fans on full attack mode recently. But no need here, just a benchmark without comparing XBox. XBox fans can be happy with their console and it’s ok to let PlayStation fans be happy with theirs.
d3lutrd.jpg
 
D

Deleted member 17706

Unconfirmed Member
Vega 64... that GPU is like an anchor dragging down the rest of your system. Not that you could get a new GPU if you wanted. I'm in a similar boat (1080TI).

I still would rather get games on PC though.

Yeah, I know. I got it back in August of 2017 and it hasn't been the best GPU purchasing decision I've ever made... It's fine enough for 1440p gaming, though.

I'm holding out for the 6900XT to see if it's worth the price tag, but I'll probably end up trying to get a 3080 Super whenever they release those.
 

S0ULZB0URNE

Member
Of course. They are better now, but monitors keep being better too.

I sit close, so even on "smallest" 4K 55 inch TV PPI is awful.
27-32inches at 1440p or 4K is way to go for me. Also less lag, VRR, higher refresh rate etc. AND lower price for same features. Also calibration is easer.
TV's currently have the advantage.
HDMI 2.1,as low if not lower input lag,better HDR,non IPS(IPS PQ sux),bigger and some times cheaper.
 

Dr Bass

Member
I can't even begin to unfold the complex levels of delusion where people find value in this kind of synthetic nonsense. Furthermore beyond people trying to make the PS5 seem to be more than it is, this man is not only a previous developer at Naughty Dog, he's plugging his own test software...

Get out of here... Jesus..

Yes, who needs the opinions of ex-Naughty Dog developers when we have DynamiteCop[!] around to tell us about the true reality of the situation.
 
Yes, who needs the opinions of ex-Naughty Dog developers when we have DynamiteCop[!] around to tell us about the true reality of the situation.
I'm pretty sure even that guy would be the first to admit that this is never going to make a noticeable difference in an actual game.
 

Aarbron

Member
Do you really only game on PC for graphics?

More out of old habits to be honest.

My gaming has always been done on computer/PC desktop:

Commodore 64 -> Amiga 500 -> Amiga 1200 -> Pentium PC etc etc.

Consoles have largely been absent, but that started to change with the PS3.

However, now, the majority of "serious" computing I do on a old Lenovo T470. Also, the older I get, the more I prefer the convenience of consoles too. The graphical upgrades of the new gen (both PS5 and XSX) are a welcome bonus.

The only game that really anchored me to a PC desktop was World of Warcraft, but I barely played the last expansion and have no interest in the new expansion.
 
Yeah, I know. I got it back in August of 2017 and it hasn't been the best GPU purchasing decision I've ever made... It's fine enough for 1440p gaming, though.

I'm holding out for the 6900XT to see if it's worth the price tag, but I'll probably end up trying to get a 3080 Super whenever they release those.

At this point only idiots and bitcoin miners are buying GPU's. if you spend over MSRP on a GPU you are not helping.
 

Armorian

Banned
4K 60 Ultra with that GPU? You´re not kidding anyone here, most of us have High end Pcs & that statement is false.

Low Vs Ultra GeForce RTX 2080 Super 8GB Performance Review

Running a GeForce RTX 2080 Super 8GB to play Assassins Creed: Valhalla shows us that we expect it to end with a bearable 57 FPS. With that performance recorded at 1920x1080 res when running High graphics.

Playing Assassins Creed: Valhalla on Ultra 4k is certainly possible with this graphics card and we expect it will return around 41 frames per second at that top scale resolution.

Low vs Ultra summary, certainly a good resolution range for this game would be 1080p. Assassins Creed: Valhalla will get a 1080p Low 108 FPS, or 1080p Medium 85 FPS, and a passable 1080p High 57 FPS, whereas the 1080p Ultra can still get up to 41 FPS.


More:


You know that PS5 version is ~1440p and on who knows what settings (not on ultra that's for sure)? Plus this game actually just performs bad on Nvidia GPUs, they are not in their normal performance tiers at all.

Too bad he doesn't have a 5700XT or other RDNA1 GPU to compare to. Something in the direct lineage of the PS5 would be nice.

6700 will have 36CUs and could be clocked exactly like PS5, we will see how that one will perform.
 
Last edited:

synce

Member
PS4 was on par with the older 780 so it makes sense for PS5 to match a 2080, which is 2 years old now. So should the 3060, which will likely be $300. This is how every GPU gen has went, so nothing too impressive.
 
D

Deleted member 17706

Unconfirmed Member
At this point only idiots and bitcoin miners are buying GPU's. if you spend over MSRP on a GPU you are not helping.

Yeah, the eBay prices are just absurd and they've caused the RTX 2000 series to basically stay at MSRP despite not being nearly worth it anymore.

Really hate how hard it's been to get quality GPUs in recent years.
 
D

Deleted member 17706

Unconfirmed Member
PS4 was on par with the older 780 so it makes sense for PS5 to match a 2080, which is 2 years old now. So should the 3060, which will likely be $300. This is how every GPU gen has went, so nothing too impressive.

It's an even better value proposition than it was back then, though. The price difference is bigger and the consoles don't have crippled CPUs, which means they should be able to compete for longer.
 
That dude seems like a smart MF'er lol

If you've ever used Illustrator and opened up a file or worked on a file with a shit ton of vector points, you'll know how taxing that is on your system... If my understanding of what's going on here is accurate, but I could totally be wrong LOL that's what his program measures, the amount of these vector points it can render/handle or something like that.

Edit: I think people are taking the "font" thing too literally, vector fonts are made of points.
 
Last edited:

rnlval

Member
Under heavy workload ps5 is showing very impressive computational power far exceeding that of RTX 2080. At the moment the guy doesn't have a RTX 3080 but soon he will try to get one and compare .






Cerny has done an amazing job given the budget constraints they had designing the PS5.impressive stuff


His biography:
Eric Lengyel
From Wikipedia:


Eric Lengyel is a computer scientist specializing in game engine development, computer graphics, and geometric algebra. He holds a Ph.D. in Computer Science from the University of California, Davis and a master's degree in Mathematics from Virginia Tech.

Lengyel is an expert in font rendering technology for 3D applications and is the inventor of the Slug font rendering algorithm, which allows glyphs to be rendered directly from outline data on the GPU with full resolution independence.[1] Lengyel is also the inventor of the Transvoxel algorithm, which is used to seamlessly join multiresolution voxel data at boundaries between different levels of detail that have been triangulated with the Marching cubes algorithm.[2]

Among his many written contributions to the field of game development, Lengyel is the author of the four-volume book series Foundations of Game Engine Development. The first volume,[3] covering the mathematics of game engines, was published in 2016 and is now known for its unique treatment of Grassmann algebra. The second volume,[4] covering a wide range of rendering topics, was published in 2019. Lengyel is also the author of the textbook Mathematics for 3D Game Programming and Computer Graphics[5] and the editor for the three-volume Game Engine Gems book series.[6][7][8]

Lengyel founded Terathon Software in 2000 and is currently President and Chief Technology Officer at the company, where he leads development of the C4 Engine. He has previously worked in the advanced technology group at Naughty Dog, and before that was the lead programmer for the fifth installment of Sierra's popular RPG adventure series Quest for Glory. In addition to the C4 Engine, Lengyel is the creator of the Open Data Description Language (OpenDDL) and the Open Game Engine Exchange (OpenGEX) file format.[9]
Lengyel is originally from Reynoldsburg, Ohio, but now lives in Lincoln, California. He is a cousin of current Ohioan and "Evolution of Dance" creator Judson Laipply.



VZXx70F.png


RTX 2080 still has the edge with the gigapixel fill ordinary font which can indicate delta color compression superiority.

For the new product against the new product, try it against RTX 3070 with 96 ROPS or RTX 3060 Ti with 80 ROPS (1)

AMD's RX 6700 XT's 64 ROPS may have problems against RTX 3060 Ti's 80 ROPS. AMD may need to increase RTX 6700 XT's clock speed.

Reference
1. https://www.techpowerup.com/gpu-specs/geforce-rtx-3060-ti.c3681




relative-performance_3840-2160.png


RTX 3060 Ti beaten RX 5700 XT, RTX 2080 and RTX 2080 Super.

RTX 3060 Ti has $400 price tag. Any PC with 8 cores with 256 bit AVX 2 at >3.5 Ghz should be able to upgrade RTX 3060 Ti.

For my gaming PC's MSI RTX 2080 Ti 11 GB Gaming X Trio AIB OC, I plan to upgrade towards RTX 3080 Ti 20 GB in Q1 2021.
 
Last edited:

rnlval

Member
SO PS5 is not just a RTX 2070?! You mean it's above a RTX 2080 in performance even approaching a RTX 2080TI? Who would have thought :messenger_open_mouth:

Oh wait, when I called this exact conclusion and broke this down in detail back in April 2020, how many people believed me?

NeoGaf PS5 GPU Analysis (2080 -2080Ti)
PS5's 64 ROPS has 2230Mhz clock speed vs RTX 2080 FE's 1897 Mhz (1).

Reference
1. https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-founders-edition/37.html

RTX 2080 Ti has 88 ROPS.

NVIDIA is aware of AMD's raster operation fill rate improvements
 
And suddenly - a Slug Website appears - I found this after doing my own search, then combed the thread and turns out - someone used @sluglibrary to denote a website and my mind simply glazed over the fact that this website was already posted. www.sluglibrary.com

Surprisingly, having looked into how this benchmark works based on the material provided on the website - this software does in fact use polygonal cloud data for it's font's and works as I previously surmised - by condensing font's into 2d polygonal assets that essentially relies on Raw Vertex Bezier Curve Data. The standard benchmark utilizes a plain bounding box with a texture sprawled out over the polygon, while the more sophisticated variant uses less efficient font outlines of cloud point data. It then separates these font's into 2 categories, plain polygon box and tight polygon box. For a more detailed breakdown read my previous thread posts.

"A specialized shader efficiently determines how much each pixel inside the box is covered by the glyph using the original Bézier curve data stored in the font. Because no precomputed images or distance fields are involved, the result is a sharp outline at all resolutions with no faceting or blurring. Slug has the ability to apply a couple speed optimizations at large font sizes, one of which is rendering a tight bounding polygon instead of a plain box. "
 
Last edited:

rnlval

Member
Yeah, the eBay prices are just absurd and they've caused the RTX 2000 series to basically stay at MSRP despite not being nearly worth it anymore.

Really hate how hard it's been to get quality GPUs in recent years.
Companies such as HP have very good PC sales in year 2020. Increase PC sales lead to increases in discrete GPU sales on top of DIY AIB GPU sales.

AMD has removed the PC market's Intel CPU quad-core stagnation i.e. PC market is healthy again.
 
Last edited:
D

Deleted member 17706

Unconfirmed Member
Companies such as HP have very good PC sales in year 2020. Increase PC sales lead to increases in discrete GPU sales on top of DIY AIB GPU sales.

AMD has removed the PC market's Intel CPU quad-core stagnation i.e. PC market is healthy again.

I've actually been thinking of just buying a pre-built PC since that seems to be the only way to get an RTX 3000 series GPU for a reasonable price.
 
And suddenly - a Slug Website appears - I found this after doing my own search, then combed the thread and turns out - someone used @sluglibrary to denote a website and my mind simply glazed over the fact that this website was already posted. www.sluglibrary.com

Surprisingly, having looked into how this benchmark works based on the material provided on the website - this software does in fact use polygonal cloud data for it's font's and works as I previously surmised - by condensing font's into 2d polygonal assets that essentially relies on Raw Vertex Bezier Curve Data. The standard benchmark utilizes a plain bounding box with a texture sprawled out over the polygon, while the more sophisticated variant uses less efficient font outlines of cloud point data. It then separates these font's into 2 categories, plain polygon box and tight polygon box. For a more detailed breakdown read my previous thread posts.

"A specialized shader efficiently determines how much each pixel inside the box is covered by the glyph using the original Bézier curve data stored in the font. Because no precomputed images or distance fields are involved, the result is a sharp outline at all resolutions with no faceting or blurring. Slug has the ability to apply a couple speed optimizations at large font sizes, one of which is rendering a tight bounding polygon instead of a plain box. "
giphy.gif


I have no idea what any of this means.
 
Last edited:

rsouzadk

Member
If you watch the video.

He says 2060S reaches 32fps in 4k native at that scene but it will need DRS to hold locked 30fps... so the resolution needs to drop to reach locked 30fps.
About Xbox Series X he doesn't know how much above 30fps it is running and he doesn't say how much from 4k the console is dropping the resolution.
The filtering quality is a bit better on PC even with the same setthings.

To finalize nVidia does have better RT solution than AMD... that is why Series X is comparable to RTX 2060S in RT performance.

If the game had a 60fps option on consoles we could draw better assumptions.

Series X and PS5 render 1440p drs to 4k for RT being possible. Add to that, RT in that game for consoles has an internal res of 1080p, and a lot of compromises were to be made on consoles.
 

ethomaz

Banned
Series X and PS5 render 1440p drs to 4k for RT being possible. Add to that, RT in that game for consoles has an internal res of 1080p, and a lot of compromises were to be made on consoles.
He never saw 1440p lol
It was the minimum resolution the game can go by config files.

“Minimum resolution is close to 1440p (confirmed again by the PC config files) and I noted that general gameplay shifts between 80 to 100 per cent of full 4K”

80-100% 4k.

I’m really not sure about your point.

Only RT reflections are rendered at 1080p on Series X & PS5.
 
Last edited:

rsouzadk

Member
He never saw 1440p lol
It was the minimum resolution the game can go by config files.

“Minimum resolution is close to 1440p (confirmed again by the PC config files) and I noted that general gameplay shifts between 80 to 100 per cent of full 4K”

80-100% 4k.

I’m really not sure about your point.

Only RT reflections are rendered at 1080p on Series X & PS5.

Yes, he says it uses drs to a minimum of 1440p for rt being possible, albeit on light scenes/scenarios it is native 4k. But even that, it has compromises. My point is that this synthetic benchs doesnt translate into game performance, at least not now.
 
Last edited:

ethomaz

Banned
Yes, he says it uses drs to a minimum of 1440p for rt being possible, albeit on light scenes/scenarios it is native 4k. But even that, it has compromises. My point is that this synthetic benchs doesnt translate into game performance, at least not now.
No.

He says the gameplay is most of time 80-100% 4k.
He says he found drops in resolution close to 1440p... he confirmed after the config files shows minimum 1440p for consoles but he never saw 1440p.
He says RT reflections runs at 1080p.

Spin to make a point is not a point.

This synthetic bench is about how many text pixels you can draw on the screen... what do you thought it was?
 
Last edited:

rsouzadk

Member
No.

He says the gameplay is most of time 80-100% 4k.
He says he found drops in resolution close to 1440p... he confirmed after the config files shows minimum 1440p for consoles but he never saw 1440p.
He says RT reflections runs at 1080p.

Spin to make a point is not a point.

This synthetic bench is about how many text pixels you can draw on the screen... what do you thought it was?

Ok man.
 
Top Bottom