• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF] Hitman 3 PS5 vs Xbox Series X|S Comparison

Status
Not open for further replies.

Tchu-Espresso

likes mayo on everthing and can't dance
This is hysterical!

I can't believe we've come FULL CIRCLE. These charts were presented by Xbox fanboys back when PS4 had the resolution advantage.

This has to be the most ridiculous chart out there. Who sits more than 45ft from their gaming set? Oh oh, I know :lollipop_raising_hand::lollipop_raising_hand::lollipop_raising_hand: people trying to prove a ridiculous argument. I sit 6ft away so I guess I'm completely fine with my 85" screen.

Dates back to Xbox 360/PS3 days. You'll only notice the difference if you have an HDTV. Sucks to be you I guess?
No one sits 45 feet away. Sad it needs to be pointed out that’s not the part you should be focusing on. Or are you just being disingenuous for a laugh?

Are you saying that you can continue to perceive the benefit of resolution differences from any viewing distance?

6 feet from an 85 inch screen? Good for you I guess, although I’m not sure why you would do that to yourself.
 
Pretty Insane..............................Metal.
I thought the cyber dude was bad, but damn some of these guys "don't care" about the game or comparison, yet have more posts in here than all the fans of the game, combined. People need to just learn to play the game on the system they currently own, or pick the system to get the game on, or just get a PC to not have to worry about having bad performance. Too bad I "supposedly" can't afford a 3090 by someone who has absolutely no clue who I am or what I own.

my headphone setup costs more than his computer
 
Last edited:
I thought the cyber dude was bad, but damn some of these guys "don't care" about the game or comparison, yet have more posts in here than all the fans of the game, combined. People need to just learn to play the game on the system they currently own, or pick the system to get the game on, or just get a PC to not have to worry about having bad performance. Too bad I "supposedly" can't afford a 3090 by someone who has absolutely no clue who I am or what I own.

my headphone setup costs more than his computer
Actually, you accused that user of being a fanboy and as a result did not believe his claims of having said GFX card. He then proved it to you and from my perspective it agitated you to start talking about income.

Edit: He did mention you not being able to afford one.
 
Last edited:
Actually, you accused that user of being a fanboy and as a result did not believe his claims of having said GFX card. He then proved it to you and from my perspective it agitated you to start talking about income.
No no no, I said exactly just this, verbatim, "If you had a 3090, I doubt you would care so much about ps5 having inferior performance." The dude might actually have one, while at the same time, trying to say I'm practically poor. Obviously dude doesn't have the slightest clue, just to correct your statement. I don't care about dick measuring, especially online, but you can't throw stones at bigger buildings. That's why a mere $1500 gpu flex was funny to me, hence my prior comment. A proper flex would be something that is rare or unaffordable. But it still displays a smoll PP and insecurity to bring up in the first place.
 
Last edited:

JackMcGunns

Member
No one sits 45 feet away. Sad it needs to be pointed out that’s not the part you should be focusing on. Or are you just being disingenuous for a laugh?

Are you saying that you can continue to perceive the benefit of resolution differences from any viewing distance?

6 feet from an 85 inch screen? Good for you I guess, although I’m not sure why you would do that to yourself.

You don’t get it do you? The chart is rigged because it covers a 50ft distance which distorts the focus, the focus being 6ft - 10ft distance that most people sit, but instead represents a TINY section at the very bottom of the chart with the MEDIAN section being 25ft. No one games at that distance.

There are much better charts out there
 

assurdum

Banned
No, I'd like to continue, because I think you're scared of your rhetoric getting blown open. Welp, my C4s are already placed and I'm about to detonate.

Reducing RAM bandwidth to a percentage is a ridiculous way to compare bandwidths, because you're doing it from a reductive POV. Operations like RT and higher-resolution asset feeding to the GPU rely heavily on bandwidth. The AMD cards are only competing with Nvidia's in terms of traditional rasterization performance, and it's arguable if the narrower GDDR6 bandwidth helps or hurts; you always have to keep in mind RDNA 2 cards have more RAM capacity than the Nvidia cards, in some cases almost 2x more memory capacity, and that's almost as important as the actual bandwidth.

You do realize that features of Series X's I/O, like DirectStorage, are not yet deployed widely on PC, right? DirectStorage is a restructuring of the filesystem as a whole, it's not just a means of getting more realized bandwidth from NVMe SSD drives. Other features of the system tied to the I/O but not necessarily in the I/O (like SFS) have no equivalents in hardware on PC. So again, you're completely incorrect on that one as well. There hasn't been confirmation if the RDNA 2 GPUs have cache scrubbers; IC as it's known so far is just a fat 128 MB L3$ embedded on the GPU. Very similar to the eSRAM the XBO had; they were both for framebuffers. IC is enough for 4x 4K framebuffers or 1x 8K framebuffer, and we don't exactly know what the bandwidth or latency of IC is either.

Seeing that, again, in only rasterization performance, AMD RDNA 2 equivalents can keep up with the new Nvidia cards then we can assume both bandwidth and latency on IC is quite good. You don't actually need cache scrubbers for a L3$, but we can throw a bone and say that RDNA 2 GPUs might have that. The point is all of this is only benefiting AMD on rasterized performance; anything that can leverage RT or DLSS image upscaling techniques, AMD loses out on massively. That's because things like RT benefit from bandwidth, capacity AND the actual hardware acceleration in the GPU; IC only helps with one, maybe two of those, but you need all three.

As for what inspired IC, I'm sure Sony may've had some influence but a fat L3$/L4$ is nothing new in system designs, at all. Again, MS did this with the 32 MB embedded eSRAM for XBO, but Intel also did this with embedded eSRAM (maybe eDRAM?) cache on some of their CPUs from the early 2010s'. AMD just took an age-old idea and put their own spin on it, I can guarantee you they weren't simply looking at what Sony was doing and you'd have to verify the RDNA 2 PC cards have cache scrubbers to accredit this to Sony actually.

You've been going all over the place with what you think you're talking about, because you don't understand how bus contention works (seemingly) nor do you seem to understand that you cannot aggregate access to the fast and slow pools as an average because that doesn't reflect how the two pools will be used in practice. We should also assume that if effective bandwidth from fast & slow pool access would drop to a level even near PS5's peak bandwidth, let alone lower than that, then Microsoft would have clearly taken the hit and gone with 20 GB. It's not like they're a team of gasping seals that just learned about electronics yesterday 🤷‍♂️



The amount of willing ignorance and disinformation surrounding Hitman 3 not being a native port on PS5 going on in the thread is astounding.

I bet someone is probably calling me a Microsoft fanboy as we speak, even though I was just pointing out technical deficiencies in The Medium yesterday :LOL: .
Shit man you really don't know what to mean to go straight to the point? First you said cache scrubbers on ps5 are nothing of new but you haven't a single clue how works on ps5 as their benefits, anyway doesn't implies they working exactly as in intended in the past. Secondly, the fuck of research you do to can affirm with absolutely sureness splitted configuration on series X has a minimal and negligible impact in perfomance, when has never been done before in an unified architecture? You talk about the nothing here. I repeat to you again: such configuration is not there to prioritize the perfomance. It's done primarily to favourite the multiplat development environment so think the impact will be minimal, it's not demonstrate yet, especially considered how already we seen important FPS issue when bandwidth is stressed (but sure it's just early tools issue looking how runs Skyrim via mod at 60 FPS on both console and prepare to be shocked).
An another ridiculous argument is sell the series X software management data as something of never done before and revolutionary (ah the MS propaganda), comparable to an hardware custom solution; like wut? Nothing precludes to be possible in any other kinda of machine first or late. And to ending there, the hell you know how works Hitman 3 code on ps5 to say oh it's absolutely a 100% native port, I laugh to who claims otherwise, when such engine needs to handle both Hitman 1 and 2 via PS4 pro BC. And believe me no needs to be Carmack to understand something is missed there.

Christ the paragraph part about "reduce the bandwidth to a percentage number is unfair" like the hell you even tried to claim there. But obviously reduce the custom hardware features of ps5 to a bunch a stuff already seen in the past is more fair.

Practically to synthesize your posts, all the advantage of the series X are notable but whatever is relative the ps5, it's nothing new and diminished return.
 
Last edited:

Tchu-Espresso

likes mayo on everthing and can't dance
You don’t get it do you? The chart is rigged because it covers a 50ft distance which distorts the focus, the focus being 6ft - 10ft distance that most people sit, but instead represents a TINY section at the very bottom of the chart with the MEDIAN section being 25ft. No one games at that distance.

There are much better charts out there
I actually think you are again completely missing my point (which again was that at normal viewing distances people would not notice the difference between 1800p and 2160p, hence it being a waste).

But I’ll indulge you. Here’s Rtings chart for:

optimal-viewing-distance-television-graph-size.png


There’s your much smaller y-axis (something you felt the need to focus on). Notwithstanding, according to the chart, one would begin to notice an improvement over 1080p from around 8.5 feet and would perceive the maximum benefit of 4K at 4 feet - when playing at 4K on a 65 inch screen. I’ll leave you to draw your own conclusions about what that means for discerning the difference between 1800p and 2160p (hint: you need to sit closer to 4 feet than 8.5 feet).

Comparing that outcome to my original chart, neither chart suggests any material difference when viewing 4K on 65 inches.

To use your own range of normal viewing distances (i.e 6-10 feet) the full benefit of 4K is lost (I.e wasted). At 10 feet, you aren’t even perceiving any benefit at all if you have a 65 inch set.
 

FritzJ92

Member
GDK is MORE of a simple UPDATE than the PS5 environment, in fact I would even call it just a rename to be honest.

Direct X 12 ultimate was already in the GDK and is well known to all PC devs anyway (it's also a basic update of Direct X 12 which has been out for years)

PS5 has GNMX which is a high level API much like Direct X, pretty simple to use and master that also comes with a performance hit, but unlike GDK it also includes SPECIFIC TO PS5 hardware GNM which is a low level API.

So as you are clearly in the know, please indicate just why you think the jump from SDK to GDK is more complicated than from PS4 to PS5?

Is it because since the Xbox 360 which had a superb dev platform, MS have really dropped the ball with their dev environment and lag WAY, WAY WAY behind Sony who's dev kit was utter shite on PS3?

Sony's devkit is probably far better than GDK, but that's not because of any changes, it's because MS's changes have not caught up.

This of course doesn't change the fact that Direct X 12 (Ultimate if you prefer) is still Direct X 12 (Ultimate), and Visual Studio is still Visual Studio. In the wash, when all the clowns stop lapping up this tools nonsense, the only real changes made from GDK to SDK will be to aid cross platform games, and have very little, if anything to do with something specific to Series X|S.

Slight correction, the DX12u implementation used by the Xbox is specific to the Xbox, its not the exact same thing used on the PC
 

Shmunter

Member
44 pages to read Sony fanboys trying to defend something indefensible. 😏 That’s the only reason for all these pages. But keep doing the great work.
Any Sony fan entering the thread defending framerate over resolution becomes the unwitting subject of a vigorous Xbox Bukakke session.

Frankly it’s good to see such vigour after everything that’s happened since the next gen launch.
 

assurdum

Banned
44 pages to read Sony fanboys trying to defend something indefensible. 😏 That’s the only reason for all these pages. But keep doing the great work.
The "indefensible". Years ago, when PS4 regularly beaten Xbox one if fps was smoother, 1080p Vs 900p in the DF narrative was " just a touch soft, smoothness is overall always preferable". Now, 4k native is what Xbox fan waiting for, this game is phenomenal in 4k output, a tangible difference, fuck smoother FPS who cares of some drops, praise the glory of 4k native ...Christ I literally annoyed to DF narrative, every single time the same story and people even defend them. They act like a rabid fanboy when Xbox get it's win. It's annoying see a supposed tech site feed the console war masses in this way and the hypocrisy they have when for months talked of native resolution as diminished return, Jeez at least have some dignity.
 
Last edited:
Shit man you really don't know what to mean to go straight to the point? First you said cache scrubbers on ps5 are nothing of new but you haven't a single clue how works on ps5 as their benefits, anyway doesn't implies they working exactly as in intended in the past. Secondly, the fuck of research you do to can affirm with absolutely sureness splitted configuration on series X has a minimal and negligible impact in perfomance, when has never been done before in an unified architecture? You talk about the nothing here. I repeat to you again: such configuration is not there to prioritize the perfomance. It's done primarily to favourite the multiplat development environment so think the impact will be minimal, it's not demonstrate yet, especially considered how already we seen important FPS issue when bandwidth is stressed (but sure it's just early tools issue looking how runs Skyrim via mod at 60 FPS on both console and prepare to be shocked).
An another ridiculous argument is sell the series X software management data as something of never done before and revolutionary (ah the MS propaganda), comparable to an hardware custom solution; like wut? Nothing precludes to be possible in any other kinda of machine first or late. And to ending there, the hell you know how works Hitman 3 code on ps5 to say oh it's absolutely a 100% native port, I laugh to who claims otherwise, when such engine needs to handle both Hitman 1 and 2 via PS4 pro BC. And believe me no needs to be Carmack to understand something is missed there.

Christ the paragraph part about "reduce the bandwidth to a percentage number is unfair" like the hell you even tried to claim there. But obviously reduce the custom hardware features of ps5 to a bunch a stuff already seen in the past is more fair.

Practically to synthesize your posts, all the advantage of the series X are notable but whatever is relative the ps5, it's nothing new and diminished return.

Clearly, you don't know what you're speaking about.
 
Last edited:

assurdum

Banned
Clearly, you don't know what you're speaking about.
What a convincing argument based on the nothing. I start to suspect I know more of you about such things if your argument are all like that. Most of defensiveness post about the series X hardware are basically based on raw specs numbers but most of the things I seen in the MS "panel" are all about overpromising tech improvement than undisputable data. And the funny thing people blame Sony to not to be clear. If to be clear is promise the sky as MS, I prefer the silence of Sony and to leave the games speak for theirself. Every single time I try to point out some criticism against series X is all about I don't know of what you talking but when ps5 is the matter, yo dude looks cache scrubbers are all the same, GE was recycled by Vega, nothing new nothing new, goodnight pal. When we haven't a single official data about it. Whatever make you sleep the night I guess
 
Last edited:
What a convincing argument based on the nothing.

Mixing cache scrubbers and infinity cache for example....

I don't understand how so many people think that MS when they have decided to put this memory setting (to reduce cost) has NEVER measure/simulate the impact on performances, NEVER have some exchange with internal game dev (and with game engine companies for example) and has kept that configuration only for the server aspect if it was tanking the memory part of the system at a level near or less the PS5.
You clearly think that MS engineer team and AMD side are completely dumb. If it was really reducing the memory performances at an important level, it is obvious that MS would have put 20GB (but bad cost impact) or simply reduced the bus to 256bits (bad thing for the server use, but it would have decreased the Series X cost).
Every time we are doing some important hardware changes for any reasons (cost, performance aspects) we are doing so many simulations, have so many exchanges with customers or internal software team, that for me it is CLEARLY impossible for MS hardware team to not have datas showing them that with this configuration, it should have impacts but never tank the memory part, and Series X will keep this advantage over the PS5 for example. For these point, and my past experiences I state that the splitted memory impact is overestimated.

Other thing, you are saying as you are completely sure that "how already we seen important FPS issue when bandwidth is stressed" when you don't know for which reason there is the fps issues, and forgot that it seems most of the drops as we saw in ACV, Hitman 3, Skyrim, COD CW, (etc...) could be more linked to alpha effects usages so clearly more chances due to the fillrate performances...
Last thing, you answered him "Secondly, the fuck of research you do to can affirm with absolutely sureness splitted configuration on series X has a minimal and negligible impact in perfomance, when has never been done before in an unified architecture", when that's simply totally false and I'm simply well placed to speak about that. Do you really think that such configuration has never been used in other hardware configuration ??? As there is not only game consoles and PC in the hardware landscape...
 
Last edited:
What a convincing argument based on the nothing. I start to suspect I know more of you about such things if your argument are all like that. Whole of defensiveness post about the series X hardware are basically based on raw specs numbers and nothing more. But every single time I try to point out some criticism against it oh I don't know of what I'm talking about. Sure dude. Whatever make you sleep the night.

You can think what you want about me, in fact I clearly don't care. I just dislike when people claimed many wrong things based on knowledges they clearly don't have.
 

assurdum

Banned
You can think what you want about me, in fact I clearly don't care. I just dislike when people claimed many wrong things based on knowledges they clearly don't have.
Practically I don't know of what I'm talking about because I expected some issue in the series X perfomance as it was more intended for multiplat development than optimal performance ? It's just my opinion.
 
Last edited:

assurdum

Banned
Mixing cache scrubbers and infinity cache for example....

I don't understand how so many people think that MS when they have decided to put this memory setting (to reduce cost) has NEVER measure/simulate the impact on performances, NEVER have some exchange with internal game dev (and with game engine companies for example) and has kept that configuration only for the server aspect if it was tanking the memory part of the system at a level near or less the PS5.
You clearly think that MS engineer team and AMD side are completely dumb. If it was really reducing the memory performances at an important level, it is obvious that MS would have put 20GB (but bad cost impact) or simply reduced the bus to 256bits (bad thing for the server use, but it would have decreased the Series X cost).
Every time we are doing some important hardware changes for any reasons (cost, performance aspects) we are doing so many simulations, have so many exchanges with customers or internal software team, that for me it is CLEARLY impossible for MS hardware team to not have datas showing them that with this configuration, it should have impacts but never tank the memory part, and Series X will keep this advantage over the PS5 for example. For these point, and my past experiences I state that the splitted memory impact is overestimated.

Other thing, you are saying as you are completely sure that "how already we seen important FPS issue when bandwidth is stressed" when you don't know for which reason there is the fps issues, and forgot that it seems most of the drops as we saw in ACV, Hitman 3, Skyrim, COD CW, (etc...) could be more linked to alpha effects usages so clearly more chances due to the fillrate performances...
Last thing, you answered him "Secondly, the fuck of research you do to can affirm with absolutely sureness splitted configuration on series X has a minimal and negligible impact in perfomance, when has never been done before in an unified architecture", when that's simply totally false and I'm simply well placed to speak about that. Do you really think that such configuration has never been used in other hardware configuration ??? As there is not only game consoles and PC in the hardware landscape...
Uh really there are others hardware with virtual splitted configuration in the RAM/bandwidth in an unified architecture as series X? That's crazy to hear. Why they even should did that in the past? Can you give me an example? And who has never said MS hadn't tested their hardware properly. I point out just the Series X hardware is prioritize the multiplat development and the new GDK consolidate it. If you think such choice won't affect negatively the perfomance, compared a whole hardware thought with low API and dedicate tools, well, that's all to see and far from spread misinformation. There are others developers who argue about it. Oh and keep in mind I never said ps5 is more powerful or whatever. In my opinion the "issue" in some multiplat are caused by such choice but will see.
 
Last edited:
44 pages to read Sony fanboys trying to defend something indefensible. 😏 That’s the only reason for all these pages. But keep doing the great work.
What work? I didn't read any page to be honest

So I'll change my post

WOW 44 pages of Sony fanboys trying to spin this into "that's not true" stuff

Amazing really


Is that ok now?
 

Concern

Member
How do you find the number of posts someone has in a thread? This thread would be very interesting to see those numbers. Lol

I don't think there's a way to see it, other than actually counting. I was really bored at work 🤣🤣

What made me do it is how many times I've seen him say "im done" or "im gonna stop here" but he clearly has no self control lol.
 

JackMcGunns

Member
I actually think you are again completely missing my point (which again was that at normal viewing distances people would not notice the difference between 1800p and 2160p, hence it being a waste).

But I’ll indulge you. Here’s Rtings chart for:

optimal-viewing-distance-television-graph-size.png


There’s your much smaller y-axis (something you felt the need to focus on). Notwithstanding, according to the chart, one would begin to notice an improvement over 1080p from around 8.5 feet and would perceive the maximum benefit of 4K at 4 feet - when playing at 4K on a 65 inch screen. I’ll leave you to draw your own conclusions about what that means for discerning the difference between 1800p and 2160p (hint: you need to sit closer to 4 feet than 8.5 feet).

Comparing that outcome to my original chart, neither chart suggests any material difference when viewing 4K on 65 inches.

To use your own range of normal viewing distances (i.e 6-10 feet) the full benefit of 4K is lost (I.e wasted). At 10 feet, you aren’t even perceiving any benefit at all if you have a 65 inch set.

Well at least now we know where the issue lies... you’re blind :messenger_grinning_sweat:

Your own chart shows Ultra HD at 6’ is fine even for a 55” and my screen is WAY larger. Who the hell sits 10’ away? you’re grasping at straws.

The bottom line is I’m getting the full benefit of 4K and even beyond, based on the chart I could even go up to 8K and start seeing the benefit. So speak for yourself.

Again sucks for you if you’re still gaming on a 30” 720p set, and I’m not judging, but don’t come at me with some bs charts to tell me what I can or can’t see lol.
 
Last edited:

Concern

Member
This is the thread that doesn’t it end
it just keeps going on and on my friends
Some people thought the Ps5 version was better then it was
and they can’t stop posting it over and over just because...

Why did i actually just read that in tune? 😭😭😭

I don't know why they haven't just let this thread die. They're actually the ones keeping it on the first page.
 

Gatox

Banned
Why did i actually just read that in tune? 😭😭😭

I don't know why they haven't just let this thread die. They're actually the ones keeping it on the first page.
Even worse I read his post in tune then thought you had tried to rhyme and read yours in tune.....then was confused as to why it didn't rhyme :messenger_tears_of_joy:
 

assurdum

Banned
Wasn't the excuse for the whole demon souls coming to PC debacle, was that PC stands for "playstation console". Fucking 😂
Christ. It should really hurt to you stay in my ignore list isn't it? Waste your time to search and count a post about who want to leave in peace by you and the fuck? Some of you know how extremely close are to an harassment? If you or people like Concern, Ricky and so on find ridicolous my opinion, fine, ignore them, I put you even in ignored, more easy than this the fuck I have to do of more?
 
Last edited:
Status
Not open for further replies.
Top Bottom