• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Resident Evil 4 Remake First Look: A Classic in the Making? PS5, PS4, Xbox Series X and PC Tested

TonyK

Member
After days of back and forth paying the game in PS5 and XBOXSerX trying to decide which version to buy, if the one with bad controls or the one with bad IQ, I decided to reset the controls in XBOXserX to default values and played the demo three times in a row. After a while, I simply get used to the default aim sensitivity and ended making headshots with no problem. So I purchased the Xbox version. End of the debate for me, ready for release day!!
 

MarkMe2525

Gold Member
I love the fact that we can get hundreds of posts arguing over the technical superiority of a platform with unfinilized code. Why would one plant a flag in the ground when there are going to be multiple patches coming within the next couple of weeks.

I can't speak for everyone, but I don't like to eat crow. Making definitive statements at this point is just opening yourself up to that.
 

Lysandros

Member
And what about these two examples is "blatantly biased" exactly?
None will satisfy you no matter what and again that's fine by me. Just a few from earlier years for the last time:

55yLUxy.jpg
hQ181El.jpg

fMVeMVb.jpg

B9FHuOn.jpg

(In context of PS5's variable frequencies which is a pure misunformation)
fRr7Be8.jpg
 
Last edited:

DenchDeckard

Moderated wildly
After days of back and forth paying the game in PS5 and XBOXSerX trying to decide which version to buy, if the one with bad controls or the one with bad IQ, I decided to reset the controls in XBOXserX to default values and played the demo three times in a row. After a while, I simply get used to the default aim sensitivity and ended making headshots with no problem. So I purchased the Xbox version. End of the debate for me, ready for release day!!

That's me too. Played the demo over ten times now and I'm not missing headsets or anything on xbox. It's like my brain has just adapted to it.

It looks considerably better to me so sticking with that.
 

DenchDeckard

Moderated wildly
None will satisfy you no matter what and again that's fine by me. Just a few from earlier years for the last time:

55yLUxy.jpg
hQ181El.jpg

(In context of PS5's variable frequencies which is a pure misunformation)
fRr7Be8.jpg

Literally everything he types here is factually correct apart from what he says is "my hunch"

So your basically attacking him for nailing actual facts but being wrong with an assumption that xbox games would perform better out of the gate?

How was he to know that direct x12 wasn't and still isn't 100 percent ready for limelight. That's on Microsoft not him.
 
literally everything stated here is factual...
not a single word here is wrong or unreasonable

Not really.

He said he thought XSX perf advantage would materialize based on some niche hardware features that only XSX has. It hasn't. Most games run better on PS5. So clearly his interpretation of how much these features "matter" isn't really true at all.
 
Last edited:

01011001

Banned
Not really.

He said he thought XSX perf advantage would materialize based on some niche hardware features that only XSX has. It hasn't. Most games run better on PS5. So clearly his interpretation of how much these features "matter" isn't really true at all.

that was a reasonable speculation based on the hardware specs presented.

also we barely saw a single game use any of these features yet.
 
Last edited:

damidu

Member
And what about these two examples is "blatantly biased" exactly?
nah i can easily put that second post to biased territory,
which is already side-eyed further down the same thread.

still the same stupid narrative of “lot better” failing to materialize three years into the gen because of “lazy devs”
 

onQ123

Member
PS5 has the PS4 version holding it back that's all so all they have to do is cancel the PS4 version & boom PS5 will magically be better.


Isn't that how it works? Last Gen holding us back on the PlayStation platform but it's current Gen only on Xbox. You just wait until they patch away the PS4 & give the PS5 love & optimizations. 😂
 

01011001

Banned
Seemed more like a huge jump to conclusions when viewing the facts more objectively, each system had pro's and con's but were otherwise very close in perf, and he just seemed to think the XSX's were more meaningful based on a bunch of buzzwords

well look at Dead Space, that showed what would happen if a game needed VRS to keep the performance stable.

if a game were to use VRS as a technique and it wouldn't be possible to just patch it out without a performance penalty like it was in Dead Space's case, you now know how that would look like.

so it wasn't unreasonable for him to say that VRS Tier 2 is an advantage for the Series X.

furthermore we don't have a single implementation of SFS yet, nor mesh shaders.

we have to see how these will impact games as soon as they start getting actually used.
and that's where Microsoft honestly failed more than anything. their tech team should have been working with developers from the start to make sure their hardware gets used optimally.
they can't just sit back and hope that developers ootimise their games for the platform that sells less units.
 
well look at Dead Space, that showed what would happen if a game needed VRS to keep the performance stable.

if a game were to use VRS as a technique and it wouldn't be possible to just patch it out without a performance penalty like it was in Dead Space's case, you now know how that would look like.

so it wasn't unreasonable for him to say that VRS Tier 2 is an advantage for the Series X.

furthermore we don't have a single implementation of SFS yet, nor mesh shaders.

we have to see how these will impact games as soon as they start getting actually used.
and that's where Microsoft honestly failed more than anything. their tech team should have been working with developers from the start to make sure their hardware gets used optimally.
they can't just sit back and hope that developers ootimise their games for the platform that sells less units.

I guess we'll see. I'm not hoping for rainbows and unicorns though at this point. We're into year 3 of these consoles. We pretty much know where the systems each stand. At this point just play where you prefer, the performance isn't really meaningful and I doubt that changes.

For there to be a much bigger difference in performance these consoles would need to use different architecture or at least feature 50% or greater performance stats across all metrics and not just a few.
 

01011001

Banned
I guess we'll see. I'm not hoping for rainbows and unicorns though at this point. We're into year 3 of these consoles. We pretty much know where the systems each stand. At this point just play where you prefer, the performance isn't really meaningful and I doubt that changes.

For there to be a much bigger difference in performance these consoles would need to use different architecture or at least feature 50% or greater performance stats across all metrics and not just a few.

there was never going to be a big difference even if both systems would be used 100% optimally for each game that's absolutely true
 

Lysandros

Member
Not really.

He said he thought XSX perf advantage would materialize based on some niche hardware features that only XSX has. It hasn't. Most games run better on PS5. So clearly his interpretation of how much these features "matter" isn't really true at all.
Not to mention that his posts directly contradict the explanation given by Cerny on how PS5's variable clocks work.
 

01011001

Banned
No.

4fRD8Pk.jpg


"I think the raw flop and bandwidth will matter most at first..."

Boy, was he ever wrong! Now he has adopted the Microsoft strategy: "the better performance is coming, just you wait"


zE4bAus.jpg



Ibz6lVZ.jpg


"the PS5 will not be 2080 level"

Results from his own analysis:

k7vnNvE.png


V5BJwid.png

nothing said was unreasonable to say given the details availabe at the time of him saying that.

also in the vast majority of games the PS5 does in fact not reach 2080 performance. there are outliers where it happens but they are rare.
using a game that was previously console exclusive and whose engine was developed with Sony's own API in mind is also a very handpicked example to use.
 

onQ123

Member
Not to mention that his posts directly contradict the explanation given by Cerny on how PS5's variable clocks work.
Speaking of Variable Clocks PS5 is capped at a certain limit I wonder if they will lift that limit a little for the V2 PS5 just to get a little more out of the system.
 

ChiefDada

Gold Member
nothing said was unreasonable to say given the details availabe at the time of him saying that.

also in the vast majority of games the PS5 does in fact not reach 2080 performance. there are outliers where it happens but they are rare.
using a game that was previously console exclusive and whose engine was developed with Sony's own API in mind is also a very handpicked example to use.

Therein lies the issue and my point. Everything you're saying here was considered by others in that very forum during that time period. And they don't even have the large platform and expected responsibility to be objective as he does/should.

Quotes from regular members from the other Forum pre PS5/XS release:

tQLkEJs.png


2WTBey7.png




for that single engine...

Ah yes, UE5 "that single engine" that is the underlying tech behind much of the AAA multiplat games made. "That single engine" that so many developers, large and small, are abandoning their proprietary tech for.

Even still, Nanite-like tech is the future and if you're rendering via the "Full RDNA2" mesh shader API, by definition your rendering pipeline is less efficient than Nanite and less efficient than the proprietary engines to be used by PS Studios (based on Cerny/Matt Syke discussions on PS5 GE")
 

01011001

Banned
Therein lies the issue and my point. Everything you're saying here was considered by others in that very forum during that time period. And they don't even have the large platform and expected responsibility to be objective as he does/should.

Quotes from regular members from the other Forum pre PS5/XS release:

tQLkEJs.png


2WTBey7.png

he is not responsible for the fact that the average people on these forums are idiots.



Yep, literally 100% factual. 🙂

Alex:

gPrViXl.jpeg


Mark Cerny:

MPgiaH4.jpeg



Because "factual" means "wrong", right? Please excuse my ignorance, I'm still learning English.

Cerny can say what he wants, it is a fact that the PS5 can hit a power limit, upon which it will downclock either the CPU or the GPU.
if developers want to optimize their games as well as possible they will indeed need to take this into account and adjust the game accordingly. even if they don't handpick how fast any given component runs, they will be able to notice clock changes running the code on dev kits
 
Last edited:
I love the fact that we can get hundreds of posts arguing over the technical superiority of a platform with unfinilized code. Why would one plant a flag in the ground when there are going to be multiple patches coming within the next couple of weeks.

I can't speak for everyone, but I don't like to eat crow. Making definitive statements at this point is just opening yourself up to that.

One reason is because the chances that the final game will be any different than the demo are slim to none, especially at launch. Sure, we could get lucky and they'll fix all the major issues, but their track record (and most devs track records) are not good in this respect.
 

Arioco

Member
Cerny can say what he wants, it is a fact that the PS5 can hit a power limit, upon which it will downclock either the CPU or the GPU.
if developers want to optimize their games as well as possible they will indeed need to take this into account and adjust the game accordingly.

So PS5 needs that extra optimization, time and effort, but in the last few months we've seen Monster Hunter Rise, Hogwarts Legacy, Callisto Protocol, Atomic Heart... running better ok PS5 because devs are lazy and don't optimize... for Series X, which doesn't require that optimization at all to begin with. Ok, now I get it. 🙂👍
 

01011001

Banned
So PS5 needs that extra optimization, time and effort, but in the last few months we've seen Monster Hunter Rise, Hogwarts Legacy, Callisto Protocol, Atomic Heart... running better ok PS5 because devs are lazy and don't optimize... for Series X, which doesn't require that optimization at all to begin with. Ok, now I get it. 🙂👍

read what Cerny himself said in the post... "developers may learn to optimise in a different way"
it's just different. nothing about effort or a difficulty was mentioned by anyone.

and it is pretty clear from this very game, RE4, that the Xbox versions do not get the amount of care the PS5 version does.
how else do you explain the ridiculously dogshit controls on Xbox that make the game borderline unplayable at times?
the fact that this isn't even a technical issue, but simply an issue of NOONE at the team actually giving a flying fuck about how this version plays or to make sure the settings are the same as the PS5 version, tells us A LOT.
this is literally a thing that could be changed in less than 10 seconds... hell you could change this without even using a keyboard, you could use a mouse and Microsoft's on-screen keyboard to fix this in less than 20 seconds.

but it's not fixed is it? and we know the day 1 version also hasn't fixed it... because noone cares... noone at Capcom gives a shit.


that is the truth about multiplatform development, the lower the sales, the less attention you get.
last gen Microsoft was lucky that their One X was so much more powerful that it was simply bruteforcing games to run better than the Pro. I would bet that if the PS4 Pro and the One X released on the same day with exactly the same hardware and the exact same clock speeds, that the Pro versions would still run better... the simple fact that the APIs are different and that they would need to actually adjust stuff for the One X would means it would run worse.

and then we have Ace Combat 7 of course, which the devs somehow managed to port over so sloppily that it actually ran worse on One X than on the Pro. same res, same settings, way lower unstable framerate.
and if the One X wasn't so undeniably more powerful than the Pro, you can bet your ass some fanboys would have taken that example to show that the Pro is superior.

simply moving a game over to a different API has many implications on performance.
you can run the exact same game on PC in DX12 and then again in Vulkan, and even tho both in theory offer the same access to the hardware, one will perform worse than the other almost always, and that is usually down to how much time the devs spend working on one or the other API.
so if you have 2 systems that are extremely close in terms of hardware... simply the API being different can lead to one system performing worse.
and that is what I meant above, if the One X and Pro had been identical in hardware, the API alone would have meant the One X version would have run worse in many games, and you can bet on that.

and btw. part of this can be Microsoft's fault. maybe their DX12 version they run on Xbox is just terrible to work on 🤷‍♂️ this could also be the case... but it's not the issue in the example of how RE4 plays worse on Xbox of course, since that's literally just a config file discrepency that noone cared to correct.
 

Hoddi

Member
Cerny can say what he wants, it is a fact that the PS5 can hit a power limit, upon which it will downclock either the CPU or the GPU.
if developers want to optimize their games as well as possible they will indeed need to take this into account and adjust the game accordingly. even if they don't handpick how fast any given component runs, they will be able to notice clock changes running the code on dev kits
I agree. I don't want to get into this but I don't think some of you guys fully understand how power limits work. The CPU/GPU can obviously run at 3.5/2.3ghz if they're doing very little work because then they won't be consuming 250w in the first place.

Does that mean both can run at those frequencies when they're fully loaded? No, it doesn't and there wouldn't be any need for variable clocks if that were the case.
 

ChiefDada

Gold Member
he is not responsible for the fact that the average people on these forums are idiots.

So Alex, who made a blanket statement that PS5 could not perform at 2080 level and ended up being wrong, is the smart one. And the people who correctly theorized that the PS5 could perform similar to a 2080 in certain scenarios are the idiots.

Yeah that totally makes sense. None of what you're saying is backwards at all.
 

01011001

Banned
So Alex, who made a blanket statement that PS5 could not perform at 2080 level and ended up being wrong, is the smart one. And the people who correctly theorized that the PS5 could perform similar to a 2080 in certain scenarios are the idiots.

Yeah that totally makes sense. None of what you're saying is backwards at all.

are you one of them too or something?
how does 1 game performing better than on a 2080 discredit what he said?

guess what... the Xbox 360 runs Dark Souls 1 better than a PC with an RTX4090. so if Alex said that the Xbox 360 can not perform on the level of an RTX4090, would you bring Dark Souls up as an example of how he is wrong?

if he said that the PS4 Pro, which is LITERALLY WORSE IN EVERY WAY to an Xbox One X, can not perform on the level of an Xbox One X... would you then bring up Ace Combat 7 as an example of why he is the dumb one?

outliers always exist, developers making suboptimal ports exist.
if you take a game that runs on a SONY engine, developed to create SONY exclusive games on SONY hardware, and then port that to PC, you can't be surprised to find that SONY hardware, using SONY's API, for which this SONY engine was developed to be used on/with, runs it better than equivalent or slightly more powerful PC hardware running a different API.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
are you one of them too or something?
how does 1 game performing better than on a 2080 discredit what he said?

guess what... the Xbox 360 runs Dark Souls 1 better than a PC with an RTX4090. so if Alex said that the Xbox 360 can not perform on the level of an RTX4090, would you bring Dark Souls up as an example of how he is wrong?

if he said that the PS4 Pro, which is LITERALLY WORSE IN EVERY WAY to an Xbox One X, can not perform on the level of an Xbox One X... would you then bring up Ace Combat 7 as an example of why he is the dumb one?

outliers always exist, developers making suboptimal ports exist.
if you take a game that runs on a SONY engine, developed to create SONY exclusive games on SONY hardware, and then port that to PC, you can't be surprised to find that SONY hardware, using SONY's API, for which this SONY engine was developed to be used on/with, runs it better than equivalent or slightly more powerful PC hardware running a different API.
there are plenty of games that run better or equivalent to a 2080 on PS5 according to Alex's own benchmarks.

He was talking out of his ass for over a year prior to the PS5 launch, and within a few weeks of launch, he had been proven wrong.
 

01011001

Banned
there are plenty of games that run better or equivalent to a 2080 on PS5 according to Alex's own benchmarks.

He was talking out of his ass for over a year prior to the PS5 launch, and within a few weeks of launch, he had been proven wrong.

which games?
 

DenchDeckard

Moderated wildly
there are plenty of games that run better or equivalent to a 2080 on PS5 according to Alex's own benchmarks.

He was talking out of his ass for over a year prior to the PS5 launch, and within a few weeks of launch, he had been proven wrong.
Can you list the games, I'm sure the only one we have seen was death Stranding which was a playstation game, running on a playstation engine designed for their API and ported to PC.
 

Hoddi

Member
there are plenty of games that run better or equivalent to a 2080 on PS5 according to Alex's own benchmarks.

He was talking out of his ass for over a year prior to the PS5 launch, and within a few weeks of launch, he had been proven wrong.
Why are people even comparing these consoles to nvidia GPUs? They have AMD GPUs and there's a billion differences between them.

UE4 is notorious for performing worse on AMD GPUs and it's been that way for literally a decade. It doesn't say anything about these console GPUs.
 

ChiefDada

Gold Member
if he said that the PS4 Pro, which is LITERALLY WORSE IN EVERY WAY to an Xbox One X, can not perform on the level of an Xbox One X... would you then bring up Ace Combat 7 as an example of why he is the dumb one?

I never resorted to such petty insulted towards anyone. That was you, remember?

he is not responsible for the fact that the average people on these forums are idiots.


outliers always exist, developers making suboptimal ports exist.
if you take a game that runs on a SONY engine, developed to create SONY exclusive games on SONY hardware, and then port that to PC, you can't be surprised to find that SONY hardware, using SONY's API, for which this SONY engine was developed to be used on/with, runs it better than equivalent or slightly more powerful PC hardware running a different API.

Interesting. But doesn't that also work the other way around or no? For games designed around DirectX, what handicap should be applied to PS5?

Hmm, maybe we should use a more agnostic benchmark for a more objective analysis:

 

01011001

Banned
I never resorted to such petty insulted towards anyone. That was you, remember?






Interesting. But doesn't that also work the other way around or no? For games designed around DirectX, what handicap should be applied to PS5?

Hmm, maybe we should use a more agnostic benchmark for a more objective analysis:



benchmarks never corollate with real world performance.
the PS2 in theory could display a shitload of polygons... the problem tho, it would never be able to shade them all.
so having a Polygon crunching benchmark would give unrealistic results if you applied this to the PS2.

even more down to earth benchmarks are unreliable to judge game performance.
 

ChiefDada

Gold Member
Why are people even comparing these consoles to nvidia GPUs? They have AMD GPUs and there's a billion differences between them.

Precisely! You are smartly identifying the limitations of comparing two GPUs with different architectures. Alex fails to do this on many occasions and often speaks in the absolute.

I agree. I don't want to get into this but I don't think some of you guys fully understand how power limits work. The CPU/GPU can obviously run at 3.5/2.3ghz if they're doing very little work because then they won't be consuming 250w in the first place.

Does that mean both can run at those frequencies when they're fully loaded? No, it doesn't and there wouldn't be any need for variable clocks if that were the case.

Incorrect. Cerny explained this with the "Race to Idle" concept. As he said, it would be pointless to brag about the SmartShift variable clock setup if they were talking about situations where the CPU/GPU wasn't fully utilized in a given frame.

From Eurogamer Interview with Sony:
https://www.eurogamer.net/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive
We use 'race to idle' to describe this pointless increase in a GPU's frequency," explains Cerny. "If you construct a variable frequency system, what you're going to see based on this phenomenon (and there's an equivalent on the CPU side) is that the frequencies are usually just pegged at the maximum! That's not meaningful, though; in order to make a meaningful statement about the GPU frequency, we need to find a location in the game where the GPU is fully utilised for 33.3 milliseconds out of a 33.3 millisecond frame. "So, when I made the statement that the GPU will spend most of its time at or near its top frequency, that is with 'race to idle' taken out of the equation - we were looking at PlayStation 5 games in situations where the whole frame was being used productively. The same is true for the CPU, based on examination of situations where it has high utilisation throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency."

So if CPU and GPU rarely drop frequency in situations where they are both fully utilized, what situations would it make sense to downclock??? IN SITUATIONS WHERE YOU ARE PERFORMING ACTIVITY THAT REQUIRES HIGH POWER CONSUMPTION, BUT DOESN'T NECESSARILY REQUIRE MAX CLOCKS FOR PERFORMANCE REASONS.

1. Simple geometry/Maps Screens (Do you care if the GPU downclocks when presenting maps? Does this make the PS5 weak or smart?)




2. AVX 256 Instructions



Think about the hypothetical implications here - you could theoretically see a situation where developers are more limited in AVX workloads for other platforms because they can't easily program for hardware that isn't able to smartly downclock CPU or GPU in a given frame(s) where said CPU/GPU doesn't require the frequency (thus, doesn't require the power) but the AVX workload does.
 

Rykan

Member
None will satisfy you no matter what and again that's fine by me. Just a few from earlier years for the last time:



(In context of PS5's variable frequencies which is a pure misunformation)
Instead of clarifying why the earlier examples you posted are clear signs of "blatant bias," you have now posted a list of new posts from March 2020, months before the PS5 console even came out, in which he made mostly reasonable predictions based on the data available at the time.

If you're being completely honest with yourself, you must surely acknowledge that your accusations of bias were unfounded and hyperbolic.

Here's the thing: Alex is a professional working for Digital Foundry, a group that analyzes and compares video game technology. This includes comparing the performance of games on different hardware to one another.

I don't think you quite understand how severe an accusation of bias really is when you level it against DF or one of its members. Their entire business depends entirely on objectivity and unbiased analysis. When you claim that he's biased, you MUST be able to at least prove some ulterior motive as to why he's somehow biased against the PS5.

Why would someone with a history of PC gaming be "biased" against the playstation 5 when comparing it to Series X and potentially put both his employment and his colleagues in jeopardy? It just doesn't make any sense to suggest that.

And this is another thing that you don't seem to fully grasp: Being wrong about something or being wrong about a prediction is not at all a sign of "bias." It's just that.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Can you list the games, I'm sure the only one we have seen was death Stranding which was a playstation game, running on a playstation engine designed for their API and ported to PC.
I dont remember all of them, but AC Valhalla was the big one right at the start of the gen. Call of Duty, DMC and Borderlands were all performing equivalent to 2080 (on both the XSX and PS5) but he latched on to the Watch Dogs RT results and kept saying these consoles are a 2060 Super at best. Then Doom comes out AFTER the MS acquisition and the 2060 Super was way behind both the PS5 and XSX even in RT mode. the Matrix demo is definitely performing like a 2080 or 2080 Super.

The playstation engine games like Uncharted 4, and Spiderman run way better than 2080 Supers hitting 3070 or 2080 Ti levels of performance. The most recent TLOU Part 1 remake has Sony suggesting a 10.7 tflops 6600xt offering 1080p 60 fps while the 10.2 tflops PS5 does 1440p 60 fps, a massive 75% boost in performance.

His methodology is flawed anyway. He consistently used these expensive $600 12 core 24 thread CPUs running at 5.2 GHz in his PCs while comparing them to consoles because according to him, these games arent CPU bound... utter fucking nonsense when comparing GPU performance. He shouldve been using equivalent Zen 2 CPUs while downclocking them to PS5 and XSX clocks. I give NX gamer a lot of shit for his equally flawed comparisons but at least he's using CPUs like the ryzen 3600 and ryzen 2700x which are far closer to the CPU in the PS5 than the freaking 12900k Alex used for his comparisons.
 

SlimySnake

Flashless at the Golden Globes
Why are people even comparing these consoles to nvidia GPUs? They have AMD GPUs and there's a billion differences between them.

UE4 is notorious for performing worse on AMD GPUs and it's been that way for literally a decade. It doesn't say anything about these console GPUs.
Ask Alex. I have been begging NX Gamer and DF to go and buy the 6600xt or 6700 for PS5 and 6700 xt for XSX comparisons. They are perfect for comparisons especially since you can lock the clocks on PC using MSI Afterburner to get a perfect tflops comparison.

Whats interesting is that UE5 is performing more or less equivalent on AMD cards. So its possible that Epic's work on the PS5 demo likely helped not just the PS5 but all AMD cards. The early nvidia advantage in RT is also no longer as prevalent with games like Doom, Far Cry, Spiderman and matrix more or less performing the same on AMD cards. It will come down to developer implementation.
 

Thief1987

Member
Instead of clarifying why the earlier examples you posted are clear signs of "blatant bias," you have now posted a list of new posts from March 2020, months before the PS5 console even came out, in which he made mostly reasonable predictions based on the data available at the time.

If you're being completely honest with yourself, you must surely acknowledge that your accusations of bias were unfounded and hyperbolic.

Here's the thing: Alex is a professional working for Digital Foundry, a group that analyzes and compares video game technology. This includes comparing the performance of games on different hardware to one another.

I don't think you quite understand how severe an accusation of bias really is when you level it against DF or one of its members. Their entire business depends entirely on objectivity and unbiased analysis. When you claim that he's biased, you MUST be able to at least prove some ulterior motive as to why he's somehow biased against the PS5.

Why would someone with a history of PC gaming be "biased" against the playstation 5 when comparing it to Series X and potentially put both his employment and his colleagues in jeopardy? It just doesn't make any sense to suggest that.

And this is another thing that you don't seem to fully grasp: Being wrong about something or being wrong about a prediction is not at all a sign of "bias." It's just that.
Just imagine living in a fairytale where gaming press is professional and doesn't have any biases.
 
Last edited:
Top Bottom