• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rendering Engineer at EpicGames: DirectX RayTracing and Vulkan Optix holds everything back in PC land

VFXVeteran

Banned
Right now for next-gen consoles, Spider-Man: Miles Morales has set the benchmark for the best RT Reflections for launch games, I don't understand why you'd call it garbage when Rendering engineer of Epic games considers it a technical achievement.

I was a bit harsh in my wording there... but it's definitely not the best RT reflections for ALL platforms as many Sony gamers want to think. WD:L does indeed have better reflections because they are done with higher resolution (comparing the PC Ultra RT mode) and they include more accurate reflections (i.e. they don't cut off nearly everything).

I don't know why he said they were a technical achievement since that could be interpreted so many ways. Perhaps he never saw WD:L reflections or he's comparing it to the fact that the game has a crap load of buildings that have reflections in them.
 
Last edited:

VFXVeteran

Banned
It totally is though:

FhQ9R5y.jpg


From a distance is still noticeable:

cZhVqrG.jpg


You could argue its "squary", but thats just how the pole looks (as i said, not a graphical benchmark of a game):

VGytClR.jpg


And before you point out, people here are sprites, not 3D objects, which might be the reason why they don't cast any shadow (again, not exactly a graphical benchmark of a game)

OK.. I couldn't see that pole from the angle that you had. Thanks for the clarification.
 

VFXVeteran

Banned
It's Dictator and obviously it's just his guesses. I'm asking to tech data about your previous assumption; without that I imagine it's just your personal presumption (not the first and not the last)

I did read it up somewhere.. but even if I can't verify what they said, I can bet a dollar to a dime that Dictator and my opinion is correct since we know how to replicate the reflections done in Spiderman MM. Since they don't mention any "code to the metal" to achieve the performance in them (how they look doesn't matter since they don't mimic real reflections in the first place), you would be assuming something more illogical than our claims.
 
Last edited:

VFXVeteran

Banned
sony releasing games on PC means they port the games to PC or more correctly port the engine, it doesnt mean they are not targeting the console with their engine originally, specially if the game is not releasing simultaneously ,

The reason the game isn't releasing simultaneously has to do with business politics and not the porting. They would be working on the PC version of a particular game concurrently as they are making the game itself. Just like the AAA 3rd party developers are doing. Right now, a Death Stranding 2 or HFW would be getting concurrent development with the PC (even if it releases 6 months later) because the graphics engine is already ported to PC. ND is doing the same (this statement is NOT a guess).

the original game may be targeting low level functions on console and then run on PC with not the same optimization, another example is that engines like unreal engine can have special optimization for consoles and not use them on PC when porting its invisible for the average developer but there are changes

That is true. But those optimizations would be needed in order to implement a certain feature that already can be reached on a PC - for example RT reflections doing much more than just proxy geometry or showing reflections within reflections like in WD:L. It would NOT be used to somehow make the console version a BETTER version than the PC.

Bottomline, when you guys talk about console being more optimized for a game is bogus because the console will never outperform or look better than it's PC counterpart.. so you really should stop trying to make it seem like optimizing gives the consoles some advantage over the PC because it really doesn't nor ever will.
 
Last edited:
Ray tracing is the worst feature developed in recent gaming. Compromises lots of graphical features in exchange for realistic puddles and reflections that most people wouldn't even care if wasn't for this next gen "must have" bullshit..
 
Last edited:

theclaw135

Banned
The reason the game isn't releasing simultaneously has to do with business politics and not the porting. They would be working on the PC version of a particular game concurrently as they are making the game itself. Just like the AAA 3rd party developers are doing. Right now, a Death Stranding 2 or HFW would be getting concurrent development with the PC (even if it releases 6 months later) because the graphics engine is already ported to PC. ND is doing the same (this statement is NOT a guess).



That is true. But those optimizations would be needed in order to implement a certain feature that already can be reached on a PC - for example RT reflections doing much more than just proxy geometry or showing reflections within reflections like in WD:L. It would NOT be used to somehow make the console version a BETTER version than the PC.

Bottomline, when you guys talk about console being more optimized for a game is bogus because the console will never outperform or look better than it's PC counterpart.. so you really should stop trying to make it seem like optimizing gives the consoles some advantage over the PC because it really doesn't nor ever will.

I'd argue achieving the same results on lower cost hardware, with considerably less effort, is an advantage for consoles. PC has multiple GPU vendors, each of which offer product lines spanning from low-end to high-end. A console game's developer can program down to the metal for a single chip, if inclined to do so.
 

Allandor

Member
Another developer who doesn't understand why standards are needed. Standards are never optimal but lead to things that can be supported in the future and on multiple platforms. Yes he can use e. g. the Nvidia stuff more freely, but it isn't open and tgan only runs at Nvidia cards until Nvidia drops support for the old interfaces.
 
I'd argue achieving the same results on lower cost hardware, with considerably less effort, is an advantage for consoles. PC has multiple GPU vendors, each of which offer product lines spanning from low-end to high-end. A console game's developer can program down to the metal for a single chip, if inclined to do so.
You can only optimize so much with the amount of shaders, rops, cu's, etc. PC had outclassed the new consoles by several years already. Its not a bad thing, but it's the reality of it. You won't get the same performance with a $500 budget for hardware on PC, so it is cost effective, but it only goes so far.
 
Watch Dog Legion ULTRA REFLECTION
535YiJM.jpg



Spiderman MM reflections



0BkBi1.jpg


VFX and Don


cNKeaaS.gif


Come one dude. Please stop this nonsense man.

Are you really willing to show cutouts of window reflections in Spiderman MM compared to WD:Legion that has already been proven to have reflections within reflections (i.e. the proper use of recursive ray-tracing)? Are you willing to show culled out leaves from trees compared to all the leaves available in WD:L? Are you willing to see half res of 4k (i.e. 2k) compared to 1080p res for reflections with the two games (i.e. higher resolution ray-traced rendering yields better results).

Pedestrians not disappearing is more relevant than reflection within reflection a few feet away that a few feet later changes to screen space reflection or leaves, though I think later versions do have leaves in spiderman



edit: Though was editing new post, but it seems I edited original post. Regardless point still stands.
 
Last edited:
Bottomline, when you guys talk about console being more optimized for a game is bogus because the console will never outperform or look better than it's PC counterpart.. so you really should stop trying to make it seem like optimizing gives the consoles some advantage over the PC because it really doesn't nor ever will.

is this some kind of joke?

I think is perfectly clear what "optimized" means and I think there is no need to explain that "PC" does not imply a particular specification of hardware like the console name if you want to talk about bogus comparisons I have an old 486 it is a PC so by what you say it should be better for RT than any console 😉
 

VFXVeteran

Banned
RT on Watch Dogs is no where close to the use in Miles Morales.... they are not even comparable.

Even on PC Watch Dogs is not a showcase for RT.

Come one dude. Please stop this nonsense man.

Are you really willing to show cutouts of window reflections in Spiderman MM compared to WD:Legion that has already been proven to have reflections within reflections (i.e. the proper use of recursive ray-tracing)? Are you willing to show culled out leaves from trees compared to all the leaves available in WD:L? Are you willing to see half res of 4k (i.e. 2k) compared to 1080p res for reflections with the two games (i.e. higher resolution ray-traced rendering yields better results).
 

VFXVeteran

Banned
is this some kind of joke?

I think is perfectly clear what "optimized" means and I think there is no need to explain that "PC" does not imply a particular specification of hardware like the console name if you want to talk about bogus comparisons I have an old 486 it is a PC so by what you say it should be better for RT than any console 😉

It's not a joke. Look at the Sony guys and learn why I say what I say. It's quite clear they have an agenda of thinking their precious box is more powerful thanks to the dev's optimization code than a powerful high-end PC that completely dwarfs the console's power.
 

VFXVeteran

Banned
I'd argue achieving the same results on lower cost hardware, with considerably less effort, is an advantage for consoles. PC has multiple GPU vendors, each of which offer product lines spanning from low-end to high-end. A console game's developer can program down to the metal for a single chip, if inclined to do so.

It's never less effort with coding for specific hardware. I've never seen this in practice. Programming for an standard API takes much less effort. That's what the API does. It frees up trying to figure that out.

Also, you don't get the same results. That's the big issue with the console warriors and why their theories break down into wishful thinking. WD:L reflections had to be changed via a .ini file to a setting lower than the "Low" in the UI in order for the PC version to match the look of the consoles.
 
Last edited:

VFXVeteran

Banned
So pedestrians don't disappear a few meters away from you like they do in WD:L

This

and this

and this
Yup. Ultra for you.



This is next gen. I saw the WD:L reveal and it was totally unimpressive.

Wouldn't surprise if the assets are higher quality than some of the earlier spiderman movies, which look far worse. Hollywood wanted these kinds of assets just a few years ago.

This

Hollywood wants that back, it shouldn't be possible anywhere. You could easily mistake it for a new cg film screenshot.

Yup it's that good.

Omega Supreme Holopsicon Omega Supreme Holopsicon - if you are going to argue with me about screenshots, then eliminate that cinematic gameplay and photomode shit. I will ignore any comments made with these screenshots as I'm tired of these bull shots when the game doesn't look like that in-gameplay. I've argued this for 7yrs with the PS4 crew and not going into again for another 7yrs.

Bring actual gameplay screenshots with no photo-mode that cleans up shit or cinematics where rendering shaders at a much higher quality is not on the table. Show the game as it should be shown - while PLAYING THE GAME.

P.S. Wait till I get my hands on a PS5. I will take the screenshots that should be taken and even run streamed video to show all the discrepencies running a console game compared to a PC @ Ultra settings.
 
Last edited:
Omega Supreme Holopsicon Omega Supreme Holopsicon - if you are going to argue with me about screenshots, then eliminate that cinematic gameplay and photomode shit. I will ignore any comments made with these screenshots as I'm tired of these bull shots when the game doesn't look like that in-gameplay. I've argued this for 7yrs with the PS4 crew and not going into again for another 7yrs.

Bring actual gameplay screenshots with no photo-mode that cleans up shit or cinematics where rendering shaders at a much higher quality is not on the table. Show the game as it should be shown - while PLAYING THE GAME.

P.S. Wait till I get my hands on a PS5. I will take the screenshots that should be taken and even run streamed video to show all the discrepencies running a console game compared to a PC @ Ultra settings.
gameplay




Watch dog Legions Gameplay
Guys, wanna laugh? See pedestrians disappearing to oblivions in that 3090 behemoth! (timestamped)




That 36TF must be doing some kinda blackhole inside the game.

This
 
Last edited:

Lethal01

Member
You won't get the same performance with a $500 budget for hardware on PC, so it is cost effective, but it only goes so far.

I feel like that's all I'm really hearing people say, That they are getting more out of the hardware than they would on PC. I'm not hearing claims that PS5 will be regularly beating the RTX 3090 but it seems like that's what people hear every time someone mentions consoles having optimization that get's more out of the hardware.

At launch consoles give stronger hardware for cheaper and the hardware performs slightly better than a PC of equal specs.
 
Last edited:
It's not a joke. Look at the Sony guys and learn why I say what I say. It's quite clear they have an agenda of thinking their precious box is more powerful thanks to the dev's optimization code than a powerful high-end PC that completely dwarfs the console's power.

:pie_eyeroll:

who says that? make quote

and if a dev optimize a game for a system it doesnt mean the system its more powerful it only mean certain game its running to the strengths of a system so will perform better than similar specs other system such as a PC with similar specs, a more powerful and costly system should in theory be able to outperform by bruteforce if such system "dwarfs" the other and of course talking about the same game and assuming the developer is competent because you can have the most powerful PC but nothing will protect you from bad software
 
I feel like that's all I'm really hearing people say, That they are getting more out of the hardware than they would on PC. I'm not hearing claims that PS5 will be regularly beating the RTX 3090 but it seems like that's what people hear every time someone mentions consoles having optimization that get's more out of the hardware.

At launch consoles give stronger hardware for cheaper and the hardware performs slightly better than a PC of equal specs.
Cheaper performance for sure. Once you get the threshold of a rtx 2060 in raytracing, you can only do so much. That's a low end gpu in today's standings.... Ooof
 

Lethal01

Member
Cheaper performance for sure. Once you get the threshold of a rtx 2060 in raytracing, you can only do so much. That's a low end gpu in today's standings.... Ooof

We don't really know what the threshold is, I really don't think judging the console by it's launch games is very smart when we have developers saying they have a clear roadmap of how they can optimize their game to get far better result..

It would seem that they disagree there is "only so much you can do"
 
We don't really know what the threshold is, I really don't think judging the console by it's launch games is very smart when we have developers saying they have a clear roadmap of how they can optimize their game to get far better result..

It would seem that they disagree there is "only so much you can do"
Not so much launch games, but the actual hardware, compared to PC hardware. Ps5 is like a 2060, xsx is like a 2060S in raytracing. Both will struggle in 4k, checkerboard rendering, high framerate etc. You can see this currently in COD or AC games. You can only optimize, but can't suddenly get 2070/3090 graphics with the same hardware you currently have.
 
Last edited:

VFXVeteran

Banned
gameplay




Watch dog Legions Gameplay

This


I have a video myself of WD:L that I"m going to upload in 4k on Youtube soon.

From the DF video that I just looked at, if you guys want to claim that because MM has infinite draw distance of objects in reflections no matter how bad those reflections look, then I'll concede that I can see you feeling MM is the best use of RT reflections. However, that's a purely subjective opinion, but one I will accept.

But to simply say the reflections are BETTER in quality to WD:L is just lying as they are clearly not of better quality.
 

VFXVeteran

Banned
Not so much launch games, but the actual hardware, compared to PC hardware. Ps5 is like a 2060, xsx is like a 2060S in raytracing. Both will struggle in 4k, checkerboard rendering, high framerate etc. You can see this currently in COD or AC games. You can only optimize, but can't suddenly get 2070/3090 graphics with the same hardware you currently have.

Exactly my point.
 

Bo_Hazem

Banned
Watch Dog Legion ULTRA REFLECTION
535YiJM.jpg



Spiderman MM reflections



0BkBi1.jpg


VFX and Don


cNKeaaS.gif




Pedestrians not disappearing is more relevant than reflection within reflection a few feet away that a few feet later changes to screen space reflection or leaves, though I think later versions do have leaves in spiderman



edit: Though was editing new post, but it seems I edited original post. Regardless point still stands.


It's funny to argue between the two when one has SSR to compensate for the shitty RT and extremely low res reflection. Very few pedestrians in WD:L yet that makes them disappear after few meters.
 
LOL, you're all taking the bait - all I know is, as a PC gamer - Man, I could run every single game from every single generation at Ultra High detail but unless a new card juuuuuuuuuuuuuuust released and I upgraded to it immediately - those max setting made the game run far more poorly than Console launches that just released a massive game - every time.

And even afterword's BF3 as example - looked glorious on my PC at high and ultra high details, couldn't run it and actually expect to play it on highest setting's however, even after maxing out my hardware spec's. That is the difference between consoles and PC's cross platform, don't mention how long it took in actuality - get a copy of Gears of war 1 or 2 or 3 on PC. Which when released, PC had nothing that compared to it.

This occurrence happens every generation, last gen with Horizon - no game on PC came close graphically and a couple of other console title followed that did this too - it will happen this gen as well. Not sure why other's continue to insinuate console's wont eventually have a visual masterpiece that PC's have to catch up with as it is an exclusive console title, particularly when history shows otherwise.
 
Last edited:

theclaw135

Banned
It's never less effort with coding for specific hardware. I've never seen this in practice. Programming for an standard API takes much less effort. That's what the API does. It frees up trying to figure that out.

Also, you don't get the same results. That's the big issue with the console warriors and why their theories break down into wishful thinking. WD:L reflections had to be changed via a .ini file to a setting lower than the "Low" in the UI in order for the PC version to match the look of the consoles.

We shouldn't be praising PC developers who can't be bothered to include the console version's settings.
 

llien

Member
Are you telling that Epic is coding RT for each and every configuration where their engine will be available?
No.

I'm referring to Lumen structures, which are the same across the board, being, to put it softly, very incompatible with DXR.
UPDATE: better link
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Not so much launch games, but the actual hardware, compared to PC hardware. Ps5 is like a 2060, xsx is like a 2060S in raytracing. Both will struggle in 4k, checkerboard rendering, high framerate etc. You can see this currently in COD or AC games. You can only optimize, but can't suddenly get 2070/3090 graphics with the same hardware you currently have.

Of course a lot of third party devs will have a similar approach on al HW especially near launch as it is the practical thing to do, but as much as you do not want to like it you need to either call the Epic guy out or accept that with lower level access developers with time and resources can make more efficient use of the resources and the benefits you get from a vertically integrated design which is sensible and I am not sure why it is now being disputed (although I have my PCMR theories ;)). Still, first party teams have that mission: to explore and push the platform forward: the RT in MM is not bad for a launch title.

So, taking the above in and your 2060/2060S ratings, how do you mix it all together? What does extracting more performance out of the HW mean to you then? Other ways to say “punching above its weight”?

Not sure creating strawmans and exaggerations (as if people thought that a $399 console with a peak power consumption around 200+ Watts is going to have stronger raw HW performance than cards that cost 2x and consume much more than 200 Watts on their own) does any good, but you do you.
 

llien

Member
Let's recap.

Someone said raytracing is over-rated technology.
I said ray tracing is indispensible for better lighting.
You asked me if I saw the UE5 demo. Implying it shows we don't need ray-tracing for better lighting

unknown.png


The UE5 demo uses raytracing to give better lighting.

I didn't ask you anything, confused stranger.
You were told ray tracing (in this context: hardware RT) is not needed to have the said effects.

No, Lumen is not using it.

No, its structures are not even compatible with DXR and the like take.

No, you do not need "ray tracing" (as in hardware with DXR like capabilities) to get the said effects.

Yes, it is time for you to get over it.
 

Lethal01

Member
I didn't ask you anything, confused stranger.
Have you seen UE5 demo?
:messenger_smirking:
You not being able to remember what happened less than a day ago explains a lot confused stranger.

You were told ray tracing (in this context: hardware RT) is not needed to have the said effects.
No, you do not need "ray tracing" (as in hardware with DXR like capabilities) to get the said effects.
No, I was told that Raytracing is useless and overrated before you butt in.
This explains why you are having such a hard time communicating. We are talking about Raytracing, whether it's hardware accelerated or not isn't the point. And yes, currently raytracing is the only way to achieve these effect accurately in real time, it's the best method by far. Any method you are thinking of is either inferior or is often just an attempt at using raytracing as efficiently as possible.

No, Lumen is not using it.

The creator of it disagrees.

No, its structures are not even compatible with DXR and the like take.

They are looking into if it's possible to use the built in ray tracing hardware to improve the performance of Lumen.

It seems you just don't know what you are saying, You want to force your hate of DXR into the conversation I was having about Raytracing, and then get confused when I talk using the definition of what was actually being discussed.. maybe you should move on.
 
Last edited:

Darius87

Member
Somehow people believe they will get 4090 ti specs from "optimization", and I just laugh.
you can get up to x2 twice better performance for optimizing for particular spec many devs said so.
so if game for PS5 is coded to the metal it could match performance of 20Tflops PC card, and on top of that there's a lot custom ASIC hw in PS5 which would give even more headroom, not necessary in GFX.
 
Last edited:

acm2000

Member
Yes I do.

Add RT Audio and Shadows and it makes matters worse.

GPU resources can be better spent elsewhere.

And you are still wrong, ray traced shadows are miles above Shadow maps and make a HUGE difference in making games look more realistic, cleaner etc and we'll worth the GPU calls, admittedly RT gi is mind melting but probably to much even for 30xx generation
 
Last edited:

llien

Member
You not being able to remember what happened less than a day ago explains a lot confused stranger.


No, I was told that Raytracing is useless and overrated before you butt in.
Yes, and "ray tracing" meant "DXR like hardware RT" in this discussion, nobody questioned basic laws of optics.

We are talking about Raytracing, whether it's hardware accelerated or not isn't the point.
Yes, it is exactly the point.

t seems you just don't know what you are saying,
Well, in short: you are full of shit pathetic poster who did this:

tenor.gif


It was DEDICATED HARDWARE RT that people were talking about.

Nobody has ever argued about exactly how it is done in software just that no hardware RT is needed to have those effects, as we've seen for years and now also with EPIC's Lumen.
 

assurdum

Banned
I did read it up somewhere.. but even if I can't verify what they said, I can bet a dollar to a dime that Dictator and my opinion is correct since we know how to replicate the reflections done in Spiderman MM. Since they don't mention any "code to the metal" to achieve the performance in them (how they look doesn't matter since they don't mimic real reflections in the first place), you would be assuming something more illogical than our claims.
Honestly, I don't care what you or Dictator, think to see in Spiderman raytracing because "you know how such things working", but you are not a sophisticated AI machine which detects the graphic tech with your scan vision, and just saying WD Legion uses a far superior raytracing reflection compared Spiderman proves it. I don't know where the hell is coming this absurd convinction because there isn't any evidence on the screen, far the contrary . And the funny thing just needs a simple capture from the same DF videos to destroy such assessment. Heck, raytracing in WD is extremely limited in the LOD, it abuses of a mix of cubemap/SSR, in what absurd universe can be superior of something which appears more raytraced? Where is it supposed to see such superiority at least, and don't hide your assumption behind tons of techs conjuncture, as always, at leastjust shows to poor humans as us a comparison screenshot on both which proves your assumption.
 
Last edited:

ripeavocado

Banned
I mean, are you familiar with late 90 PC? Where you don't have sounds or your game running in less colors and all that shit? On PC there is a fucking challenge to make piece of software works on literally millions of combination of HW. So yeah his bitching does not really mean much.

what does he know, he’s just a rendering engineer at Epic Games while your are a random guy on the internet talking about 90s PC that have nothing to do with what he was talking about.
 

assurdum

Banned
Watch Dog Legion ULTRA REFLECTION
535YiJM.jpg



Spiderman MM reflections



0BkBi1.jpg

Like seriously. The supposed professional people spit out absurdity this time but they can because they see "things" which us poor humans can't catch with our limited knowledge. It's almost embarrassing the difference between the two and no needs to be a expert to notice it. It's like to compare a PS2 POV Vs PS3 and say "you know it has better transparency, PS3 better pov doesn't counts ". It's the same logic.
 
Last edited:

Lethal01

Member


Yes, and "ray tracing" meant "DXR like hardware RT" in this discussion, nobody questioned basic laws of optics.


Yes, it is exactly the point.


Well, in short: you are full of shit pathetic poster who did this:

tenor.gif


It was DEDICATED HARDWARE RT that people were talking about.

Nobody has ever argued about exactly how it is done in software just that no hardware RT is needed to have those effects, as we've seen for years and now also with EPIC's Lumen.

Only one moving goalposts here is you Mr. "I didn't ask you anything".

That aside, the fact remains, those effects are done with raytracing, Lumen has the same issues as most of the solutions before it because it doesn't raytrace enough which is why it cannot properly replicate those effects. So raytracing is pretty much the most important thing when it comes to doing 3d lighting.

As a favor to you I will go on a tangent from what was originally discussed (raytracing) and address your point specifically.
If you already agree Lumen looks great due to the raytracing being implemented then you agree that hardware to make that raytracing go even better would be fantastic. but you are free to keep on being a walking contradiction.

adieu my confused friend, to the shadow realm with ya.
no more polluting this thread with you.
 
Last edited:

llien

Member
That aside, the fact remains, those effects are done with raytracing
The fact remains; it's not hardware RT, which is what everyone including yourself meant when saying "ray tracing" until that goalpost moved.
Nobody argued with that re-defined version of what "ray tracing" is.

Lumen has the same issues as most of the solutions before it because it doesn't raytrace enough.
Citation of what "enough raytracing" is is desperately needed to accompany this amazingly insightful take.

If you already agree Lumen looks great due to the raytracing being implemented...
I don't know or care why Lumen looks great, I see it looks great and I know it didn't use any hardware RT.

..then you agree that hardware to make that raytracing go even better...
Which hardware "go even better"? I need to see it to judge.
 
Last edited:

Trimesh

Banned
I'm referring to Lumen structures, which are the same across the board, being, to put it softly, very incompatible with DXR.
UPDATE: better link

So they fucked up their software architecture.

They basically designed their software around the way that nVidia chose to do things and are now whining that it doesn't map well to other platforms. That's not the fault of the other platforms, it's just shitty incompetent software engineering.
 
Last edited:

llien

Member
So they fucked up their software architecture.

They basically designed their software around the way that nVidia chose to do things...



They designed a system that runs without hardware RT on a wide range of hardware configs.
The only people whining in this thread are DXR proponents, who seem to struggle with the thought that DXR ain't at all needed to have good looking games.
 
Last edited:
LOL, you're all taking the bait - all I know is, as a PC gamer - Man, I could run every single game from every single generation at Ultra High detail but unless a new card juuuuuuuuuuuuuuust released and I upgraded to it immediately - those max setting made the game run far more poorly than Console launches that just released a massive game - every time.

And even afterword's BF3 as example - looked glorious on my PC at high and ultra high details, couldn't run it and actually expect to play it on highest setting's however, even after maxing out my hardware spec's. That is the difference between consoles and PC's cross platform, don't mention how long it took in actuality - get a copy of Gears of war 1 or 2 or 3 on PC. Which when released, PC had nothing that compared to it.

This occurrence happens every generation, last gen with Horizon - no game on PC came close graphically and a couple of other console title followed that did this too - it will happen this gen as well. Not sure why other's continue to insinuate console's wont eventually have a visual masterpiece that PC's have to catch up with as it is an exclusive console title, particularly when history shows otherwise.
so... Can I ask the elephant in the room, what does this mean?

UKN5iGf.png
 
Top Bottom