• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rendering Engineer at EpicGames: DirectX RayTracing and Vulkan Optix holds everything back in PC land

No. It's not the greatest thing since sliced bread, anymore than tessellation was the greatest thing since sliced bread, or anymore than bump mapping was the greatest thing since sliced bread.


I'm not sure why you're bringing up SSDs.


Oh so that's why... I'd say that Game Pass is bigger than RT, but whatever. I guess for you that doesn't count. I'll give you another one. Vulkan. Let me guess. That one doesn't count either... What about machine learning...? Hm... Doesn't count either, unless it's DLSS, Am I Rite?


So...? New techniques for real-time graphics are always evolving. RT is on everyone's radar simply because nVidia shoved it down everyone's throats.
RT is not new at all by the way, since you're counting SSDs and whatnot... RT has been around since the 1970s... It's new to modern games, just like ML, the new APIs, and yes, even SSDs.


It can all depend on what your definition of 'new' is. It's still new to PC, considering the games that support it and the amount of people that can actually use it. In fact, 4K is arguably still new to gaming in general.


Uhuh... It doesn't work like that. The performance hit is still too high compared to the visual improvement that it offers. You only think like that because your own mind works that way, and you project your own aggressive fanaticism onto others.

If you stopped posting stupidity, I might.


1uv808.jpg
Has there been hardware to run RT in real time, in the 1970s? There was a demo of a vehicle on ps3 running raytraced light in very low fps. But nothing in real time till recently. No one can disagree with that, or with the fact that it's one of the most, if not the biggest graphical breakthroughs. Maybe since Nvidia did it first, you don't care, or is it because AMD struggles with it in comparison?

DLSS is definitely up there, and compliments raytracing in a big way, no one can argue that, not even you can. But whatever man, you have too much fanboyism and bias to understand where we are headed from a technical and graphical standpoint. Which is fine and all, but it's hard to have meaningful conversations with those who live in an alternate reality and buy things out of pity. The sad part is, you can't even say what's bigger than DLSS or raytracing! Let me guess, Rage© mode in gpu's 😂?
 

MrFunSocks

Banned
Well nothings holding this rendering engineer back from writing his own bunch of raytracing APIs if the ones available aren't good enough for his liking. Go for gold, put your money where your mouth is.
 

Ascend

Member
Has there been hardware to run RT in real time, in the 1970s? There was a demo of a vehicle on ps3 running raytraced light in very low fps. But nothing in real time till recently. No one can disagree with that, or with the fact that it's one of the most, if not the biggest graphical breakthroughs. Maybe since Nvidia did it first, you don't care, or is it because AMD struggles with it in comparison?
Maybe it's because RT was an obvious move to artificially jack up GPU prices, despite nVidia already being one of the most profitable companies around. And as already stated, the performance drop is too high in comparison to the visual improvement that it offers. That is also a fact that no one can disagree with.
I can also say that you only care about RT exactly because nVidia has an advantage there. But I'm tired of going down to your level with your retarded backhanded personal slander.

DLSS is definitely up there, and compliments raytracing in a big way, no one can argue that, not even you can.
It does. Too bad it came at the cost of milking the audience for higher GPU prices than necessary, right?

But whatever man, you have too much fanboyism and bias to understand where we are headed from a technical and graphical standpoint.
Here we are again with the backhanded slander and passive aggressive comments... I know where we are headed. And not to brag, but I'm quite sure that my knowledge on hardware is more extensive than yours. You only care about the shiniest new virtual toys that you can only look at, rather than the actual workings and progress of the industry. Just like the majority. And the majority is always wrong. That's why democracy can never work, but that's another story for another time.

it's hard to have meaningful conversations with those who live in an alternate reality and buy things out of pity.
Another backhanded comment. You know what's hard? Having a conversation with;
1uv808.jpg


The sad part is, you can't even say what's bigger than DLSS or raytracing! Let me guess, Rage© mode in gpu's 😂?
I just did in my last post. But... Since your pigeon-chessing, you deliberately filter out whatever is out of your own bubble. You have comprehension problems. It's like there can be 100 black swans in front of you, and the white one is the only one you see. You don't post to have an actual conversation nor to learn anything, but just to try and be right and shove your view as some universal standard. That's why we constantly see this from you;

Red-herring-fallacy.jpg
 

Lethal01

Member
Maybe it's because RT was an obvious move to artificially jack up GPU prices, despite nVidia already being one of the most profitable companies around. And as already stated, the performance drop is too high in comparison to the visual improvement that it offers. That is also a fact that no one can disagree with.
I can also say that you only care about RT exactly because nVidia has an advantage there. But I'm tired of going down to your level with your retarded backhanded personal slander.


It does. Too bad it came at the cost of milking the audience for higher GPU prices than necessary, right?


Here we are again with the backhanded slander and passive aggressive comments... I know where we are headed. And not to brag, but I'm quite sure that my knowledge on hardware is more extensive than yours. You only care about the shiniest new virtual toys that you can only look at, rather than the actual workings and progress of the industry. Just like the majority. And the majority is always wrong. That's why democracy can never work, but that's another story for another time.


Another backhanded comment. You know what's hard? Having a conversation with;
1uv808.jpg



I just did in my last post. But... Since your pigeon-chessing, you deliberately filter out whatever is out of your own bubble. You have comprehension problems. It's like there can be 100 black swans in front of you, and the white one is the only one you see. You don't post to have an actual conversation nor to learn anything, but just to try and be right and shove your view as some universal standard. That's why we constantly see this from you;

Red-herring-fallacy.jpg

When it comes to graphics raytacing is definitely the biggest innovation in ages, Bump mapping was indeed the hottest thing since sliced bread but this is even bigger.
 
Maybe it's because RT was an obvious move to artificially jack up GPU prices, despite nVidia already being one of the most profitable companies around. And as already stated, the performance drop is too high in comparison to the visual improvement that it offers. That is also a fact that no one can disagree with.
I can also say that you only care about RT exactly because nVidia has an advantage there. But I'm tired of going down to your level with your retarded backhanded personal slander.


It does. Too bad it came at the cost of milking the audience for higher GPU prices than necessary, right?


Here we are again with the backhanded slander and passive aggressive comments... I know where we are headed. And not to brag, but I'm quite sure that my knowledge on hardware is more extensive than yours. You only care about the shiniest new virtual toys that you can only look at, rather than the actual workings and progress of the industry. Just like the majority. And the majority is always wrong. That's why democracy can never work, but that's another story for another time.


Another backhanded comment. You know what's hard? Having a conversation with;
1uv808.jpg



I just did in my last post. But... Since your pigeon-chessing, you deliberately filter out whatever is out of your own bubble. You have comprehension problems. It's like there can be 100 black swans in front of you, and the white one is the only one you see. You don't post to have an actual conversation nor to learn anything, but just to try and be right and shove your view as some universal standard. That's why we constantly see this from you;

Red-herring-fallacy.jpg
You think you have more knowledge in this than me? Ohhh boy Ill bet you on that!! As matter as fact, I've already won based on your response.

"Ray tracing is too taxing. Nvidia is just the shiniest toy. Its more expensive and a way to milk the customers."

Your knowledge is beyond lacking if this is how you view new strides in technology and consumer level products. I'd love to hear your thoughts on Tesla and other EV's. Actually, nevermind. "Its more expensive, the technology for self driving cars isn't 100% perfect, less charging stations than gas stations, etc." That's literally how salty you sound in regards to Nvidia.

I'm almost convinced you either:

A) are an employee at AMD.
B) you're related to Lisa Su.
C) you have investments in AMD
D) All of the above.








Also quit being so sensitive. I'm not personally attacking you. I'm just telling it how it is. I'm not offended that you constantly shit talk Nvidia, cause I'm not an Nvidia fanboy. I just like where they are headed, and know how to differentiate between someone who knows what they are talking about, and those who have no clue. You can figure out which one you fall into, right? I'm sure others can.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Well nothings holding this rendering engineer back from writing his own bunch of raytracing APIs if the ones available aren't good enough for his liking. Go for gold, put your money where your mouth is.
... and writing his own GPU driver why not? As if that were practical.... c’mon.
Not sure I understand the anger and saltiness around the statements he made unless you were either a PCMR warrior, a MS warrior, or both.

Did they criticise precious MS? Is that the problem?
 
Last edited:

Bo_Hazem

Banned
I think it's about a company mans up and creates an OS that is 100% standalone for gaming to maximize hardware capabilities and runs separately from Windows OS from the startup of the PC. By then most cards and hardware will get more than what they do now.
 

raul3d

Member
Maybe it's because RT was an obvious move to artificially jack up GPU prices, despite nVidia already being one of the most profitable companies around. And as already stated, the performance drop is too high in comparison to the visual improvement that it offers. That is also a fact that no one can disagree with.
This is so wrong. Tesselation and bumpmapping are render techniques solely invented to workaround a geometry bottleneck. Raytracing is different and is here to stay. Dispite being used 50 years in offline rendering, it is still the best method for generating high quality renderings. If you want to compare it to something, use hardware T&L, which basically did the same just for rasterization. It was revolutionary when it came out.

I take it that you are too young to have seen a "new" technology emerge and becoming the new standard?

It does. Too bad it came at the cost of milking the audience for higher GPU prices than necessary, right?
It does? I would argue that the value proposition of a RTX 3070 or 3080 is pretty solid. The price premium comes mostly from the fact that Nvidia is the undisputed leader for new technology and can ask a premium. Apple does the same and it does not offer technology available nowhere else.
 
Last edited:

Haggard

Banned
Maybe it's because RT was an obvious move to artificially jack up GPU prices, despite nVidia already being one of the most profitable companies around. And as already stated, the performance drop is too high in comparison to the visual improvement that it offers. That is also a fact that no one can disagree with.
A fact it is not...it is your personal opinion. Nothing more nothing less.
RT has been widely used in CGI for decades. It is the way for correct lighting....
I would agree that partial implementations like for reflections only are not necessarily worth the performance impact, but if you have the hardware power to use it for something as impactful as global illumination with a sufficient raycount and bounce depth it absolutely transforms the look of a game.
People should be in awe that we get real time raytracing at all...instead it`s become another juvenile team red/green quarrel.
 
Last edited:

Ascend

Member
I would agree that partial implementations like for reflections only are not necessarily worth the performance impact,
Then why are you telling me I'm wrong... And why are you telling me it's "just my opinion"...? Technically you agree, but you pretend you don't.
I guess ganking is a thing in forums too.

but if you have the hardware power to use it for something as impactful as global illumination with a sufficient raycount and bounce depth it absolutely transforms the look of a game.
I never said it didn't.

People should be in awe that we get real time raytracing at all...instead it`s become another juvenile team red/green quarrel.
Well, to some of us, some things are more important than fancy graphics. But obviously if you don't jump on the RT hype, you get trashed. Happens to Hardware Unboxed too, so much so that they as reviewers that have trashed AMD harshly in the past (rightfully so in the majority of cases), are now being called AMD fanboys. 🤷‍♂️
I'll just say that there's a reason Nintendo is quite successful despite constantly having the weakest hardware in pretty much all of gaming. A game can have the fanciest graphics and still be garbage. Or a game can have retro graphics and be great. Having a strong desire to need RT right now is wanting the dopamine rush of a shiny new toy, because in reality, it is not that useful yet.

Now don't get me wrong. Can RT bring gameplay innovation? Yes it can, simply because rasterization is limited in certain aspects. For example, a stealth game requiring you to stay in the shadows can be implemented a lot easier with RT. Or a game like Alan Wake can be taken to a whole other level.

But that is still a long ways off, which is also one of the reasons I say that the performance impact is still too high for its visual impact. No one is going to build a game around a feature like RT, where only a marginal part of the audience has access to it. As long as it is graphical only, it really has no true value from a gameplay perspective. The consoles can potentially change that in a couple of years, and I welcome that. But I would not base my hardware purchasing decision solely on RT at this very moment.

This is so wrong. Tesselation and bumpmapping are render techniques solely invented to workaround a geometry bottleneck. Raytracing is different and is here to stay. Dispite being used 50 years in offline rendering, it is still the best method for generating high quality renderings. If you want to compare it to something, use hardware T&L, which basically did the same just for rasterization. It was revolutionary when it came out.
Right now RT is being used more like an add-on technique like tessellation, or bump mapping, or ambient occlusion, or screen space reflections, rather than actual rendering. It is still at most a couple of effects on top of rasterization. It is nowhere near as important as hardware T&L was.

It does? I would argue that the value proposition of a RTX 3070 or 3080 is pretty solid. The price premium comes mostly from the fact that Nvidia is the undisputed leader for new technology and can ask a premium. Apple does the same and it does not offer technology available nowhere else.
The value proposition of the RTX 3070 and RTX 3080 are indeed ok, unlike the RTX 2000 series. There is not really a 'price premium' for the RTX 3000 series anymore, except the RTX 3090. Basically after the RTX 2000 series, the whole graphics card industry has become a 'price premium'. And that's what most people don't get.

Apple is yet another company that I refuse to buy anything from. Apple is worse though, because you literally pay a lot more for actually less. At least nVidia offers some additional features.
 

Haggard

Banned
Then why are you telling me I'm wrong... And why are you telling me it's "just my opinion"...? Technically you agree, but you pretend you don't.
I guess ganking is a thing in forums too.
If you want to have a civil discussion with people you should maybe refrain from half-quoting things out of context to change the message and maybe also tone down the victim complex a little. :messenger_neutral:
My issue with your statement was just that you tried to generalize and elevate your personal opinion/taste to fact status.

Well, to some of us, some things are more important than fancy graphics. But obviously if you don't jump on the RT hype, you get trashed. Happens to Hardware Unboxed too, so much so that they as reviewers that have trashed AMD harshly in the past (rightfully so in the majority of cases), are now being called AMD fanboys. 🤷‍♂️
I'll just say that there's a reason Nintendo is quite successful despite constantly having the weakest hardware in pretty much all of gaming. A game can have the fanciest graphics and still be garbage. Or a game can have retro graphics and be great. Having a strong desire to need RT right now is wanting the dopamine rush of a shiny new toy, because in reality, it is not that useful yet.

Now don't get me wrong. Can RT bring gameplay innovation? Yes it can, simply because rasterization is limited in certain aspects. For example, a stealth game requiring you to stay in the shadows can be implemented a lot easier with RT. Or a game like Alan Wake can be taken to a whole other level.

But that is still a long ways off, which is also one of the reasons I say that the performance impact is still too high for its visual impact. No one is going to build a game around a feature like RT, where only a marginal part of the audience has access to it. As long as it is graphical only, it really has no true value from a gameplay perspective. The consoles can potentially change that in a couple of years, and I welcome that. But I would not base my hardware purchasing decision solely on RT at this very moment.
RT as an advanced rendering technique is for eye-candy/visual realism, so I do not understand at all why you bring gameplay into this or talk about "Games build around RT" as that is completely missing the point.
The principle of the tradeoff between eye candy and performance is as old as the first graphics-slider in a video game and has always been a personal choice.

So what exactly are you even discussing? I really don`t get it......
 

yurinka

Member
Why do you care about what PC players have anyways? Its almost like a jealousy thing when you console warriors try and downplay PC.
I don't care about PC players and I don't downlay anything, I just show gaming market facts that seems that you PC fanboys can't accept.

The facts are, PC has much better hardware that can push higher framerates, higher resolution, better textures, modding, etc. Ps5 can barely maintain 60fps in crossgen games, and here you are trying to downplay PC's that can run more than double or triple that.
High end PCs yes, are more powerful. What facts (Steam stats) show is that only a tiny portion of the Steam users have high end pcs.

Regarding PS5, it perform great at 60fps (and 120fps too in many games).

Please find me a gaming computer that you can buy right now, that is running the OS on a HDD. Even cheapo laptops from several years ago come with SSD as standard. There are way more PC gamers with SSD's than there are ps5's out in the wild. Don't even mention Tim Sweenie, as all of that was debunked by Epic China. Why are all multiplatform games multitudes better on PC?
Yes, there should be more PCs with SSD in the market than PS5s, and the new ones include SSD even if it's worse than the PS5 one and have more bottlenecks. But again, the average PC that people has at home aren't new ones, in the same way that most console users still don't have a PS5.

Epic China obviously didn't debunk him, it was a mistranslation/misinterpretation of a video they published.

Multiplatforms are better or worse on PC depending on the PC you have. In high end/top pcs perform better than in modern consoles, in the average PC that Steam users have, it performs worse than in modern consoles.

I seriously don't believe you are a game developer, when you are oblivious to these facts... Unless you develop mobile games??
I'm work on multiplatform games for console, PC and arcade. I did work on mobile and browser games too. I've been workind during 15+ years in over 40 published games, several of them award winning and in a top publisher (now indie, creating my own studio).
 

VFXVeteran

Banned
I seem to remember a room in minecraft youtube vid where the reflections were heavily nested. Maybe I'm misremebering.

Heavily nested - yes. Infinite? Nah. Maybe 16 levels of recursion. We do that a lot to make sure that our ray switch shader works properly.
 

rnlval

Member
... and writing his own GPU driver why not? As if that were practical.... c’mon.
Not sure I understand the anger and saltiness around the statements he made unless you were either a PCMR warrior, a MS warrior, or both.

Did they criticise precious MS? Is that the problem?
GPU driver is still software. Hit-the-metal programming on modern game consoles are like writing userland GPU driver. Sony wouldn't allow game programmers to write kernel level GPU driver/software.
 
GPU driver is still software. Hit-the-metal programming on modern game consoles are like writing userland GPU driver. Sony wouldn't allow game programmers to write kernel level GPU driver/software.
Shhh, you can't do that in this thread. Gotta leave some leeway for people to make mindless assumptions.
 

Panajev2001a

GAF's Pleasant Genius
GPU driver is still software. Hit-the-metal programming on modern game consoles are like writing userland GPU driver. Sony wouldn't allow game programmers to write kernel level GPU driver/software.
Lots of things are software and there is a wide gap between accessing the HW directly, which the current PC GOU drivers still abstract away enough, and replacing the current driver (and ensuring it works on all users’ PC HW setups... they are smart enough to write to the metal code on consoles but not enough to achieve the same on PC by “just” curcumventing) and you know that. You are smart and not clueless at all, but you are playing semantics here. I do not why this topic is being controversial or why some feathers are being ruffled... it is not even putting consoles against each other: it is saying consoles give in this area lower level access which developers prefer to have.

As VFXVeteran VFXVeteran often points out, despite the complaints by the Epic dev (who wants the HW he writes for to sing the best way it can) on PC you do have GPU’s being able to brute force their way out of these inefficiencies (albeit if they weren’t there more users would have a GPU meeting the engine requirements).

What does it matter if Sony allowed to write a kernel module or not? They allow sufficient low level access and document it for developers to talk to the GPU directly and decide what they want to do directly and what they want to let the system do for them. I do not think PS5 has anywhere near a complex and opaque GPU driver as you see on a Windows or Linux box. I do not think XSX hides as many low level details from developers as any RDNA2 driver does on Windows.
On PC you also have often undocumented HW features and the original GPU Windows driver still wrapping all GPU access unless you wrote your full GPU driver and asked users to install it and replace what was given. Unless the PC driver by AMD or nVIDIA exposed the right level of access there is a chasm between console like low level access to the HW and writing a replacement driver for the GPU.
 
Last edited:

Spukc

always chasing the next thrill
You know ray tracing isn't just reflections right? Ray traced shadows are miles better than Shadow mapped ones, ray traced global illumination is mind blowing, even audio can be ray traced
RT SHADOWS in wow are fucking sheit.

RT is overated as fuck right now.
And screen space reflections are enough 99% of the time.

minecraft RT is dope tho
 
Last edited:

FALCON_KICK

Member
Not even offline rendering does infinite level of recursion. Of course, it's just a 2nd level. It's still more accurate than no level of recursion like in MM.

@assurdum



There is my video. Full 4k. Ultra settings. You see the popping across the street. That's not the point of the video.


How does Spider-Man PS4 Remastered for PS5 hold up? Are the reflections impressive for console disregarding recursive reflection?

 
How does Spider-Man PS4 Remastered for PS5 hold up? Are the reflections impressive for console disregarding recursive reflection?


Don't know if the watchdog legions video you quoted is representative, but it seems it is very deserted and with significantly fewer moving objects and pedestrians than spiderman. Maybe there are busier sections with denser pedestrian and car counts, but if not it seems that watch dogs ray tracing may need to sacrifice moving object and pedestrian density in addition to cutting the distance the ray tracing works at.
 
Last edited:

rnlval

Member
Lots of things are software and there is a wide gap between accessing the HW directly, which the current PC GOU drivers still abstract away enough, and replacing the current driver (and ensuring it works on all users’ PC HW setups... they are smart enough to write to the metal code on consoles but not enough to achieve the same on PC by “just” curcumventing) and you know that. You are smart and not clueless at all, but you are playing semantics here. I do not why this topic is being controversial or why some feathers are being ruffled... it is not even putting consoles against each other: it is saying consoles give in this area lower level access which developers prefer to have.

As VFXVeteran VFXVeteran often points out, despite the complaints by the Epic dev (who wants the HW he writes for to sing the best way it can) on PC you do have GPU’s being able to brute force their way out of these inefficiencies (albeit if they weren’t there more users would have a GPU meeting the engine requirements).

What does it matter if Sony allowed to write a kernel module or not? They allow sufficient low level access and document it for developers to talk to the GPU directly and decide what they want to do directly and what they want to let the system do for them. I do not think PS5 has anywhere near a complex and opaque GPU driver as you see on a Windows or Linux box. I do not think XSX hides as many low level details from developers as any RDNA2 driver does on Windows.
On PC you also have often undocumented HW features and the original GPU Windows driver still wrapping all GPU access unless you wrote your full GPU driver and asked users to install it and replace what was given. Unless the PC driver by AMD or nVIDIA exposed the right level of access there is a chasm between console like low level access to the HW and writing a replacement driver for the GPU.
From https://www.neogaf.com/threads/rend...back-in-pc-land.1579267/page-5#post-261330583
My post for VFXVeteran

You keep forgetting



Doom 2016 (Vulkan+AMD extensions) was 1st PC game to use this direct access method.

ID Software is not EPIC games.

AMD may need to update GCN Intrinsic Functions for RDNA 2.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
From https://www.neogaf.com/threads/rend...back-in-pc-land.1579267/page-5#post-261330583
My post for VFXVeteran

You keep forgetting



Doom 2016 (Vulkan+AMD extensions) was 1st PC game to use this direct access method.

ID Software is not EPIC games.

AMD may need to update GCN Intrinsic Functions for RDNA 2.


Which still means this may not be available and the points about DXR stands. On PC you also have a large variation that AMD itself has to extend for all their GPU generations. Have they updated this path for RDNA1 cards? Despite the fancy name does it deliver access to all HW directly or some?

I am well aware the needs of EPIC are not the needs of Id... and that is precisely the point why this “could” be cool and all and still irrelevant for their PC business (great HW variation) and not apply to RDNA2 yet (and be more PR than anything that AMD is lagging behind in upkeep) either.


Also how many games on PC are using these extensions and how deep is their use? How does it affect the access to the RT HW? How Do you get most PC devs to use this on the PC platform (... and why GCN Intrinsics is not lighting the world on fire)
 
Since you're all on about Bump Mapping, Bump Mapping in it's original form - as designed by John Carmack - was an Overhead API shading algorithm that relied very little on Artist input perimeters and far more on the program interpreting how light bounces refract across player model/asset details within a scene.

The artist, originally had no ability to tweak or control the bump map (this was later resolved by creating a non API Overhead solution that allowed the artist finite adjustability) as that was handled by the overhead API.

Due to this, bump mapping when applied to a scene - a scene specifically using the overhead API - was limited in that it only had one setting - unless reprogrammed by the developer.

Meaning API based Bump Mapping essentially oversaturated much of the scene originally if a dev didn't come back and reprogram the API - this was of course before Carmack created an artist centric solution to his Bump Mapping Shader Algorithm.

Due to this, VRSS is essentially Bump Mapping on steroids - Or next Gen Bump Mapping.

As it also is an Overhead Shading API that deduces how to best interpret details by light refraction - as well as now a myriad of other scene based attributes spread across an entire scene, such as distance away from the object effecting asset LOD and pixel density's - but this technique is far more nuanced as it is capable of adjusting detail across numerous pixel saturation levels at per pixel density precision. VRSS will allow for amazing performance boost's when properly utilized in the future.
 
Last edited:

rnlval

Member
Which still means this may not be available and the points about DXR stands. On PC you also have a large variation that AMD itself has to extend for all their GPU generations. Have they updated this path for RDNA1 cards? Despite the fancy name does it deliver access to all HW directly or some?

I am well aware the needs of EPIC are not the needs of Id... and that is precisely the point why this “could” be cool and all and still irrelevant for their PC business (great HW variation) and not apply to RDNA2 yet (and be more PR than anything that AMD is lagging behind in upkeep) either.


Also how many games on PC are using these extensions and how deep is their use? How does it affect the access to the RT HW? How Do you get most PC devs to use this on the PC platform (... and why GCN Intrinsics is not lighting the world on fire)
NAVI 10 can run GCN wave64 instruction set at a lower latency i.e. from GCN's 12 clock cycles to 8 clock cycles which are about 33% improvement.

NAVI 10 can run RDNA 1 wave32 instruction set with 7 clock cycles.

PC GPU variations are less diverse when there are only two major PC GPU vendors and many SKUs are just scaled-down versions from flag chip designs, hence differences in power, but not instruction set.

The micro-architecture migration shift from GCN to NAVI 10 was not like Radeon X1900 (SIMD) to Radeon HD 2900 XT (VLIW5) i.e. Sony requires hardware backward compatibility with PS4.

The basic idea with AMD's GCN Intrinsic Functions is to re-cycle GCN-based game console programming work for PC with GCNs i.e. AMD is being lazy with PC driver side shader program replacements.

AMD's GCN Intrinsic Functions wouldn't solve rasterization bottlenecks inherent in the Vega GCN architecture.
 
Last edited:

Sophist

Member
I don't know about the ps5 but the ps4 doesn't provide "lower access" to the GPU than PC; Its API is Vulkan level. But PS4 has unified memory which makes CPU-GPU cooperation much easier.
 
Last edited:
Top Bottom