• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel turns to emulation for DirectX 9 games after ditching native support

Draugoth

Gold Member
https://www.theverge.com/2022/8/15/23306160/intel-directx-9-support-emulation-direct-x-12

acastro_210120_1777_intel_0002.0.jpg




Intel is removing native support for DirectX 9 (DX9) from its new Arc graphics cards and Xe integrated graphics on 12th Gen processors, with support now coming from emulation thanks to DirectX 12.

News of the DirectX 9 change comes from a recently updated support page on Intel’s website in which the company states it will drop support for the 20-year-old graphics API but offers reassurance that “applications and games based on DirectX 9 can still work through [the] Microsoft [Windows 10 and Windows 11] D3D9On12 interface.”

MICROSOFT IS NOW SHOULDERING RESPONSIBILITY FOR OPTIMIZATIONS TO DIRECTX 9 GAMES AND ASSOCIATED BUGS
In essence, while modern Intel GPUs won’t have dedicated drivers for older games that solely run on DX9, such as Unreal Tournament, Star Wars: Knights of the Old Republic, and Team Fortress 2, you should still be able to play them on this newer hardware, and the new dependency on D9D9On12 mapping layers shouldn’t have a noticeably negative impact on gameplay.

That’s not to say issues won’t occur, but Microsoft is now shouldering responsibility for optimizations to DirectX 9 games and associated bugs. That means if game developers experience graphical bugs or Microsoft’s mapping layer refuses to run entirely, they’ll have to take that up with Microsoft directly rather than Intel.

Intel also points out that this isn’t necessarily bad news. There are very few games still running solely on DX9, as opposed to having DX9 support alongside more recent APIs, that will still be supported by Intel. It’s also extremely unlikely that any new games will be built using DirectX 9, not to mention that the quality of in-game graphics has come leaps and bounds in the years since DX9 was released 20 years ago. The only reason support has likely endured this long in the first place is down to how much aging PC gaming hardware is still used today, given Nvidia’s GTX 1060 is still the most popular GPU on Valve’s Steam survey, six years after its original release.
 
Last edited:

Reallink

Member
They think repeating 20 years old will divert attention from the fact there are a metric shit ton of big budget DX9 only games that aren't even a few years old. I'm pretty confident there must still be DX9 only indies releasing today and in the future. Sucks for iGPU users cause ain't nobody buying an ARC dGPU.
 
Last edited:

Xyphie

Member
Except for a handful of games like say CSGO or Rocket League for DX9 and Minecraft for OpenGL legacy performance for those APIs shouldn't matter too much as long as compatibility is good and performance is good enough. Like the difference between getting 200 fps and 500 fps in some game from 2005 is mostly academic from a user POV. The legacy API for which perf will really matter will be DX11 as there are actually some new games coming out with it occasionally (God of War, Stray for instance) and many 2010-2015-era games like Witcher 3 which can still make the most expensive hardware grind to a halt.
 
Last edited:

mrcroket

Member
So they have a not so much powerful gpu series that barely can run newer games at high resolution and/or framerates but also that won't support the most used api for ages with a vast legacy of games.
 

kiphalfton

Member
In short, don't buy our arc products.

Thanks.

Problem is those those who decide to play a game on something with integrated graphics. Guess those people shouldn't expect much from the getp go... but still sucks.

That’s not to say issues won’t occur, but Microsoft is now shouldering responsibility for optimizations to DirectX 9 games and associated bugs. That means if game developers experience graphical bugs or Microsoft’s mapping layer refuses to run entirely, they’ll have to take that up with Microsoft directly rather than Intel.

Okay so Intel is removing native support and is pawning the issue off on Microsoft. Neat.
 
Last edited:

nkarafo

Member
Oh, another reason to not touch these cards. Because lackluster performance and bad drivers weren't enough.

Edit: What about DX8 games though? Or even older? Most of these games are still supported by modern cards. I can still play such games on my current build.


Except for a handful of games like say CSGO or Rocket League for DX9 and Minecraft for OpenGL legacy performance for those APIs shouldn't matter too much as long as compatibility is good and performance is good enough. Like the difference between getting 200 fps and 500 fps in some game from 2005 is mostly academic from a user POV. The legacy API for which perf will really matter will be DX11 as there are actually some new games coming out with it occasionally (God of War, Stray for instance) and many 2010-2015-era games like Witcher 3 which can still make the most expensive hardware grind to a halt.
I'm currently enjoying a lot of older dx8 and dx9 games @ 240fps on my 240hz monitor. Such high frame rates matter if you can reach them because this is the only way shitty LCD tech panels will reduce their motion blur. Not to mention you can achieve more natural motion and lower input lag this way, for instance, 120fps minimum is essential for Visual Pinball.

Intel simply caters to the average "consume product and get excited about next product" consumers who only care about the latest popular games.
 
Last edited:

ShirAhava

Plays with kids toys, in the adult gaming world
For someone like me who only cares about older games Arc might as well not even exist
 
Last edited:

supernova8

Banned
Unless Arc is devilishly well priced (ie extremely cheap), I don't see any reason for anyone to gamble on their first generation of discrete GPUs. It just sounds like all bad news at this point. You'll probably be able to get a 6700 XT for like $300 (that will match/beat even their top Arc card) by the time Arc actually comes out. I'm already seeing them used for $400 ish (in Japan).
 
Last edited:

Shifty

Member
I mean, DXVK under Linux sets some precedent that emulating older graphics APIs by translating them into newer ones is broadly viable for gaming.

However, that's an open-source project maintained by people who have a vested interest in making old stuff work where it shouldn't.

MICROSOFT IS NOW SHOULDERING RESPONSIBILITY FOR OPTIMIZATIONS TO DIRECTX 9 GAMES AND ASSOCIATED BUGS
Fire Elmo GIF


I'll be nice and call my confidence in this 'limited', given that I still can't run Operation: Inner Space or the non-DOS editions of MechWarrior 2 on modern versions of their OS.
 
Last edited:

Xyphie

Member
I'm currently enjoying a lot of older dx8 and dx9 games @ 240fps on my 240hz monitor. Such high frame rates matter if you can reach them because this is the only way shitty LCD tech panels will reduce their motion blur. Not to mention you can achieve more natural motion and lower input lag this way, for instance, 120fps minimum is essential for Visual Pinball.

Intel simply caters to the average "consume product and get excited about next product" consumers who only care about the latest popular games.

Hence why I wrote 200 fps because that covers the typical 144-165Hz gaming monitor VRR range. Obviously there's the super niche use case of 240-500Hz monitors, but those people probably buy a nVidia GPU in the first place.
 

RobRSG

Member
Support is key: if they can ensure compatibility, probably it gets good in its 3rd iteration of cards.
 

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
I laughed at the last paragraph. Well yeah, as long as new games aren't dx9 only it's not bad news. Who cares about a library of thousands of existing older games? Certainly not PC gamers.

Right? RIGHT?
MLxUi41.gif


Seems like Intel copies from the market leader.

These gpus are doa.
 

BlackTron

Member
When your normal non-nerd friend asks you to come over and help fix their computer. You come over and look in the case and say "oh no". Thanks to Intel we can experience that feeling again just like we did on eMachines or Gateway. Leaning in hard on nostalgia, Intel. Firing on all cylinders. But while you're skating by on BS. Don't forget to like, design a chip or something in the shadows. Instead of just trying to connect your existing best CPU to the clock tower in Back to the Future. THX
 

coffinbirth

Member
Except for a handful of games like say CSGO or Rocket League for DX9 and Minecraft for OpenGL legacy performance for those APIs shouldn't matter too much as long as compatibility is good and performance is good enough. Like the difference between getting 200 fps and 500 fps in some game from 2005 is mostly academic from a user POV. The legacy API for which perf will really matter will be DX11 as there are actually some new games coming out with it occasionally (God of War, Stray for instance) and many 2010-2015-era games like Witcher 3 which can still make the most expensive hardware grind to a halt.
I was agreeing with you until you mentioned Witcher 3...as that game can run on a potato. It's just not greatly optimized at full blast. You can actually get pretty good performance with integrated graphics. Shit, they shoehorned that bastard onto Switch after all!
 
Anyone who ever used Intel iGPU and thought they knew how to write drivers for a dGPU product was delusional. It's hilarious that they are completely giving up on supporting anything old and just punting it over to an emulation layer written by Microsoft.

This is going more or less exactly how things at Intel have been going the last decade. In terms of companies which have never had any software competency whatsoever, Intel has been #1 on that list for decades. Too bad a GPU is both hardware and software, and you need to be good at software. Ask Nvidia and AMD for more information.
 
Last edited:

Shifty

Member
Hmm, to some extent this is apparently also open-source. Not sure how it compares, though.

That uh, hm.

Going through the readme it seems like the program being translated from DX9 would have to be aware of the translation layer and actually request it from the OS instead of it being an automatic override like DXVK or other similar DLL hooks that you drop in next to the game.

So it sounds like the expectation is for developers to integrate it with their DX9 titles via patch, though I guess Intel could override that in their driver to make it work with any old title.

Though as a systems programmer, reading
the device will expose an IDirect3DDevice9On12 interface which enables applications to submit work to both the D3D9 API and the D3D12 API with lightweight sharing and synchronization
makes me die ever so slightly inside.

I get it, I do. It's for stuff like the Crysis remaster that has the OG code running with a DX12 layer on top to add modern niceties like RT, but bleurgh. If they were doing it properly they'd port the old stuff to the new API instead of trying to dual-wield :messenger_face_steam:
 

Drew1440

Member
The most popular game on steam - csgo only supports dx9.🤣
Wait what?
Didn't valve introduce OpenGL support into their Source games shortly after introducing SteamOS? Why on earth are they still pushing updates for games that use such an old API?

As for Intel, Dgvoodoo should help getting those DX9 games back working again.
 
Top Bottom