• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

As Nvidia /AMD push out the high end standards, Intel ARC aims to PULL IN the low end.

Trogdor1123

Member
I dont understand how this makes sense. What is a decent card today is a slow card tomorrow. You can’t “standardize “ this stuff
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I'm probably late to reacting to Intel's new ARC discrete GPUs.

At first glance these GPUs could be looked over in favor for Nvidia and AMD.

But looking at them a little differently, the ARC GPUs seem to standardize the low end specs, focusing on standardizing 1080p 60fps (that's across mobile, console and PC) as the lowest standardize spec.

This is good for gaming.

Their ARC cards are being compared to the RTX 3070ti.....if thats low end we are all in big trouble.

Not only that but their Raytracing implementation may well be better than even Nvidias.(Rumors point to their way of doing Raytracing being more performant than Nvidias to the point DXR 1.0 is basically stupid to Intel engineers) ARC Alchemist is almost certainly not going to be attacking the RTX 3080 12G, but its def not fighting with the RTX 3050.
Id say the RTX 3050 and RTX 3060 are where the low end really is.
Note that the RTX 2060 which might soon become minspec is still faster than an RTX 3050.
The RTX 3050 competes with the GTX 1660S...its only advantage is DLSS and Raytracing.

ARC is right at the mid high range of things....just alittle late to the party.
Battlemage is expected to be 3090ti levels but again a little late to the party.

Basically Intel are focusing on the mid-high range GPU sector.
If Intel always has a chip that fights with the xx70 then they will always (hopefully) be in the right sector to gain real market share.
IF their Raytracing and XeSS truly are as good as theyve been reporting it to be.....then they are in for a good few years.

Nobody actually cares about RTX 3090s thats a silly sector to focus on, the xx70 will forever be the golden child with the xx60(true xx60 not RTX3060 barely faster than the card im replacing bullshit) being the golden goose, have a card thats cost performant with those card and you are always set to make money and gain market share.
 

01011001

Banned
Ridiculous.

AMD will be fine.

AMD will only be fine if they bring their RT performance to the level of Nvidia's cards and if their cards start to support a DLSS equivalent.

if XeSS runs well enough on AMD hardware then Intel might have done that last step pretty much for them.
 
Last edited:

poppabk

Cheeks Spread for Digital Only Future
It's definitely a good time for them to jump in with all the changes going on. DLSS, ray tracing etc have made the GPU market a little less 'traditional'.
 

manfestival

Member
They did state sometime back that they weren't going to be initially going after the high tier cards like the 3080/3090+(halo tier)/6800xt/6900xt+(halo tier). They were targeting "mid grade" and lower so the 3070 and below. Seems like they squeezed out more juice and looking more like a 3070ti. Now I also remember reading sometime back that they wanted to price their cards so they are undercutting the market. So yes, this would be a solid move for the competitive space since they would be gutting and targeting the biggest group in PC gaming. Keep in mind that as far as steam hardware goes, the 1060 is STILL relevant. Check the link and see for yourself. Granted, this is a voluntary survey but most people do it just because. I believe the first run of their cards was said to be sold at really low pricing(relatively speaking) but we know that it would be close to impossible to get with scalpers existing. Granted most of the pricing stuff is all speculative until they finalize actual numbers.

 
Last edited:

01011001

Banned
RT hasnt been a factor at all. General masses dont care about it.

they do if they look at what card to buy. would you rather get a card that can't run future games with RT, or one that more easily can?

RayTracing is the future, and who knows how fast games will use it more and more... we already have at least one big game out there that released a whole version that only works with RayTracing cards.

if you are currently looking at both AMD cards and Nvidia cards, and you'd be able to get both at MSRP (which thankful becomes more of a possibility again it seems) you'd be a fool to currently buy the AMD card instead of the Nvidia equivalent quite frankly.

DLSS and the faster RayTracing hardware is just too much of an advantage currently.
and if you combine the 2 the advantage becomes even bigger as you'll already get better performance with RT on compared to AMD, and now you can also enable DLSS in most games to get an additional performance boost with often even an improvement in image quality thanks to the superior Antialiasing that usually comes with DLSS compared to most games' TAA.

if that advantage stays that way AMD would have a really hard time.

XeSS being multiplatform could save AMD on the AI upsampling department, so now they would only make sure to get their RT performance up before it's too late.
 
I'm probably late to reacting to Intel's new ARC discrete GPUs.

At first glance these GPUs could be looked over in favor for Nvidia and AMD.

But looking at them a little differently, the ARC GPUs seem to standardize the low end specs, focusing on standardizing 1080p 60fps (that's across mobile, console and PC) as the lowest standardize spec.

This is good for gaming.
Problem is they've been 'coming' for 2 years now. They are gonna be so late by the time they launch Nvidia and AMD will have their next gen GPUs out. Also Intel has an uphill battle with the drivers, they've never cared for quality drivers but now they need to. I think these GPUs won't hit their stride till their gen 3 or gen 4 versions launch. This is very much trial by error for Intel.
 

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
AMD will only be fine if they bring their RT performance to the level of Nvidia's cards and if their cards start to support a DLSS equivalent.

if XeSS runs well enough on AMD hardware then Intel might have done that last step pretty much for them.
I've heard that "if amd can keep up with nvidia" for so many years I can't even remember when people started with this argument.
 

Ev1L AuRoN

Member
RT hasnt been a factor at all. General masses dont care about it.
I disagree, I think people that buy that type of hardware are very aware of not only of RT but DLSS also. It's not the be-all and end-all of things, sure, but it is a very desirable feature, and no doubt weights on the decision.
 

manfestival

Member
I partially agree with Santa. RT is more exaggerated as far as appeal goes. It is definitely a factor for some but the implementation still hasn't reached a point where it is all that important. Also the technology is still taxing on the current generation of hardware. 40 series will be the third generation and maybe just maybe it will be worth considering then. Also AMD according to rumors(grain of salt folks) will be finally putting ray tracing hardware on the 7k series.
As far as DLSS being thrown around. I also view that more of marketing appeal like ray tracing. There are alternatives that exist(FSR being the main one) but most people either downplay them(this is inevitable) or they just don't know they exist. Sadly FSR is finally getting around to it's second generation but it is likely to get more implementation over DLSS since Nvidia is trying to lock that behind 20 series and up. Where Nvidia has the "advantage" of having both consoles to provide for, which means developers in their limited resources MAY be more inclined to develop for FSR. I never cared for DLSS or FSR but these things dominate the conversation these days. Just hate the shimmering, ghosting, and blurriness that comes from the tech.
 

PaintTinJr

Member
AMD might be in trouble.
I would expect them to see margins hit, but volume increase - because they'd then have a value focused competitor to playoff of, and help raise awareness of their products when they focus on decent performance at widest market pricing.

Nvidia might even be pushed out of the laptop market - except for the top tier workstation and premium tier - because the discounting Intel and AMD might do in a competitive mid to low market with their high-end mobile CPUs might make the price difference too much for mobile RTX 3050/3060/3070 level products to pair with an AMD or Intel CPU.
 
Last edited:

Buggy Loop

Member
Even low end AMD/Nvidia GPUs are better. The only advantage Intel has is in vendor relations.

What? Did I miss the top Arc embargo for benchmarks or something? The A700 series is in the wild? Cause last I heard, it was supposed to fight against the 3070, which is no slouch and probably the best market share to attack (3080 & 3090 barely make a dent in market shares compared to 3060/3070 range)
 

SlimySnake

Flashless at the Golden Globes
lol take one look at how poorly even the 2080 and 2070 are performing in that game and ask yourself if anyone should be entering the 1080p market in 2022.

PC gamers are not going to settle for 30 fps which means they need whatever the PS5 and XSX specs are and double it to get a 60 fps mode. That means a 3080 or better going forward. The console makers have phoned in this gen so far but next gen games are coming and if you are buying a card today to last you the next 3 years, you should not be buying anything around 10 tflops. Even with the intel Xses and DLSS you are going to need at least a 3070 level card to get 60 fps in next gen only games that are targeting 30 fps on the PS5 and XSX.
 

Buggy Loop

Member
… and ask yourself if anyone should be entering the 1080p market in 2022.

/squints at Steam hardware survey..

Yes actually, it’s by far the most adopted resolution. Peoples have to understand that 4K didn’t make a dent on PC, it’s even less market share than the already niche ultrawide format.

Peoples have the totally wrong perception that the high end matters on PC, an epeen war of top cards versus consoles games as if it mattered at all. Hint, it does not. PC is popular due to scalability, super popular multiplayer games that can run on a potato, and the affordable GPUs in the XX60 range and 1080p resolution. Even top egamers are often 1080p but at >300 fps refresh rates.

Intel is legit having the most logical entry point into this market if they can attack the xx60/xx70 market. The main problem is to get the price right, good drivers and to manufacture enough cards to make a difference.
 

SlimySnake

Flashless at the Golden Globes
/squints at Steam hardware survey..

Yes actually, it’s by far the most adopted resolution. Peoples have to understand that 4K didn’t make a dent on PC, it’s even less market share than the already niche ultrawide format.

Peoples have the totally wrong perception that the high end matters on PC, an epeen war of top cards versus consoles games as if it mattered at all. Hint, it does not. PC is popular due to scalability, super popular multiplayer games that can run on a potato, and the affordable GPUs in the XX60 range and 1080p resolution. Even top egamers are often 1080p but at >300 fps refresh rates.

Intel is legit having the most logical entry point into this market if they can attack the xx60/xx70 market. The main problem is to get the price right, good drivers and to manufacture enough cards to make a difference.
Read the rest of my post please. The games people are running at 1080p 60 fps right now were designed to run at 1080p 30 fps on a 1.8 tflops console gpu. That’s going to change soon. We are not going to be in cross gen forever. The 1060 is 2.5x more powerful than the ps4 and that’s why it can run almost everything at 60 fps. That won’t be the case when 10 tflops becomes the base.
 

CuNi

Member
Read the rest of my post please. The games people are running at 1080p 60 fps right now were designed to run at 1080p 30 fps on a 1.8 tflops console gpu. That’s going to change soon. We are not going to be in cross gen forever. The 1060 is 2.5x more powerful than the ps4 and that’s why it can run almost everything at 60 fps. That won’t be the case when 10 tflops becomes the base.

But you also forget that the 10tflop base is aimed at 4k games. It most likely won't be enough for 1080p60 but I can see the 1060 still push 30fps on mixed settings for a while.
 
Make me money, Intel :) We need a big shake up to the graphics market.
You're high if you think Intel cares about maintaining a low margin position in the GFX space. As soon as they make any gains in market share they'll sit right up there with AMD and Nvidia. Shareholders only care about profit.

I guess if the shake up you want is 3 GFX designers ripping you off instead of 2, it makes sense.

Also people calling this trouble doe AMD is amusing. People love to complain about AMDs drivers. Intel's drivers are worse.

They'll be fine.
 

Tams

Member
AMD will only be fine if they bring their RT performance to the level of Nvidia's cards and if their cards start to support a DLSS equivalent.

if XeSS runs well enough on AMD hardware then Intel might have done that last step pretty much for them.
Raytracing is still a meme.

DLSS is alright, I guess. FSR is good enough though.
 
Last edited:
I'm probably late to reacting to Intel's new ARC discrete GPUs.

At first glance these GPUs could be looked over in favor for Nvidia and AMD.

But looking at them a little differently, the ARC GPUs seem to standardize the low end specs, focusing on standardizing 1080p 60fps (that's across mobile, console and PC) as the lowest standardize spec.

This is good for gaming.

Console has been doing 4k (or close to it) with 60fps for a while now. And Intel pricing pushes them well out of reach for console manufacturers. So I don't get the console mention.

Intel has also yet to reveal their pricing. Which is important in reviewing which of their competitors' chips these stack up against.
 
Shareholders care about profit.
I certainly do, which was the point of my post 😁

Weird how you try to spin more competition as a bad thing though. They're already ahead of amd in having a proper dlss competitor for example.

My main concern is it's Raja... but he did design some decent stuff like Polaris and Vega 56. For the server side of the business it was probably a good move to pick him up.
 

KungFucius

King Snowflake
I want a decent, smallish form-factor GPU that isn't a ripoff for secondary gaming unit. I couldn't fit my 1080 in it when I upgraded and couldn't score a 3060Ti easy enough so I gave up. I would try an intel GPU though their fucking iGPU on that machine died and that was a bitch to figure out that it was just the iGPU and not the mobo. I also used to work for them and they are a shit employer. Regardless, a decent GPU that will work on an HTPC / family gaming setup is something I will be in the market for once the market can provide me something.
 
I certainly do, which was the point of my post 😁

Weird how you try to spin more competition as a bad thing though. They're already ahead of amd in having a proper dlss competitor for example.

My main concern is it's Raja... but he did design some decent stuff like Polaris and Vega 56. For the server side of the business it was probably a good move to pick him up.
I'm not saying competition is bad. I'm just saying competition probably won't mean what you think it will.
These manufacturers will do everything in their power to keep prices high.
 

SeraphJan

Member
People seems to forget RDNA3 will have way better power efficiency compare to Lovelace. While RT although will still lack behind but its going to be very close this time. FSR 2.0 although without ML, but still looks pretty good since its still a temporal solution, meaning better than FSR 1.0. It all came down to pricing, if they price it fairly they are going to be fine, unless AMD gets arrogant.

As for Intel ARC, I'm looking forward to XeSS
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
People seems to forget RDNA3 will have way better power efficiency compare to Lovelace. While RT although will still lack behind but its going to be very close this time. FSR 2.0 although without ML, but still looks pretty good since its still a temporal solution, meaning better than FSR 1.0. It all came down to pricing, if they price it fairly they are going to be fine, unless AMD gets arrogant.

As for Intel ARC, I'm looking forward to XeSS
Stares at 5000X series launch prices.
Yeah AMD are gonna be arrogant if they feel theyve done enough to trump Nvidia at pricepoint X.
But lets be real.....no one is actually gonna buy the AMD cards right.....right?
SteamSurvey has one RX 6000 card on it....the 6700.....there are more 3090s on Steam than there are RX6 anythings, thats rough.
 
Low end at high price knowing Intel.
They really can't. They would have gotten away with it more than 6 months ago but when they release in a few months time they need to be the cheaper option if they hope to grab any market share at all because by that time AMD and Nvidia cards will be close to their respective MSRPs. And who would pick a card from a new and unproven player if it is just as or more expensive than the competition.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
They really can't. They would have gotten away with it more than 6 months ago but when they release in a few months time they need to be the cheaper option if they hope to grab any market share at all because by that time AMD and Nvidia cards will be close to their respective MSRPs. And who would pick a card from a new and unproven player if it is just as or more expensive than the competition.
Yup.
Their MSRP will almost certainly be ~500 dollars for A770.
Off hand calculations put its TFLOP right at 6700XT levels of performance....the 6700XT is AMDs most popular card.
Assuming the drivers are on point, its priced properly and can handily beat the RTX 3070.
Then Intel have themselves a really good product that supply permitting will easily make the Steam Survey charts.

4CO2GcQ.jpg



If they actually have this design then they are also really handsome cards in reference form.
2022-03-30_20-09-56-2060x1159.png
 

SeraphJan

Member
Stares at 5000X series launch prices.
Yeah AMD are gonna be arrogant if they feel theyve done enough to trump Nvidia at pricepoint X.
But lets be real.....no one is actually gonna buy the AMD cards right.....right?
SteamSurvey has one RX 6000 card on it....the 6700.....there are more 3090s on Steam than there are RX6 anythings, thats rough.
I'm talking about RDNA3 and Lovelace
 

winjer

Gold Member

Intel Arc Alchemist desktop series including A770, A750, A580 and A380 SKUs reportedly delayed till late Q2/early Q3


If this becomes fact, then Arc will release very close to RDNA3 and Ada Lovelace. While having performance similar to Ampere and RDNA2.
And with the probability of having a flood of used GPUs from mining, it might mean that Arc won't be able to compete with the new generation in performance, and won't be able to compete wit the old generation in price.
Intel could have had a great opportunity to make an impact, if they had release in early 2022. But it this rumor becomes true, then it's going to release at the worst time.
 
i agree, their iGPU's were total shit. Now they have raised the bar. This is also good for pixel pushing because windows laptops have stylus pen support, which is needed for photopaint, and other various drawing and painting software that supports windows ink. If laptops get 120hz refresh rate with VRR, the latency for pen is gonna be buttery smooth just like the samsung galaxy note :messenger_savoring:

They now just need to improve the front camera's on laptops cause they are still shit, and NVIDIA needs to make their own x86 cpu for better competition!
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I'm talking about RDNA3 and Lovelace
AMD are almost certainly gonna price RDNA3 at or above Nvidias competing product.
From the rumor mill RDNA3 is 2.5x more powerful than RDNA2 which should should translate to AMD having the more powerful raster hardware in the coming generation, like by quite a margin if Ada is "only" 2x as powerful as Ampere.
The same way they had a better chip in Ryzen 5000 and they priced up accordingly.....im pretty confident they will price their RX7000 GPUs higher than Nvidias.

Id hope to be proven wrong.
 

Shubh_C63

Member
Realistically Intel ARC ONLY have to nail games like DotA, LoL, PUBG and Fortnite.

There is a lot of people who don't know squat about games and just wants to play these games on their laptops.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Yup.
Their MSRP will almost certainly be ~500 dollars for A770.
Off hand calculations put its TFLOP right at 6700XT levels of performance....the 6700XT is AMDs most popular card.
Assuming the drivers are on point, its priced properly and can handily beat the RTX 3070.
Then Intel have themselves a really good product that supply permitting will easily make the Steam Survey charts.

4CO2GcQ.jpg



If they actually have this design then they are also really handsome cards in reference form.
2022-03-30_20-09-56-2060x1159.png
I look at that and all I can see is piss poor airflow...
 

JohnnyFootball

GerAlt-Right. Ciriously.

Intel Arc Alchemist desktop series including A770, A750, A580 and A380 SKUs reportedly delayed till late Q2/early Q3


If this becomes fact, then Arc will release very close to RDNA3 and Ada Lovelace. While having performance similar to Ampere and RDNA2.
And with the probability of having a flood of used GPUs from mining, it might mean that Arc won't be able to compete with the new generation in performance, and won't be able to compete wit the old generation in price.
Intel could have had a great opportunity to make an impact, if they had release in early 2022. But it this rumor becomes true, then it's going to release at the worst time.
Intel may need to consider selling these at a loss in order to simply breakthrough. Im hopeful and optimistic for Intels success, but I still live by what I have always found to be true

NEVER TRUST RAJA KODURI. He is the king of overpromising and underdelivering.
 

Reallink

Member
Intel may need to consider selling these at a loss in order to simply breakthrough. Im hopeful and optimistic for Intels success, but I still live by what I have always found to be true

NEVER TRUST RAJA KODURI. He is the king of overpromising and underdelivering.

They would have to eat launch PS3 losses to compete with next generation dGPUs. Sounds like Intel is dead before they even get to the starting line.
 

JohnnyFootball

GerAlt-Right. Ciriously.
They would have to eat launch PS3 losses to compete with next generation dGPUs. Sounds like Intel is dead before they even get to the starting line.
Intel could afford to do that, though. I would absolutely love for them to do it, since it could potentially provide a nice pricing reset on the spiraling out of control GPU market. The days of a sub $400 price point for the 70-series of nvidia cards is likely over, but sub $500 would be a huge boon.

I have no problem admitting that I have no faith in Intel's GPUs and fully expect them to be a failure, but I very much hope and will be happy if I'm wrong.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
AMD are almost certainly gonna price RDNA3 at or above Nvidias competing product.
From the rumor mill RDNA3 is 2.5x more powerful than RDNA2 which should should translate to AMD having the more powerful raster hardware in the coming generation, like by quite a margin if Ada is "only" 2x as powerful as Ampere.
The same way they had a better chip in Ryzen 5000 and they priced up accordingly.....im pretty confident they will price their RX7000 GPUs higher than Nvidias.

Id hope to be proven wrong.
Let's be real here. Isn't it great that we can now have confidence that AMD can pull something like this off? AMD being in place where their cards can trade blows with nvidia is something considered unfathomable even two years ago. The most impressive part is that it's not like nvidia has stumbled or done a shitty job like Intel did with their stagnation to get to 10nm. Having said that, nvidia is such a household name that the 12GB 3080 at $1000 will likely be preferred over a $1000 6900XT, despite the fact that in non-RT applications, the 6900XT usually offers superior performance.
 
Top Bottom