• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel Arch GPUs review thread

DaGwaphics

Member
No one is throwing away GPUs, because of something that simple.
You are really insane if you believe that.

We will agree to disagree on that. I think your insane for believing otherwise, tbh.

If the fans are dying early in the useful life of the product that's a problem, but the majority will far outlive the usefulness of any GPU and that combined with the average customer not being as quick to tinker, and you'll end up with most of these in the landfill when the fans break.
 

winjer

Gold Member
We will agree to disagree on that. I think your insane for believing otherwise, tbh.

If the fans are dying early in the useful life of the product that's a problem, but the majority will far outlive the usefulness of any GPU and that combined with the average customer not being as quick to tinker, and you'll end up with most of these in the landfill when the fans break.

Fans can break, it's not common but it happens. What is common is a GPU getting filled with dusk. Or the thermal paste losing some effectiveness.
And no one is throwing away a GPU that cost hundreds of dollars just because of so little things.
If it were a whole bunch of capacitors that blew up, I could understand. Although these can be replaced.

Just look at Steam, the most used card is a GTX 1060. Do you think million of people are throwing away their GPUs because they can't do basic maintenance? Get real.
 
Last edited:
Buy cheap Intel GPU to play DX12 and raytraced games
Swap in your old Nvidia GPU for DX11 games
8kM62tn.gif
 
lol that teardown video. 56 screws in a "screwless" design. It's designed poorly from an industrial design perspective, while also making it extremely difficult to maintain. Lol at the glue and double sided tape.

I would avoid Intel cards until they fixed those drivers.
 

DaGwaphics

Member
Fans can break, it's not common but it happens. What is common is a GPU getting filled with dusk. Or the thermal paste losing some effectiveness.
And no one is throwing away a GPU that cost hundreds of dollars just because of so little things.
If it were a whole bunch of capacitors that blew up, I could understand. Although these can be replaced.

Just look at Steam, the most used card is a GTX 1060. Do you think million of people are throwing away their GPUs because they can't do basic maintenance? Get real.

I'm not saying repairing them isn't a thing. I myself, would probably give it a go if a fan broke early. That doesn't change the fact that the majority of these things will never be opened for any purpose whatsoever. Same with your monitor or BR Player or TV. Can an owner make repairs, sure. Will a meaningful number of real-world users actually try to do that? NO :messenger_tears_of_joy: It is what it is.

Regarding the 1060s, no I don't think a meaningful number of those users has done one bit of maintenance on those at all. Maybe some superficial dusting of the exterior.

I often try to find a teardown video of a GPU I'm interested in, but I'm looking more at if the card works correctly or not. Does the cooler make contact where it should, are the component temps okay, that kind of thing.
 
Last edited:

BreakOut

Member
It will probably take time for them to build up a proper driver update on it? I feel like AMD has that shit on lockdown. But it has come over time.
 

CrustyBritches

Gold Member
He's got different prices than what I thought these were going for. I thought the 770 LE was $350?

I'm significantly less interested in the 770 if it is $380 or $390 vs. $350. LOL
Everywhere else is saying the A770 is $349 for the 16GB, $329 for 8GB, and $289 for the A750.
 
Last edited:

Buggy Loop

Member
So, interesting first go at least with performing RT & AI. Price is ok, i'm sure most glaring performance issues ala CS:GO will be fixed in a driver, i mean i doubt it's just because of legacy DX9.

Also, i mean you have to think what application you would make of this card, are you really an e-gamer that will play competitive shooters at >350 fps? Even i have a 3080 TI and i don't give a shit beyond 120 fps.

BUT

It's so strange that they have more transistors and silicon area than Nvidia's 3070, ditched legacy DX feature sets to have more silicon dedicated to DX12 performance, and they still managed to perform worse. I mean, not that surprised, it's not Nvidia's first rodeo, but still, somewhere they mangled up their silicon.

2nd or 3rd iteration they might become a serious contender honestly.
 

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
The problem today is no longer who's making the fastest cards anymore.

Imo the struggle is if the cards support freesync or gsync.

Is your monitor dedicated to only one of these, then you are guaranteed to keep being gpus from one manufacturer.

I can't imagine playing without g sync
 

RoboFu

One of the green rats
Saw Jayz2cents review and I am impressed. These cards are really good even without the driver optimizations. Given that its Intel's first go, driver optimizations should bring significantly more performance. The RT performance is especially impressive. Much better than AMD.
I hope they succeed. We really need some competition.

Nah I doubt rt will be better than rdna3 which is the cards launching in the sane time frame, but maybe 🤷‍♂️
 

DaGwaphics

Member
Everywhere else is saying the A770 is $349 for the 16GB, $329 for 8GB, and $289 for the A750.

Yeah. I'm genuinely interested in the A770 @ $350. It has tantalizing upside at that price point when you consider the fact that even with the state of the drivers some of the titles specifically optimized for it can approach 3060ti/3070 levels of performance. As the drivers improve it could be a real smart buy for the money on new titles. It might even hold when the next wave of cards hit because it is rumored that the 4060 is only benching around the 3070, in that scenario you might have similar performance with 8GB more VRAM for less money (assuming Nvidia pushes the price upwards on the 4060). At $40 or $50 more you lose a bit of the value and it gets harder to overlook the performance deficits on older versions of DX.

It will be one of those cards for a certain kind of buyer. I can see if you already have a RX5600 or RTX 2060 where this might not be a good upgrade since performance would likely drop on many of the older games, would depend on the type of games you play I guess. For me I've got the old 960, so it might be more of a wash or modest improvement in most titles (looks like AC Unity would still be a loss though, LOL).

I'll keep an eye on the reviews and benchmarks that pop up. I might try to jump on that LTT live stream and ask them to test the UE5 City sample, I'm curious about how that runs (especially since the UE4 based Gears didn't do as well as most of the other newish titles). Freesync support would be a big one too, can't see getting a new GPU without that, if they don't have it yet I'd hope it would be on the roadmap in short order.
 
Last edited:

Neo_game

Member
Unfortunately the biggest problem of intel would be the requirement of newer cpu and dx11 performance. The power draw even on idle also is too high. Impressive part is RT performance and how well it scales with resolution. It is more competitive on 1440P and even on 4K than 1080P. I hope they can fix the power usage via software and dx 11 performance and do not give up irrespective of some negative reception.
 

poppabk

Cheeks Spread for Digital Only Future
No one is throwing away GPUs, because of something that simple.
You are really insane if you believe that.
I'm pretty sure he is right, that won't know its the fan it will just start acting janky and they will dump it and buy a new one. A lot of people don't even build their PC you really think they are taking apart their GPU?
 
Buy cheap Intel GPU to play DX12 and raytraced games
Swap in your old Nvidia GPU for DX11 games
8kM62tn.gif
Would be funny if for certain games a Ryzen APU actually would handle DX 11 better than the certainly much beefier but for some reason struggling Arc.


The first review videos of this thread were not really encouraging, but watching some other videos, were at least RT (xess not so much) and more modern games were emphasized, it looked more promising. The HW seems decent, xess might also add a bit value, so maybe actually 3070 area, but drivers for thousands of old and probably also new, small, terribly optimised unity indie games, which are usually not in reviews at all, will be a major headache for months and years and the high power draw melts a price advantage away too. It's seems to be a proper option but only for adventurous adopters that expect better drivers soon less as a carefree recommendation for everyone.
 

winjer

Gold Member

The results may vary, but overall Arc really may lose up to a quarter of performance without ReBAR. According to TPU tests, the lack of PCIe Gen4 support is not as important as lack of ReBAR. Older PCIe Gen3 standard only shows a 0-2% variance throughout the tests. On the other hand, lack of ReBAR may result in 76-80% of performance, which is a significant difference.

What this means that systems that lack Gen4 support but still have ReBAR, can still be used with Arc GPUs without major performance drawbacks. This includes Intel 10th Gen Core or Ryzen 5000G series which are limited to Gen3 standard, but also many systems that received ReBAR support unofficially (such as Intel Coffee Lake).
 

DaGwaphics

Member

Good to know. I was wondering about the PCie 3.0 support. A lot of the better value ITX boards for CPUs with rebar only offer 3.0 on the X16.
 
Top Bottom