This is what my control panel displays:
What happens when you change the dropdown of "prefered graphics processor" to the 970?
I don't even get that drop down. I'm guessing you have onboard graphics as well?
What happens when you change the dropdown of "prefered graphics processor" to the 970?
I don't even get that drop down. I'm guessing you have onboard graphics as well?
So we are through with the shit storm I guess? No free game codes?
Alright I am about to throw my fucking computer out of the fucking window.
So I uninstall EVERYTHING associated with the new GPU.
Re-install cleanly a lo and behold there it is - DSR SETTINGS! hallelujah right?
Nope!
So I am in the process of updating the drivers to the newest ones and what happens? Control Panel stops responding. I close out and re-open and guess what? No Fucking DSR settings.
I swear to god if consoles were more powerful this gen I would throw every fucking gaming PC and part I every bought into a pile and burn them while celebrating in joy. Fuck PC gaming sometimes.
I would hope for something like a trade up scheme with some costs to a 980. The who situation has soured me unfortunately. My brother having two 970s in his rig and a 1440p monitor which I recommended to him BECAUSE I thought it"had"4GB...
Since I have a better paying job... for some reason... I'm willing to buy SLI 980s and dual G-Sync monitors lol. I know I'm going to get some shit for it though from some friends haha.
welp.http://www.reddit.com/r/buildapc/comments/2tu86z/discussion_i_benchmarked_gtx_970s_in_sli_at_1440p/
I will just leave this here....
970SLi on overclocked i7 @ 1440p.
welp.
I guess people have to decide how important Ultra textures are to them.
I have about a month to decide if I want to "step up" to a 980 for $160. idk if it's worth that vs just putting that $160 towards something that can run well with an Occulus Rift in 2016.
i just don't know.
welp.
I guess people have to decide how important Ultra textures are to them.
I have about a month to decide if I want to "step up" to a 980 for $160. idk if it's worth that vs just putting that $160 towards something that can run well with an Occulus Rift in 2016.
i just don't know.
I do not want to spend $550+ on a card, especially not when it's such an incremental upgrade over the $330 card.
$160 is a good chunk toward the next card which you can grab long before the 970's performance is done in.
I know the loss of 500mb sucks but if Nvidia does some good memory management with their drivers, it won't be felt as much. Nothing we've learned so far has convinced me that the 980 is a better buy. The only choice I'd put forward is getting a 970 or waiting to see what the next line of cards bring.
http://www.reddit.com/r/buildapc/comments/2tu86z/discussion_i_benchmarked_gtx_970s_in_sli_at_1440p/
I will just leave this here....
970SLi on overclocked i7 @ 1440p.
It is the better buy for highh res or SLI set ups that are aiming fo rhigher the 1080p.
It's funny, I've seen the performance difference between 970 and 980 touted as 5%, 10%, 15%, and 20%
Which is it folks?
I'd disagree. The 980 isn't that much better for that either. Especially considering the price tag. We're talking half a gig here. Its not that significant.
If you really wanted higher resolutions for modern games with ultra textures then you are waiting for another round of cards with higher memory. The 980 just isn't worth the extra cash. You're paying 50% more for like 5% of performance. It just doesn't make sense.
The 970 packs a punch. It runs things on high for middle of the pack money. That hasn't changed.
http://www.reddit.com/r/buildapc/comments/2tu86z/discussion_i_benchmarked_gtx_970s_in_sli_at_1440p/
I will just leave this here....
970SLi on overclocked i7 @ 1440p.
Personally speaking, shadow of mordor does not behave like that at my end. I'm getting much better results (although I have a single card instead of two).
http://www.reddit.com/r/buildapc/comments/2tu86z/discussion_i_benchmarked_gtx_970s_in_sli_at_1440p/
I will just leave this here....
970SLi on overclocked i7 @ 1440p.
Monolith recommends a 6GB GPU for the highest possible quality level - and we found that at both 1080p and 2560x1440 resolutions, the game's art ate up between 5.4 to 5.6GB of onboard GDDR5
Just saying but if you want surround you need 3 displays and I'm not sure but you might need 1 gpu per each one of them if you use g-sync (or more generally DP connection)
Doesn't the developer themselves say Ultra textures require 6gig of VRAM anyways? Why would anyone expect it to run smoothly on 4?
Man, I would have waited longer but my card can not play DX11 games and that the Witcher 3 is coming out this May and I'm doubtful 28mns are going to be revealed and purchasable by then.It definitely is not 5% more performance, we know that. 15 to 20... without even looking at the fact that GTX980 wont have this odd stutter problem and overclocks Just as well.
I would argue EVERY bit of VRAm extra you can get for an SLI set up is extremely important. As someone who has had 3 SLI rigs now where each problem eventually becomes VRAM and not shading performance... I can only say VRAM is of the utomost importance. Hence, why the 980 is much more recommendable than it was before in lightof everything that is come up.
Nonetheless, I agree with you that I will now be waiting for a different GPU than even the 980; which I still find to be cost inefficient for my preferences. Plus there is the knowledge that it is a 28nm GPU and GPU performance has stagnated for about 3 years.
Is there any reason to ever install the drivers that Windows Update pushes on you? I'm having a problem with my PC staying asleep that I think only started happening after I installed my 970 G1, and was wondering if installing the WDDM signed drivers could resolve the problem. Do they typically offer the same functionality/performance as the ones from Nvidia's website?
I'm playing Shovel Knight on my 970. Ballin'
Actually I'm also playing the original Bioshock (which I never got far in before). Playing it at 3400x2400 (4x DSR), everything maxed out including .ini and Inspector tweaks, and 8x MSAA + 8x transparency supersamling. Runs @ smooth 60fps.
I've hit the limit of my monitor's pixel density and that's preventing me from achieving perfect IQ, as I can't get rid of some fine jaggies that appear on thin/far away lines and edges. Makes me really want a 4K display to have that perfect pixel density that I'm familiar with from phones.
MFAA seems to be acting wonky for me though, it's as if it's decreasing performance instead of increasing it, even at lower AA settings. While other times it doesn't seem to have an effect at all.
I'm playing Shovel Knight on my 970. Ballin'
Actually I'm also playing the original Bioshock (which I never got far in before). Playing it at 3400x2400 (4x DSR), everything maxed out including .ini and Inspector tweaks, and 8x MSAA + 8x transparency supersamling. Runs @ smooth 60fps.
I've hit the limit of my monitor's pixel density and that's preventing me from achieving perfect IQ, as I can't get rid of some fine jaggies that appear on thin/far away lines and edges. Makes me really want a 4K display to have that perfect pixel density that I'm familiar with from phones.
MFAA seems to be acting wonky for me though, it's as if it's decreasing performance instead of increasing it, even at lower AA settings. While other times it doesn't seem to have an effect at all.
What resolution is your monitor? My Rog Swift seems perfect to hide the jaggies while downsampling from 4K. Makes me not want a 4K monitor but if I need to run games at 4K to get completely rid of jaggies then why not get one down the line. Still, I paid too much for the Swift, next monitor I purchase will be a curved 21:9 144hz 34inch display with Gsync after at least 2 years.
It's 1920x1200 (16:10 ratio), the LG 2420R (IPS RGB LED 30-bit panel), which originally retailed for 1500 euros or so, somehow I managed to find two of them for 360 each back in 2011 and been rocking them since. I'm honestly not sure if the remaining jaggies I see in Bioshock are due to the monitor's pixel density or if it's game related. I need to check other games. I also play on the Sony HMZ-T1, which has dual 720p OLED panels for 3D... it's low res by today's standards, but downsampling from 5K with GeDoSaTo helps, and the immersion is unbeatable compared to the monitor. Was playing some Dead Space just now and the IQ was better than Bioshock I think.
It also depends on the game. I read a while ago it has something to do with shaders or some thing like that, I'm ignorant when it comes to technical stuff. But I notice that with some games I can easily eliminate jaggies with FXAA (at least make them hard to spot) while with games like Battlefield 4 even with 4x MSAA, FXAA and I cannot eliminate the ugly jaggied lines.
I've been avoiding FXAA because people keep saying it blurs everything and thus sucks. Although it looks fine to me in screenshots, heh, but I'm OCD about using "the best" settings when I can. Yeah I'm largely ignorant about the technical stuff too, I used to be more knowledgeable about them in past times.