• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia CES 2019: Adaptive Sync Support and RTX 2060 coming January 15th. RTX Laptops coming January 28th.

Leonidas

Member
Just announced at CES 2019.
RTX 2060 was shown running BFV at 1440p60 with RTX on
The RTX 2060 is 60 percent faster on current titles than the prior-generation GTX 1060, NVIDIA’s most popular GPU, and beats the gameplay of the GeForce GTX 1070 Ti.
Source

Adaptive Sync Support is coming to Nvidia GPUs starting January 15th.
RTX Laptops will be out by the end of the month.

This laptop has twice the performance of a PS4 Pro.
Source

Great move by Nvidia to support adaptive-sync as a lot more people will be able to experience variable refresh gaming. RTX 2060 at $349 is easily the best value RTX GPU and it performed greatly with ray-tracing on in the BFV demo.

Adaptive-Sync on Nvidia is huge, that's one of the only reasons I have considered an AMD GPU in the past few years erased with an upcoming driver update.
 
Last edited:
I'll take a wait and see on the RTX 2060.

Same MSRP as 1070 in 2016 but 2GB less memory.

Adaptive SYNC support is only for 12 monitors.
 
Last edited:

dirthead

Banned
You know Nvidia's only doing it because they don't want to be left out in the cold regarding HDMI 2.1. Still a good thing, though. Not complaining.
 
Last edited:

twdnewh_k

Member
Glad to see nvidia back-down from the G-sync only option, that's a big change in stance; must be starting to worry about AMD gaining traction.

While the $350 price on the 2060 isn't terrible, it isnt great either. Will look to see some more benchmarks; but this might be my new GPU.
 

JohnnyFootball

GerAlt-Right. Ciriously.
You know Nvidia's only doing it because they don't want to be left out in the cold regarding HDMI 2.1. Still a good thing, though. Not complaining.
How will they be left out in the cold? Nvidia can still refuse to support HDMI 2.1, opt to make G-Sync an exclusive tech and their GPUs will still sell as long as they offer the best performance.

I am grateful they are NOT doing that obviously, but pressure from AMD is the only reason they are doing it.
 
Last edited:

Soltype

Member
I don't know why everyone's happy about adaptive sync,gsync costs more,but it works better. I'd rather they have two options, one premium and one not.
 

JohnnyFootball

GerAlt-Right. Ciriously.
I don't know why everyone's happy about adaptive sync,gsync costs more,but it works better. I'd rather they have two options, one premium and one not.
They do. Proprietary G-Sync monitors are still going to exist. Nvidia has opened the door for G-Sync-lite which I imagine will be fine.
 

Alexios

Cores, shaders and BIOS oh my!
I don't know why everyone's happy about adaptive sync,gsync costs more,but it works better. I'd rather they have two options, one premium and one not.
Nvidia here says it's a gsync experience with their approved monitors, if you tested it on older or not the greatest monitors that they come to approve or even with not the greatest (AMD by necessity previously) graphics cards or otherwise sub par conditions then are your test results more valid than NVidia's and how/why? But nothing about this says they'll stop selling gsync monitors and freesync isn't necessarily not premium, there can be expensive monitors based on all specs and you may wanna keep them at full capability, primary or secondary, even if you switch GPU brands, from AMD to NV or vice versa.
 
Last edited:
Nvidia fucked the tech world with their propriatary G-Sync bullshit. They needlessly created a format war where G-Sync had no tangible benefits over Freesync. As a Nvidia owner of the past 15 years I am so happy. I want Nvidia to eat crows and pay the piper for their business practices which have hurt the games industry needlessly through fragmentation.
Nvidia is not alone here. Apple and Sony have gotten their feet toasted heavily for pushing tech standards down consumers throats for only to see them become abandoned.

Tech parity standards are important. It's important that different hardware speaks together. When hardware companies needlessly try to peddle closed walled garden standards (like MiniDisc, Firewall etc..) the result is that consumers get caught in the middle in the format war between two near identical technologies that are not different enough to warrant the massive cost of avoiding compatibility.
There was never a legitimate case for G-Sync. Freesync was the open standard. It did everything. But Nvidia thought they could use their market share to peddle their own standard which wasn't better, charge a markup and then get other companies to pay a royalty markup to pay Nvidia to use their standard even though there was already a free and available standard.

Nvidia is almost comically evil in their doucheyness, and its hard for consumers (even Nvidia fans) to not enjoy them getting pie on their face for all this fuckery. What a terrible year 2018 has been for them in the wake of all the greed, markup, lies about inventory, the release of the RTX cards and distorting the truth with regards to their crypto boom.
The only major company that had a worse year 2018 than Nvidia was Intel. For these two companies who've DOMINATED the enthusiast marked since the early-mid 2000s, 2019 brings an air of potential change. People want to see Intel and Nvidia significantly change their ways, and even its fans are rooting for AMD to take a big piece of the pie. Go AMD!
 

Leonidas

Member
I'll take a wait and see on the RTX 2060.

Same MSRP as 1070 in 2016

1070 MSRP in 2016 was $379/$449, and most 1070s were selling at $420 or higher at launch.

FE to FE 2060 is $100 cheaper, is a bit more powerful, has much better performance in more modern engines, supports ray-tracing, supports DLSS, comes with Anthem, has higher memory bandwidth and comes on a bigger more expensive to produce chip.

If you want to paint that as being bad value then fine I guess, but it's far and away the best value card at the $349 MSRP.
 

Ivellios

Member
Considering this come with RTX and now Freesync support, this looks like it could be my next upgrade.

Lets see now what AMD Navi has to offer
 

dorkimoe

Member
Wow....support for freesync, aka monitors that are half the price of gsync. Might be time to upgrade my second monitor
 

Soltype

Member
Nvidia here says it's a gsync experience with their approved monitors, if you tested it on older or not the greatest monitors that they come to approve or even with not the greatest (AMD by necessity previously) graphics cards or otherwise sub par conditions then are your test results more valid than NVidia's and how/why? But nothing about this says they'll stop selling gsync monitors and freesync isn't necessarily not premium, there can be expensive monitors based on all specs and you may wanna keep them at full capability, primary or secondary, even if you switch GPU brands, from AMD to NV or vice versa.
https://www.pcworld.com/article/297...rate-displays-make-pc-games-super-smooth.html

Going from this article it seems Nvidia has stricter standards, I just hope they continue to make a steady stream of g-sync monitors from here on out.
 

dirthead

Banned
Really, every screen should be at least 24hz -> max refresh rate just so you can support movies without juddering. It's kind of sad that people haven't prioritized better video playing programs that take advantage of VRR for movies. Judder in movies is so fucking annoying.

People with Nvidia cards can take advantage of FreeSync on this monitor now, too.

https://www.theverge.com/circuitbre...d-monitor-resolution-ur59c-curved-4k-ces-2019
 
Last edited:

CuNi

Member
Why did we have two of those threads now?

Anyway, just as I said in the other one, this only works, if we can trust NVidia on its own words, on 10XX and 20XX Series GPUs which I think is a waste.
Me and my 970 paired with my 144Hz AOC FreeSync Monitor feel betrayed by this move. I hope someone will hack those drivers and enable it on 9XX Series GPUs too or at least find a way to mask the 970 and report as a 1060 or so to the driver to use FreeSync.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Really, every screen should be at least 24hz -> max refresh rate just so you can support movies without juddering. It's kind of sad that people haven't prioritized better video playing programs that take advantage of VRR for movies. Judder in movies is so fucking annoying.

Juddering is largely non-existent on TVs these days since most no longer use 3:2 pulldown and instead just drop 60Hz TVs to 48 so they can display each frame an even number of times. While judder is largely gone, stuttering of 24fps content is still an issue. That is my knock with my LG OLED. The stutter can be bad at times.

I am very curious to see what effect VRR can have on TVs. That could be a nice game changer.

Or it could look terrible.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Why did we have two of those threads now?

Anyway, just as I said in the other one, this only works, if we can trust NVidia on its own words, on 10XX and 20XX Series GPUs which I think is a waste.
Me and my 970 paired with my 144Hz AOC FreeSync Monitor feel betrayed by this move. I hope someone will hack those drivers and enable it on 9XX Series GPUs too or at least find a way to mask the 970 and report as a 1060 or so to the driver to use FreeSync.
Or just upgrade the GPU. The 970 isn't cutting it much anymore.
 

llien

Member
Admittedly, there aren't as many FS2 monitors, but is it version 1 or 2 of FreeSync that they are supporting?

JohnnyFootball JohnnyFootball
The only "light" thing about FreeSync (this is what "adaptive sync support" actually means, without nitpicking into semantics) is that it does not mandate any motion compensation tech on the monitor. Other than that, as tests have shown, FS is at least on par, at times superior to GS.

Going from this article it seems Nvidia has stricter standards, I just hope they continue to make a steady stream of g-sync monitors from here on out.

Stricter standard of what? It was hard to find what specs "g sync" actually mean, besides, well, $150-$250 premium and lack of compatibility with competitors.

FreeSync2 is quite ahead already, addressing HDR lag and having mandatory Low Framerate Compensation.
 

Leonidas

Member
Stricter standard of what? It was hard to find what specs "g sync" actually mean, besides, well, $150-$250 premium and lack of compatibility with competitors.

FreeSync2 is quite ahead already, addressing HDR lag and having mandatory Low Framerate Compensation.

Nvidia's standard for G-Sync VRR are much stricter than AMD. There are Free-Sync monitors out with 1.25:1 ratio for VRR range(48-60). Nvidia requires 2.4:1 ratio for G-Sync and there can't be any flickering, blanking or stuttering which is present in many cheap Free-Sync panels. That disqualifies the vast majority of Free-Sync monitors, but you can still enable it yourself to see if the experience is good enough for you...
 

Tygeezy

Member
Really, every screen should be at least 24hz -> max refresh rate just so you can support movies without juddering. It's kind of sad that people haven't prioritized better video playing programs that take advantage of VRR for movies. Judder in movies is so fucking annoying.

People with Nvidia cards can take advantage of FreeSync on this monitor now, too.

https://www.theverge.com/circuitbre...d-monitor-resolution-ur59c-curved-4k-ces-2019
I think you can do 48 hz and still watch movies in their native framerate because 48 is double that of 24. It's why 60 hz screens are perfectly fine with 30 fps content which is just about everything.
 

Kenpachii

Member
Admittedly, there aren't as many FS2 monitors, but is it version 1 or 2 of FreeSync that they are supporting?

JohnnyFootball JohnnyFootball
The only "light" thing about FreeSync (this is what "adaptive sync support" actually means, without nitpicking into semantics) is that it does not mandate any motion compensation tech on the monitor. Other than that, as tests have shown, FS is at least on par, at times superior to GS.



Stricter standard of what? It was hard to find what specs "g sync" actually mean, besides, well, $150-$250 premium and lack of compatibility with competitors.

FreeSync2 is quite ahead already, addressing HDR lag and having mandatory Low Framerate Compensation.

Freesync was a disaster a while back dunno about freesync2. But there is a reason why they have a 2 to start with.

The panels that used had shitty ranges, terrible ghosting, only full screen mode. Gsync gave you a quality check that everything would just work well and is actually tested. Freesync was also completely dependend on AMD driver team that had to support every single panel in there drivers which with amd's driver track record = gg.

Tests over tests showcased freesync fell flat on its face against gsync

However that was a few years ago, they probably improved a lot since then.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Freesync was a disaster a while back dunno about freesync2. But there is a reason why they have a 2 to start with.

The panels that used had shitty ranges, terrible ghosting, only full screen mode. Gsync gave you a quality check that everything would just work well and is actually tested. Freesync was also completely dependend on AMD driver team that had to support every single panel in there drivers which with amd's driver track record = gg.

Tests over tests showcased freesync fell flat on its face against gsync

However that was a few years ago, they probably improved a lot since then.
The price premium for GSync is pretty substantial, but there is little doubt that it will flawlessly.
 

Wag

Member
So are they supporting all Freesync displays? I'd love to replace my crappy 1st gen 2015 Samsung 4k TV that I'm using as a monitor with one of their new Freesync enabled ones for my 1080Ti.
 
Last edited:

sendit

Member


Twice the performance of a PS4 Pro.
"The most powerful game console out there" -Nvidia CEO :messenger_grinning_squinting:

Funny how he completely disregarded the XB1X.
 
Last edited:

DeepEnigma

Gold Member


Twice the performance of a PS4 Pro.
"The most powerful game console out there" -Nvidia CEO :messenger_grinning_squinting:

Funny how he completely disregarded the XB1X.


87d.gif
 

llien

Member
The price premium for GSync is pretty substantial, but there is little doubt that it will flawlessly.
Quick googling of "gsync problem" shows it's a lie.

Freesync was a disaster
Meh.

The panels that used had shitty ranges, terrible ghosting, only full screen mode.
AMD maintains a filterable list of FS monitors, with useful info.

Freesync was also completely dependend on AMD driver team
Only a handful of broken implementations had such dependencies.

Tests over tests showcased freesync fell flat on its face against gsync
Nope:


But one shouldn't compare $900 monitor to $300 monitor, to have apples to apples.

flickering, blanking or stuttering
Unlike gsync, right? Fear, Uncertainty, Doubt.

Nvidia's standard for G-Sync VRR are much stricter than AMD.
None have been ever published, so we can only guess.
FS2 label needs at least LFR etc, but VESA VRR has nothing to do with how manufacturers certify monitors anyhow.
 
Last edited:

Leonidas

Member


Twice the performance of a PS4 Pro.
"The most powerful game console out there" -Nvidia CEO :messenger_grinning_squinting:


Probably because it sounds a lot more impressive to say your flagship laptop GPU is 2x the power than it would be to say it is only 50% more powerful(compared to the X).
Or perhaps they agree with that popular Youtuber who said Xbox is a PC.
 

Aintitcool

Banned
The only reason nvidia is doing this move and doing it now is because of the TV's and monitors coming with HDMI 2.1. As it offers adaptive sync via VRR. So nvidia couldn't let AMD own that. Including it in a few monitors is a nice move, but the reality is Nvidia didn't want people having to buy AMD for wanting VRR on their fancy new TV's.
 
Last edited:
i've been waiting to pull the trigger on a 34" ultrawide but didn't feel like paying the g sync tax. this will probably push me over the edge :)
 

Xdrive05

Member
Sorry to bump this thread, but didn't want to start a new one just for my question. So this new freesync support from Nvidia only supports the 10 and 20 series cards right? Well I still like my GTX 980 4GB for 1080p gaming, and I need to get a 24” monitor. So if these freesync monitors won’t work on my 980, then how do I find a *real* g-sync monitor that will? Because when I search Amazon, they now list these newly supported freesync monitors as “G-Sync Compatibile” which in my case is not true (980). So what do I need to look for in a monitor that will work with it for g-sync? Thanks in advance! I do appreciate it!
 

Leonidas

Member
Sorry to bump this thread, but didn't want to start a new one just for my question. So this new freesync support from Nvidia only supports the 10 and 20 series cards right? Well I still like my GTX 980 4GB for 1080p gaming, and I need to get a 24” monitor. So if these freesync monitors won’t work on my 980, then how do I find a *real* g-sync monitor that will? Because when I search Amazon, they now list these newly supported freesync monitors as “G-Sync Compatibile” which in my case is not true (980). So what do I need to look for in a monitor that will work with it for g-sync? Thanks in advance! I do appreciate it!

Check the list below and avoid and avoid "G-SYNC Compatible" monitors at the bottom since they won't work with the 980.

https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/
 
Top Bottom