I feel like if i buy RX 6800xt its gonna last me for next 7 years on 1440p. Am i too optimistic or?
That's a lot of assumptions. I won't hold my breath.
Quality over quantity is what I see the end game being.
Time will tell.
Let's also understand this is the first release and use of it and DLSS has been out for years. So similar to DLSS 1.0 which was not great, it's going to take time for developers to implement this.
I think you guys the biggest beneficiaries here. Its Consoles.
There is no quality over quality? It's one is basically open, the other is closed. Once is being used on consoles which profile wise can run on PC, so the work they do while developing console versions on games config wise can be used on PC.
So in my opinion more people in practical use will use this outside of games that right now are contracted with NVIDIA.
Video and screens so far did not make a marry-me first impression, it is significantly blurring the image while in motion from what I can tell. I'd like to see more though.
AMD oddly made a lot of mistakes in this showcase (they showed split-screen instead of mirroring or dual-screening the footage... and WTF that every shot seemed to have all the good parts of the scene like leafy trees and pillars and entryways on the native left while the FSR right side half the time was a wall or an empty ravine?!) Looking forward to seeing FSR in a better light next time.
Okay, first release, but I would say everybody was expecting AMD to leapfrog into competition with NVIDIA (since it is so late to the game, but also since the bar of quality has been set) with FSR that competed with DLSS 2.0, not 1.0. There are certainly things to look forward to with FSR, this isn't the end of the conversation or the battle for best visual output (this being open beyond the typical specialized devices is an interesting new wrinkle,) but I think it makes sense why people might have expected more in this announcement.
Maybe, although there are a lot of good choices for display on consoles. This was kind of sold as the DLSS-Challenger for Consoles that soon to come.
But speaking of consoles... there was no mention of consoles, right? Nothing in that section of the presentation, nothing I saw in the larger AMD keynote, and nothing about consoles on the FSR website. They talk about it being open to "almost any graphics card", they talk about it being made for a wide range of products ("desktop and laptop devices", in this case,) they say it works on over 100 commercially available products... then they say nothing about PS5 or Xbox Series X/S, which AFAIK was the first community that was anticipating the FSR announcement.
So what is the story with FidelityFX Super Res on consoles? Is it not in the first-phase rollout, is it not happening, or is AMD just not able say Xbox or PlayStation in its show for whatever contractual reasons (but they could have said "console platforms" and been fine if that were the case?) and it's still coming as previously rumored?
Video and screens so far did not make a marry-me first impression, it is significantly blurring the image while in motion from what I can tell. I'd like to see more though.
AMD oddly made a lot of mistakes in this showcase (they showed split-screen instead of mirroring or dual-screening the footage... and WTF that every shot seemed to have all the good parts of the scene like leafy trees and pillars and entryways on the native left while the FSR right side half the time was a wall or an empty ravine?!) Looking forward to seeing FSR in a better light next time.
I disagree. It's one of my favorite channels, I watch all of their videos. They criticize everyone, the fact that some companies are more worthy of criticism than others doesn't make them biased. They have complained about the price of AIB graphics cards, they have complained about AMD not releasing non X CPUs, they have complained about Intel 11th gen i9 being ridiculous, but praised them for their i5/i7 CPUs. They don't think that RT is a big deal that should convince you to go towards one card or the other, but they have said all along that Nvidia offers the better value thanks to DLSS.They don't criticise everyone with the same level of energy, this is well known.
When it comes to drivers and new arc that no one in games has been developing on up until the last year or so third party wise. I say why is everyone expecting the moon? Nvidia had tons of time to make DLSS, and also improve upon it with an unlimited R&D budget.
Not the moon, just a competitive product.
AMD is doing things differently from NVIDIA (being open and multi-hardware), so it makes sense that they're on a different timeline, but DLAA debuted in Sept 2018 and 2.0 has been out for over a year now. AMD took their time to come to market for their own reasons which may be good overall, but they missed the console launch (and as I mentioned, they frustratingly omitted saying anything about consoles, which would at least feel like a slingshot of momentum to me since DLSS is never going to be on Xbox or PlayStation,) and they're coming way late to the competition. AMD now have got their work cut out for them.
But again, I'm eager to see more and get a clearer understanding. (...And I'd like to see a demo that wasn't full of wonky problems and YT compression crinkle that mostly excited when you looked at the stats, not at the video.)
Video and screens so far did not make a marry-me first impression, it is significantly blurring the image while in motion from what I can tell. I'd like to see more though.
AMD oddly made a lot of mistakes in this showcase (they showed split-screen instead of mirroring or dual-screening the footage... and WTF that every shot seemed to have all the good parts of the scene like leafy trees and pillars and entryways on the native left while the FSR right side half the time was a wall or an empty ravine?!) Looking forward to seeing FSR in a better light next time.
Great explanation of what FSR is:
It's from the marketing image they chose.. and it's incredibly blurry right at the "line" they chose to split this same image with. In fact if anything the 2 pillars look closer than right at the line:Sorry to be blunt, but this comparison it terrible, you can't compare it like that because different parts of the frame are moving at different speeds and the viewport isn't uniformly sharp and many other reasons related to the geometry of the image at different sample points.
We need to wait for proper screendoor or image comparisons. I'm not writing this to attack you I'm writing it to make others who just see this and think thats the difference aware why its an unfair comparison.
Well they barely showed anything here; nothing moving much for instance.. hard to judge anything but the obvious blurriness in the 1440p stuff.I know you've been critical about DLSS 2.0 artifacts, did you notice anything with AMD's solution? And which one you prefer?
Fidelity FX CAS (not to be confused with SR) was widely accepted as giving way better render, so "comparable to 1.0" what the heck do you mean?Interesting that AMD have managed to provide something comparable to DLSS 1.0
Nah, I too believe 6800 XT will easily last for the next 7 years and even more. Don't worry.
The card is more future proof then anything nvidia produced besides the 3090.
The good outcome of this tech would be that people would stop bitching about "but no DLSS".
AMD stays firmly behind Nvidia when it comes to tech. It's good they're making progress for those who are die hard AMD fans and maybe will help things on the console side but overall Nvidia stays winning.
Buying enough reviewers who would focus on right things.What would people stop bitching about that? lol
This is so delusional it hurts.AMD stays firmly behind Nvidia when it comes to tech.
It's from the marketing image they chose.. and it's incredibly blurry right at the "line" they chose to split this same image with. In fact if anything the 2 pillars look closer than right at the line:
Original Image
This is so delusional it hurts.
6900XT beats 3090 at newest games, despite:
1) having less transistors
2) consuming less power
3) using MUCH slower VRAM
All that merely 1 year after 5000 series.
I thought 2.0 still uses deep learning just in different areas?DLSS 2.0 and 1.0 are two very different beast, the first one is true (but failed) AI solution. The 2 is a glorified TAAu.
It's from the marketing image they chose.. and it's incredibly blurry right at the "line" they chose to split this same image with. In fact if anything the 2 pillars look closer than right at the line:
Original Image
Thanks much for reply, that is indeed blurry and disappointing, I thought the pillars were pulled from the video. Hopefully 4K and/or the "ultra quality" preset work better but I wouldn't hold my breath.
What a bizarre marketing image to release if you want to drum up excitement, its obviously disappointing even without maximising it on my display, let alone zooming.
I'll wait for more information/comparisons before I comdemn it fully just in case something weird is going on but its pretty baws looking at that. Why is the geometry of the native image so distorted?
AMD beats NV at RT (whatever the reason to care about that tech that is still in "maybe in the future" promising state years after release) in: WoW RT, Dirt 5, Fortnight.Does AMD compete when it comes to offering a solution as good as DLSS 2.0 and Ray tracing performance?
1.0 was using datacenter training using per game's higher resolution assets (and what not, and, mind you IT DID MAKE PERFECT SENSE to try that, nothing silly about it, but it didn't work).I thought 2.0 still uses deep learning just in different areas?
Sorry I actually misunderstood a post from another forum (they were actually complaining their were NO marketing images released lol); it IS from the video, you were right.
It still just looks bad.. and I couldn't find any paused framed that look much better when I found the scene in the video.
It is the 1440p render though, and we don't know the source resolution.
the first one is true (but failed) AI solution.
I think it starts at 3:40, I think Fidelity FX is a stepping stone needed for RDNA 3 to be ready
This video is one of the bigger jokes I've ever seen... why in the world did anyone use this guy as a source pre-console launch?
Everyone already knew AMD"s solution will be supported all over the place.. why is he acting like that's a surprise, other than click-baity nonsense?
As long as games are being made for Xbox Series S (which I assume will be for the next 7 years minimum), people with current RDNA 2 GPUs on PC needn't worry about anything.How about my factory OC'd 6800 (Powercolor Red Dragon)?
So both DLSS 1.0 and 2.0 use tensor cores?You really need to stop talking about ML bro lol
Training on the actual source material made it a bad solution, not a good one (and a really odd one at that).. and.. all ML involves neural networks...
It's also incredibly unlikely DLSS 2.0 uses a "static" model; they are likely continuing to train on more imagery to build a better one over time.
Sorry.. not trying to sound aggressive towards you.. this video is just so bad; like the TMZ of tech videos. He's making up things nobody every said (while saying things like "I think nVidia said this"), and talking about DLSS 1.0 still which isn't being used anymore lol:/
I dunno, thought it was interesting when he mentioned RDNA3 and Fidelity FX..
I agree.Sorry.. not trying to sound aggressive towards you.. this video is just so bad; like the TMZ of tech videos. He's making up things nobody every said (while saying things like "I think nVidia said this"), and talking about DLSS 1.0 still which isn't being used anymore lol
I mean whatever, his theory you are mentioning is fine and all.. but it's all going to boil down to quality, and what they showed wasn't really promising so far so we have no clue how much it'll get used. Lots of devs have their own solution, Unreal Engine has a solution, etc. All things that work on various graphics cards.
DLSS certainly has competition but it's also at this point way ahead in quality.
Yes; there was a version of 1.0 (or an implementation of it in Control) that didn't use them, but outside of that, they've always used tensor cores.. both are ML. They just do things differently.So both DLSS 1.0 and 2.0 use tensor cores?
It doesn't look bad.
What does that have anything to do with the series S though? All of these cards are more powerful than any of the current gen consoles.As long as games are being made for Xbox Series S (which I assume will be for the next 7 years minimum), people with current RDNA 2 GPUs on PC needn't worry about anything.
Even a hypothetical, less powerful RX 6500 non-XT with 20-22 CUs will last as long as Series S... Provided it has 6+ GB of VRAM.
I think it's a question of expectations, if you lower the details to console level and target 30fps (sometimes less) just like the consoles you can still play on a 750ti like you would on a base PS4 (probably even better than on a PS4 in some games)... now if you expect to always play at 60fps on the PC then yeah.but it is also the most brutal example i can get
its the biggest GOTY contender game the PC received in the entirety of generation
i can get you more examples though, if you like... but then it would be horizon zero dawn, ac valhalla and such and you would say its old and its normal and this discussion would go on forever...
there's no proper disperancy between consoles and desktop gpus. it is clear, i'm not saying x1x is magically faster. x1x has the advantage of years of years development and developers have mastered it and manage to extract more efficient performance out of it.
that's series x being equal to 6700xt these days are irrevelant, because sx will be rivaling 6800xt by the time nextgen gpus arrive (same applies to ps5 as well. they're already doing mighty fine job in re:Village against their desktop counterparts)