• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why you shouldn't underestimate Radeon 6000/Big Navi

Hudo

Member
AMD has spent the last five or six years overpromising and underdelivering. I see no reason to believe that's going to change.
This argument can be applied to nVidia as well, especially when you consider nVidia's absolutely insane price politics. Every nVidia piece of hardware is overpriced (because they can) and not worth the asking price. The PC hardware industry is in a shitty place right now. The only good bit of news recently was that the new Ryzen CPUs are actually good and are somewhat reasonably priced, which actually prompted Intel to adjust their insane prices a bit. I'm not saying that nVidia's stuff is crap (their software stack is, however. Especially when you develop with CUDA) but their asking prices are insane.
 

kuncol02

Banned
frequencies on console are not conservative at all this time.
the 2.1ghz is not "maintainable" for 32cu
XSX indicate the sweet spot for 52 CU is 1.8ghz
And you expect bigger will clock faster ?
It would be more a surprise than the reverse.
That frequencies are not conservative (especially sony's one) for small low powered box. Consoles are around ~250W of power budget for whole console. New Nvidia cards have 350W TDP for GPU alone.
 

Ascend

Member
And that means that 30 series maxes out at 1,7 GHz :messenger_grinning_sweat: Please.
The RTX 3070 with 46SM has 1.7 GHz, the same as the RTX 3090 with 82 SM. And their TDP is already above 300W. So...

You somehow conveniently "missed" that all 20 series FE cards including 2080 Ti, 2080, 2070 could run at 2 GHz and even higher.
Different architecture, different node. There is zero evidence that RTX 3000 series cards can reach much higher.

Oh no 2080 Ti with its tiny 1,55 GHz have LOWER than 3080 official 1,7 GHz boost clocks by a whoping 200 MHz :messenger_face_screaming: Does it mean that 3080 series will do 2,2 GHz since 2080 Ti easily does 2 GHz? NO of course not. Still that is higher posibility than maxing out at 1,7 GHz....
I never said it maxes out at 1.7 GHz. I said it cannot reach much higher than 1.7GHz. If you look at the power consumption of the cards, it's quite obvious.


....now if only there was an amd card with 36 CUs so that we could test this

oh snap! 5700 plain has 36 CUs, and also has 448GB/s bandwidth, just like ps5.
what do we know from 5700? game clock 1.625Ghz // boost clock 1.725Ghz
but lets say that these limits were there just to make way for 5700XT
with BIOS checking, you can unlock the restrictions and overclock as much as you like. no matter the cooling solution, can't go close to 2.2Ghz for the love of its life

lets move on to its bigger brother, 5700 XT : game clock 1.755 Ghz // boost clock 1.905
here is a "no limits" overclock test for 5700 XT, where BIOS and registry were touched, a fat liquid cooling block was attached, and card was overclocked to the absolute highest limit.
*notice that even though all barriers were uplifted and card could be clocked to 2.300Ghz, the silicon was no good for anything more than 2.100Ghz. So it was put there, along with a lock that it doesn't spike down below 2.050Ghz


as anyone that reads the test can see, overclocking the 5700 XT up to its hardware limits, taking temperatures completely out of the equation,
for a 18% increase in clockspeeds there was a 40% increase in consumption, and only a ~7% average performance increase

the article calls the results "disappointing", but enlighting as to why amd has put these limits, and suggests that it makes much more sense to buy a stronger GPU that to spend money on fancy cooling so as to raise 5700XT's clocks.


now, what efficiency improvement margin do we expect for amd's latest? 5%? 8%? even if 10%, cerny's magic words do not compute.

I guess we will find out this Christmas who is the troll. could prove to be me -as I go through these numbers and draw conclusions, but I'd suggest that you don't be so hasty to rule out that cerny was the troll all along.
There is a reason I used XSX and PS5 data as well. RDNA2 is not RDNA1. You cannot directly use the 5700(XT) data on its own. We don't even know if the node will be the same. Since the PS5 can reach 2.2GHz, RDNA2 should also be able to. Maybe the larger die size limits clocks, which is why I added data for 2 GHz. I would be surprised if it cannot reach 2GHz.
I'll ask you the same thing. Are you expecting the consoles to have a higher clock speed than the PC graphics cards with the same architecture?

As for the power consumption, the XSX already has proven that power consumption for RDNA2 is way down compared to the 5700(XT) cards.
 

notseqi

Member
'Unlike Intel, Nvidia took AMD extremely seriously. Hence, the absolute beast of a reveal today.' - some dude under an Nvidia RTX30 video

I hope so, I damn well hope so.
 
There's a couple AMD fanboys I always see in Nvidia threads downplaying them. Can't say I'm surprised you made this topic.
When i read the title, I thought this would be another llien llien thread. That guy swears by AMD, despite AMD not being a competitor in the gpu space, in many years. I don't understand shilling for AMD gpu's (their cpu's are great). I understand they are they under dog, and I truly want them to succeed. I just don't think they can hold a candle to Nvidia.
 
There is a reason I used XSX and PS5 data as well. RDNA2 is not RDNA1. You cannot directly use the 5700(XT) data on its own. We don't even know if the node will be the same. Since the PS5 can reach 2.2GHz, RDNA2 should also be able to. Maybe the larger die size limits clocks, which is why I added data for 2 GHz. I would be surprised if it cannot reach 2GHz.
I'll ask you the same thing. Are you expecting the consoles to have a higher clock speed than the PC graphics cards with the same architecture?

As for the power consumption, the XSX already has proven that power consumption for RDNA2 is way down compared to the 5700(XT) cards.
I did not use the rdna1 data as a conclusion for rdna2, but as a reference, a base.
I pointed out that even if there is a whole 10% on silicon improvement, sony numbers for regular usage still don't hold up.
You are right of course in your assumption that "if a closed-box console with a very smaller thermal and consumption envelope can do X Ghz, the real-deal hardware without these restraints can do at least that".
caveat is that sustained 2.23 on ps5 looks not feasible, unless nothing else works in that console at the time of this clockspeed.
anyway, obviously you were smart to put the second guess (2.0) in parenthesis, and the triple star note. I agree with those.
also, about xbox's consumption, dont forget that its clockspeeds are set at what seems to be the optimal point for amd gpu tech.
and dont forget that overclocking has a cube relation to consumption. (see the o/c test I linked on previous post)
 
Last edited:
I did not use the rdna1 data as a conclusion for rdna2, but as a reference, a base.
I pointed out that even if there is a whole 10% on silicon improvement, sony numbers for regular usage still don't hold up.
You are right of course in your assumption that "if a closed-box console with a very smaller thermal and consumption envelope can do X Ghz, the real-deal hardware without these restraints can do at least that".
caveat is that sustained 2.23 on ps5 looks not feasible, unless nothing else works in that console at the time of this clockspeed.
anyway, obviously you were smart to put the second guess (2.0) in parenthesis, and the triple star note. I agree with those.
also, about xbox, dont forget that its clockspeeds are set at what seems to be the optimal point for amd gpu tech. dont forget that overclocking has a cube relation to consumption.
If ps5 doesn't sustain 2.23GHZ at all times, imma demand to get my previous ban on my account removed by the mods. And I'm sure many will be doing the same. It just doesn't seem feasible, especially as cerny said it himself that it will clock lower.
 
....now if only there was an amd card with 36 CUs so that we could test this

oh snap! 5700 plain has 36 CUs, and also has 448GB/s bandwidth, just like ps5.
what do we know from 5700? game clock 1.625Ghz // boost clock 1.725Ghz
but lets say that these limits were there just to make way for 5700XT
with BIOS checking, you can unlock the restrictions and overclock as much as you like. no matter the cooling solution, can't go close to 2.2Ghz for the love of its life

lets move on to its bigger brother, 5700 XT : game clock 1.755 Ghz // boost clock 1.905
here is a "no limits" overclock test for 5700 XT, where BIOS and registry were touched, a fat liquid cooling block was attached, and card was overclocked to the absolute highest limit.
*notice that even though all barriers were uplifted and card could be clocked to 2.300Ghz, the silicon was no good for anything more than 2.100Ghz. So it was put there, along with a lock that it doesn't spike down below 2.050Ghz


as anyone that reads the test can see, overclocking the 5700 XT up to its hardware limits, taking temperatures completely out of the equation,
for a 18% increase in clockspeeds there was a 40% increase in consumption, and only a ~7% average performance increase

the article calls the results "disappointing", but enlighting as to why amd has put these limits, and suggests that it makes much more sense to buy a stronger GPU that to spend money on fancy cooling so as to raise 5700XT's clocks.


now, what efficiency improvement margin do we expect for amd's latest? 5%? 8%? even if 10%, cerny's magic words do not compute.

I guess we will find out this Christmas who is the troll. could prove to be me -as I go through these numbers and draw conclusions, but I'd suggest that you don't be so hasty to rule out that cerny was the troll all along.

Your wall of info misses the most fundamental point (taking a fundamentally flawed point and running with it is something you keep doing).

1. 5700XT released on July 7 2019. It is a RDNA1 generation product. It is quite old now. The PS5 GPU is RDNA2 and is releasing in November 2020.

amd-rdna-3-roadmap-1-768x768.jpg


Now what does it say under RDNA2?
PERFORMANCE/WATT improvement.

AMD confirms higher clock speeds on its next-generation RDNA 2 graphics cards
In the company's latest investor presentation, AMD has once again hinted at it's improved RDNA 2 graphics architecture, reconfirming that it will offer a 50% boost to performance/watt over their original RDNA architecture as well as enhanced clock speeds, and more performance per clock.

Thanks to RDNA 2's architectural innovations, AMD has been able to deliver Improved Performance-per-Clock (IPC), which means that AMD's latest graphics architecture will be able to do more graphics work during each of its clock cycles. On top of that, AMD also promises an increase in clock speeds, giving AMD two methods of performance enhancement with RDNA 2.

Not only that, but the same arch on a chip produced on 7nm in late 2020 is going to reach higher clockspeeds at the same watts as one made back in mid 2019 without even taking into consideration the work AMD has done to increase clockspeeds from one gen to the next.

So in summary, a 5700XT has little bearing on the clockspeeds of the next gen console GPUs.

Now put your money where you mouth is - are you telling me there won't be an RDNA2 GPU released with a BOOST CLOCK (even though the PS5 is not boosting, it is capped) in excess of 2.2Ghz and 40CUs or under? Please answer so I can see the level of understanding you have.

EDIT:

Oh boy you are clueless, I didn't see this bit in your post:

TheGreatWhiteTroll said:
now, what efficiency improvement margin do we expect for amd's latest? 5%? 8%? even if 10%, cerny's magic words do not compute.

5%?!!!?!? As above, they're only claiming about 50%....
 
Last edited:

thelastword

Banned
AMD is life....Thanks AMD for making consoles powerful with a neat roadmap for enhanced consoles down the line.....Thanks for subbing with Sony to build on Navi and RDNA 2+3 features...It's really a great initiative for the industry and it boosts everybody including PC gamers.....

Complete AMD build later this year....Ryzen 4000 (the gaming killer CPU) + (6900XT + 80CU's)......Faster clocks, lower powerdraw, screaming performance. Here I come...
 

RoboFu

One of the green rats
AMD is life....Thanks AMD for making consoles powerful with a neat roadmap for enhanced consoles down the line.....Thanks for subbing with Sony to build on Navi and RDNA 2+3 features...It's really a great initiative for the industry and it boosts everybody including PC gamers.....

Complete AMD build later this year....Ryzen 4000 (the gaming killer CPU) + (6900XT + 80CU's)......Faster clocks, lower powerdraw, screaming performance. Here I come...

that reads like a prayer to AMD lol.
 
Your wall of info misses the most fundamental point (taking a fundamentally flawed point and running with it is something you keep doing).

1. 5700XT released on July 7 2019. It is a RDNA1 generation product. It is quite old now. The PS5 GPU is RDNA2 and is releasing in November 2020.

amd-rdna-3-roadmap-1-768x768.jpg


Now what does it say under RDNA2?
PERFORMANCE/WATT improvement.



Not only that, but the same arch on a chip produced on 7nm in late 2020 is going to reach higher clockspeeds at the same watts as one made back in mid 2019 without even taking into consideration the work AMD has done to increase clockspeeds from one gen to the next.

So in summary, a 5700XT has little bearing on the clockspeeds of the next gen console GPUs.

Now put your money where you mouth is - are you telling me there won't be an RDNA2 GPU released with a BOOST CLOCK (even though the PS5 is not boosting, it is capped) in excess of 2.2Ghz and 40CUs or under? Please answer so I can see the level of understanding you have.
Looks like you got blurred vision after a point, since I accounted for a full 10% of silicon improvement, and imo ps5 numbers still don't hold up. UNLESS -as I already wrote- everything else underperforms on the console at that time. and even if that, there is the thermals for sustainability.
lucky us that sony put a "cap" in ps5 ghz performance, otherwise it would put 3090 out of business and eclipse the sun :messenger_grinning_squinting:

so tell you what. if you are itching to lose some money, lets bet that ps5 wont be able to run in real case nextgen gaming scenarios sustained 2.23Ghz gpu & 3.5Ghz cpu.
after all, its the ps5 comment that got you all riled up, so put your money where your cause is.
 
Last edited:
D

Deleted member 17706

Unconfirmed Member
I'll believe it when I see it. AMD can often introduce a card that competes well with a mid-high range Nvidia GPU for a lower price, but usually well after that Nvidia GPU first became available. They haven't competed at the high, or ultra-high end for many years now.
 
I'll believe it when I see it. AMD can often introduce a card that competes well with a mid-high range Nvidia GPU for a lower price, but usually well after that Nvidia GPU first became available. They haven't competed at the high, or ultra-high end for many years now.

du1uvY4.png








Exactly. These rumors sound great and all, but so did the Rx4xx/5xx, as well at the 57xx series and so forth. I've heard it a million times, but I want it to come true forreal this time. Not that I would switch from Nvidia, but it would make them stay competitive, price wise.
 
Looks like you got blurred vision after a point, since I accounted for a full 10% of silicon improvement, and imo ps5 numbers still don't hold up. UNLESS -as I already wrote- everything else underperforms on the console at that time. and even if that, there is the thermals for sustainability.
lucky us that sony put a "cap" in ps5 ghz performance, otherwise it would put 3090 out of business and eclipse the sun :messenger_grinning_squinting:

so tell you what. if you are itching to lose some money, lets bet that ps5 wont be able to run in real case nextgen gaming scenarios sustained 2.23Ghz gpu & 3.5Ghz cpu.
after all, its the ps5 comment that got you all riled up, so put your money where your cause is.

It's not about getting riled up, it's about you coming in to a thread about graphics cards to continue your trolling of the PS5 with what has just been proven to be a complete load of nonsense, and you being corrected.

'5% improvement in efficiency between RDNA1 and RDNA2' lmao that's a great one. And what does '10% of silicon improvement' mean?! I'm wasting my precious weekend discussing this with you, my IQ has dropped just reading your posts. Go away.
 

RoboFu

One of the green rats
I have both a 2080 and 5700xt

The biggest failure of the 5700xt is that it doesn’t have any RT features. That harmed it the most. It is a pretty nice gpu that fits nicely between a 2070 and 2080 it just had a lot of negative mind share IM because of lack of RT.
 
It's not about getting riled up, it's about you coming in to a thread about graphics cards to continue your trolling of the PS5 with what has just been proven to be a complete load of nonsense, and you being corrected.

'5% improvement in efficiency between RDNA1 and RDNA2' lmao that's a great one. And what does '10% of silicon improvement' mean?! I'm wasting my precious weekend discussing this with you, my IQ has dropped just reading your posts. Go away.
well, if you decide to put your money where your yap is, like you suggested, you know where to find me
 

Rikkori

Member
What is this? This is nothing.

Does the person who made this video even understand what it is they are comparing?

What FidelityFX settings were used? What resolution is it upscaling from? It doesn't say.

What DLSS settings were used? What resolution is it upscaling from? (Performance or Quality?) It doesn't say.

What performance was gained by using either technique? It doesn't say. There's no FPS shown at all.

This video isn't designed to inform. It's designed to make average joe think that maybe AMD actually has a response to DLSS.

The video is designed to test whether you can tell the difference between the two in terms of image quality. Nothing more!

He does this all the time even with PS4 Pro vs PC comparisons.
 

Kenpachii

Member
Looks like you got blurred vision after a point, since I accounted for a full 10% of silicon improvement, and imo ps5 numbers still don't hold up. UNLESS -as I already wrote- everything else underperforms on the console at that time. and even if that, there is the thermals for sustainability.
lucky us that sony put a "cap" in ps5 ghz performance, otherwise it would put 3090 out of business and eclipse the sun :messenger_grinning_squinting:

so tell you what. if you are itching to lose some money, lets bet that ps5 wont be able to run in real case nextgen gaming scenarios sustained 2.23Ghz gpu & 3.5Ghz cpu.
after all, its the ps5 comment that got you all riled up, so put your money where your cause is.

It doesn't matter if the PS5 can reach 2,23ghz and 3,5ghz on its cpu at the same time.

What matters is that the GPU in the PS5 can run 2,23ghz stable that's all that matters. The fact that PS5 will be sold in the millions if not 10's of millions the chips should be easy to create aka yields must be fantastic. That means PC will most likely go even higher or can go even higher then that without issue's.

His 2,2ghz claim as he renders it is completely valid. He could even go higher and it will still be valid.

Let's not forget that 980 had extreme amounts of issue's running at 1550mhz and would not go to 1600mhz even remotely unless u had a golden card. We are now sitting at 2+ghz on overclocks without issue's. Clocks go up naturally and RDNA2 could be another dumb on exactly that which makes 2,2ghz not hard to hit.

Based on the PS5 we could see 2,3ghz, 2,4ghz or even 2,5ghz chips.

Do i think it will be a thing such high clocks? no probably not because i do realize u need a die shrink with it which they really don't have at this point. This is also why PS5 clocks feel shady as shit and nothing but PR fluff to hide the actual 2ghz at best clock or even lower, but could it be possible? sure if rdna1 was just a massive fail fest.

AMD stopped being relevant ever since 9600XT/9800XT.

AMD curpstomped intel the last few years, something nobody thought was ever possible. They basically own every single market on the CPU front other then highest clock performance core for core and that's even a tiny bit different at this point which the 4000 series pushes out.

AMD gpu's in the past where also highly competitive if not better then nvidia, there is nothing really that stops them from doing exactly that all over again. They have a superior nm process under there belt with a more mature process that is probably also cheaper.

There is a reason why Nvidia moved forwards with those massive cards with way over the top performance increase over there 2000 series, they know RDNA2 is coming. Or else u would have gotten another 2000 series rebrand with a 30% increase.
 
Last edited:
Based on the PS5 we could see 2,3ghz, 2,4ghz or even 2,5ghz chips.

Do i think it will be a thing such high clocks? no probably not because i do realize u need a die shrink
with it which they really don't have at this point. This is also why PS5 clocks feel shady as shit and nothing but PR fluff to hide the actual 2ghz at best clock or even lower,
but could it be possible? sure if rdna1 was just a massive fail fest.
so, basically you quote me to say no, but then yes.

I agree with your assumption I bolded though: IF rdna1 was a massive fail test, THEN we could see such big raise in clockspeeds.
But that IF is a big pill to swallow, especially when all we have to base the rdna1 failure hypothesis on, is a sony PR (butthurt) presentation.
Let me counter your hypothesis with another hypothesis: if that was the case, why microsoft should clock their chips at such a lower speed? Are they stupid? They didn't get the amd memo? What do you think?
 
Last edited:
AMD is life....Thanks AMD for making consoles powerful with a neat roadmap for enhanced consoles down the line.....Thanks for subbing with Sony to build on Navi and RDNA 2+3 features...It's really a great initiative for the industry and it boosts everybody including PC gamers.....

Complete AMD build later this year....Ryzen 4000 (the gaming killer CPU) + (6900XT + 80CU's)......Faster clocks, lower powerdraw, screaming performance. Here I come...
AMD is like the worst religion you could possibly choose bruh
 

Ascend

Member
All this speculation is maybe interesting, but when do we expect a reveal and release?
Reveal is rumored to be Oct 7th. We know for a fact that the RDNA2 PC graphics cards are planned to be released before the consoles release. So... I guess it's somewhere in October or early November.
 
Last edited:
The other issue is that AMD still doesn't seem to have working drivers for Navi 1 more than a year later. What's the driver situation for Navi 2 going to be like?

 

Zathalus

Member
It's not about getting riled up, it's about you coming in to a thread about graphics cards to continue your trolling of the PS5 with what has just been proven to be a complete load of nonsense, and you being corrected.

'5% improvement in efficiency between RDNA1 and RDNA2' lmao that's a great one. And what does '10% of silicon improvement' mean?! I'm wasting my precious weekend discussing this with you, my IQ has dropped just reading your posts. Go away.
He basically gets triggered when anybody posts anything that can somehow be constructed as positive news for the PS5. Also, somehow RDNA 1 is relevant when discussing RDNA 2 for some reason, despite all the evidence from AMD pointing out the performance per watt and clock speed advantages of RDNA 2.
 
He basically gets triggered when anybody posts anything that can somehow be constructed as positive news for the PS5. Also, somehow RDNA 1 is relevant when discussing RDNA 2 for some reason, despite all the evidence from AMD pointing out the performance per watt and clock speed advantages of RDNA 2.

Yep, his complete lack of knowledge on what he is talking about and low effort posts exposes an agenda to spread confusion and misinformation.

I mean christ on a bike his comment on efficiency improvement of '5%, 8% even 10%' going from RDNA1 to RDNA2 was tragic!

147377
 

nochance

Banned
AMD curpstomped intel the last few years, something nobody thought was ever possible. They basically own every single market on the CPU front other then highest clock performance core for core and that's even a tiny bit different at this point which the 4000 series pushes out.
18% marketshare is curbstomping now, now I understand the logic behind the rest of your speculation.

AMD gpu's in the past where also highly competitive if not better then nvidia, there is nothing really that stops them from doing exactly that all over again. They have a superior nm process under there belt with a more mature process that is probably also cheaper.
AMD was never competitive with NVidia, ATi was. Efficient, forward thinking architectures don't exactly grow on trees.
 
Yep, his complete lack of knowledge on what he is talking about and low effort posts exposes an agenda to spread confusion and misinformation.

I mean christ on a bike his comment on efficiency improvement of '5%, 8% even 10%' going from RDNA1 to RDNA2 was tragic!

147377
I don't think you understand what you are talking about.

here's an example:
Radeon VII is a GCN card. Comes at 1.400Ghz , Boost 1.850Ghz, can be overclocked to 2.0Ghz
So, according to what you are trying to imply and say here to ...school me,
a RDNA1 card, a 5700XT would go: 2.10Ghz base clock (1400 +50%) , Boost 2.775Ghz (1850 +50%), could be overclocked to ..3.0Ghz (2000 + 50%)

I think I don't need to tell you how far this is from truth.

Well, at least you were smart enough to not place a bet with me...


He basically gets triggered when anybody posts anything that can somehow be constructed as positive news for the PS5. Also, somehow RDNA 1 is relevant when discussing RDNA 2 for some reason, despite all the evidence from AMD pointing out the performance per watt and clock speed advantages of RDNA 2.
I just dont like bullshit.
 
Last edited:

Silver Wattle

Gold Member
I don't think you understand what you are talking about.

here's an example:
Radeon VII is a GCN card. Comes at 1.400Ghz , Boost 1.850Ghz, can be overclocked to 2.0Ghz
So, according to what you are trying to imply and say here to ...school me,
a RDNA1 card, a 5700XT would go: 2.10Ghz base clock (1400 +50%) , Boost 2.775Ghz (1850 +50%), could be overclocked to ..3.0Ghz (2000 + 50%)

I think I don't need to tell you how far this is from truth.

Well, at least you were smart enough to not place a bet with me...



I just dont like bullshit.
Saving this post for future lols.
 
While I hope this comes true I think a 134% increase in performance is wishful thinking. The 80% leap from 2080Ti to 3080 is an impressive leap. 134% is unprecedented. I really hope it comes true but I'll believe it when I see it.

If Big Navi can beat 2080 but not 3090 at 2080 price then that's something I'd definitely be interested in.

Good times for PC HW it feels like 11 years ago when I last upgraded. Lots of big jumps happening all the time. Can't wait to see what Intel do to smack down AMD in the CPU space now that they've woken the slumbering giant.
 
The video is designed to test whether you can tell the difference between the two in terms of image quality. Nothing more!

He does this all the time even with PS4 Pro vs PC comparisons.

Without more information that video is literally nothing. It tells us nothing.

Is it showing that AMD has a solution that's internally rendering at 1080p but competing in image quality with DLSS 2.0 in quality mode (1440p ) that is supersampling to 4k? - This would show that AMDs solution is amazing.

Is it showing that AMDs solution, which is internally rendering at 90% of 4K, looks the same as DLSS is performance mode (1080p )? - This would show that AMDs solution is a joke.

All while not showing the settings or framerate while doing so.

The person who made that video is either trying to mislead people or is a genuine idiot.
 
Last edited:

Ascend

Member
everyone forgot intel also currently making gpu? that thing buried before see the day now.
No one is expecting Intel to compete at the high end with their first graphics card. Mid range at best.

In other news, RDNA2 IPC increase is supposedly 10%.
 

longdi

Banned
I never said it maxes out at 1.7 GHz. I said it cannot reach much higher than 1.7GHz. If you look at the power consumption of the cards, it's quite obvious.


Nvidia own testing of 3080 stock, it runs 1.92ghz on average 🤷‍♀️ :messenger_savoring:

Obviously you are just wrong. Probably a console peasant who never played with Nvidia GPU
 
Last edited:

Ascend

Member

Nvidia own testing of 3080 stock, it runs 1.92ghz on average 🤷‍♀️ :messenger_savoring:

Obviously you are just wrong. Probably a console peasant who never played with Nvidia GPU
The 3080 and 3090 cards are rated at 1.7GHz boost at 300W+. I'm not interested in cherry-picked golden samples. I guess we'll see when the reviews come out, which are also not immune from having golden samples, btw, but whatever.
 

Ascend

Member
Latest leak (always take leaks with a grain of salt);

Big Navi is supposedly somewhere between the RTX 3070 and RTX 3080 when it runs at 275W.
If it reaches 300W, it can match the RTX 3080.
That is with 16GB of RAM.
 
Top Bottom