• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Radeon 6900 XT to feature 2.4ghz clockspeed

regawdless

Banned
Such high clocks make me instantly think about the cooling solution. But I'm confident that at least the AIBs will have no issues there.

Either way, looks like AMD will deliver a good alternative for people who want great performance and are fine with some shortcomings regarding raytracing.
 

Bo_Hazem

Banned


That is just a beast of a clockspeed if true. Apparently he has been credible in the past.


All RDNA2 GPU's appear to be 2.2GHz+. Hope tech help GPU's reach the same clocks of CPU's to get much more out of the same die. Liquid metal + protective housing + liquid cooling? Seems like it can push above 2.5GHz if logic doesn't get messed up.
 
Last edited:

longdi

Banned
I have skepticism. The game clocks are sustainably at around 2.4ghz. Which means AMD can advertise the boost clocks at 2.5~2.7ghz?

Ryzen 5000 got lots of pie in the sky clocks being rumoured.

But the actual clocks announced were only conservatively higher than Ryzen 3000.

Both were built on the 7nm process.
 
Last edited:

The Skull

Member
Price is all that's going to matter at this point. Sadly there ryzen prices indicate they will bump it probably up.

I think they could only do that if they absolutely obliterated Nvidia's performance, which I don't think will happen. Ryzen prices are up because now the CPU's, at least from their presentation, will be the absolutely best at everything.

I think Big Navi will be competitive with the 3080 but priced slightly lower due to lacking a DLSS alternative, then they may have a balls to the wall water cooled overclocked edition to get within the 3090 perhaps? Either way, competition with Nvidia at the high end again near launch and not a year later is a good step for Radeon.
 

psorcerer

Banned
Maybe they can get their hands on a dev kit. Not sure how if it could be done on a consumer version. But a dev kit would definitely have this info.

AFAIK, when you get a dev kit you sign an NDA that you cannot tell anybody anything. And cannot even tell that you've signed that NDA...
 

SantaC

Member
I have skepticism. The game clocks are sustainably at around 2.4ghz. Which means AMD can advertise the boost clocks at 2.5~2.7ghz?

Ryzen 5000 got lots of pie in the sky clocks being rumoured.

But the actual clocks announced were only conservatively higher than Ryzen 3000.

Both were built on the 7nm process.
You got it wrong.

Zen 3 was rumored to boost up to 5ghz. The 5950X can actually boost up to 4.9ghz

It is not like it was far off.
 

PhoenixTank

Member
I have skepticism. The game clocks are sustainably at around 2.4ghz. Which means AMD can advertise the boost clocks at 2.5~2.7ghz?

Ryzen 5000 got lots of pie in the sky clocks being rumoured.

But the actual clocks announced were only conservatively higher than Ryzen 3000.

Both were built on the 7nm process.
Important to note that there is more than one TSMC 7nm process.
 

JimboJones

Member
2.4ghz and less power than the 3080?

That's the question, high clocks won't make a lick of difference if the benchmarks comes in and show it performing the same or under competitors cards with lower clocks.
I just want to see some game benchmarks at this point, this noise is just marketing nonsense for the time being.
 

Kenpachii

Member
I think they could only do that if they absolutely obliterated Nvidia's performance, which I don't think will happen. Ryzen prices are up because now the CPU's, at least from their presentation, will be the absolutely best at everything.

I think Big Navi will be competitive with the 3080 but priced slightly lower due to lacking a DLSS alternative, then they may have a balls to the wall water cooled overclocked edition to get within the 3090 perhaps? Either way, competition with Nvidia at the high end again near launch and not a year later is a good step for Radeon.

My gamble was that the next radeon card would push 2x 5700xt performance as 5700xt didn't really push limits for AMD. which would result in about 50-60% above a 2080ti aka 3080 performance wise. This is also the reason i think nvidia pushed there top core in the 3080. I also think the GPU was going to be planned to launch at 700 bucks much like there radeon 7 and compete against nvidia's top end gpu as result.

There is absolute no reason nvidia would push these heaters + massive cards and there top core for 700 bucks. it's because AMD is going to drop a actual next gen card.

Now obviously this could have all been changed the moment dlss2.0 got released by nvidia. Because frankly look at this stuff, 67% performance increase.. its no joke.

db1d93ab25647ed8db1c81fe27dffac9.jpg


This means AMD could be in a real pickle against nvidia as raw performance of the GPU no longer is going to carry it. If they have no DLSS alternative to offer. They will never beat the 3080 or 3090 as result and probably evne struggle against a 3070 with DLSS 2.0 active. It's straight up pushes them into a budget bracket and 400 price point. However they could ignore this by simple offering more v-ram and only bench in demonstrations games that do not support DLSS to give them a favorable look.

However it could explain AMD's memory setup. it could very well be possible they are aiming for a cheap price point instead of chasing the top again.

About AMD cpu's.

AMD CPU's where already up with the 3000 series ryzens. CPU prices went up on intel with 60% over the last 2 generations. AMD prices went up probably 40-50%. I am not a big fan of this outcome and frankly as AMD follows intel suit on this solution prices are not coming down but both are going up. AMD will do the exact same shit with there GPU department.
 
Last edited:
My gamble was that the next radeon card would push 2x 5700xt performance as 5700xt didn't really push limits for AMD. which would result in about 50-60% above a 2080ti aka 3080 performance wise. This is also the reason i think nvidia pushed there top core in the 3080. I also think the GPU was going to be planned to launch at 700 bucks much like there radeon 7 and compete against nvidia's top end gpu as result.

There is absolute no reason nvidia would push these heaters + massive cards and there top core for 700 bucks. it's because AMD is going to drop a actual next gen card.

Now obviously this could have all been changed the moment dlss2.0 got released by nvidia. Because frankly look at this stuff, 67% performance increase.. its no joke.

db1d93ab25647ed8db1c81fe27dffac9.jpg


This means AMD could be in a real pickle against nvidia as raw performance of the GPU no longer is going to carry it. If they have no DLSS alternative to offer. They will never beat the 3080 or 3090 as result and probably evne struggle against a 3070 with DLSS 2.0 active. It's straight up pushes them into a budget bracket and 400 price point. However they could ignore this by simple offering more v-ram and only bench in demonstrations games that do not support DLSS to give them a favorable look.

However it could explain AMD's memory setup. it could very well be possible they are aiming for a cheap price point instead of chasing the top again.

About AMD cpu's.

AMD CPU's where already up with the 3000 series ryzens. CPU prices went up on intel with 60% over the last 2 generations. AMD prices went up probably 40-50%. I am not a big fan of this outcome and frankly as AMD follows intel suit on this solution prices are not coming down but both are going up. AMD will do the exact same shit with there GPU department.

I don't think AMD have to sweat DLSS just yet. DLSS is still something that has to be done on a game-by-game basis, but I'm pretty sure Nvidia are working very hard to change that very soon. Once that happens, is when DLSS becomes a genuine threat.

One thing I definitely agree with though is that things will be shifting big time towards other innovations to provide more sensible performance gains outside of raw power and clock increases. Stuff like VRS 3.0 and especially DLSS 2.0 are just the start.

I have skepticism. The game clocks are sustainably at around 2.4ghz. Which means AMD can advertise the boost clocks at 2.5~2.7ghz?

Ryzen 5000 got lots of pie in the sky clocks being rumoured.

But the actual clocks announced were only conservatively higher than Ryzen 3000.

Both were built on the 7nm process.

No, the 2.4 GHz clock is the Boost mode clock. At least from what I can tell. They are listing these GPU specs at Boost clock modes.

The Base clock and Game clock are most likely lower than the relative numbers being put out, but very likely even the Base clocks shouldn't be lower than 2 GHz or that much lower than 2 GHz.
 
Last edited:
... which would result in about 50-60% above a 2080ti aka 3080 performance wise.

Actually 3080 is around 25-30% more performant than the 2080ti at 4K depending on the title. But I get your overall point.

This means AMD could be in a real pickle against nvidia as raw performance of the GPU no longer is going to carry it. If they have no DLSS alternative to offer. They will never beat the 3080 or 3090 as result and probably evne struggle against a 3070 with DLSS 2.0 active. It's straight up pushes them into a budget bracket and 400 price point. However they could ignore this by simple offering more v-ram and only bench in demonstrations games that do not support DLSS to give them a favorable look.

I don't think we are quite there yet. DLSS is very impressive technology and a good idea to mitigate the extra performance hit required to run Ray Tracing at 4K. However it still needs to be implemented by developers on a game per game basis, right now there are only around 6 games that support it. That will obviously increase with time but I wouldn't say it was the silver bullet that some are hyping it up to be just yet. Raw Performance will still matter more for the foreseeable future and the 10,000+ titles currently available on Steam for example that do not now and will likely never support a game by game implementation of DLSS.

Now a DLSS3.0 for example that worked on the driver/control panel level as an on/off toggle? Now that would be a game changer and make DLSS the silver bullet that some are preemptively hyping it to be at the moment.

Incidentally there are some rumours and rumblings about AMD potentially having some kind of upscaling tech of their own to compete with DLSS, perhaps some kind of evolution of FidelityFX or maybe something new altogether. Of course that could all be nonsense or wishful thinking from AMD fanboys, or even just tech tubers deperate for clicks. AMD haven't said anything about it so until we see/hear something from them on the 28th we can only assume they don't have something as of right now.

However if they did have some kind of hypothetical upscaling/reconstruction tech that was say 80-90% as good as DLSS but could be used automatically for every game at the driver level as an on/off toggle, then that would certainly be something to see. Chances are they probably don't, but it would be in their best interests to be hard at work on something like that before Nvidia perfects DLSS to the point of working on every game without work from developers.
 

longdi

Banned
No, the 2.4 GHz clock is the Boost mode clock. At least from what I can tell. They are listing these GPU specs at Boost clock modes.

The Base clock and Game clock are most likely lower than the relative numbers being put out, but very likely even the Base clocks shouldn't be lower than 2 GHz or that much lower than 2 GHz.

2.4G boost clocks would be more reasonable.
Amd boost clocks are unlikely Nvidia, in regards of sustaining it.
 
Did you seen the latest rumors...
Guess there is no magic bullets to hit that 2.4ghz game clock on tsmc 7nm....

I believe the current rumours this thread is based on mention a 2.4ghz game clock on an OC AIB model, not the reference design.

Of course there could be a mix up and it might actually be boost clock instead. Hard to know for sure, only around a week now until we find out for sure with the official reveal, exciting times!
 

thelastword

Banned
All this hype for DLSS, is similar to all the hype DLSS 1.0 hype had. It means nothing. How many DLSS games so far? How many RTX games so far? People are so concerned about Raytracing performance on RDNA 2, yet everybody is saying AMD's RT performance is better than Turing, but not better than Ampere, as if Ampere's RT performance is so much better than Turing anyway.

In any case, we have not seen Navi's RT performance yet, except in PS5 titles, which have shown some impressive RT performance. Higher end AMD cards with more VRAM, CU's and cache should do even better. And of course, everybody was saying that Navi would barely measure to a 3070 in normal rasterized games, but now the goal has shifted to 1080 DLSS games to boost RT performance on a handful of games. If AMD is competitive against the 3080 and 3090 in non raytraced games, that's no longer important, a far cry from AMD targeting 3070 and below.....Now, those 6 RT games at 1080p DLSS to boost performance is all the rage, it's the only important metric......

Because guess what.....NV is the only one in town that can do image reconstruction, that can do resolution scaling and AI reconstruction. How could AMD survive? and combat that.....? AMD is in so much trouble, cause DLSS... :rolleyes:
 

regawdless

Banned
In any case, we have not seen Navi's RT performance yet, except in PS5 titles, which have shown some impressive RT performance.

Huh. Sorry for picking out that line. But can you show me examples of impressive RT performance of the PS5?
Everything I've seen so far was very downscaled reflections and very spotty RT lighting lite (Spider-Man, Ratchet & Clank etc).
Looks like they have a rather limited RT budget so far. Of course judging on very early launch games, which is not saying much.
 
Last edited:

PhoenixTank

Member
Did you seen the latest rumors...
Guess there is no magic bullets to hit that 2.4ghz game clock on tsmc 7nm....
Nope - my point was that they are not the same process & comparing resulting CPU clock increases on N7 to potential GPU clock increases on N7P gives you nothing of worth.
 
Top Bottom