• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia bans HardwareUnboxed from receiving review samples

NVIDIA.jpg

-Hey Linus -great name, by the way-, we want you to review this new card we made, we designed it for it to be really really good at RTX and DLSS, so could you like, talk about that?

Dor3TSJUUAAouRu.jpg

-What did you just said about me? Stop trying to control me you fucking mafia nazi scum!!1!


NVIDIA.jpg

-But we will give you the card for free.

maxresdefault.jpg

-Hi y'all!!! Ready to see the best card ever made? :D
He definitely knows how to piss off a crowd, and reel them back in for the views. Positive or negative, its all about the views. He's done it with consoles, pc hardware, Nvidia, AMD, etc.

If I was AMD, I would stop giving free shit to reviewers that only do RT benchmarks. Because it would ultimately make them look bad, especially on a channel with a big following. But of course the hate bandwagon only allows people to show their bias.
 

IntentionalPun

Ask me about my wife's perfect butthole
The companies should work to get good reviews from the critics. It should definitely not be the case that the critics have to adapt to appease the companies. That defeats the whole purpose of reviews. If you cave in and fall in the second category, you're just a shill.

My point was critics helped create this scenario by prioritizing when they get a review out vs. having to depend on companies for pre-release samples. They all could break this dependency if they want to, but they don't, they just want to bitch about it if they don't get sent something.

They aren't a bunch of ethical pillars of light; they are people trying to make money while participating in being bribed.
 

Rentahamster

Rodent Whores
Yeah, sure, okay. Your argument is totally convincing and I am going to retreat back into my Nvidia bunker.
Do you really think that if you showed Linus that post of yours and the point you're trying to make, he'd say, "Yeah I agree 100%! That really represented the main thrust of my argument perfectly!"?
 

bargeparty

Member
I see the Cyberpunk video HU made. Let's assume it was planned from the start, as someone who only watches videos and doesn't follow their social media, had Steve put in the 3060ti video a statement that Cyberpunk would be tested separately blah blah that could have made a difference in people's perception.
 

longdi

Banned
The issue with this HU saga, is HU is hardly 100% objective, they have an agenda themselves with sneaky commentary to tilt their review over their preference.

It is pot calling kettle black, and using the internet to rage against Nvidia

Here is their written Ryzen 3000 review.

Whenever Intel wins, they mention the price difference, uses forgiving words on Amd loss
Whenever Amd wins, they forget the price differences, uses big terms.
At the same time, shilling for MSI, a aib with questionable ethics!

Deal or No Deal? What We Learned
With all of that data out of the way, I’ll share some of my opinions. Having reviewed the i9-10900K and i7-10700K recently, we have a good idea of how those models fit into the current CPU landscape. In short, both are objectively good, but they’re also niche products.

The Core i9-10900K is unquestionably the world's fastest gaming processor, but its nearest rivals are within spitting distance and happen to offer a number of advantages. Because games don’t benefit in a meaningful way of 10 cores in 2020, the 10700K is just as good, and by extension the outgoing 9900K. Meanwhile, the Ryzen 9 3900X will mostly deliver a gaming experience that is indistinguishable to the Core i9's, but there's a lower asking price, less power draw, and it's also much more powerful for productivity tasks.

The Core i7-10700K matches the price of the 3900X and that makes it a better value choice for gamers when compared to the 10900K, but it falls well short of Ryzen when it comes to productivity performance and performance per watt. In summary, the 10700K is either on par with the 3700X for a massive 40% price premium or much slower than the 3900X for the same price.

... the Core i5-10600K has the potential to deliver 10700K and 10900K-like performance at an even cheaper price point.

On the other hand, the Core i5-10600K has the potential to deliver 10700K and 10900K-like performance at an even cheaper price point. So for those seeking strong gaming performance and good value, the 10600K is an appealing option. When overclocked it can match the $400 and $500 parts, or at least get so close it doesn’t matter. That in my opinion makes it Intel’s best value high-end gaming CPU.

For those of you exclusively gaming, I can see how the Core i5-10600K makes sense. I should point out that under realistic gaming conditions, it's ~6% faster than the 3700X as seen in our 1080p data with the RTX 2080 Ti across 7 games. That margin is reduced to just 3% at 1440p. That being the case, I can also see how buying a processor with 25% less cores for roughly the same price might not be the wisest of investments moving forward. But, if you’re playing games such as Fortnite or PUBG, and you’re using competitive settings, then you'll see double-digit gains with the 10600K.

In short, if you’re after a powerful gaming CPU but don’t care about entering the realm of diminishing returns, then the Core i5-10600K is the part to get. However, if you’re after something a little more well-rounded it’s hard to pass on the Ryzen 7 3700X with its ~30% better productivity performance.

If you’re seeking maximum value -- for both gaming and application tasks -- then it’s very difficult to ignore the Ryzen 5 3600 at just $175, that’s an incredible deal which sees the 10600K cost 60% more, while offering a minor 6% speed bump in 1080p gaming.

But let's not stop at the CPU. The Core i5-10600K is even more expensive than that as it doesn’t come with a cooler and decent Z490 boards are more expensive than a good quality B450 or upcoming B550 motherboard. The cooler issue can be solved for just $30, but when you add in the price of the motherboard it starts to get more expensive. For example the MSI B450 Tomahawk Max costs $115 and supports the Ryzen 9 3950X, whereas the cheapest Z490 boards we’d recommend investing in cost around $190, like MSI’s Z490 Tomahawk.

Factor in $30 for an affordable tower cooler and $75 more for a decent Z490 motherboard, you’re at a little over $100 of additional costs to support the 10600K. If you plan to overclock to at least 5 GHz, you can double the price of the cooler. Factoring all that in, you're looking at having to pay at least $500 for the 10600K, with a decent entry-level Z490 motherboard and a budget tower cooler. The Ryzen 5 3600 will set you back $290 with something like the MSI B450 Tomahawk Max. Coincidentally, that’s about what you’d pay for just the 10600K.
 
The amount of NV boot licking here would make a dominatrix moist. If you like leather that mich find a healthier outlet than pledging loyalty to NV because it seems to have broken your critical thinking skills.
If you'd be confident saying that if roles were reversed, then I applaud you. If not, then you are an AMD bootlicker. This isn't about Nvidia vs AMD, but the obvious anti raytracing rhetoric. "Raytracing isn't important because only Nvidia has it, so let's disregard it completely... Now that AMD supports it, but isn't anywhere as good as Nvidia, or on consoles, let's still call it irrelevant"

That's the issue at hand, as RT is now going to be implemented in not just a few games, but many games going forward. Because his favorite hardware team isn't in first place, even though they support it, he still disregards it. Although, if they had better performance, he'd be white knighting it to the moon.

If you can only see it as green vs red at this point, that's your problem.
 

Rentahamster

Rodent Whores
"Raytracing isn't important because only Nvidia has it, so let's disregard it completely
That wasn't the position of Hardware Unboxed.

Now that AMD supports it, but isn't anywhere as good as Nvidia, or on consoles, let's still call it irrelevant"
Neither was that.

They actually elaborate on their thought process in multiple videos.
 

Marlenus

Member
If you'd be confident saying that if roles were reversed, then I applaud you. If not, then you are an AMD bootlicker. This isn't about Nvidia vs AMD, but the obvious anti raytracing rhetoric. "Raytracing isn't important because only Nvidia has it, so let's disregard it completely... Now that AMD supports it, but isn't anywhere as good as Nvidia, or on consoles, let's still call it irrelevant"

That's the issue at hand, as RT is now going to be implemented in not just a few games, but many games going forward. Because his favorite hardware team isn't in first place, even though they support it, he still disregards it. Although, if they had better performance, he'd be white knighting it to the moon.

If you can only see it as green vs red at this point, that's your problem.

What a load of nonsense.

He does not focus on it for 2 main reasons.

1) It is not widely used yet. This will change in the future and I am sure as it does more focus will go to RT performance.
2) the performance hit for the visual fidelity gain is out of whack on current hardware. This will also improve over time but current hardware is not good enough to get the best out of RT yet.

If you look back in GPU history we are analogous to the pre 9700 pro for AF. Before 9700 pro turning AF on was really expensive and the visual fidelity gain was not worth the performance cost. Then the 9700 pro came out and turning it on was a no brainer because there was little cost so it became worthwhile.

RDNA2 and Ampere are not at that level. I doubt RDNA3 / Hopper will be but they will be closer.

DLSS is another good tech but that will go the way of Glide and be superseded by DX ML upscaling and the Vulkan implementation. With consoles being RDNA2 this will probably happen faster than Glide dying because once it is available devs will 100% use it to hit '4k' on consoles.
 
Why would that be something not get to annoyed at, especially since some of these channels blur the line between review and sponsored video, then lash out at the very company that gives them their material, comparing them to the Mafia and Nazis (Linus), when they threaten to withhold the goods over perceived mistreatment.

There are no heroes in this story, though some people desperately want one.

Without these people you wouldnt have a way to drool over the nonexistent (in stores)rtx 3000 lineup, unless you wanna 100% trust nvidias marketing.
However I do agree reviewers should buy their own shit, even though it wouldnt really be feasible for most of them, except linus and a few others.
 
Last edited:

jigglet

Banned
they just straight up disregarded DLSS and raytracing.

Agree about ray tracing, but you’ve misrepresented their stance on DLSS. They said they don’t want to do benchmarking using DLSS which is 100% the correct thing to do. Otherwise it’s impossible to compare apples and apples. I tried recently to find benchmarking videos for R6 Siege and it’s impossible as everyone uses render scaling so one benchmark was using 50%, another 80% etc. So I found myself looking through dozens of videos only to find maybe 1/10 who knew what they were doing. The rest were utterly useless.

What happens 12 months from now when there's DLSS3.0? And some games support both? What level scaling do you use? Maybe some games can benefit from 50% scaling while others don't look so great if you go below 70%? The moment you introduce DLSS into benchmarking it becomes a clusterfuck when trying to compare cards. There's a million videos each comparing completely different things. It's always beneficial to have benchmarking performed on native settings.

(sure there's room for DLSS benchmarking videos but it can't be the main benchmark / content, and given there's only two guys they need to prioritise the content they produce)
 
Last edited:

bargeparty

Member
Agree about ray tracing, but you’ve misrepresented their stance on DLSS. They said they don’t want to do benchmarking using DLSS which is 100% the correct thing to do. Otherwise it’s impossible to compare apples and apples. I tried recently to find benchmarking videos for R6 Siege and it’s impossible as everyone uses render scaling so one benchmark was using 50%, another 80% etc. So I found myself looking through dozens of videos only to find maybe 1/10 who knew what they were doing. The rest were utterly useless.

What happens 12 months from now when there's DLSS3.0? And some games support both? What level scaling do you use? Maybe some games can benefit from 50% scaling while others don't look so great if you go below 70%? The moment you introduce DLSS into benchmarking it becomes a clusterfuck when trying to compare cards. There's a million videos each comparing completely different things. It's always beneficial to have benchmarking performed on native settings.

(sure there's room for DLSS benchmarking videos but it can't be the main benchmark / content, and given there's only two guys they need to prioritise the content they produce)

DLSS needs to be included, it's an important feature of the Nvidia cards and a reason why someone might purchase over AMD. You can't just omit it.

I also want to echo what some others have said. Yes RTX is in a limited number of games, but that's increasing and it's also in some big games that a lot of people play. How is it not important to test in those scenarios?
 

jigglet

Banned
DLSS needs to be included, it's an important feature of the Nvidia cards and a reason why someone might purchase over AMD. You can't just omit it.

As I said, sure make a vid about it or talk about it, but it simply cannot be used in the main benchmarking content.
 
Agree about ray tracing, but you’ve misrepresented their stance on DLSS. They said they don’t want to do benchmarking using DLSS which is 100% the correct thing to do. Otherwise it’s impossible to compare apples and apples. I tried recently to find benchmarking videos for R6 Siege and it’s impossible as everyone uses render scaling so one benchmark was using 50%, another 80% etc. So I found myself looking through dozens of videos only to find maybe 1/10 who knew what they were doing. The rest were utterly useless.

What happens 12 months from now when there's DLSS3.0? And some games support both? What level scaling do you use? Maybe some games can benefit from 50% scaling while others don't look so great if you go below 70%? The moment you introduce DLSS into benchmarking it becomes a clusterfuck when trying to compare cards. There's a million videos each comparing completely different things. It's always beneficial to have benchmarking performed on native settings.

(sure there's room for DLSS benchmarking videos but it can't be the main benchmark / content, and given there's only two guys they need to prioritise the content they produce)


You compare DLSS in addition to raster. At this point there are about 30 RT games and 32 or so DLSS games. And some of the biggest games of this year have both. You cant just selectively ignore multiple features that are actually key selling points because you have some sort of bizzare fetish against them that you ridiculously explain, like Steve does. Call of Duty, World of Warcraft and Cyberpunk are some of the biggest names in the entire industry. You cant just skip crucial features that absolutely make people buy gpu's for just because. Because what ? AMD doesnt have them and its not fair ? How is it not fair ? Anyone who buys a 2000 or 3000 series card is gonna use them. Its absolutely fair game.

In addition to this, it boggles the minds how these guys call themself tech channels and people and not be interested in new and fresh graphical techniques. Especially something like RT. You would expect they would be simply curious from an academic standpoint. Instead Steve, when he even bothers to spend 2 minutes showing 2 games with RT he spends those 2 minutes repeating in every video how its not worth it, how he cant tell the difference, how the perf imapct is too big, how there arent enough games. Its like he's reading a script for shitting on RT. Its hard to grasp his intense multiyear hatred of this.

They've lost me for good as a viewer for gpu's now. I could never trust a gpu review from them from now on. I dont think any sane person can say that this situation wont form or extend existing biases. Does anyone actually think these guys are made of stone and this wont impact them in their results, at all ?

As I said, sure make a vid about it or talk about it, but it simply cannot be used in the main benchmarking content.

Why not ? It absolutely can and should. You're omitting crucial information if you ignore it.
 
Last edited:

ethomaz

Banned
As I said, sure make a vid about it or talk about it, but it simply cannot be used in the main benchmarking content.
It can and should be used in the benchmarks comparisons.

In your graphs you should have the framerates for both native and DLSS resolution with pictures comparisons of the trade off.

If you get even better performance without lose image quality the why not use it?

DLSS is a reality for years already.
 
Last edited:

jigglet

Banned
some sort of bizzare fetish against them that you ridiculously explain, like Steve does.

I love how you assume I'm some AMD fanboy. I have an Nvidia GPU right now and my next one will be Nvidia. And I've been going out of my way to tell everyone to avoid AMD simply cause of the clusterfuck of driver issues with their 5000 series cards even 9 months into launch.

Why not ? It absolutely can and should. You're omitting crucial information if you ignore it.

As I said it's really hard to do benchmarking like this. Go pick two random setups (e.g. Ryzen 3600 + 2070 Super GPU, i5 10400f + 2080ti) and find me just one benchmark video each of any random game that supports render scaling or DLSS (e.g. R6 Siege). You'll spend ages. Because some vids use 50% render scaling, others use 80%, others use 100%...it takes too long to compare apples with apples. I struggle to find some basic stats to help me figure out what card I want. DLSS IS a great feature you aren't getting my point here. It's fucking brilliant. But it makes it very difficult to figure out what the hell I'm buying, even when comparing between two RTX cards.
 
Last edited:
Hahahahahaha, and he crawls straight back into nvidia's asshole after they shat him out and flushed it.

What a fucking clowns holy shit.
What's the difference between.what Nvidia is doing and when mods here don't let you start threads that are contrary to their personal opinions ?
 

Kenpachii

Member
I love how you assume I'm some AMD fanboy. I have an Nvidia GPU right now and my next one will be Nvidia. And I've been going out of my way to tell everyone to avoid AMD simply cause of the clusterfuck of driver issues with their 5000 series cards even 9 months into launch.



As I said it's really hard to do benchmarking like this. Go pick two random setups (e.g. Ryzen 3600 + 2070 Super GPU, i5 10400f + 2080ti) and find me just one benchmark video each of any random game that supports render scaling or DLSS (e.g. R6 Siege). You'll spend ages. Because some vids use 50% render scaling, others use 80%, others use 100%...it takes too long to compare apples with apples. I struggle to find some basic stats to help me figure out what card I want. DLSS IS a great feature you aren't getting my point here. It's fucking brilliant. But it makes it very difficult to figure out what the hell I'm buying, even when comparing between two RTX cards.

Yea man its fucking hard to make a chart with DLSS performance like how do dead youtube channels all do it perfectly fine man dam

6aaef37341f9560ffdf7e6bd262413ee.png


And then look at this.

0185664f2bccd8a3cd2631dc060c1fa2.jpg



2070 from 45 to 69 fps. People freaking upgrade there GPU's from a 1080ti towards 2080ti for just that performance increase and burn 1200 bucks for it a year ago.

Totally not relevant man. It just pushes your GPU into a league above it and makes all of gamersnexus benchmarks completely useless because he doesn't tell you the full story.
 
Last edited:
Yea man its fucking hard to make a chart with DLSS performance like how do dead youtube channels all do it perfectly fine man dam

6aaef37341f9560ffdf7e6bd262413ee.png


And then look at this.

0185664f2bccd8a3cd2631dc060c1fa2.jpg



2070 from 45 to 69 fps. People freaking upgrade there GPU's from a 1080ti towards 2080ti for just that performance increase and burn 1200 bucks for it a year ago.

Totally not relevant man. People will dam sure not use that DLSS option becasue fuck logic. Just a 35% performance increase.
Yeah, it doesn't make sense to not talk about a feature just because it's exclusive to that brand. Shouldn't be the other way around? You should talk about every advantage that specific card has over others. DLSS and RT are a must for people upgrading.
 

ethomaz

Banned
I think it should be used when comparing to itself when not usind DLSS. If the competition has no equivalent, you are not comparing anything.
Of course you are.

You are comparing how card delivery the same game with it own features.

If AMD has some features that nVidia doesn’t then it should be used too.

IMO benchmarks comparisons not using DLSS in games that support it is pretty useless... it basically doesn’t show how good the card can run the game in performance terms.

HU is basically telling you nonsense you are eating in that nonsense.
 
Last edited:

jigglet

Banned
Yea man its fucking hard to make a chart with DLSS performance like how do dead youtube channels all do it perfectly fine man dam

As I said I struggled hard to find good R6 benchmarks. Did I just imagine that? No one outlet will have the resources to test literally every single combination of CPU + GPU, so it's something that's spread out over dozens out of outlets. It's infuriating when every outlet uses a different render scale setting making it impossible to compare apples and apples.

I never said DLSS benchmarks were a bad thing, I just said I want native CPU performance to be the main benchmark. Everything else should either be a foot note or a supplementary video. Not all games will have the benefit of 3-4 easy defaults that everyone will use. Lots of games will have percentage sliders which just fucks things up from a comparability perspective.
 

Kenpachii

Member
As I said I struggled hard to find good R6 benchmarks. Did I just imagine that? No one outlet will have the resources to test literally every single combination of CPU + GPU, so it's something that's spread out over dozens out of outlets. It's infuriating when every outlet uses a different render scale setting making it impossible to compare apples and apples.

I never said DLSS benchmarks were a bad thing, I just said I want native CPU performance to be the main benchmark. Everything else should either be a foot note or a supplementary video. Not all games will have the benefit of 3-4 easy defaults that everyone will use. Lots of games will have percentage sliders which just fucks things up from a comparability perspective.

Who the fuck benchmarks at 60% resolution scalers. what kind of bootleg benchmarks are you watching. Just type in google rainbow siege 6 or whatever and the gpu + and go to images and techpowerup will most likely show up with a list of performance.
 
Last edited:

jigglet

Banned
Who the fuck benchmarks at 60% resolution scalers. what kind of bootleg benchmarks are you watching.

Go to Youtube. Type in any random CPU + GPU setup with "R6 Siege benchmark". Watch every video on the first 2 pages. I shit you not you will struggle to find more than one or two that leave the render scaling at 100%. Don't believe me? Go have a look. Seriously just go try it.
 
Last edited:

ethomaz

Banned
If I say "Game at 4K" and use DLSS, is it 4k?
Yes... both personal opinion and technically it is a full 4k imag using DLSS.

That is how games archived 4k nowadays.
And the difference between native 4k and temporal rendering like DLSS is becoming smaller and smaller to the point that in you game what 4k is basically losing performance.

If you got similar IQ with 50% increase in framerate why ignore it?

4k native is something that no Dev or Engine is targeting anymore because it is a waste of resources that can be used to increase the image quality or graphic fidelity.
 
Last edited:

Kenpachii

Member
Go to Youtube. Type in any random CPU + GPU setup with "R6 Siege benchmark". Watch every video on the first 2 pages. I shit you not you will struggle to find more than one or two that leave the render scaling at 100%. Don't believe me? Go have a look. Seriously just go try it.

This is what i do

2081ca595875d4826c2c5b0d822c883a.png
 

Closer

Member
That is how games archived 4k nowadays.

Wrong.

And the difference between native 4k and temporal rendering like DLSS is becoming smaller and smaller to the point that in you game what 4k is basically losing performance.

If you got similar IQ with 50% increase in framerate why ignore it?

4k native is something that no Dev or Engine is targeting anymore.

I'm not saying you should ignore, I'm saying you should compare different companies at native and same company features on vs off. That's it. You get the base performance of a card against competition, then bring the same card with all features on vs same card with all features off, so you can base your preferences around that.
 
Last edited:

jigglet

Banned
This is what i do

2081ca595875d4826c2c5b0d822c883a.png

And those stats are shit cause they don't show you the underlying settings. When you actually run the same benchmarks yourself they rarely line up. Then when you watch the videos where they actually take you into the settings menu and show you what they've used, you realise why...cause so many of these numb nuts apply render scaling. I bet if you looked closely at your screenshot they won't even be in alignment, you'll have the same cards with the "same" settings with pretty different results.
 
Last edited:

ethomaz

Banned
Wut?

Games archive 4k nowdays using similar techs... that is the reality.

No game or engine developer is focusing in native resolution anymore... they are all working in better techs that delivery better graphics with lower render resolution.

The trade off for native 4k was never good to begin with and today you can go lay games with better graphics with a non-native 4k that reaches the same clarity of native 4k plus 50% more framerate.


I'm not saying you should ignore, I'm saying you should compare different companies at native and same company features on vs off. That's it. You get the base performance of a card against competition, then bring the same card with all features on vs same card with all features off, so you can base your preferences around that.
Why you should ignore the same results in image quality with way better framerate performance from your benchmarks?

That makes no sense.

Card A - Similar IQ 40fps (there is no DLSS)
Card B - Similar IQ 60fps (but using DLSS)

Why I should ignore the card B DLSS? That is non-sense... everybody that brought the card B can run the game with DLSS.

I mean 99% of the consumers with card B will use DLSS.
 
Last edited:

Closer

Member
Games archive 4k nowdays using similar techs... that is the reality.

No game or engine developer is focusing in native resolution anymore... they are all working in better techs that delivery better graphics with lower render resolution.

The trade off for native 4k was never good to begin with and today you can go lay games with better graphics with a non-native 4k that reaches the same clarity of native 4k plus 50% more framerate.

If you are talking about consoles, I agree with that. If you are talking about PC, developers don't target resolutions at all. If you got the power, you render at whatever resolution you want, and they all know it.


Talking about graphics card + a tiny typing error.
 
Last edited:

ethomaz

Banned
If you are talking about consoles, I agree with that. If you are talking about PC, developers don't target resolutions at all. If you got the power, you render at whatever resolution you want, and they all know it.
PC devs moved from 4k native for years already... almost all games have better options.

You can still choose to use 4k native and try to brute force over it (well even high-end cards won’t be enough for that).
 
Last edited:

Closer

Member
PC devs moved from 4k native for years already... almost all games have better options.

You can still choose to use 4k native.

I'm not saying it's the better use of power, but that's the better representation of a product power against a similar, but different, product.
 
Last edited:

ethomaz

Banned
I'm not saying it's better, but that's the better representation of a product power against a similar, but different, product.
That is the point... it is not a better representation.

I mean 99% of the cards that supports DLSS will play with DLSS enabled.

So for who the benchmarks where you show results without DLSS is aimed for?
 
Last edited:

Kenpachii

Member
And those stats are shit cause they don't show you the underlying settings. When you actually run the same benchmarks yourself they rarely line up. Then when you watch the videos where they actually take you into the settings menu and show you what they've used, you realise why...cause so many of these numb nuts apply render scaling. I bet if you looked closely at your screenshot they won't even be in alignment, you'll have the same cards with the "same" settings with pretty different results.

Dunno don't have the game so no clue how it works in that game. i found 1 youtube video, 2 benchmarks and all come to the same 255 fps or something around it conclusion. So no clue about it. They probably just click the standard very high or something setting and bench and be done with it.
 
Last edited:

Buggy Loop

Member
Yeah, it doesn't make sense to not talk about a feature just because it's exclusive to that brand. Shouldn't be the other way around? You should talk about every advantage that specific card has over others. DLSS and RT are a must for people upgrading.

Naw, Nvidia just put what, roughly 50% of the silicon area to RT & DLSS, let's not bother looking into it too much ...

AMD's Smart Access Memory though.. OH WOW, EXCITING TECH, and i can't wait to see their super resolution.

The age of brute forcing your way into rendering is almost over and many reviewers are stubbornly resisting.

Same with VRAM usage, with the paradigm shift in IO management on consoles and now DirectStorage API coming to PC soon, it's a very archaic view to think that VRAM might be a problem in the future, something that HWUB also tried to use as a boogeyman to recommend the AMD cards. (While they also say they don't think 4K is that common/important, cards aren't that hot for 4k.... bu but VRAM!?)

Sadly, these cards are ML beasts and there's no real methods of testing that in games yet until DirectML starts appearing in games, which most likely will happen soon enough with Xbox series X supported games.


If I saw a competing card released tomorrow which heavily outperformed the GeForce 3080 in current-gen games, it would actually set my alarm bells ringing, because it would mean that the competing GPU’s silicon has been over-allocated to serving yesterday’s needs.

Let's just say, Nvidia did not triple the tensor ops of these cards compared to Turing only to run DLSS 2.

Nvidia designed the equivalent of a McLaren P1 for the Nürburgring, with straight lines (rasterization) and hard curves (RT & ML) for upcoming games, and then insist on bringing that McLaren P1 to a dragster straight line race... GG :messenger_grinning_sweat:
 
Last edited:

Marlenus

Member
That is the point... it is not a better representation.

I mean 99% of the cards that supports DLSS will play with DLSS enabled.

So for who the benchmarks where you show results without DLSS is aimed for?

People who want to see raw raster performance which is a lot of people.

You might as well ask who cares about 720p gaming benchmarks when reviewing a CPU. The point is to test the hardware because DLSS does not look as good as native 4k. If you are struggling to hit 4k 60 in a game then using DLSS, where it is available, is probably a better option than turning other settings down but not always.
 
Top Bottom