DaGwaphics
Member
<edit> Didn't finish reading the thread yet.
Last edited:
You played Miles Morales in 8 hours on a 3080? Tell us more about your experience!
Seriously . I can't imagine putting this much time and effort into patrolling threads, and shilling all day long. As much as I love my computer, I'm not going to go on a tirade for the entirety of my GAF account, about how this company is so much better than that company. Speak with your wallet, instead of, shilling with your keyboard.Really? You're trying to use raytraced shadows in WoW to claim that "ahkchuallly RDNA 2 raytracing scales just as good as Ampere. PS. Nvidia is evil"
Go outside. ALL YOU DO is schill for AMD and their inferior GPUs. Don't you get tired of it?
<edit> Didn't finish reading the thread yet.
Really? You're trying to use raytraced shadows in WoW to claim that "ahkchuallly RDNA 2 raytracing scales just as good as Ampere. PS. Nvidia is evil"
Go outside. ALL YOU DO is schill for AMD and their inferior GPUs. Don't you get tired of it?
Seriously . I can't imagine putting this much time and effort into patrolling threads, and shilling all day long. As much as I love my computer, I'm not going to go on a tirade for the entirety of my GAF account, about how this company is so much better than that company. Speak with your wallet, instead of, shilling with your keyboard.
With that said, I don't think WoW is a great example of raytracing "dominance". I'm sure there are multitudes of better and more recent examples out there.
Imagine using WoW as a talking point to try and compare the architectures? Imagine being fueled so much by your own ego, that you can only post Positive AMD/Negative Nvidia stuff. So egotistical that you think others feel you are so important, and that we can't put you on ignore because we would miss your posts soooo much.I simply repeated the results with some additional insight. That you don't like it is your problem.
I'm not interested in your fanboy wars. The fact that scaling seems about equal up to a certain point is a much more interesting topic to discuss, because it actually entails the architectures, rather than pointless namecalling and attempts at shaming and discrediting other users for your own grandeur. Maybe you should look in the mirror and spend your own time better, rather than looking at what you think I am doing.
Even better, put me on ignore so you don't have to see my posts. But you can't, can you? I'm flattered that I'm so important to you though. Unfortunately, it's not mutual. Just FYI.
Now let's keep things on-topic.
I mostly just haunt the Xbox threads since the speculation phase is behind us for gen 9. LOLMate, I missed you.
He's admitted awhile back that he doesn't actually buy video cards, his job is apparently to just do this for a living.Really? You're trying to use raytraced shadows in WoW to claim that "ahkchuallly RDNA 2 raytracing scales just as good as Ampere. PS. Nvidia is evil"
Go outside. ALL YOU DO is schill for AMD and their inferior GPUs. Don't you get tired of it?
I simply repeated the results with some additional insight. That you don't like it is your problem.
I'm not interested in your fanboy wars. The fact that scaling seems about equal up to a certain point is a much more interesting topic to discuss, because it actually entails the architectures, rather than pointless namecalling and attempts at shaming and discrediting other users for your own grandeur. Maybe you should look in the mirror and spend your own time better, rather than looking at what you think I am doing.
Even better, put me on ignore so you don't have to see my posts. But you can't, can you? I'm flattered that I'm so important to you though. Unfortunately, it's not mutual. Just FYI.
Now let's keep things on-topic.
Imagine using WoW as a talking point to try and compare the architectures? Imagine being fueled so much by your own ego, that you can only post Positive AMD/Negative Nvidia stuff. So egotistical that you think others feel you are so important, and that we can't put you on ignore because we would miss your posts soooo much.
I simply repeated the results with some additional insight. That you don't like it is your problem.
I'm not interested in your fanboy wars. The fact that scaling seems about equal up to a certain point is a much more interesting topic to discuss, because it actually entails the architectures, rather than pointless namecalling and attempts at shaming and discrediting other users for your own grandeur. Maybe you should look in the mirror and spend your own time better, rather than looking at what you think I am doing.
Even better, put me on ignore so you don't have to see my posts. But you can't, can you? I'm flattered that I'm so important to you though. Unfortunately, it's not mutual. Just FYI.
Now let's keep things on-topic.
Based on the DF video, RT shadows are a best case for AMD. It's with reflections and/or GI that the gap becomes big.So in actuality, RDNA2 is scaling equal here to Ampere (and better with minimums) except at the high setting where it falls behind. And even at the high setting, it's not THAT much worse... That is actually quite interesting. nVidia will definitely push to crank RT higher to put their cards in the best light, even when not necessary. I'm getting GameWorks/Tessellation vibes.
The 720p results are a whole other story.
Yes, this is why Spider Man Miles Morales basically only uses RT shadows with very sparing use of 1/4 resolution reflections. That's pretty much what PS5 hardware with RDNA2 can handle. We will not be seeing any real RTX GI on the consoles this gen, for example Metro Exodus with full RTX GI is probably beyond anything these consoles can handle.Based on the DF video, RT shadows are a best case for AMD. It's with reflections and/or GI that the gap becomes big.
It doesn't seem very sparing to me based on the videos. You can see it on the sides of buildings, puddles, cars, polished floors and metallic objects. John from DF said you can see reflections "around every corner" of the game world.Yes, this is why Spider Man Miles Morales basically only uses RT shadows with very sparing use of 1/4 resolution reflections. That's pretty much what PS5 hardware with RDNA2 can handle. We will not be seeing any real RTX GI on the consoles this gen, for example Metro Exodus with full RTX GI is probably beyond anything these consoles can handle.
Really? You're trying to use raytraced shadows in WoW to claim that "ahkchuallly RDNA 2 raytracing scales just as good as Ampere. PS. Nvidia is evil"
Go outside. ALL YOU DO is schill for AMD and their inferior GPUs. Don't you get tired of it?
AMD doesn't even hide that they have army of unpaid schills on forums and social media.Seriously . I can't imagine putting this much time and effort into patrolling threads, and shilling all day long. As much as I love my computer, I'm not going to go on a tirade for the entirety of my GAF account, about how this company is so much better than that company. Speak with your wallet, instead of, shilling with your keyboard.
With that said, I don't think WoW is a great example of raytracing "dominance". I'm sure there are multitudes of better and more recent examples out there.
Nah, I don't think so. We will definitely see some form of ray-traced GI on consoles. But those games would have to target 30fps. UE5 demo already showcases ray-traced real-time dynamic global illumination called Lumen. And it was running on the shaders i.e. it's a software-based, non-triangle RT bounce lighting. That said, it's a lighter solution compared to the hardware-accelerated RT GI found in ME which is more demanding.We will not be seeing any real RTX GI on the consoles this gen, for example Metro Exodus with full RTX GI is probably beyond anything these consoles can handle.
I am not interested in paying over $700 to lower graphics settings for playable framerates.
I never said to ignore it, but to not put a high priority on it.
No, spending money , having no compromise on settings without putting any brain in what you do doesn't make you an enthusiast; just an annoying rich person.That statement is kinda confusing to me.
Take Cyberjunk for example. Raytracing effects are a part of the graphical settings that you can chose to enable. I'd think enthusiasts who buy GPUs for 700+ bucks would want to be able to max out their games.
No, spending money , having no compromise on settings without putting any brain in what you do doesn't make you an enthusiast; just an annoying rich person.
edit : but yes RT will be the difference this gen....Lumen excuse or not.
AMD vs NVIDIA (AMD= dark purple shirt). They keep trying but cannot really even land a punch on NVIDIA.
Not watching, Digital Foundry are biased af and dont even hide it when it comes to AMD vs Nvidia. Not to mention Nvidia sponsors them. Ray Tracing is way over blown we dont have the horsepower to do that yet. I'll gladly take a lower cost AMD build over an Nvidia/Intel one.
So because the "right places" miss tech to do something correctly people reporting progress doing it are biased ?Not watching, Digital Foundry are biased af and dont even hide it when it comes to AMD vs Nvidia. Not to mention Nvidia sponsors them. Ray Tracing is way over blown we dont have the horsepower to do that yet. I'll gladly take a lower cost AMD build over an Nvidia/Intel one.
That statement is kinda confusing to me.
Take Cyberjunk for example. Raytracing effects are a part of the graphical settings that you can chose to enable. I'd think enthusiasts who buy GPUs for 700+ bucks would want to be able to max out their games.
If you don't enable raytracing, you are already lowering the settings by definition, because it's designed with raytracing in mind and it has huge visual impacts.
Buying an AMD card would therefore mean that you have to lower the settings by a bigger margin in the increasing number of games with raytracing for playable fps compared to Nvidias offerings. And again, devs are optimizing their RT implementations for the currently available cards for playable framerates.
If you want even higher fps, you can only enable RT reflections with lower quality. That's still resulting in better visuals at lower performance cost. Or only use RT lighting. Or disable DLSS and reduce the RT quality if you don't like DLSS for some reason.
That's the thing, you can customize it to your individual liking for the best balance of visuals vs fps. Just like with any other visual setting.
Not trying to attack you, only wanting to wrap my head around your statement:
As I've explained, raytracing is part of the visual settings. And it is important for you to not lower the settings on a 700+ bucks card, meaning you want to use highest settings. While simultaneously saying that highest settings don't have a high priority.
That's confusing me.
Or do you somehow exclude raytracing from visual settings? Which in the case of again Cyberpunk for example, would be pretty weird because they are one of the most impactful settings for the visual quality of the game.
RT is there to make the life of developers easier in the long run. And at this point, all consoles have RT. But there is a reason that the games that have it in a noticeable manner, have an option to turn it off.Yes, this is why Spider Man Miles Morales basically only uses RT shadows with very sparing use of 1/4 resolution reflections. That's pretty much what PS5 hardware with RDNA2 can handle. We will not be seeing any real RTX GI on the consoles this gen, for example Metro Exodus with full RTX GI is probably beyond anything these consoles can handle.
Wtf does PS5 have to do with this?Huh. That's like, prime trolling or incredibly stupid.
So PS5 owners can play the great looking Spider-Man with cool RT reflections on hardware that is significantly slower than a 3080.
And on high-end PCs, people are playing stuff with different raytracing effects like Control.
But we don't have the horsepower yet?
You know that devs implement and optimize these effects with target hardware and target performance in mind? And they then adjust the RT settings to run on these cards?
Not to mention the rest of your comment. Jeez, what's wrong with you.
But the AMD build isn't lower priced?Not watching, Digital Foundry are biased af and dont even hide it when it comes to AMD vs Nvidia. Not to mention Nvidia sponsors them. Ray Tracing is way over blown we dont have the horsepower to do that yet. I'll gladly take a lower cost AMD build over an Nvidia/Intel one.
The PS5 is around half as powerful as a 6800 XT, yet is still capable of impressive looking ray tracing effects.Wtf does PS5 have to do with this?
You know what's confusing to me? That you're arguing that people paying 700+ bucks don't want to lower settings, and that lowering settings is a viable option at the same time.
You can't run RT at max settings in Cyberpunk with playable framerates, even with a respectable DLSS settings (which is a fancy resolution-lowering method with AI upscaling), so by default you're lowering settings. That means you're paying 700+ bucks to play with lowered settings.
That means, RT is too large a performance hog to be considered a priority for the majority of people. Especially for the price. The game still looks extremely good without RT. Sure. It does look better with RT. Sure, the Ampere cards are faster. But that doesn't mean much when every current hardware drowns with it enabled.
So either you think lowering settings for a 700+ graphics card is not justifiable, or, that RT has too high a performance impact. You can't have it both ways.
That, I agree with.Maybe I didn't write it in the best way. People want the maximum visual effects and performance possible for the price. Of course gamers adjust and scale the settings to their target fps, that's why PC gaming is so great.
Considering variable refresh rate monitors, 60 fps average is fine, with minimums above 40 fps.What is a playable framerate for you?
That's fair. Which card? If your card cost more than $500, I would not find that good value. But that's just me.I'm playing Cyberpunk at 1440p with all raytracing maxed at 50 - 70 fps (depending on the scene) with DLSS quality mode. I add a bit of reshade sharpening and clarity, and it looks very clean and extremely close to native.
Same with Control, maxed out with all RT, it runs at 40+ fps even without DLSS at 1440p. While with DLSS obviously way better.
It's a combination of fps, resolution and price for me. That is one package. For playing at 1080p-ish, I considering anything under $350 reasonable, but preferably sub $250. For playing at 1440p-ish, I consider up to $500-ish reasonable. For playing at 4K, I consider up to $750 reasonable. Anything above that is a waste of money. All need to be 60 fps average.Curious what your definition of playable fps is.
That, I agree with.
Considering variable refresh rate monitors, 60 fps average is fine, with minimums above 40 fps.
That's fair. Which card? If your card cost more than $500, I would not find that good value. But that's just me.
It's a combination of fps, resolution and price for me. That is one package. For playing at 1080p-ish, I considering anything under $350 reasonable, but preferably sub $250. For playing at 1440p-ish, I consider up to $500-ish reasonable. For playing at 4K, I consider up to $750 reasonable. Anything above that is a waste of money. All need to be 60 fps average.
Those are my standards. It doesn't have to be everyone's. I value my performance for money. All that, is achievable with the current cards (if there was actual stock). We just moved to 4K being viable with max settings, after 4K has been around for a long time.
But if you enable RT, everything shifts down two tiers, i.e. the $750 card becomes a $350 performance card, so to speak. And I doubt performance will get better over time for these cards. This is the best RT performance we're going to get from them.
Yeah. If you want to pay $800 for ray tracing, you can use it at 1080p and to some extent 1440p, you can argue that RT can be used. Possible it definitely is right now. The whole argument is based on the fact that the performance loss is too big. You will pay with a kidney for it, so to speak. So to me, I can't currently call it a feasible solution considering the prices.Thanks for the clarification. I'm playing with a 3080 and an 8700k at 5ghz.
If it's not worth it for you, that's perfectly reasonable. All a matter of budget. I just don't like the general sentiment that raytracing isn't feasible and not possible yet. Because it is, at least for the people like me who are stupid enough to drop 800 bucks on a GPU.
And because of the consoles, I truly believe that we'll see more toned down, cheaper implementations that'll run well on cheaper cards.