Self
Member
We've already seen RT on Minecraft running on an SX at WAY higher quality than anything else on the consoles.
Using Minecraft as an example of 'high quality' should be considered a crime against humanity.
We've already seen RT on Minecraft running on an SX at WAY higher quality than anything else on the consoles.
nah,dont give this poor soul any attention,he is clearly butthurt,just ignore the poor dude,as i said before any mod can pm and i will give him the receipts
Using Minecraft as an example of 'high quality' should be considered a crime against humanity.
Using Minecraft as an example of 'high quality' should be considered a crime against humanity.
It’s a night and day difference too
Ghost of Tsushima looks so much better than Infamous SS
The leap from Uncharted to TLOU2 isn’t as big, but it’s still very much present. And UC4 was more of a “mid gen” game anyhow
I think it’s delusional to think we won’t see much better looking games than what we see in the launch window
1080p and barely hitting 30fps too. Path Tracing is 10 years away from AAA games.
Using Minecraft as an example of 'high quality' should be considered a crime against humanity.
I said you won't see more performance and/or features at 0 cost.
I said you won't see more performance and/or features at 0 cost. The fact that you guys can't really come up with several examples from last gen means it's just a wishful dream you have. You HOPE that you'll get more RT features than what's being shown now. You HOPE that you'll get faster FPS at higher resolutions, etc..
Maybe he's a dev.. but the way he talks, you would think he might be a dev of some other job.
well it's great to know to dismiss anything you will ever say about graphics technology from this point forward. since you clearly have no idea what you're talking about.
Better performance at no or totally indistinguishable cost is something that constantly happens in software when you are working with it for 7 years.
No hope needed, look at any game engine update log and you will see constant patches saying things like "30% more in (X feature)", or often specifically "(X Feature) now runs up to 2x faster on (Y platform) with lower memory utilization"
Because I don't believe he is the proper developer for this context otherwise he would ha e stated it.This is just funny to see you type after hearing you people to, (Paraphrasing)"attack my logic, but trying to downplay/question my qualifications is really low)
Yup, better than what we have seen until now on PS5 and most Xbox Series X games, if the devs are believed this implementation can also run on much higher polygon count games. I watched the DF videos in which they explain this.It was path traced ffs! That's pretty high quality.
We are talking consoles here. You have never seen a game have higher for after a patch due to performance except subpar performance.
I think the fact that we are seeing native 4k games to start off a gen while still providing that wow factor should be proof enough that games will look better as the devs begin to use UE5 demo's 1440p 30 fps target instead. thats more than twice the GPU power left on the table, and seeing as how no one was able to tell that the UE5 demo was 1440p until epic told DF, i think it's obvious that devs will go with 1440p as the gen progresses.It’s a night and day difference too
Ghost of Tsushima looks so much better than Infamous SS
The leap from Uncharted to TLOU2 isn’t as big, but it’s still very much present. And UC4 was more of a “mid gen” game anyhow
I think it’s delusional to think we won’t see much better looking games than what we see in the launch window
Wasn't it 1080p at a somewhat low frame rate?In any case, even on that I'm not sure he's correct. We've already seen RT on Minecraft running on an SX at WAY higher quality than anything else on the consoles.
Wouldn't that translate to around 12fps in 4k?at ran at around 45fps on Series X at 1080p after only a few weeks of development to implement pathtracing and porting it over
First of all you seriously need to rewrite that because I'm having to guess what you are attempting to communicate.
I assume you are basically trying to say that there hasn't been a game that has gotten better performance after a patch.
Even if that were true It doesn't affect what I said at all, You use engines to make games. When an engine like unreal engine or a companies internal engine gets updated with something that let's they raytrace 20% faster or use 10% less memory on particles that translates into future gameshaving better performance than earlier ones.
Wasn't it 1080p at a somewhat low frame rate?
That's the point, at a low frame rate it doesn't show power or prowess as much as it should.That's not the point.
That's the point, at a low frame rate it doesn't show power or prowess as much as it should.
There is a set amount,and you explain why... A console's GPU/cpu (specs really) is a known quantity in therms of computing power, you can apparently move the needle around by making more efficient use of it, doing more of one thing and less of another (as you say)--that is exactly why there is a set amount of one thing that can be done at any given time.It's just not true that there's some fixed amount of (for example) RT that the consoles can leverage. It's all relative to the other work the games are doing. So therefore it's not true that the fidelity we're seeing now is something close to the maximum.
There is a set amount,and you explain why... A console's GPU/cpu (specs really) is a known quantity in therms of computing power, you can apparently move the needle around by making more efficient use of it, doing more of one thing and less of another (as you say)--that is exactly why there is a set amount of one thing that can be done at any given time.
We don't know how far or close to the maximum potential of the consoles what we have seen so far is.