phil_t98
#SonyToo
do you have proof of that?They lie
do you have proof of that?They lie
Lol okay. Gotcha.Yes, should be faster. The fact PS5 is so close to XsX and XsX is so close to pc nvme says it all.
They have rejiggered parts, but not all.
Yes, it’s in front of us.do you have proof of that?
Good writeup! You sound like you have some experience in game development. Have you been involved in any games we might know of?Main bound of what it can be on the screen (after moving to SSD) is GPU and memory RAM/VRAM subsystem. The thing is most people overlook, is that pute data bandwidth is not enough, you have to schedule your task in a way which supports peak efficiency. Let's say that consoles still does not have AFx16, which is standard on PCs for a 20 and no it's not some special procedure, it's a toggle in OpenGL ES (mobile), DirectX as well as Vulkan. So it's not shared it's not some crazy algorythm, it's toggle for ROPs. I know the PR is wonderful thing, but simple real world scenarios does not really proved to be properly utilised it, not that ti does not work, but that you have a lor of other contraints, mainly.....drumroll....people. Optimalisation is not easy thing to do and before they do some super advance profiler, which is going to be powered by neural network and which structure everything in a way which going to alow for all these crazy numbers to be utilised. But than againt, that's againts the the manufactuers need, you can see how much you paying for 20-30% extra power.
And again something from my experience, sound and animations are biggest offender to memory. You can put textures relatively easily to a surface. But damn it's hard to get a standard sounds without effect, slap it with some effect, position it in the world and made it sounds real. Obviously shaders are expensive, but that's the thing of compute. But than again compute needs fast memory, bandwidth, not SSD bandwidth but RAM type one. Presumably you have it with GDDR6 memories, good right? No not really because GDDR6 (not X) have pretty bad latency, so you have to schedule all the operations, because if you fire up the operation, it's impossible to start another one. GDDR6X mitigate it to certain degree, however they are power hungry and expensive so far. Still nobody runs whole system on just GPU.
Animations are indeed a much overlooked aspect! And I understand your point about having access to a variety of animations that can be loaded in an instance. My doubt is the fact that these animations still have to be rendered out. Complex animations need complex models "perform" them. See in hitman, the LOD setting actually reduces the animation update rate of far away NPC's, not due to I/O constraints, but to save on rendering budget. When we are already close to maxing out the render budget (like in control) there is not much more a SSD with all its benefits in terms of access and availability of assets can do.
But let's see, maybe some Dev will surprise us, who knows?
but they said its different to last gen version you call them liars with no proof other load time?Yes, it’s in front of us.
3 platforms showing such close loading times proves a common bottleneck between the 3. The bespoke i/o is not it for obvious reasons.
Yes, it’s bogged down by last gen techniques, not fully built for next gen I/o.but they said its different to last gen version you call them liars with no proof other load time?
Kingdom Come: Deliverance. But I wasn't really really anything more than did support job, QA, but I was at those meeting where we discussed the challenges. Also helped did some animations for Amanita Design and also sound loading (but that was flash, so it was not anything advance - my High School teacher is guy who does animation in Amanita and we are small country - or Rather everyone involved is from Prague).Good writeup! You sound like you have some experience in game development. Have you been involved in any games we might know of?
You are talking about scenarios where completely different scenes happen within a few seconds of each other? So then every few frames, a whole load of new assets needs to be in the memory ready to go. I can think of two such scenes; one is the cyberpunk montage, and the other the dimension hopping scene from ratchet and clank. But these kinds of scenes are not what general gameplay is made out of.Limitations exist of course, but please don’t use Control as a benchmark. And the point of streaming is that data doesn’t need to be resident in RAM for more than it needs. You are only thinking about what’s in the frame at a given moment, when this is as much about what happens next. If you can’t swap data fast enough, you get stuck with the same data for longer than you would’ve liked...
How is badmouthing RT perf of a console based on... nothing a bad PR, is that really the question?How is this bad PR?
You are talking about scenarios where completely different scenes happen within a few seconds of each other? So then every few frames, a whole load of new assets needs to be in the memory ready to go. I can think of two such scenes; one is the cyberpunk montage, and the other the dimension hopping scene from ratchet and clank. But these kinds of scenes are not what general gameplay is made out of.
That is a great way to invalidate an argument someone made, just attack the character if you can't say anything and you win!Good writeup! You sound like you have some experience in game development. Have you been involved in any games we might know of?
Another example of some fine console "optimization" here. Directly from Remedy. No guess work involved.
Huh? But I agreed with him and he has worked on KCD? No need for fake outrage now.That is a great way to invalidate an argument someone made, just attack the character if you can't say anything and you win!
It can look terrible on everything. There is something seriously fundamentally wrong with how Remedy engine handle light and shadows which is shame because outside of that it's great game.Let's take time, relax and appreciate Series S version.
Allot of hard work went into this work of art
You are still ignoring the cost of rendering these animations out in my opinion. But let's wait and see.No that’s not what I mean. What I mean is that the pool of animation you can pull from is suddenly much wider because the latency is low enough. You are not stuck with the same data for seconds, therefore forced to reuse constantly. This will be highly liberating to developers.
Stop thinking I’m terms of swapping the whole data in RAM can do. Ratchet is doing that, right? It’s impressive sure, but as you say you can’t make a whole game around that. That’s just one possibility.
On PS5 as an example:
Is there a limit to the amount of animation you can have in a moment? Answer: Depends on how large the ram pool is.
Is there a limit to how diverse sequence of animations can be? Answer: Depends on how much storage you got.
This will all be made clear in the very near future. The problem is that this part of game production isn’t widely talked about, people are usually more focused on lighting and special effects, because that’s what you can easily see and what sites like DF focus on.
You are still ignoring the cost of rendering these animations out in my opinion. But let's wait and see.
Seems they put in the same amount of effort Microsoft put in designing the series s.Let's take time, relax and appreciate Series S version.
Allot of hard work went into this work of art
I think it captures reality in all of it's beautiful richnessIt can look terrible on everything. There is something seriously fundamentally wrong with how Remedy engine handle light and shadows which is shame because outside of that it's great game.
That could definitely be the case, but I had a feeling that at least in RT Nvidia should still be superior for a while, unless DirectML start to work its magic…We're seeing games around 3060 or better performance already and, as is the case every generation, as time goes on further optimisation will continue to provide better results on console, while those equivalent PC specs fall behind.
Like last gen, where at first a 750 ti was more than enough to match or exceed the PS4, but was woefully inadequate in the last few years.
But yes, anyone wanting the maximum possible settings isn't getting that with a $400 PS5. If you want the absolute best, you need to pony up the few grand needed for it and keep doing so each year as those parts become out of date.
Even on SeriesX standard version was shimmering mess of temporary artifacts. I have no idea how somebody could think that it was great idea to calculate light in such way.I think it captures reality in all of it's beautiful richness
It's free on PS+.The whole thing looks and sounds like a half arsed cash grab, I started playing it no the last gen but stopped to wait for this gen, doubt I`ll even bother now.
I will fully admit that I don't have the technical knowledge to give you an informed answer. My brain can't process how this might be used in the futureThe question is, why are you adding the cost? They aren’t being rendered at the same time.
Are you adding the cost when it comes to geometry? Why would it be rendering geometry that isn’t there anymore? Same goes for animation.
If you want the absolute best, you need to pony up the few grand needed for it and keep doing so each year as those parts become out of date.
Nes, super Nintendo, PlayStation, PlayStation 2, Xbox 360. To name a few.Can`t remember a gen where a consoles could actually stand side by side with high end PCs at launch.
Nothing new here.
Consoles are low budget hardware after all.
Low quality RT even in Miles Morales. Go play Cyberpunk on beast PC or control or Metro Exodus with full RT effects in action with DLSS. You will forget Miles RT.Miles Morales out there rocking RT, destruction, 4K, particles up the whazoo, higher detail models, open world... but clowns want Control to be anything other than small tits port.
Low quality RT even in Miles Morales. Go play Cyberpunk on beast PC or control or Metro Exodus with full RT effects in action with DLSS. You will forget Miles RT.
SSD doesn't always mean no popins. Control has issues on PC, it suffers from minor drops here and there on PCs. SX seems like a direct port of PC since both use Windows based OS and Direct X.That pop in on the PC version!!! Do we know if it's running off an NVME?
i noticed that too..Is there a way to disable step rumble without turning off vibration completely? I find it SO ANNOYING, jeez.
Main reason i would want to upgrade really. if nvidia drops a actual half decent upgrade of a card on the market.
Huh? But I agreed with him and he has worked on KCD? No need for fake outrage now.
Yeah, stick a cube map on it, done.
pretty muchAs I said earlier, this is a smash and grab port. Nothing more
Disagree, accuracy is secondary to the illusion of a reflective surface on balanceCubemap only looks good in a game that isn't filled with 80% dynamic objects that you can't include in it.
Still have no idea what you are on about dude
One of the reasons why I discontinued playing it on the PS4, that shimmering effect plagued the game.Let's take time, relax and appreciate Series S version.
Allot of hard work went into this work of art
do you have proof of that?
Huge disspointment for me.
Playing both versions on XSX I'm not wowed by the upgrade but I am grateful.
I dont think it was worth waiting for.
Just played for about half an hour on the PS5.
Not really impressed. Looks significantly worse than the game running on my 2070 and the controller support feels undercooked.
But yes, anyone wanting the maximum possible settings isn't getting that with a $400 PS5. If you want the absolute best, you need to pony up the few grand needed for it and keep doing so each year as those parts become out of date.