James Sawyer Ford
Member
Every platform is maxed out. Including PC. I'm at 96% GPU cycle and can't sustain 60FPS @4k on a 3090.
optimizations and new techniques therefore don’t exist and games can’t possibly look better...
Every platform is maxed out. Including PC. I'm at 96% GPU cycle and can't sustain 60FPS @4k on a 3090.
optimizations and new techniques therefore don’t exist and games can’t possibly look better...
Even though i love Playstation and i am PSfan, Have to agree on this, GPU rendering techniques existed on PC long before consoles are, and some PC exclusive features don't even exist on Consoles. But game engines do evolve over time and will have more optimization for consoles, same goes for PC.Optimizations do exist. But I've never heard of a game being "so" optimized that you can start adding more features to it because your rendering budget was suddenly BELOW what it was before them. In other words, have you ever had a development company tweet saying they made an incredible optimization that allowed them to use 8k textures instead of 4k textures running at the same FPS? That's the sillyness that you guys are proposing..
New techniques do exist. Always done on PC first then adopted by the consoles. PBR shaders, HBAO, HDAO, SSR, Volumetric light shafts, PhysX smoke, RT, etc.. all there for everyone to witness.
Optimizations do exist. But I've never heard of a game being "so" optimized that you can start adding more features to it because your rendering budget was suddenly BELOW what it was before them. In other words, have you ever had a development company tweet saying they made an incredible optimization that allowed them to use 8k textures instead of 4k textures running at the same FPS? That's the sillyness that you guys are proposing..
New techniques do exist. Always done on PC first then adopted by the consoles. PBR shaders, HBAO, HDAO, SSR, Volumetric light shafts, PhysX smoke, RT, etc.. all there for everyone to witness.
I'm not even speaking of new techniques. I'm just speaking of ways to reduce cycle time, through better programming/memory management, that allows you to place more on the screen (regardless of whether the technique is "new")
See the tweet an ND dev made about the PS5 hardware, using PS3 as a perfect example. I'm sure "Uncharted 1" was using 99% of the GPU, in fact that game suffered from screen tearing. But compare it to TLOU and it is a night and day different in what they were able to put on the screen in terms of overall quality. They got better at the hardware which resulted in huge efficiency gains
This happens every gen but you seem to be on a crusade to deny it exists
Sort of like throwing the keys of a Formula 1 car to your grandma and watching her max out the machine at Monaco.Every platform is maxed out. Including PC. I'm at 96% GPU cycle and can't sustain 60FPS @4k on a 3090.
Sort of like throwing the keys of a Formula 1 car to your grandma and watching her max out the machine at Monaco.
Ignoring talent, experience & technique in driving hardware is a child’s depth of thinking.
You can't possibly give a realworld example. This just isn't true. The "more on screen" is completely dependent on GPU bandwidth. Optimization is more about getting that budget FPS because what you do have is slowing it down.
PS3 was a very very hard platform to develop for. I know they struggled with that hardware for years. That's NOT the same as it is now. PS5 is basically a faster PS4. Same architecture in AMD with a basic x86 CPU instruction set.
The problem with you guys is you assume things stay the same even though life around you is changing. The PS3 was a radical architecture and took time to master. The PS4 was more PC-friendly which developers could master rather quickly. The PS5 is just like the PS4, so not much to master other than RT. Why stick to something you know was a different situation and apply it to every single generation from here on out? Life doesn't work that way so why try to MAKE it so?
This discussion has no merit. Poor software can run away maxing ram, Cpu, leak memory etc. So what is then proverbial point to such a useless conversation. It’s misleading and has no base in any coherent conversation.It just means that the hardware is maxed out relative to the software that's pushing it. There's nothing more the hardware can do, given the way it's being utilized, to make the game run faster or at higher resolutions. If that weren't the case then the engine wouldn't have to constantly lower the resolution to maintain its framerate. That's not to say the AnvilNext is representative of the full potential of the PS5 hardware, but rather the full potential of the current iteration of the AnvilNext engine on the PS5.
Prove me wrong dude. Otherwise you have no leg to stand on. All of you are declaring ghosts exists and I"m constantly trying to find one where there is none. Then you throw the PS3's completely radical architecture out as an example that clearly took time to master, but then you apply that same "radical" architecture requires mastering with the PS4 and then PS5 when both are pretty much home-grown PCs? The fact that you can't show any sort of "mastery" of the hardware within the PS4 generation is telling .... You only get 1 or 2 games to prove it from the same company before the next generation comes. Choose your company wisely and hope they release a 2nd game within the generation..
PS5 has some customizations that have not been fully utilized on launch day,
The speed of the I/O in conjunction with things like cache scrubbers and potentially some unified cache on the CPU isn't something that just suddenly works to all of its potential.
The examples were given, accompanied by the developers of the examples saying that the PS5 also has tons more potential when they get to used it more. The one plugging their ears and ignoring reality is you. Instead of pretending to be a professional try listening to them some more I guess.
1080ti at 1440p with similar settings dips alot more frequently and lower than ps5
And many of these scenes the ps5 is scaling above 1440p and holding closer to 60fps, if not locked most of the time
The 4k quality mode comparison argument is sort of a bad one. Of course neither could do 4k 60fps so we're stuck at a locked 30fps on ps5, unable to see how far above that it can go. Theres a ton of room for power comparison between 30 and 60.
PS5 is performing better than a 2060S by at least 20% in this title, that was Alex's ballpark before release......He is guessing on clouds, when it's quite clear PS5 has the best settings there....He is benchmarking at 1440p straight on PC, when PS5 fluctuates from 1440p minimum to 2160P as per GPU load, so in as much as I like the video for what it shows, and he did good work there, PS5 is still not accurately represented here....It's even more performant than what we see in this video based on the DR, one higher setting than PC and mostly PC's top settings otherwise....>Curiously enough, PS5 was even more performant in that scene where it fell to 51fps prior to the patch....
So overall PS5's performance is maybe at least on par with 2080ti on rasterization, but this is only a crossgen game which is not even using the advantages/strengths of PS5 per se.....So as this gen progresses, more videos like this would be interesting...
Sorry but no. The PS5 does NOT match a 2080ti in this game. What it does is match a 1080ti. A bit of difference there.My thoughts exactly. In this game, because of the res used, PS5 should be a good match for a 2080 Ti.
Name something in the graphics pipeline that hasn't been utilized that will make AC:Valhalla run at native 4k/60FPS instead of dynamic res?
Let's go to Spiderman MM. Name something in the RT architecture that Insomniac hasn't mastered yet, that should get them even better reflections along with RT shadows, GI, and Area lights for the next and last Spiderman of this generation?
How does this translate to better visuals? I'm looking for 8k textures instead of 4k. I'm looking for several shadow casting light sources instead of 1 like in Demon Souls. How will that enhance the RT hardware to support more than just 1 RT feature? Will we see more tessellation than what has ever been seen before in any game even on PC? At what FPS will that game run at? What resolution? Where is the RT GI from Crysis Remake in a game @ 4k/30FPS with massive foliage? Will we see that kind of RT in Horizon: Forbidden West or will we have to wait for their last game of the generation before then?
I'm looking for these enhancements because I know that this is what you are guys are trying to imply despite it never existing last generation at all.
But it did exist last gen even though you claim otherwise. Games later in the gen looked demonstrably better.
It seems you cannot distance yourself from the idea that if a technique is used throughout the generation than the games ALL LOOK OF SIMILAR quality, this of course is a false notion.
You don't need a difference of 4K vs 8K textures to make a difference. And again, I bring up UE5 which hasn't had any game launch on it yet (because it's not out), and it's already proven to run with visuals that surpass what we have seen for released games.
There was nothing technically improved due to lack of mastering the hardware dude. Just because a game's art and overall scope changes doesn't mean that you can throw a label on that team to say they mastered the hardware. You don't even know anyone there and they have never made such comments in public.
I never said that. Go back and reread what I said. A game can look good but you have nothing to judge it against. You can't judge KZ:SF to Horizon and declare "they mastered the hardware because Horizon looks better than KZ:SF!!" That's completely a gamer perspective and nowhere near a developer that knows what's going on in the graphics engines' perspective.
UE5 is a standard graphics engine for MULTIPLE platforms. You can't use that because that doesn't help your claim one bit. It's not a company that can't master the PS5 and suddenly mastered it to make the UE5 demo. They have a demo that can run on ALL platforms. They just used the PS5 to showcase it.
You would be hard pressed to find a SINGLE DEV that would suggest that their games do not look better over time as they optimize across the generation.
And just because UE5 runs on multiple platforms DOES NOTHING against my argument. EVEN UE4 games throughout the gen started looking better. Compare UE4 early vs later, there are improvements.
UE5 is a standard graphics engine for MULTIPLE platforms. You can't use that because that doesn't help your claim one bit. It's not a company that can't master the PS5 and suddenly mastered it to make the UE5 demo. They have a demo that can run on ALL platforms. They just used the PS5 to showcase it.
So, why are some ppl only focusing on the 5700XT when this was done using a cpu way better than what's in the PS5?Oh isn't that more cable than zen2 at 3.5 ghz in ps5 ?
Well, this also goes with my question. This article is using a way better cpu, but for some reason ppl are focusing on 5700XT results.I find this post to be somewhat disingenuous.
Of all the vids you could post yours maxes out at 720p. Everything is set to Ultra where that is definitely NOT the case for the PS5 version, and this is a 1080ti paired with a CPU from 2013 and his specs say he's still rocking slow ass DDR3 RAM.
I own a 1080ti with a modern CPU (3900X) and proper 3600cl16 DDR4 RAM to go along with it. I can lock out a 99% solid 60fps with my 1080ti at 1440p no problem and can manage a bit higher. 4K with resolution scaling set to .7 or maybe .8 for example and still hold that 60fps the vast majority of the time.
There is nothing to indicate that the PS5 offers performance anything above a 2080/1080ti ( a 4 year old GPU ). And this has been a BEST CASE SCENARIO so far with comparing PS5 to PC performance. Usually the PS5 performs worse than this. It's just that Valhalla is an outlier where RDNA 2 performs better on average against Nvidia. In the other 90% of cases, Nvidia GPUs perform better.
Bro. In ONE generation, you MIGHT see 2 games from the same company - at MOST! Typically you get ONE shot for an entire generation to display your talent and release a game.
PS4 - Uncharted 4, TLOU2.. that's it. They are very similar in look and rendering tech. Not the LEAP that you guys are trying to claim.
PS4 - Infamous SS and Spiderman. Nothing too dramatic there. Both using the same engine and you can NOT tell that there was any super optimization code going on that allowed Spiderman to be possible.
PS4 - KZ:SF and Horizon and Death Stranding. Same graphics engine. All of them using the same techniques. HZD had new systems in place like procedural foliage, DTOD, cloth physics, etc.. but NOTHING glaring like the lighting, shading, textures, etc..
PS4 - GoW. Only one shot. One game.
All 3rd party games will follow the PC architecture and have graphics engines that are portable to multiple platforms, so we'll go with you guys assumption that they aren't optimized so we can leave their evolution alone.
There you go.. 4 companies all in one generation. NONE of them have games that prove what you guys are trying to say. Period.
UE5 is agnostic of hardware. Their algorithms are PC-centric. No optimized code I'm finding for the PS5. Should I post some source code of what I'm workign on to prove that too?
This discussion has no merit. Poor software can run away maxing ram, Cpu, leak memory etc. So what is then proverbial point to such a useless conversation. It’s misleading and has no base in any coherent conversation.
Bugsnax has framerate drops on PS5, oh no PS5 is maxed out!
They cannot port every UE4 game or demo to Nintendo Switch or a cellphone for a reason. And they cannot port this UE5 demo to any platform.
The engine is multi platform. But the software you create with the engine doesn't need to work on every platform.
I don't think that they just used the PS5 to showcase it. They used the PS5 because it was the best plataform for that showcase.
I yet have to see that flight scene on other systems.
The PS5 is designed to do certain things that are either very hard or impossible to do on other platforms,
I really can't take this seriously. TLOU2 does see noticeable improvements over UC4, but UC4 came out later in the gen.
Spider-Man looks significantly better than Infamous SS. Horizon/Death Stranding look significantly better than KZ:SF.
So I don't agree with any of your points. Are they GENERATIONAL leaps? No. But they are leaps nonetheless.
Both graphics engines support the PC (ND engine will soon) so there isn't a fully optimized from scratch graphics engine JUST for a next-gen console. Sorry those days are over.
So since you have absolutely no clue how to counter points, you start to make shit up? Just like the other threads I've seen you in. Do you agree that there is better hardware out there than ps5? Like for instance better processors from AMD and Intel? Or better gpu's from AMD or Nvidia? If you agree, then your response shows an extreme immaturity, and lack of being able to hold a mature discussion. Just because he knows more about these things, doesn't give you the right to downplay their experience, especially as he's schooled you multiple times, in multiple threads already. Keep seething bro.VFXVeteran , man, I don't know why you post so much, here, let me put it in a few single phrases, you can just copy paste this:
1. Graphics on any platform cannot and will never get better, performance improvements don't exist
2. I'm an EXPERT!
3. PS5 is a mediocre PC shitbox, the Geometry Engine is shit and useless, no developers will use it.
4. PS5's SSD is shit and even if it's not, PC is better anyway at some point in time, for a lot of money.
4. PC is the bestest ever, buy one with the latest card and get on my level you pleb.
5. I'm an ExPeRt!
6. Consoles are shit.
7. There's nothing better to add except ray tracing.
8. Sliders are king, even when they don't affect visual quality all that much but it lets everyone know my PC is the best
9. I'm an ExPeRt!
10. Watchdogs is the greatest graphics benchmark even if it has visuals that are made for 8 year old mid range GPUs and mobile CPUs.
11. The best raytracing is the one that can't reflect anything past 10 meters.
12. I'm an ExPERt!
13. Notice my PC build sempai. Baka!
There you go, you can just put this text in a bash script and set up a cron job to launch this every 20 minutes. Voila! I've just optimised your time spent by hours and hours. You are welcome!
The simple fact is that you can't pinpoint, technically, what makes those games look better, so your entire premise is based off of a subjective opinion that "the game looks better therefore the developers must have optimized their techniques (without any objective proof) to make it look better".
Sorry, can't carry on the convo anymore bud. Until you can get one of these developers to tell you that they mastered some form of hardware technique that they couldn't in their previous game and they pushed out significantly better performance out of the card lending to better features, your argument is dead in the water.
VFXVeteran , man, I don't know why you post so much, here, let me put it in a few single phrases, you can just copy paste this:
Been trying to cancel/not update PS5 Valhalla after hearing it makes the game run worse but it managed to install somehow.... The performance post patch on PS5 is dogshit, stutter jitter galore and tearing is like the Ubi of old all over.... WTF?
It was running a lot better before the patch? How can something like this even happen? I was considering getting AC:V, but now...maybe not
The problems addressed were for issues on XSX with bad tearing and stutter, Since that patch it's made PS5s performance run like ass when it was fine with only minor mishaps, It's quite horrific now imo, In a busy town especially when it rains it is absolute shit.
Yes..in a few years. Catch-up.The last gen of cards are what people tend to make comparisons with, so it's good to see that the PS5 stands up rather than getting smoked day 1.
But the Sony wins and no need for PC gaming crowd is getting a little too lit in here. In a couple years the PS5 will be getting demolished in these comparisons.
Or, in a few days once Cyberpunk drops.Yes..in a few years. Catch-up.
VFXVeteran , man, I don't know why you post so much, here, let me put it in a few single phrases, you can just copy paste this:
1. Graphics on any platform cannot and will never get better, performance improvements don't exist
2. I'm an EXPERT!
3. PS5 is a mediocre PC shitbox, the Geometry Engine is shit and useless, no developers will use it.
4. PS5's SSD is shit and even if it's not, PC is better anyway at some point in time, for a lot of money.
4. PC is the bestest ever, buy one with the latest card and get on my level you pleb.
5. I'm an ExPeRt!
6. Consoles are shit.
7. There's nothing better to add except ray tracing.
8. Sliders are king, even when they don't affect visual quality all that much but it lets everyone know my PC is the best
9. I'm an ExPeRt!
10. Watchdogs is the greatest graphics benchmark even if it has visuals that are made for 8 year old mid range GPUs and mobile CPUs.
11. The best raytracing is the one that can't reflect anything past 10 meters.
12. I'm an ExPERt!
13. Notice my PC build sempai. Baka!
There you go, you can just put this text in a bash script and set up a cron job to launch this every 20 minutes. Voila! I've just optimised your time spent by hours and hours. You are welcome!
Yes..in a few years. Catch-up.
Digital Foundry is starting to become super hacky lately. Did they seriously pick AC Valhalla to compare PS5 with the RTX 20 series? The game runs like shit on the 30 series card, matter of fact it gets blown out of the water by AMD hardware by at least 20%.
Do your have opportunity to put it on YouTube? May help in generating some blowbackI felt that they could have tested busy areas and rainy environments to actually show how bad the tearing/stutter and frame drops can be in performance mode post patch. The game is bound to run fine in a lowly populated town or fighting 4 or 5 guards.
I'm so agitated by the performance drop that I might drop the game and move on.
Do your have opportunity to put it on YouTube? May help in generating some blowback
I understand you’ve been twirling VFXVets balls like some kind of exotic stress reducerYour logic is flawed just as the weak example you've provided. Your inability to comprehend the fact that the AnvilNext engine is maxing out the PS5 hardware is bizarre.
More proof that you either didn't read what was said or didn't understand it.
I find this post to be somewhat disingenuous.
Of all the vids you could post yours maxes out at 720p. Everything is set to Ultra where that is definitely NOT the case for the PS5 version, and this is a 1080ti paired with a CPU from 2013 and his specs say he's still rocking slow ass DDR3 RAM.
I own a 1080ti with a modern CPU (3900X) and proper 3600cl16 DDR4 RAM to go along with it. I can lock out a 99% solid 60fps with my 1080ti at 1440p no problem and can manage a bit higher. 4K with resolution scaling set to .7 or maybe .8 for example and still hold that 60fps the vast majority of the time.
There is nothing to indicate that the PS5 offers performance anything above a 2080/1080ti ( a 4 year old GPU ). And this has been a BEST CASE SCENARIO so far with comparing PS5 to PC performance. Usually the PS5 performs worse than this. It's just that Valhalla is an outlier where RDNA 2 performs better on average against Nvidia. In the other 90% of cases, Nvidia GPUs perform better.
It makes me laugh when PC dudes get insecure about the hardware they have. Involving themselves in console talk because they want to justify the redicilous price they probably paid to match a 399/499 console.Your CPU cost as much as a ps5 or at least well over half with a deal. I would hope paired with a top tier GPU from a few years could outpace a $399 console. Hell, you could probably pair an even less capable GPU with the right CPU and RAM setup and get very close to PS5 in some benchmarks.
But we're not in an ideal environment to compare optimized console hardware against to PC specs. It's the opposite actually. Cov19 and launch window, not to mention AC and most of the other games have 5-6 versions. Dropping the old PS4 and XB1, and the pro version will allow devs to max what they have. Basically, we're seeing up resolution last gen games right now. We're seeing the absolute worst-case scenario for console hardware that thrives from optimization. PC is far less reliant on optimization because of the wide array of specs.
Your brain seems to be dropping frames and tearing.I understand you’ve been twirling VFXVets balls like some kind of exotic stress reducer
So since you have absolutely no clue how to counter points, you start to make shit up? Just like the other threads I've seen you in. Do you agree that there is better hardware out there than ps5? Like for instance better processors from AMD and Intel? Or better gpu's from AMD or Nvidia? If you agree, then your response shows an extreme immaturity, and lack of being able to hold a mature discussion. Just because he knows more about these things, doesn't give you the right to downplay their experience, especially as he's schooled you multiple times, in multiple threads already. Keep seething bro.
LMAO perfect.You know how in some boss battles there are some annoying mobs coming along for the ride that you have to get rid of before being able to attack the main boss? That's basically you in every thread cheerleading for the guy.
No need to get so triggered buddy, really, I was just having a bit of fun since be posts the same thing over and over.