Dust-by-Monday
Member
Yeah all those laggy games with vsync. SMH.The more frames you buffer the greater your latency.
Spider-Man and RE:Village have vsync.
Yeah all those laggy games with vsync. SMH.The more frames you buffer the greater your latency.
And vsync adds input latency..Yeah all those laggy games with vsync. SMH.
Spider-Man and RE:Village have vsync.
is that one of the twins from the matrix movies
This can only be explained by sub 900p resolutions. Sorry but I never saw such bad artifacts at 1080p.
Okay the thing is, WDL has vsync, but it gets disabled when the action gets heavy, so you’re saying, there’s latency until the frame rate drops and the screen tears? Why would it matter at that point. Just leave vsync on all the time like RE:VAnd vsync adds input latency..
Okay the thing is, WDL has vsync, but it gets disabled when the action gets heavy, so you’re saying, there’s latency until the frame rate drops and the screen tears? Why would it matter at that point. Just leave vsync on all the time like RE:V
lol come on man be better than thisHoly shit 0.03% is that real? Is that not like within a margin of error.
Even though its all embarrassing I'd say a 10% higher res on xbox is the winner here.
Your point doesn't stand at all. You literally have a texture filtering issues on XSX on first page. There is no need to ignore it, and even worse - SPIN IT.
You're replying to someone who made this cringe af thread and literally takes his Series S to his bed...Why? It is "next-gen" console after all. There is no need ignore it in game comparisons. Then go to DF, NXG and VGTech and say to them to not include XSS in comparison. After all, MS said that difference in games between the XSS and X will be only in resolution. Let us see was MS right in that. So far, no.
Why? It is "next-gen" console after all. There is no need ignore it in game comparisons. Then go to DF, NXG and VGTech and say to them to not include XSS in comparison. After all, MS said that difference in games between the XSS and X will be only in resolution. Let us see was MS right in that. So far, no.
You don't have the say in this, thus your point doesn't stand. Unless you have the the knowledge, the equipments, the games, the experience, and the reputation as VGtech and DF, and you provide your own test to dispute both VGtech and DF.Disagree. I'm not seeing the issues you're speaking of. No spin, just telling it as I see it.
It's definitely not capable of doing everything the SX can do. There's memory limitation as stated by id Software engine programmers. Remedy Entertainment omitted ray tracing entirely from the Series S version of Control and their reason was again "hardware limitation".uhm, for the most part it is only a resolution difference tho... it is still weird that DMC5 for example didn't have an RT mode on Series S, because it clearly can handle it as demonstrated by the very game this thread is about.
until now, differences beyond resolution and texture res are usually due to development issues it seems. the system is definitely capable of doing everything the Series X can do.
You're replying to someone who made this cringe af thread and literally takes his Series S to his bed...
I LOVE Series S
Hello GAF, Its been over a week I have been enjoying Series S. Everything from smooth performance to snappy load times to abundance of games via gamepass. It ticks a lot of boxes for me as well. Apart from being affordable and actually being available for purchase, it slots in my system...www.neogaf.com
No, seriously...
uhm, for the most part it is only a resolution difference tho... it is still weird that DMC5 for example didn't have an RT mode on Series S, because it clearly can handle it as demonstrated by the very game this thread is about.
until now, differences beyond resolution and texture res are usually due to development issues it seems. the system is definitely capable of doing everything the Series X can do.
Disagree. I'm not seeing the issues you're speaking of. No spin, just telling it as I see it.
They should probably have sacrificed more resolution for better FX in perf mode.I was playing in a few days ago and the quality difference between 30 and 60 fps is H U G E ... it literally looks like generic shit once on 60 fps mode...
Clearly they can do a lot better.... and follow Insomniac's approach for both consoles...
W E A K and a huge letdown...
at this rate the 7060 will have the same price tag as its part number.Well RTX 3080 in 2020 has 30Teraflops so, depends how will tech develop in next 4 years, its quite possible to have something like RTX 7060 in 2024 with 40TF.
So who knows, nobody even thought we would have 699$ 30tf GPU in 2020.
Yea, I really get annoyed with these over-the-top wishes for future hardware - and all at $500 too. It's one of the main reasons that the console tech thread dragged on forever with completely unrealistic expectations.40 tflops? We may not even see that in PS6. We're in diminishing returns territory with die shrink and how much performance you can get in console form factor. Most likely you'll get double the performance with Pro consoles (20-24 tflops) and that should be enough for 4K60 and 120fsp at lower resolution.
PS5 has always had a slight FPS advantage in multiplats. It's due to the higher clock speeds.
lol this gen Sony fans continue to amaze me .... ahahahaahIt’s not weaker, it’s different.
he's talking about gpu clocks, ps5 has 2.2 ghz of gpu clock speed against xbox's 1.8 ghzSeries X CPU: 3.8GHz Custom Zen 2 (constant)
Series S CPU: 3.6GHz Custom Zen 2 (constant)
PS5 CPU: 3.5GHz Custom Zen 2 (variable)
The PS5 does not have a CPU edge in any instance.
It's the tools. People love memeing about it but it's the only reason. Early cross gen games built on the new Xbox GDK are having some issues. It's also the reason why Series games struggle with AF sometimes. There's no technical reason the XSX shouldn't have perfect AF, yet here we are. It's gonna get better soon. The GPU compiler has had a lot of work done since launch already.I am curious though, as to why, in this case, performance seems a little more solid on PS5, albeit at cost of native resolution (Both are upscaling anyways right?). The XSX is more than capable, is there some type of rule that says "Go for highest resolution first" over anything else on the XSX? I would think you'd want to balance resolution/performance. But, I watched the video, and I can't really tell the different all that easily upfront. I've noticed this exact same result in several other 3rd party titles.
Why? It is "next-gen" console after all. There is no need ignore it in game comparisons. Then go to DF, NXG and VGTech and say to them to not include XSS in comparison. After all, MS said that difference in games between the XSS and X will be only in resolution. Let us see was MS right in that. So far, no.
Series X CPU: 3.8GHz Custom Zen 2 (constant)
Series S CPU: 3.6GHz Custom Zen 2 (constant)
PS5 CPU: 3.5GHz Custom Zen 2 (variable)
The PS5 does not have a CPU edge in any instance.
On the XSX|S side, take 200 MHz off if you enable Hyper Threading (we are back to the old pre-launch monster specs like gloating eh? Making a big deal of constant vs variable and omitting the HT/SMT clockspeed difference?), take a few percent difference off because the XSX|S run games in a fully virtualised environment, take a few percent off as I/O still taxes the CPU a bit more (1/10th if a core or so, optimistic or realistic that it was as an estimate).Series X CPU: 3.6 GHz (w/ SMT) - 3.8GHz Custom Zen 2 (constant)
Series S CPU: 3.4 GHz (w/ SMT) - 3.6GHz Custom Zen 2 (constant)
PS5 CPU: 3.5GHz (w/ SMT) Custom Zen 2 (variable)
The PS5 does not have a CPU edge in any instance.
People will dump on anything MS does. The system is the cheapest current gen console on the market and you can't get more performance for less.Compare all you want. But warring and comparing are two different things.
People look for chinks in the armour while warring.
Well, Series S doesn't wear an armour at all. Cause it's not here for war.
The very notion of holding it's sub 1080p resolution against it is outdated. It belongs to PS4 / XBOne era.
This generation is all about 60fps, good upscaling to desired resolutions from variable resolution image. We even have games with 1080p base image upscaled to 4k and look good (Returnal).
As for doing everything that series x does ? Isn't it 1/3 the power? As long as it makes sensible sacrifices (like ray tracing in most games currently) and image looks sharp (it does on 1080p), it's working well.
Hivebusters shows what the XSS can do in capable hands. Pushes up to 1440p60 and looks gorgeous.People will dump on anything MS does. The system is the cheapest current gen console on the market and you can't get more performance for less.
The XSS needs to lean on SFS and VA to address some of the RAM limitations. When it is running cross gen titles it is pretty much just brute forcing those titles to get 60 fps. Since frame rate is king according to some here it is doing what it is supposed to be doing. It is not designed to run games at 4k and it requires more optimization if you aren't going to use its RAM saving features. When we get out of the cross gen period we'll see what it is capable of. Look to MS for titles that show what the system can do.
I wonder why the detractors don't use that title to gage what the console can do? Why focus on a launch Ubisoft title that despite it not having the best graphics has the best frame rate? It's very interesting.Hivebusters shows what the XSS can do in capable hands. Pushes up to 1440p60 and looks gorgeous.
Always the MS victim card. No, people do not dump on the XSX as it is a great console.People will dump on anything MS does. The system is the cheapest current gen console on the market and you can't get more performance for less.
You know why As long as ethomaz tier console war shitposts are allowed (muh 675p), not much will change.I wonder why the detractors don't use that title to gage what the console can do? Why focus on a launch Ubisoft title that despite it not having the best graphics has the best frame rate? It's very interesting.
Good luck to find the better resolution on series X in this game. I mean yeah statically the difference seems notable. But when you look into in the game no way you will notice the more native pixels with this DRS. On the other side tearing is quite apparent. Wouldn't be better to have less tearing than some extra native pixels? Just to sayI still will take better resolution that you always notice than a few frame dips that last milliseconds at times. It always confuses me when people opt for an extra frame or two at random times versus a higher resolution you see at all times.
XSX is just getting a bit of the same shit PS4 got for AF being worse than on Xbox One in some third party games. In both cases it is a mix of tools and performance settings (not sure why people think AF has to be free on console where every MB/s of bandwidth and KB in texture caches matters and is optimised for).It's the tools. People love memeing about it but it's the only reason. Early cross gen games built on the new Xbox GDK are having some issues. It's also the reason why Series games struggle with AF sometimes. There's no technical reason the XSX shouldn't have perfect AF, yet here we are. It's gonna get better soon. The GPU compiler has had a lot of work done since launch already.
Higher clock speeds of the GPU obviously.
lol even a 5 year old 1060 pushes locked 1080p 60 with %60-70 utilization (probs headroom for 1440p)
AF is free now. It hasn't been before, but it is now. 10 year old graphics cards can do 16xAF with zero notable performance impact.XSX is just getting a bit of the same shit PS4 got for AF being worse than on Xbox One in some third party games. In both cases it is a mix of tools and performance settings (not sure why people think AF has to be free on console where every MB/s of bandwidth and KB in texture caches matters and is optimised for).
it is true that gears 5 has excellent optimization, so it's not surprising that all systems run it fine, same for forza horizon 4Yeah, even on a 6-7 yo GTX 970 there was headroom for higher than 1440p in some sections w/ dynamic res active.
Nothing is free, hidden by other costs sometimes on PC’s driving insane resolutions and FPS’s, but it is still not free. Your texture units are essentially your load and store units for your compute code and getting them to spend more time on one thing (fetching 8x or more texture samples as inputs is extra time some devs could find a use case that spends it elsewhere).AF is free now. It hasn't been before, but it is now. 10 year old graphics cards can do 16xAF with zero notable performance impact.
Fine, it is maybe a tools issue then… still not sure why the inventors of DirectX and the ones that architected this change are getting a bit of a free pass on their tools from their fans for so long after launch, but hey PS5 fans gloss on some Sony stuff too… .With this game, we even know that the AF settings are exactly the same between both XSX and PS5, since all the settings are hidden in the PC config file. It's 100% a GDK issue.
It is running at peak performance when needed by the code with very minor very quickly variations that beyond F.U.D. concern and rumours have yet to be noticed and mentioned negatively by any dev.You can't possibly calculate that accurately knowing that the clock speeds are variable.
The PS5 is not running at peak GPU performance 100% of the time.
It's not literally free of course, but in practice, we reached a point where the amount of GPU time used by AF is so miniscule that it doesn't really matter much.Nothing is free, hidden by other costs sometimes on PC’s driving insane resolutions and FPS’s, but it is still not free. Your texture units are essentially your load and store units for your compute code and getting them to spend more time on one thing (fetching 8x or more texture samples as inputs is extra time some devs could find a use case that spends it elsewhere).
I give them a pass because I know it's gonna get fixed soon. It's also not a huge deal, AF is fine in the vast majority of games.Fine, it is maybe a tools issue then… still not sure why the inventors of DirectX and the ones that architected this change are getting a bit of a free pass on their tools from their fans for so long after launch, but hey PS5 fans gloss on some Sony stuff too… .
You know why As long as ethomaz tier console war shitposts are allowed (muh 675p), not much will change.
Yeah, in BC games.
You can see it in car sections. I hope you noticing a blurrier road on XSX in the distance :
yesSorry for going a bit off-topic but is the game still stuttering like crazy on PC after all the patches?
You're replying to someone who made this cringe af thread and literally takes his Series S to his bed...
I LOVE Series S
Hello GAF, Its been over a week I have been enjoying Series S. Everything from smooth performance to snappy load times to abundance of games via gamepass. It ticks a lot of boxes for me as well. Apart from being affordable and actually being available for purchase, it slots in my system...www.neogaf.com
No, seriously...
Cant see any difference if am honest, only thing i notice is the ps5 version the double yellow lines get jaggerdy after the puddle where the xboxs doesnt
I no longer understand the meaning of comparisons threads if such a blatant trolling is allowed Mod of War .. though even when a console is objectively pushing far less pixel at the same perfomance (0.03% and one also have VRR that the other console dosnt) doing so creates a precedent for future comparisons so it becomes impossible to understand something.
I understand that not having the edge on performance can hurt someone's feelings but so these threads lose their value completely. I don't know how much post I read with "another PS5 win" bs trolling in