• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ratchet & Clank: Rift Apart will run at dynamic 4K resolution, targets 60 FPS for performance mode

Md Ray

Member
This was argued at length here for months about the GPU being the bottleneck on these consoles. Just announcing the TFLOPS should have tempered expectations but people still believed in unrealistic claims of "secret sauce" coding and "the Zen CPUs will make a hellava difference over Jaguars".

It has and will always be about the GPU.
But you can't deny the Jaguars were pretty crappy even before the current-gen consoles were launched.
 

pawel86ck

Banned
Then you don’t know Polyphony very well.

Ray tracing is going to be the biggest power user this generation. It’s not a mere 8%.
Benchmark from metro exodus proves RT performance impact is only 8% when you will not exceed the rays budget and that's the beauty of HW RT. Polyphony has butchered RT quality to the extreme just to hit their performance target (they are scaling reflections from checkerboarded 1080p) and at this point RT performance impact is probably minimal.
 

Mister Wolf

Member
has this game been seen running on XSX or has Microsoft said that the demo is running on XSX?

Seen by DF and Austin Evans. Epic even assisted the Coalition in porting it to the Series X. Its the whole reason we know the GPU in the Series X can match a RTX 2080 as far as UE4 games are concerned.
 

pawel86ck

Banned
1080 p ? what about 4k ?


J0KHSOs.png


You realise that not all ray tracing api and methods will be the same, do you know if Ps5 is using the same method or something different ?
4K is another story, because you need 4x more rays budget for the same performance target. That's why polyphony is rendering RT reflections in much slower resolution, because thanks to it they only need to optimze RT reflections for 1080p target instead of 4K. No game on PS5 will run at 4K and RT effects done in native resolution.
 

mitchman

Gold Member
Assuming RT workload will be within rays budget (for certain resolution and framerate target) there's only small performance hit with RT. For example in metro exodus if you balance RT with rendering resolution there's only 8% performance hit on 2060S.
At 1080p, this game is clearly CPU limited, as was to be expected at such a low resolution. 1440p should be the minimum resolution to compare to avoid CPU limitations.
 

theddub

Banned
I was not referring to PC parts or big NAVI 80 CU or whatever it turns out to be. Big Navi is leaked to 10 CU shader arrays and frequency focus as Ps5, strange that.

XSX is not big NAVI LOL.

If only we had somehing called maths, where L1 feeding 10 CU at 2.23 Ghz will perform much better than an L1 cache feeding 14 CU at 1.825 GHz. I guess 12 and 10 numbers are the limit of peoples math capability

Feeding CU is just as important as the number of them, its only MS who gauge performance in TF, everyone else uses benchmarks and how games perrform.

And TF has nothing to do with the method of ray tracing, do you think Ps5 is doing it the same as others. Have you heard of local Ray. If Ps5 is using FP16 for ray tracing and oher tricks in the local ray 32 patents then it could be doing half the RT work, what does that do for your performance maths ?

Lets see, power divided by half erm....

Have you noticed the amount of ps5 ray traced titles coming, strange.

Wow the logic and understanding you got there is not so good. When XSX has a good looking next gen title with some ray tracing, come back and talk.
Nvidia is now using TFs to measure Ray tracing performance in the 3000 RTX series, Amphere.
 

geordiemp

Member
Nvidia is now using TFs to measure Ray tracing performance in the 3000 RTX series, Amphere.

LOL they started to do that MS marketing when ampere Samsung TF is not as good.

So how does it go again, I cant recall exactly, 20 TF ampere = 14 TF turing performance ?

Something like that. I can see why they went with TF narrative, and left the 2080ti results off all performance charts and cherry picked the best performance settings. But thats another story.

I wonder what RDNA2 PC parts running at 2.3 Ghz will be perfformance wise, will TF match up....?
 
Last edited:

NullZ3r0

Banned
"That's just the start though, as Xbox Velocity Architecture brings near instant loading, Smart Delivery will pull the optimal version for your device, and we will be running campaign at 60FPS with up to 4K resolution,"


It was dynamic and it still looked bad
You do realize that the Series S also uses VA?
 
You do realize that the Series S also uses VA?

Yeah, I know it does. I was focusing on the bolded part which says "up to 4K resolution" meaning XSX also uses dynamic resolution where it's needed. The way people speak about dynamic resolution is like it's the worst thing in the world but it's not. It's hardly noticeable.
 

NullZ3r0

Banned
Yeah, I know it does. I was focusing on the bolded part which says "up to 4K resolution" meaning XSX also uses dynamic resolution where it's needed. The way people speak about dynamic resolution is like it's the worst thing in the world but it's not. It's hardly noticeable.
I was focused on that part too. "Up to 4K" here refers to performance across the architecture which includes the Series S.
 
Maybe I’m just months behind news wise. Wasn’t this the claim earlier in the year?


It doesnt matter what the marketing department tells you. Cerny also told you, because his console turned out weaker in every aspect than Xbox, that the storage device, the drive in which you store games, its gonna change the way games are made. Just take a moment to process that - the storage device that holds the games. Not the CPU, not the GPU, not the RAM speed, not the RAM bandwidth. No, because all those are weaker in the PS5. Its the storage unit.

The smallest amount of common sense tells us how heavy native 4k is, how heavy high details next to 4k is. Add raytracing in the mix and its not happening.

This was argued at length here for months about the GPU being the bottleneck on these consoles. Just announcing the TFLOPS should have tempered expectations but people still believed in unrealistic claims of "secret sauce" coding and "the Zen CPUs will make a hellava difference over Jaguars".

It has and will always be about the GPU.

It never was and never will be solely about GPU. CPU and GPU go hand in hand and you always need both.
 
I was focused on that part too. "Up to 4K" here refers to performance across the architecture which includes the Series S.

No, I'm right here. They are referring the XSX version alone. Series S wasn't even announced at this point and the way you are describing it doesn't even make sense.

Just to make it crystal clear for you

On Xbox Series X, enjoy enhanced features like up to 4k resolution at 60FPS in campaign and greatly reduced load times creating seamless gameplay that ushers in the next generation of gaming.*


So XSX also has dynamic 4K where it is needed but it's at 60fps if it makes you feel any better but the sacrifice is the game looks like an Xbox 360 title which should let you know how expensive 4K is.
 
Last edited:

Batiman

Banned
I need to play this 60fps over any other options. Love R&C PS4 but 30 dragged the experience down big time.
 

Tchu-Espresso

likes mayo on everthing and can't dance
The same people on here spouting that resolution doesn't matter were probably beating their dicks and slinging feces when PS4 games were in 1080p and Xbox One was 900p.

I wonder why the change of tone?
Image reconstruction.
 
Last edited:
Still looks better than any Xbox game I've seen so far... I already have both new consoles pre ordered
its a fair point because we havnt seen much of anything in regards to series x exclusive games. but to thinl series x games wont outdo Ratchet is silly just based on raw numbers.

Hellblade 2 looks significantly better than Ratchet, if the trailer is even 50% accurate
 

Allandor

Member
LOL they started to do that MS marketing when ampere Samsung TF is not as good.

So how does it go again, I cant recall exactly, 20 TF ampere = 14 TF turing performance ?

Something like that. I can see why they went with TF narrative, and left the 2080ti results off all performance charts and cherry picked the best performance settings. But thats another story.

I wonder what RDNA2 PC parts running at 2.3 Ghz will be perfformance wise, will TF match up....?
you can't compare TF across different architectures. But the architectures of PS5 and XSX should be quite the same. So you can say which one is faster in graphics calculations.
Turing and Ampere are different architectures. Yes nvidia made "small" modifications to get more TF from more or less the same die area and therefore the TF can't be used as efficient as in Turing. But overall it has so much more, that it is the faster architecture (compared to Turing). And we might see workflows, where the TF of Ampere can also really be used. All I see so far is, that still the CPU is a bottleneck for those chips.

How fast RDNA 2 is .. we will see in a few weeks. Rumors say, that AMD threw everything out that is not good for games (like they hinted in their presentation) which could lead to smaller chips with the "same" performance (in games) or bigger chips with higher performance (than RDNA1).
 

geordiemp

Member
But the architectures of PS5 and XSX should be quite the same.
Why do you think that, just because they use some RDNA2 both are heavily customized and so different its chalk and cheese

  • 4 shader arrays of 14 CU fed by L1 cache at 1.825 Ghz
  • 4 shader arrays of 10 CU fed by L1 cache at 2.23 Ghz
So one has got more mouths to eat with, one has got a faster spoon and less mouths to feed. They are not remotely similar and thats just the shader arrays.

The GE on both will be totally different and customised as well.

So no I dont agree, not even close. 20 % clock difference and massive CU difference, they are worlds apart.
 
Last edited:

KyoZz

Tag, you're it.
WTF, you are missing my point totally.

Here is the 1st post I responded to:



Then all I was doing was pointing was pointing out that DF showed today, the brand new RTX 3080 could not hit native 4K60 with ray-tracing on a last gen game. So i am perfectly fine with R&C being a dynamic 4k60. I just trying get these people that are disappointed dynamic 4k and saying that this unacceptable to realize that the RTX 3080 had to use DLSS to get to 60 fps which is not a native 4k.

"So these idiots thinking that..."
"so these idiots"
"THESE IDIOTS"
That's my only problem with you, calling other idiots. It's free, and it's actually dumb because there is games that do 4K 60 so...
That will be my last reply to you, hope you understand.
 

OutRun88

Member
It doesnt matter what the marketing department tells you. Cerny also told you, because his console turned out weaker in every aspect than Xbox, that the storage device, the drive in which you store games, its gonna change the way games are made. Just take a moment to process that - the storage device that holds the games. Not the CPU, not the GPU, not the RAM speed, not the RAM bandwidth. No, because all those are weaker in the PS5. Its the storage unit.

The smallest amount of common sense tells us how heavy native 4k is, how heavy high details next to 4k is. Add raytracing in the mix and its not happening.



It never was and never will be solely about GPU. CPU and GPU go hand in hand and you always need both.
Fair enough. I’ve just always been super impressed with what consoles have been capable of in the past given their specs so I don’t put the same focus on them.
 

Blond

Banned
This was argued at length here for months about the GPU being the bottleneck on these consoles. Just announcing the TFLOPS should have tempered expectations but people still believed in unrealistic claims of "secret sauce" coding and "the Zen CPUs will make a hellava difference over Jaguars".

It has and will always be about the GPU.

100%

I was constantly saying before this that the SSD would make for amazing gameplay opportunities in the realm of being able to access resources instantly. God of Wars traversal system would have to be completely rethought because the current one would be made irrelevant since you could teleport to different places instantly.

They still think the Chinese engineer who exposed the demo of unreal 5 was running on a gaming laptop was lying despite the CEO talking all that shit but not denying or disproving him at all.

I’m buying the console for the exclusive’s not graphics but I feel like a lot of people have yet to realize that this power argument is lost. The moment 9.2tf was unveiled I instantly threw in the towel on that argument and started caring more about the games because playing games in 4K on my X1X sure was hell didn’t make Gears 5 more fun, Doom enteral more visceral, injustice less clunky,or dragon ball fighterz less shit online or Red Dead less boring.
 

DeepEnigma

Gold Member
The same people on here spouting that resolution doesn't matter were probably beating their dicks and slinging feces when PS4 games were in 1080p and Xbox One was 900p.

I wonder why the change of tone?

It flips flops. For all.

That's why console fanatics are some of the most petty cunts on gaming boards. Gamers are fickle in general.

It began in July actually.


We're going to memory hole this.
 
Last edited:

SafeOrAlone

Banned
Once again, this thread is showing its stupidity by expecting native 4k/60fps with next gen visuals +raytracing like Ratchet and Clank. Like someone else said, if you want all of those things, go buy yourself an RTX 3090 and i9-10900k and go play on PC. It is not happening on a $500 console

This post is hella funny. Not because it clears up an innocent bit of naivety, regarding console tech, but because it's so socially foul. Guy walks into a room calling people stupid in a friendly video game tech conversation, then tops it off with his prideful display of pc power. Shit is funny.

Also worth pointing out that not one person claimed they expected ray tracing, 4k, and 60fps at once as your post implies.
 
Last edited:

Redlancet

Banned
The same people on here spouting that resolution doesn't matter were probably beating their dicks and slinging feces when PS4 games were in 1080p and Xbox One was 900p.

I wonder why the change of tone?
you mean the same ones who were beating theis dicks with the 4k xbox x and suddenly 1140 its the best thing ever because the series s?
 

jaysius

Banned
This next gen was a mistake, hopefully the mid-gen replacements will be more than half-assed upgrades this time around.
 
Last edited:

jaysius

Banned
Actually, DF tested the RTX 3080 today and with native 4K with ray-tracing with frame rates hitting mid 30s, I believe the game was Fallout. So these idiots thinking that a $500 box is going out perform a $1000 graphics card are clueless.



These people you're calling idiots are mentally ill obsessives that are so drawn into a corporations narrative that they can't see facts anymore, they are a minority.

Also don't call people idiots over stupid fucking video games... it's idiotic.

Console gamers never think they're getting the same tech as cutting edge PC gaming, it's the price compromise that console gamers understand.

The corporations have taken great liberties with the truth and how 4k will be delivered this coming generation, they've mislead and people have eaten it up.
 
Last edited:

GametimeUK

Member
I'm so glad I'm aware there's more to visuals than a native 4K image. I've been a fan of cheating 4K since checkerboard rendering. Also I've kept my expectations in check on hitting 60fps on the new hardware. I'll be more than happy with the results as long as the framerate is solid.

Pixel count doesn't tell the full story.
 

Hairsplash

Member
the truth...


the lie...


the bane of my existence, on the PC. max all eye candy, and that is what afterburner or FRAPS shows... then you adjust the graphics settings... and just want that “little-bit-more” eye candy... or change the minimum to 39 and it describes adaptive sync on my 4k benq 60hz monitor...

while the 22 low is funny. with an 4k adaptive sync monitor it is below 40fps, and on a gsync monitor 30 (generally)


make the minimum frame rate 39, and describes the hell on a PC with adaptive sync on a 4k monitor (range 40-60) even on a rtx2080, i could not get above 40 and GTAV with everything ULTRA, no MSAA. a 1000 dollar canadian card. meanshile on a gtx1660ti, watercooled to 2115gpu/7000ram, can get 30fps 4k 99% of the time. (380 canadian ) ...
 

Dogman

Member
This post is hella funny. Not because it clears up an innocent bit of naivety, regarding console tech, but because it's so socially foul. Guy walks into a room calling people stupid in a friendly video game tech conversation, then tops it off with his prideful display of pc power. Shit is funny.

Also worth pointing out that not one person claimed they expected ray tracing, 4k, and 60fps at once as your post implies.
Really? Nobody expected that? Here's 4 from this thread that did

If it's just "targeting" 60fps then it better be native 4k. I don't see how they could be using any render scaling and still struggling to hit 60fps.
I thought we were getting 4K/60 FPS this gen? Am I stupid?
never been into Ratchet games. But "dynamic 4k" and then "targeting" 60fps for performance mode is not a good look. What res would performance mode be on? 1440p?
I'm sorry, but TARGETS 60 fps in performance mode is not acceptable out of these machines. Hopefully it was just worded poorly.

And if they didnt know it had raytracing, not my fault they didnt read the webpage

Besides, I wasnt displaying any sort of PC power that youre talking about. I didnt say I had those specs. Its a fact that a PC with those specs is far more likely to pull native 4k/60 with RT than a fucking PS5.
 
Same Zen 2 CPU that is beaten in gaming by ancient Intel architecture based on 14nm, lol. Zen 3 is actually the real deal and consoles should have been launched with it a year later along with more powerful GPUs. The amount of nonsense people were spouting here for months about how these consoles were something they're not was very funny. People overhype consoles every time only to have their fantasy hopes crushed faster than PS5 SSD can load data.

Problem with Zen 2 is only worse latencies, but they're far better here on these APUs.
And no matter if "there's better CPUs out there", what matters is that these new CPUs are much, much faster than the old Jaguars. Look what the current video games achieved with those weak CPUs, think what can be done with these new CPUs that are between 6 to 8 times faster.

But I agree, they should had waiting another year to have Zen 3 and a more mature RT solution on RDNA2+.
 

mistakable

Neo Member
Yes! Give me that 60 FPS goodness. I don't mind if it's 1440p.
Yes, this all the way I don't even really want 4k ... at 3.5m I don't sit close enough to my TV to benefit from that... so improve the visuals all the way but hang back at 1440p and I'm more than happy.
My pc monitor is 1440p... same issue again as 4k would need a bigger screen or have to sit closer to notice.

I mean Its just waisted performance is it not ...halving frames for a barely noticeable bump in pixel fidelity?
 

Andodalf

Banned
Problem with Zen 2 is only worse latencies, but they're far better here on these APUs.
And no matter if "there's better CPUs out there", what matters is that these new CPUs are much, much faster than the old Jaguars. Look what the current video games achieved with those weak CPUs, think what can be done with these new CPUs that are between 6 to 8 times faster.

But I agree, they should had waiting another year to have Zen 3 and a more mature RT solution on RDNA2+.

If you wait a year, you might as well wait another year or so in hopes of GDDR7, Memory bandwidth is going to be a major limiting factor this gen after all.

And At that point I'm sure there's something new to wait for too. There's always something new.
 

ZywyPL

Banned
Truth be told "dynamic" resolution + "targeting" 60FPS is essentially what the PS4/XB1 generation was about, where many many games barely hit the native resolution, if ever, while at the same time not hitting 60FPS even once, running mostly at 45-50FPS, because the dynamic scaling wasn't aggressive enough, or the game was CPU bound. That's one of the main reasons I completely skipped the current generation of consoles, that awful lack of consistent performance is what seriously pushed me away, the moment the prices were revealed I knew something wasn't right. This gen is starting better in that regards, with many titles being 60FPS, or at least having an optimal performance mode, but yeah, R&C example does put a red flag for me, let's hope that's just an exception rather than the rule.
 

SafeOrAlone

Banned
Really? Nobody expected that? Here's 4 from this thread that did






And if they didnt know it had raytracing, not my fault they didnt read the webpage

Besides, I wasnt displaying any sort of PC power that youre talking about. I didnt say I had those specs. Its a fact that a PC with those specs is far more likely to pull native 4k/60 with RT than a fucking PS5.

Nope, those are quotes that are asking about 4k/60FPS. They say nothing about ray tracing and there is no reason to automatically assume they expect ray tracing in a 4k/60fps package, when we are accustomed to having different performance options. Hoping for 4k/60FPS is not the same as hoping for 4k/60fps/Ray tracing. There's a difference in expectation there. It suited you to exaggerate their expectations though, because it at least gave you some more slack to call people stupid.
 
Last edited:
It's funny reading this thread full of people saying that 4k/60 fps/ray tracing is completely impossible, while the very next thread down is about Gran Turismo 7 trying for exactly that.


Most games that will go for 4k and high framerate on consoles its gonna be severely compromised in many areas. Racers are one genre that can do high res and high framerate. Didnt nvidia demo some racing game with their 3090 and it ran at 8k max settings ?
 

VFXVeteran

Banned
It never was and never will be solely about GPU. CPU and GPU go hand in hand and you always need both.

With the complexity of the rendering these days, it's the GPU. We can disagree but unless a game intentionally becomes CPU limited, we'll see more GPU limited games in the future. Let's agree that you need a pretty good CPU to feed data to the GPU at a good clip, but as scene complexity and rendering complexity increase, the GPU becomes more the bottleneck -- NOT the CPU.
 
Last edited:

StreetsofBeige

Gold Member
With the complexity of the rendering these days, it's the GPU. We can disagree but unless a game intentionally becomes CPU limited, we'll see more GPU limited games in the future. Let's agree that you need a pretty good CPU to feed data to the GPU at a good clip, but as scene complexity and rendering complexity increase, the GPU becomes more the bottleneck -- NOT the CPU.
Are the PS5 and Series S/X cpus in your opinion good enough for the generation for what it is and cost?

Or is it anywhere near the jaguar shit we had last gen where people said those were really bad to start with?
 
Last edited:
Top Bottom