• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Star Wars Jedi Survivor PC Performance Discussion OT: May The Port Be With You

Evil Calvin

Afraid of Boobs
Agreed. PCs couldn't possibly hope to run all that next-gen goodness.

7Q6p1Ck.png
GZfBtEx.png
Is that supposed to look great? or were you being sarcastic? Because that doesn't look very next-gen at all.
 

REDRZA MWS

Member
I have a pc with specs, i9 9900k OC’ed, 32 GB ripsaw RAM, and an rtx 4090. How are these stuttering ports still a problem. guess im getting Jedi Survivor on my Series X. This is ridiculous.
 

rodrigolfp

Haptic Gamepads 4 Life
See I don't think this is right either. There's this bizarre schism in the gaming space where it's a mentality of my platform vs the other platforms but it shouldn't be the case. Console gamers shouldn't shit on PC gamers for this botched job and PC gamers shouldn't have to retaliate. We should be on the same side and shit on the publishers for delivering a shoddy product. Our side is the one of the consumers, not PC/console or what have you.
It is what it is. Consoles always have their limitations, be resolutions, performance, controllers, etc, etc. People choose to play on pc to avoid those problems...
 

MiguelItUp

Member
Doesn't seems as bad as reported:


My understanding (for the most part) is that it's very much playable, still able to be played and enjoyed, etc. But everyone's thresh hold is different. I've noticed a lot of people that have exponentially beefy rigs are up in arms about it. I'm not concerned.

A lot of people are and will be streaming it today. I watched Cohh stream it for a bit on PC, and stopped as I don't want to be spoiled. He was having some weird horizontal black bar issue. But it was fixed, and disappeared. Everything I saw looked smooth, but this was still in the very beginning of the game, my understanding is the open world segments is when you notice instability.

SkillUp, who's one of the few I tend to agree with, said that if you're playing on PC it's best to avoid it until it's patched and improved. Which, come launch I'm sure it'll be BETTER. Maybe not perfect, but better. But we'll see.
 

CSJ

Member
Why are so many devs on other platforms up in arms defending this common issue?

You got paid and are moving on to your next project or stuck fixing your busted work and continue getting paid.
Sick of them thinking they are the most holy of industry workers and only have problems others don't.

I cannot think of any other type of job where people defend shoddy work, (or lesser work because of an actual issue) other than VFX artists that are being worked to the bone.
Also, yeah most of this is upper management pushing shit out before it's ready but let's not pretend other developers can make equally decent lookng games without these issues.

Otherwise I can't really say much until tomorrow and then the praise, or shit-show can begin.

Also, "But my bonus". lmao.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
In re4 there was no difference between 3-4 gb and 8gb texture, people just wanted to max out texture even if their system was weak, not my fault if people want to play everything on ultra even when turning down a setting has no visual quality loss amd can fix problems, i had a crash aswell when i was putting everything on ultra, i just needed to turn 1 or 2 settings down a notch to not have any problem.

In hogwarts you just needed to turn down rtx to solve most of the problem, i clearly remember you saying the same thing in the ot, the game was faaaaaaar from unplayable, same for dead space, callisto on ps5 had more stuttering than these 2 combined.

For everyone of those bad ports you named that weren't even broken for everyone, we had atomic hearts, dead island 2, forspoken and some others that runs well.

And i tell you a secret, stuttering in most of the cases is not related to how good your gpu is, if a game has shader compile stuttering, it is the same for everyone, so i experienced the same stuttering as you in most of the cases.

I think wild hearts was way more broken than re4 and ds remake tbh, that one was really unplayable on pc, no matter how strong your pc is.

It is a good year for pc gamers? No, but it's hardly this hell on earth that some of you want to promote.
No, the difference between 4 and 8GB textures is the LOD pop-in. Not the actual textures themselves. I see pop-in all the time now that im forced to settle for 4GB when i was running the game just fine at native 4k 60 fps with hair strands on at high a couple of weeks ago.

RT does cause problems in both RE4 and Hogarts since its a vram hog, but i turned it off in both games. And yes, once you do that the stuttering goes away. But i bought the 3080 for its RT performance. Otherwise i wouldve just bought a 6950xt which was 20% better rasterization performance and way more vram.

P.S Go back and look at the hogwart PC thread. The game only became playable even without RT when we added 32GB of system ram and made changes to the config file that were patched in a couple of weeks later by the devs themselves. Without those changes to the config file, the game was damn near unplayable. RT off or on. Everyone who stepped out into the courtyard and saw their framerate dip to below 30 fps can attest to that.

P.P.S Gotham Knights with RT runs fine at a locked 60 fps at 4k dlss quality. Until it doesnt. It literally drops to single digit fps every 20-30 minutes. Even after they fixed all the UE4 shader related stutters. I played the game without RT because i hated those single digit fps drops. Hogwarts was a mess until i upgraded my ram, changed the config files thanks to reddit, and turned off RT. Witcher 3 went from 70-80 fps with RT on to fucking 25 fps even in smaller towns. RE4 kept crashing with RT on. Hell, RE4 looked like shit on the PS5 performance mode, they fucked up the XSX modes then fixed it then fucked up the PS5 IQ in the latest patch. This is not a PC exclusive issue. Just look at the star wars jedi survivor PS5 video we posted earlier in the thread. 15 seconds to load textures. At least R4 loads them within a second or two, this thing is taking 15 seconds to load textures on the PS5. We all know how shit forspoken looked on the PS5. This is affecting everyone.
 

Thebonehead

Banned
I’m just going to stop you right there, because you have no clue what you’re talking about in regards to this dogshit port.
Fixed that for you

In this thread:

Corporate loving Nvidia fans triggered and desperate that a pre-release version of a game

Strangely, it's also full of corporate loving Sony fangirls shitposting
 

Sleepwalker

Member
Every AMD sponsored game has bad performance
Yep, ive said it before. AMD sponsored games end up having cancer performance on PC.

The moment they start getting bundled with AMD parts is like a kiss of death

Callisto Protocol
Company of heroes 3
The last of us
Redfall (loading)
Dead island 2 (loading)
Now jedi survivor. Dead island 2 being the exception so far, but also that game is not that demanding afaik.
 

Alex11

Member
See I don't think this is right either. There's this bizarre schism in the gaming space where it's a mentality of my platform vs the other platforms but it shouldn't be the case. Console gamers shouldn't shit on PC gamers for this botched job and PC gamers shouldn't have to retaliate. We should be on the same side and shit on the publishers for delivering a shoddy product. Our side is the one of the consumers, not PC/console or what have you.
Yes, but how many can really have this mentality on forums, 20-30%? Maybe the ones that simply play games and don't comment anywhere, they can enjoy games, and have no problems with others enjoying games on different platforms.

When I see people genuinely happy about someone else's misery for a shit port or vice versa, I'm like '' wtf is wrong with you? ''. It's like they have some sort of investment or own the companies of their preferred platform for games.
 

Romulus

Member
Forget why it doesn't run well on a 4090 PC, why the hell isn't this on base ps4? The game looks dated as hell for all the hype.
 

SmokSmog

Member
Let me be clear: for most of the time I spent with the game before publishing the review, the optimization of the game was a disaster on PlayStation 5. The graphical "performance" mode had little to do with performance, and if I had to estimate the amount of time the game hit 60 FPS, it would be maybe 5% - the rest is "by eye" around 40 frames per second. In a flash, however, we received access to the "day 0" patch, which actually improved the performance of the game. It's still not perfect though.
star-wars-jedi-ocalaly-optymalizacja-1.jpg

star-wars-jedi-ocalaly-optymalizacja-4.jpg

star-wars-jedi-ocalaly-optymalizacja-3.jpg
 

Kataploom

Gold Member
Learn how to use paragraphs when you post your utter nonsense.

Even better: don't post at all

Why do all these single platform ps5 warriors insist on posting in PC threads?
They've not been told that if PC players downgrade settings to PS5 levels games not only avoid most of bespoken issues but run better on mid range hardware than on it if the port isn't an absolute broken mess like TLOU.

BTW, I saw the TLOU P1 news and seems like it's getting most issues fixed, tho some are still complaining about "even more crashes" on Steam forums, was thinking on getting it while I wait for SW:JS to be fixed but seems I'm better off just playing something else
 

SlimySnake

Flashless at the Golden Globes
3080 with patch

Looks like a patch today fixed the GPU utilization issues:
- 3080 has 99% GPU utilization
- Game runs at 70 fps on 1440p using Ultra Setting RT Off.
- 1% Lows are just 53 fps
- VRAM usage is less than 9GB
- RAM usage is around 16GB. 32GB might be recommended here.

It's still the opening area so it could get worse as you go into the more open world areas.

I played the first one at native 4k 60 fps epic settings on my 2080 so to settle for 1440p 60 fps on my 3080 is going to be lame but i will take something that is playable on day one.
 

Gaiff

SBI’s Resident Gaslighter
The game wasn't even officially out so of course we had to wait for a patch. Based on what I've seen though, the requirements for such a visual payoff are way too high. The game looks like it belongs in the PS4-era but has PS5+ demands.
 
"With DirectStorage PCs are hitting 20GB/s."

Those maximum speeds depend not only on the speed of the PCI-E interface, but also the speed of the NME-drive and your graphics card. A 4090 is going to perform better than a 1660 card here since texture decompression happens on the graphics card. I did the DirectStorage 1. 1 benchmark on my PC (with an RTX 3080) and I got great results (16 GB/sec) but I would need to have a 4090 to hit the max.

So the truth would be that "Absolute high-end PCs are hitting 20GB/s bandwidth" but it all depends on your particular configuration. Whereas all PS5s can reach that target.

Another thing: since DirectStorage uses the graphics card for texture decompression, this will have an effect on the graphics performance. That's not the case on consoles that have dedicated IO chips.

You do realize that DirectStorage 1.2 would get you the very 20GB/s bandwidth on your RTX 3080 that you claim you would need 4090 to hit?
The "God-tier I/O" cult will never cease to amaze me!
 

KyoZz

Tag, you're it.
The game runs fine on 5800X and 3080, so false alarm.

Also:

"I've seen some tests on RTX 4090 struggling to get 50fps, well, they weren't completely wrong, I was having the same issue when I got the game yesterday, even at the lowest settings I wasn't able to reach 60fps due to extremely low GPU Usage (it was around 60-70% and I was getting 40 to 55fps), but thankfully, a new update was released today, and my GPU Usage issue was fixed.
You SHOULD NOT trust tests released before today's update (6,5GB update), even though the port is still problematic, the performance is way better, and it is possible to get more than 60fps now."
 

Buggy Loop

Member
So is it really fixed? Audio bugs too?

Oh it's not fixed performance..

This is the last thing I wanted to share; but somehow CPU performance has gotten *worse* on the day-1 patch. I have no idea why; my best guess, though I never bothered to check, but it's possible that the pre-release build didn't incorporate Denuvo, and with the day-1 patch did. That would add some extra CPU overhead. Regardless of the reason, the patch didn't help.
 
Last edited:

Alex11

Member


Disappointed Chris Farley GIF


Ooouf, those RT effects.

Thanks AMD!

That looks rough, I've seen better reflections non-RT in older games. What the hell happened? , I was so looking forward to this game, and all I see are disappointments, both graphically and gameplay wise, granted the gameplay ones are personal opinions.
 
Last edited:

Kataploom

Gold Member
Usually at this point pc cpus were anywhere from 5 - 10x faster (going all the way back ps1) so only 2x is a drastic slowing of pace.

GPU delta is a bit harder to directly compare since architectures didn't line up so neatly until ps4, but even there, 3x was far more accessible than 4x the price of console for gpu alone.
I said it in a reply post but those estimations seem very conservative, my 5600X could run A Plague Tale Requiem at more than 80 fps before being bottlenecked by GPU at 1080p, at 1440p that CPU + RX 6700 XT ran the game from around 60 to over 70 fps... That's a game that is said to be bottlenecked on consoles by CPU so it's limited to 30-40 fps at 1440p upscaled to 4K, with Forspoken the story is basically the same, it just loads like 80% slower on my PC compared to PS5 (but that's just 3 seconds on PS5 vs 5 or so on my PC), I'm bottlenecked by PCIe 3.0 but people with beter GPU and PCIe 4.0 even beat PS5 on that regard, which has the most powerful I/O system of the consoles.

BTW, haven't seen new reports on PC version, how is it running to people here in GAF? Or basically everybody decided to hold up lol
 

StereoVsn

Member
It’s laziness, with a big side of not giving a fuck about the product you’re putting out there, for customers to spend their hard earned $70 on.

It’s a “we’ll fix it later” mentality.

Which by the way the console versions suck just as well, so looks like priorities went right out the window. Maybe delay these shit games to please paying customers instead of shareholders?
But that's not a developer call. It would be their publisher's which is EA in this case.

And yes, this port appears to suck, same as TLOU, Hogwarts and others. Then they get patched up.

Basically publishers dont want to wait anymore because they know that gamers will buy titles anyways. Then (hopefully) game gets patched up.

The key is just basically stop buying at release. After a month or three usually things get settled more or less, but not always of course.
 

SlimySnake

Flashless at the Golden Globes
Ok, first impressions. Not bad. My Specs:
  • 3080 + 32 GB RAM + i7-1700k
  • 1440p Epic Settings No Ray Tracing 55-65 fps
  • VRAM usage 7GB
Quite a few micro stutters but nothing like hogwarts or other games with shader compilation stutters. The game starts with a gigantic stutter which made laugh. But other than that, its been fine. Performance doesnt fluctuate too much. Some cutscenes do go up to 120 fps, but gameplay mostly hovers around 55-65 fps in the linear intro.

Aside from the micro stutters every now and then, I havent seen any glitches or bugs or crashes. GPU utilization is roughly 75-99%. Definitely CPU bound in some areas. I didn't get the HDR bug GamingTech was talking about so maybe the day one patch fixed all these issues.

The game feels really really good. Kind of a slow opening but the combat feels tight, visuals remind me of Ratchet though not as pretty of course. I really like all the new changes to the stances and NPCs.

P.S EA App Uses 600 MB itself. But I did see it go down to 200MB after I quit out of the game. No idea why these apps need dedicated vram.
 

Kataploom

Gold Member
The game runs fine on 5800X and 3080, so false alarm.

Say whatever about the game animations and graphics "not so much better than last gen", but damn I can see the areas being waaaaay bigger and graphics a little better on characters but noticably better on environments, and Jedi Fallen Order was already crawling on last gen consoles so I think graphics are good enough for me
 

Raelgun

Member
BTW, haven't seen new reports on PC version, how is it running to people here in GAF? Or basically everybody decided to hold up lol
Running 5120x1440 (32:9), max graphics, ray tracing on, widest FOV, FSR2 off, vsync off

Getting like 55-65FPS in the opening mission on 4090

Not noticing stutters, crashes, audio bugs so far

Looks and plays fine - but 4090 should be getting way higher
 

hollams

Gold Member
Ok, first impressions. Not bad. My Specs:
  • 3080 + 32 GB RAM + i7-1700k
  • 1440p Epic Settings No Ray Tracing 55-65 fps
  • VRAM usage 7GB
Quite a few micro stutters but nothing like hogwarts or other games with shader compilation stutters. The game starts with a gigantic stutter which made laugh. But other than that, its been fine. Performance doesnt fluctuate too much. Some cutscenes do go up to 120 fps, but gameplay mostly hovers around 55-65 fps in the linear intro.

Aside from the micro stutters every now and then, I havent seen any glitches or bugs or crashes. GPU utilization is roughly 75-99%. Definitely CPU bound in some areas. I didn't get the HDR bug GamingTech was talking about so maybe the day one patch fixed all these issues.

The game feels really really good. Kind of a slow opening but the combat feels tight, visuals remind me of Ratchet though not as pretty of course. I really like all the new changes to the stances and NPCs.

P.S EA App Uses 600 MB itself. But I did see it go down to 200MB after I quit out of the game. No idea why these apps need dedicated vram.
4090 + Ryzen 7700

Same experience as you. I am playing 4k max everything with Ray Tracing and even though I didn't look at my FPS it felt pretty smooth except for the micro stutters. I wish they weren't there, but it's a very minor gripe. The game looks awesome and the HDR really pops on the OLED. The combat is good and the dismemberment is awesome. I finished a quick battle and there were arms and legs all over the place. I cut off another enemies arm and as I did you could see the bright red burn mark that reminded me of the old cigarette lighters in cars that would turn red on the ends.
 

SlimySnake

Flashless at the Golden Globes
4090 + Ryzen 7700

Same experience as you. I am playing 4k max everything with Ray Tracing and even though I didn't look at my FPS it felt pretty smooth except for the micro stutters. I wish they weren't there, but it's a very minor gripe. The game looks awesome and the HDR really pops on the OLED. The combat is good and the dismemberment is awesome. I finished a quick battle and there were arms and legs all over the place. I cut off another enemies arm and as I did you could see the bright red burn mark that reminded me of the old cigarette lighters in cars that would turn red on the ends.
Yep. Combat feels weighty yet responsive and far more brutal than the first game. I legit said wow when the game switched stances on me bb and i cut through half a dozen stormtroopers.

I really like that they give you ask the abilities at the very start of the game instead of stripping him of them like they do in gow. Game feels fun to play from the start.
 

rnlval

Member
Do some people read the stupid shit that they tap before posting? PC gamers are paying customers, just like you are. Devs shit out a garbage port, instead of going after the ones responsible, you go "hur hur, serves you right PC gamers"?

Shit on NVIDIA for cheating you out of VRAM but mocking PC gamers for what is essentially a botched port that is solely the fault of the company is utterly moronic.
GTX 3070's 8GB VRAM issue is the same garbage tactics as expected from NVIDIA's GTX 970's fake 0.5 GB VRAM.
 

rnlval

Member
No, it's just incompetent developers who don't give a shit.
2GB VRAM PC GPUs were rendered obsolete with PS4 / XBO era.

4 GB to 8 GB VRAM target is mid-gen PS4 Pro / Xbox One X target. GTX 970's 3.5 GB VRAM + 0.5 GB fake VRAM is soured milk.

History is not your strong point.
 
Top Bottom