• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Cyberpunk 2077 E3 2019 demo was running at 1080p & Ultra settings with Ray Tracing enabled on a Titan RTX

CyberPanda

Banned
A few days ago, we shared with you the specs of the PC system that was running the E3 2019 demo of Cyberpunk 2077. And today, we’ve got some additional details about the resolution and the settings that were used in order to showcase this demo.

CD Project RED’s Alvin Liu stated that the game was running at 1080p on Ultra settings at E3 2019, and with its Ray Tracing effects enabled.

“The game was running on Ultra, but we are continuing to improve our visuals.
The game demo was running at 1080p, but our trailers and publically released assets are at 4K. The UI is designed mostly at 4K (eventually it will entirely be at 4K native), but we have the technology to swap assets and do intelligent scaling to handle 1080p, widescreen, 720p, 1440p, and so on. We can also design specific UI at 1080p and other resolutions, on a need by need basis, such as on a screen or graphics with heavy icons that might look bad otherwise.”
Regarding the Ray Tracing effects, Alvin Liu claimed that the team used RTX for Ray Traced Emissives, Sky Light, and Ambient Occlusion, though they are still looking into further improving the Ray Tracing effects.

“Ray Tracing was on [in the demo]. We were showing off Ray Traced Emissives, Sky Light, and Ambient Occlusion. However, I’ve seen super impressive screenshots internally about raytracing (they get sent out in a digest e-mail), so we’re clearly still working on it as they looked more impressive than what I remember seeing in the demo. Especially at night and with neon reflections. NVIDIA also has representatives and work with our studio to continue to improve and utilize this technology, similar to Witcher 3 and Hairworks.”
Now as it’s pretty obvious, the E3 2019 demo did not feature all of the performance optimizations that CD Projekt RED will implement in its game. As such, these PC requirements for running the game at 1080p/Ultra settings do not reflect those for the final product.

Before closing, here is the PC that was used in order to run the Cyberpunk 2077 E3 2019 demo.

  • CPU: Intel i7-8700K @ 3.70 GHz
  • Motherboard: ASUS ROG STRIX Z370-I GAMING
  • RAM: G.Skill Ripjaws V, 2x16GB, 3000MHz, CL15
  • GPU: Titan RTX
  • SSD: Samsung 960 Pro 512 GB M.2 PCIe
  • PSU: Corsair SF600 600W
Cyberpunk 2077 is currently scheduled for an April 16th, 2020, release.

Thanks WCCFTech

 

SliChillax

Member
I know people will jump to conclusions about a shit port, but to me that doesn't sound all that bad. They still have plenty of time to optimize the game let alone Nvidia drivers specifically for Cyberpunk will improve it even more. If you're gonna push graphical boundaries, even the most expensive gpu will get smacked hard and I like that.
 
It's running at 1080p because of the raytracing.

Turn off raytracing at it would run at 4k on that GPU no problem.

And that's exactly my problem with current raytracing. I can have raytracing but at the expense of 4k. That's a hard pass for me. Will wait for RTX 4000 or even 5000.
 

TeamGhobad

Banned
i highly recommend upgrading to a Xbox One X or a PS4 Pro. No way the baseline consoles can handle anything above 720p and with limited crowds.
 

ZywyPL

Banned
They still have plenty of time to optimize the game let alone Nvidia drivers specifically for Cyberpunk will improve it even more.

Like they had time with WC3? Or Ubi with WD and Division? It's not gonna happen, you would've tough that yes, within time they will optimize the code much better, but there are infinite PC configurations and at the end result they just cut everything down and call it a day, instead of leaving those absurdly heavy options for future hardware...
 

Agent_4Seven

Tears of Nintendo
I'm starting to wonder if I should wait for 4080Ti (or AMD's equivalent) to play this game maxed out in at least 1440p 60 instead of shitty 1080p with a shit ton of alliasing and other nasty looking stuff which for sure will ruin the whole experience for me. I mean, granted, we'll playing a game and not GFX, but man 1080p looks like shit now 😩
 

CyberPanda

Banned
I'm starting to wonder if I should wait for 4080Ti (or AMD's equivalent) to play this game maxed out in at least 1440p 60 instead of shitty 1080p with a shit ton of alliasing and other nasty looking stuff which for sure will ruin the whole experience for me. I mean, granted, we'll playing a game and not GFX, but man 1080p looks like shit now 😩
You don’t need to wait that long. You can easily do two 2080 SLI
 

Agent_4Seven

Tears of Nintendo
You can easily do two 2080 SLI
We don't know for sure how release version of the game will behave and how well it'll be optimised. Plus, you don't need 2080, x2 1080Ti is a much better and cheaper option if you can get two exact same cards.
 
Last edited:
I'm starting to wonder if I should wait for 4080Ti (or AMD's equivalent) to play this game maxed out in at least 1440p 60 instead of shitty 1080p with a shit ton of alliasing and other nasty looking stuff which for sure will ruin the whole experience for me. I mean, granted, we'll playing a game and not GFX, but man 1080p looks like shit now 😩
Why not just play at 4K 60fps but rtx off ?
 
Ouch... As others have said they have time for optimisations but tbf each of cd project reds games have been impossible to run at launch with all settings active. Anyone remember ubersampling in Witcher 2?

It's clear that it's going to be a stunner of a game even without etc features though, I'm assuming the gameplay that is public didn't have that tech and it looked great and that was 1080p on a 1080 ti.
 
Top Bottom