• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

David Cage: Many Devs Will Prefer Ray Tracing @1080p Rather Than Limited Lighting @4K; We’ll Have a New Engine

Stop exaggerating. I have youtube on tv, and can also put ps4 games there, 1080p doesn't look that much worse than 4k. The ps5 is a different beast, the 4k videos do look notably better than the 1080p stream, but that might have had low bitrate.

I am not exaggerating. Assuming you have a 6" phone screen 144p will have approximately 20% more PPI than a 55" 1080p tv.

And wait, have you never even seen a game running natively at 4k? You are just basing this all off of youtube streams?
 
I am not exaggerating. Assuming you have a 6" phone screen 144p will have approximately 20% more PPI than a 55" 1080p tv.

And wait, have you never even seen a game running natively at 4k? You are just basing this all off of youtube streams?
Let's exaggerate with 144p, sure it's higher ppi. But again, I see videos in 480p and 720p on youtube in a 55inch screen, the 480p looks a bit blurry, but generally it looks fine.

1080p you can see the side by side comparison between 4k bluray and 1080p bluray

Here look at side by side pictures.

There's a notable improvement with 4k, but 1080p does look quite good too.
 
Let's exaggerate with 144p, sure it's higher ppi. But again, I see videos in 480p and 720p on youtube in a 55inch screen, the 480p looks a bit blurry, but generally it looks fine.

1080p you can see the side by side comparison between 4k bluray and 1080p bluray

Here look at side by side pictures.

There's a notable improvement with 4k, but 1080p does look quite good too.

Why are you comparing a movie? not to mention one that wasn't even shot in 4k.

I have a PC, I can literally toggle games between 1080p and 4k on the fly... It is a massive difference.

But if you think 480p looks fine. I think you just got straight up bad eyesight.
 
Why are you comparing a movie? not to mention one that wasn't even shot in 4k.

I have a PC, I can literally toggle games between 1080p and 4k on the fly... It is a massive difference.

But if you think 480p looks fine. I think you just got straight up bad eyesight.
Movies shot in Film are basically 4k in terms of detail. That is why old films like Lawrence of Arabia are spectacular 4k blu rays. Many modern hollywood films use 2k effects and are often made below 4k resolution.

In PC you're sitting about a foot or two away from the monitor. That's quite different from 5 to 6 foot away regards a TV.

That said while there is a difference between 4k and 1080p, the point is that 1080p still looks quite good.




Link to source image for bigger image

Again, the 4k looks better, but 1080p is still quite good.

Now here's the issue take the rtx3090 36 Tflops.

Have you heard of the xbox series x vs xbox series s? The series s can run the same games with similar assets just by dropping resolution.

Now 4k has 4x the pixels of 1080p, and if you go 60fps 4k that's 8 times the requirements of 30fps 1080p.

A game with mind blowing graphics that would take 80 Teraflops for ps5 to run at 4k 60fps, can easily run at 30fps 1080p on ps5. That is you can do far more effects and detail as if you had 80 Teraflops(rtx4090+), just with lower resolution. Designing for 1080p 30fps means the game can be almost beyond Next gen in graphics.

EDIT:

Again take quake 2, put it at 8k resolution. It will still look inferior to the latest doom at 720p.

You have to keep in mind, an 8 x reduction in resolution/framerate, effectively gives you a generational jump. As generations tend to do about an 8x jump. You're in essence seeing what would likely take a ps6 to run at 4k 60fps.

Like the Doom Eternal 720p vs quake 2 8k example, if the jump in graphics is big enough resolution doesn't matter.
 
Last edited:

Paracelsus

Member
This is one thing I will never get.
What is the point of real life lighting is the image quality takes you out of the game anyway?
It's like having a real lifeish character model with Nintendo 64 textures, horrible jaggies and moving at 20 frames per second, it makes no sense.
 
Last edited:

Neo_game

Member
Glad to read this. Resolution does make a difference but is a small factor when it comes to gfx. We went from 720P to 1080P which is 2x the jump this current gen. So 1440P and 1600P should be jump for next gen, they just need decent scaling technique. 4K is 4x the pixels and will definitely compromise the gfx.

Xbox SS is the worst jump in console generation when it comes to gfx. Devs should ignore this for good.
 
This is one thing I will never get.
What is the point of real life lighting is the image quality takes you out of the game anyway?
It's like having a real lifeish character model with Nintendo 64 textures, horrible jaggies and moving at 20 frames per second, it makes no sense.
Have you seen horizon on ps4? or ratchet and clank on ps4? The image quality is pristine. Next gen can use even more advanced AA techniques.

1080p is not 900p where you get severe issues on a 1080p or a 4k screen.
 

GiJoint

Member
1080p is too low on a decent sized 4K TV, graphics are noticeably more blurry. 1440p though? That’s a nice step up and allows that headroom to improve frame rate.
 

SlimySnake

Flashless at the Golden Globes
it doesnt have to be 1080p nor does it have to use ray tracing. i like David Cage, but UE5 has shown that you dont need ray tracing to have fully dynamic global illumination.

a lot of games are linear, especially his games and they dont need a full day night cycle anyway. you can always bake in GI in those games.

1440p 30 fps is the sweet spot.
 

Chiggs

Member
David Cage might be the worst game developer in the entire industry right now...and that includes Randy Pitchford. Cage knows full goddamn well he can't cut it in the film business, so he takes his shit ideas and makes them into interactive experiences so fucking dreadful, they somehow actually make me both thankful and appreciative of garbage like Death Stranding.

If I were to rank David Cage on a scale of 1-10, with 1 being the lowest form of garbage in the games industry (for instance, Digital Foundry) and 10 being the apex of creativity and excellence (Bloodborne), I would rank David Cage a 2, because nobody is as bad as Framerate Rain Man, his failed model friend, the old mothering whore Richard Leadbetter, and that guy who is never on camera (and therefore more tolerable).
 
Last edited:

MilkyJoe

Member
20/20 At 6ft of distance one's sharper, but the other one ain't blurry.

If I'm seeing something sharp and you're seeing it blurry, maybe it's you who needs an optician.

Stop exaggerating. I have youtube on tv, and can also put ps4 games there, 1080p doesn't look that much worse than 4k. The ps5 is a different beast, the 4k videos do look notably better than the 1080p stream, but that might have had low bitrate.

Errr... No
 

MilkyJoe

Member
I didn't sa
when you get to a monitor or tv, check the link with the bigger pictures. You'll see that while 4k looks better, 1080p is still quite fine.
I it wasn't, but it ain't indistinguishable from 4k like the graph tells us
 
I didn't sa

I it wasn't, but it ain't indistinguishable from 4k like the graph tells us
I don't think the graph claims it indistinguishable, just that the perceived additional detail might not be that significant at said distances and tv sizes. Perceiving it a bit sharper is not enough to justify it.
 
The subtext here is that while these new machines are powerful, they're not doing ray-traced-everything without resolution shenanigans. It's a tradeoff that we all saw coming.

That said, I agree with other posters who said animation will be just as important if not more important than lighting. It's really strange to think about it but we have -- through experience and daily interactions -- an expectation of how humans and objects move and interact with one another. One of the fastest ways for a game to break its immersion is for the on screen animation(s) to betray our expectation(s) of movement. For example, even though the next gen version of NBA 2K21 looks fantastic, there's still a noticeable stiffness, pausing, and 'stutter' to how the players move. So even though the players faces look accurate and the court is reflective as fuck, the way the players move breaks the realism in the graphics.

It's a tough problem to solve. But I believe the studios that can deliver the most accurate animations without a noticeable penalty to button inputs/player input/character response time, I think those studios will be the ones that deliver the most 'next-gen' games.
 

sinnergy

Member
Ray tracing is less is less labour intensive and the end result is better and more real. I worked for years on offline rendering , our render Vray ray tracing . Even with 1080p the results will be impressive in games .
 
Movies shot in Film are basically 4k in terms of detail. That is why old films like Lawrence of Arabia are spectacular 4k blu rays. Many modern hollywood films use 2k effects and are often made below 4k resolution.

In PC you're sitting about a foot or two away from the monitor. That's quite different from 5 to 6 foot away regards a TV.

That said while there is a difference between 4k and 1080p, the point is that 1080p still looks quite good.




Link to source image for bigger image

Again, the 4k looks better, but 1080p is still quite good.

Now here's the issue take the rtx3090 36 Tflops.

Have you heard of the xbox series x vs xbox series s? The series s can run the same games with similar assets just by dropping resolution.

Now 4k has 4x the pixels of 1080p, and if you go 60fps 4k that's 8 times the requirements of 30fps 1080p.

A game with mind blowing graphics that would take 80 Teraflops for ps5 to run at 4k 60fps, can easily run at 30fps 1080p on ps5. That is you can do far more effects and detail as if you had 80 Teraflops(rtx4090+), just with lower resolution. Designing for 1080p 30fps means the game can be almost beyond Next gen in graphics.

EDIT:

Again take quake 2, put it at 8k resolution. It will still look inferior to the latest doom at 720p.

You have to keep in mind, an 8 x reduction in resolution/framerate, effectively gives you a generational jump. As generations tend to do about an 8x jump. You're in essence seeing what would likely take a ps6 to run at 4k 60fps.

Like the Doom Eternal 720p vs quake 2 8k example, if the jump in graphics is big enough resolution doesn't matter.


Like I said, I have a PC. Here is a quick borderlands 3 pic I just took, both with identical graphics settings. If you fullscreen that and can't see a massive difference in image quality, then I really don't know what to tell you. Everything even remotely far away just looks like a blurry mess to me.

xwrpVo0.jpg


Also tflops are not comparable across architectures and the 3000 series tflops are especially bad.
 

sinnergy

Member
Yes, a final movie image could be composed of any kind of asset in different resolutions. If you are strict there are thattvmany full 4K movies with added cgi and even full HD native movies 🤣 fake it till you make it.

I do know that Transformers : revenge of the fallen , the IMAX scenes the CGI was also made in 4K.

You could say , that VRS for consoles is about the same as authoring a movie as this also offers different resolutions in the framebuffer.
 
Last edited:
He does have a point... the audience you can wow is far larger. Next gen games can still look incredible on a 1080p TV, and if you can push more effects and better lighting due to only resolution, I say do it.

The leap in everything else is massive, give me all of those other cool features, and push those hard. That is where the engagement is, not in the resolution, we are so close to diminishing returns.

I think I would rather have 1080p and solid FPS taking up only 30% of processing power, while the rest can be used for other amazing advances. Next to real destruction, better AI, more simulation and NPCs on screen, interaction, animation, and better physics.

Rather than Solid 4k with 30fps, taking up 70% of the same power and having much worse of everything else.
 
Like I said, I have a PC. Here is a quick borderlands 3 pic I just took, both with identical graphics settings. If you fullscreen that and can't see a massive difference in image quality, then I really don't know what to tell you. Everything even remotely far away just looks like a blurry mess to me.

xwrpVo0.jpg


Also tflops are not comparable across architectures and the 3000 series tflops are especially bad.
Opened in full screen, the 4k looks a bit better. ZOOMING IN the 1080p looks a bit blurry in the distance the 4k looks sharp. But again no one looks at these zoomed in, unless you sit like 1 foot away from the monitor, this is unlikely to be that noticeable.
 
Opened in full screen, the 4k looks a bit better. ZOOMING IN the 1080p looks a bit blurry in the distance the 4k looks sharp. But again no one looks at these zoomed in, unless you sit like 1 foot away from the monitor, this is unlikely to be that noticeable.

Agreed, nobody looks at games zoomed in. Luckily you shouldn't have to. As I can very easily see a difference from several feet away even without my glasses.

It is a massive difference, because it is the difference between feeling like I am just looking at something naturally vs there is something wrong with my eyes preventing them from focusing. And that is not even taking into account all the shimmering once motion is involved.

Serious question, when was the last time you went to an eye doctor? Because if this isn't some weird bias and you actually can't easily see the difference, it might be time for a checkup.
 
Agreed, nobody looks at games zoomed in. Luckily you shouldn't have to. As I can very easily see a difference from several feet away even without my glasses.

It is a massive difference, because it is the difference between feeling like I am just looking at something naturally vs there is something wrong with my eyes preventing them from focusing. And that is not even taking into account all the shimmering once motion is involved.

Serious question, when was the last time you went to an eye doctor? Because if this isn't some weird bias and you actually can't easily see the difference, it might be time for a checkup.
I can see the difference, it's just that it ain't massive. One's sharper, but the other doesn't look bad.

I've played plenty of ps4 games at 1080p, they look quite good. 4k looks better, but not some out of this world better. This is usually what most people say. That's why 4k blu rays sales aren't as high as they should be.
 
I can see the difference, it's just that it ain't massive. One's sharper, but the other doesn't look bad.

I've played plenty of ps4 games at 1080p, they look quite good. 4k looks better, but not some out of this world better. This is usually what most people say. That's why 4k blu rays sales aren't as high as they should be.

Well things looking sharper is very important for a lot of people. Most folks aren't happy with everything looking blurry, and once they are able to see things clearly, it is difficult to go back to everything being blurry. Thats why glasses are so popular.

And yes, most people say 4k isn't a big upgrade, because most people can't play games at 4k. I suspect with 4k gaming becoming more mainstream and accessible this next gen, a lot of folks will miraculously start changing their tune.

And again, movies are not comparable to games. Most of them are fake 4k or converted to 4k. Most gamers hate thing like film grain, chromatic aberration, motion blur etc. that try to make games look more "cinematic". But most importantly, unlike video games their source material (aka the real world) isn't internally rendered at 1080p, it is internally rendered at infinite resolution and downsampled to 1080p.
 

Iced Arcade

Member
Calm down guys, David is still over in the corner confused trying to figure out the difference between the big black box console and small white slate console that looks like a speaker.
 
Well things looking sharper is very important for a lot of people. Most folks aren't happy with everything looking blurry, and once they are able to see things clearly, it is difficult to go back to everything being blurry. Thats why glasses are so popular.

And yes, most people say 4k isn't a big upgrade, because most people can't play games at 4k. I suspect with 4k gaming becoming more mainstream and accessible this next gen, a lot of folks will miraculously start changing their tune.

And again, movies are not comparable to games. Most of them are fake 4k or converted to 4k. Most gamers hate thing like film grain, chromatic aberration, motion blur etc. that try to make games look more "cinematic". But most importantly, unlike video games their source material (aka the real world) isn't internally rendered at 1080p, it is internally rendered at infinite resolution and downsampled to 1080p.

720p looks blurry, 480p looks blurry. 1080p looks sharp, 4k looks sharper still.

What you're telling me is that if I sit at 7 feet, and get a 110 inch 4k tv, all of a sudden everything will look blurry. As that is about the same as having 4 55 inch 1080p tvs stitched together. So everywhere I look everything will look blurry. I'm better off with a 55 inch 4k tv than a 110 inch 4k tv.

Sadly that's not what most people think, people want 100+inch tvs, and many are going to sit 7 to 8 feet away. It won't look blurry because 1080p doesn't look blurry. Even if each 55 inch chunk of the tv has 1080p resolution it won't matter.
 
D

Deleted member 471617

Unconfirmed Member
I actually agree with David Cage here. 1080P with Ray Tracing at hopefully 60FPS. Hell yeah!!! Resolution is overrated. Give the great performance and that sweet Ray Tracing goodness!!!
 

Warnen

Don't pass gaas, it is your Destiny!
If playing on a 60 inch tv I want 4k, if I’m playing on a 15 inch laptop 1080p is good. 60 inch 1080 can blow me.
 

M16

Member
That's great news. 4K is a waste of resources. 1080/1440p should be the target for the best next gen visuals.

I look forward to his next game l enjoyed Detroit.
no, especially since consoles are mostly played on TVs.
1440p is sufficient on a computer monitor, but once you get into bigger TVs, there is a huge difference.
 
Last edited:

Reindeer

Member
To a those who are making fun of 1080p, can you really tell me that 1080p CGI movies from early 2000s like Finding Nemo or The Incredibles look worse than anything we have today running in 4K? The answer is obviously no, so Cage is right about advanced visual effects being much more important than resolution. People are comparing 1080p games we have today and think that's how 1080p game will look on next gen consoles, but obviously much higher textures, better image quality, better AA and advanced visuals will make 1080p much more bearable in the future, just like those CGI movies from almost 20 years ago. 1440p would be better yes, but 1080p can still look good under right conditions.
 
Last edited by a moderator:
Top Bottom