Movies shot in Film are basically 4k in terms of detail. That is why old films like Lawrence of Arabia are spectacular 4k blu rays. Many modern hollywood films use 2k effects and are often made below 4k resolution.
In PC you're sitting about a foot or two away from the monitor. That's quite different from 5 to 6 foot away regards a TV.
That said while there is a difference between 4k and 1080p, the point is that 1080p still looks quite good.
Link to source image for bigger image
Again, the 4k looks better, but 1080p is still quite good.
Now here's the issue take the rtx3090 36 Tflops.
Have you heard of the xbox series x vs xbox series s? The series s can run the same games with similar assets just by dropping resolution.
Now 4k has 4x the pixels of 1080p, and if you go 60fps 4k that's 8 times the requirements of 30fps 1080p.
A game with mind blowing graphics that would take 80 Teraflops for ps5 to run at 4k 60fps, can easily run at 30fps 1080p on ps5. That is you can do far more effects and detail as if you had 80 Teraflops(rtx4090+), just with lower resolution. Designing for 1080p 30fps means the game can be almost beyond Next gen in graphics.
EDIT:
Again take quake 2, put it at 8k resolution. It will still look inferior to the latest doom at 720p.
You have to keep in mind, an 8 x reduction in resolution/framerate, effectively gives you a generational jump. As generations tend to do about an 8x jump. You're in essence seeing what would likely take a ps6 to run at 4k 60fps.
Like the Doom Eternal 720p vs quake 2 8k example, if the jump in graphics is big enough resolution doesn't matter.