• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

new 4K monitors @ CES: Will it become the next PC standard?

1-D_FTW

Member
I'd rather see them in a projector. Knowing that $1k 1080p DLP projectors are $5 away from 120 Hz and you can do nothing about it is ANNOYING AS HELL. See, it makes me use caps and italics in the same sentence!

Haha. What really burns is that BenQ projector has a VGA port! A VGA port! WHAT ARE YOU GUYS DOING TO GEORGE? THESE PRETZELS ARE MAKING ME THIRSTY! Ditch the VGA port and at least go with D-DVI if you don't wanna spring for the extra 6 bucks!
 

Dynamic3

Member
Seeing as virtually every game is designed in 16:9, don't you actually lose real estate on the sides when using 16:10 as they are in a sense expanding the image to fill the extra vertical space, pushing the sides out of view?
 

Mononoke

Banned
I doubt 4k will be the standard anytime soon. For a lot of gamers, the 1440p 27 inch monitors have been too expensive. This has led to a very slow adoption of it, in favor of 1080p. Unless 4k can be at $500-700 range for 27 inch in the next 2 years, I'm not seeing it. At best, if 4k starts to flood the market and is pushed as the new standard for all TV/ monitors, then you might see 1440p prices drop, which will lead more people to buy that.

Even then, we aren't considering that 99% of monitors with resolutions that high don't offer 120hz (well some Korean monitors can be over clocked to hit close to that) - and there has been a shift in PC gamers favoring smoothness/response time over resolution. So until major companies can sell 1440p/1600 27-30 inch monitors at $500 that can do 120hz, I don't see there being a new standard over 1080p anytime soon.

Also 4k resolution will do a number on most graphic cards, thus alienating it further from becoming standard.
 

onQ123

Member
The only thing really holding 4K back is content & standard inputs , it's not as hard to push as some people think.

The Nexus 10 has a 2560 X 1600 screen & it's a $399 portable tablet.
 

mkenyon

Banned
Yep yep. The new standard is here, and it is 120hz. If it isn't offering that, it will not be adopted by enthusiasts.
 

Somnid

Member
The only thing really holding 4K back is content & standard inputs , it's not as hard to push as some people think.

The Nexus 10 has a 2560 X 1600 screen & it's a $399 portable tablet.

Yeah it is. Rendering software, especially more intense console/PC graphics is not feasible. Even multi-GPU systems can choke on 1600x2560 let alone something double that. Video is generally fine but bandwidth is not. HDMI won't even do 4K@60Hz at current standards, 4K 3D isn't even possible over displayport.
 

1-D_FTW

Member
Yeah it is. Rendering software, especially more intense console/PC graphics is not feasible. Even multi-GPU systems can choke on 1600x2560 let alone something double that. Video is generally fine but bandwidth is not. HDMI won't even do 4K@60Hz at current standards, 4K 3D isn't even possible over displayport.

Current standards? Even the new 300mhz HDMI (that nobody seems to care about) can only do 4K@30Hz.
 

MrBig

Member
Seeing as virtually every game is designed in 16:9, don't you actually lose real estate on the sides when using 16:10 as they are in a sense expanding the image to fill the extra vertical space, pushing the sides out of view?

If you're referring to the topic in the title, PC games can be played at nearly any resolution, there is no "designed" resolution.
 

1-D_FTW

Member
HDMI 2.0 can according to wikipedia.



Yeah, kinda gross.

I wasn't even aware there was an HDMI 2.0. Hopefully it's not as lousy and backwards thinking as the first standard was. But, hey! Encryption so people couldn't make VCR copies of their movies they were watching! Score. I take it all back. HDMI ruled.
 

Mononoke

Banned
I just think given the prices being thrown around, and looking at the fact that, most people today think $600 is too much for a 27 inch 30inch monitor, I just can't see 4k becoming a standard in the next 2 years. Unless the prices drop really low. And again, most gamers are now opting out for higher refresh rate, over resolution. Seriously, I can't tell you how many people I know that will not buy an expensive monitor, unless it does 120hz. Doesn't matter how nice the picture quality is, or resolution. If it doesn't have the refresh rate for gaming, a lot of people won't touch them. And while I'm not the voice for the entire gaming community (I wouldn't be so bold) - my experience is that the general feeling among gamers, is that 120hz is a must for PC gaming (where graphics and performance is pushed beyond the console counterparts). I personally have not bought into this (I like the higher resolution over the response time) - but I can understand why this is a popular view. Why would you spend all that money on a nice GPU - that can push over 100 fps, and have a 60hz monitor?

So it all comes down to price, and refresh rate. I just don't see people opting out of 1080p for 4k right away. Then again, what do I know. Definitely not a business major. The fact that, all major companies/retailers sell the cheapest 2560x1440 27inch IPS for $700 - and yet, Korean models that are just as good (and even better, as they can go higher in refresh rate) - are much cheaper? So the hell I know.
 

Durante

Member
I wasn't even aware there was an HDMI 2.0. Hopefully it's not as lousy and backwards thinking as the first standard was. But, hey! Encryption so people couldn't make VCR copies of their movies they were watching! Score. I take it all back. HDMI ruled.
That reminds me that if we go beyond HDMI 1.4 I'll need a new HDCP stripper :(
 

onQ123

Member
Yeah it is. Rendering software, especially more intense console/PC graphics is not feasible. Even multi-GPU systems can choke on 1600x2560 let alone something double that. Video is generally fine but bandwidth is not. HDMI won't even do 4K@60Hz at current standards, 4K 3D isn't even possible over displayport.

So you just ignore the fact that I said


"The only thing really holding 4K back is content & standard inputs , it's not as hard to push as some people think.

The Nexus 10 has a 2560 X 1600 screen & it's a $399 portable tablet."
 

mkenyon

Banned
Why would you spend all that money on a nice GPU - that can push over 100 fps, and have a 60hz monitor?
Lower input latency.

Games poll input based on frames rendered. If you are pushing 60fps, that is 16.7ms of latency. If you are pushing 120fps, that is 8.3ms of latency. Every bit counts where you can eliminate it.
 

Mononoke

Banned
Yeah, it's a common misconception. Glad you did say it, because now you know why!

Thanks for clarifying! I'm pretty self-aware of my own ignorance and lack of knowledge in certain things, and would never pretend to know something I don't. So I'm always open minded about learning from others. So appreciate it.
 
CES has just started, and already some 4K ish monitors have been revealed.
This is possible due to same new LCD technologies that has made high res mobile screens possible (Nexus 10, iPad etc.)

clipboard05d1bx9.png





Additionally they introduced some very expensive ones like the ViewSonic VP3280-LED ($20.000+)

Resolutions have barely increased from the 1600x1200 CRT Monitors available at the start of this gen (with 1440p&1600p Monitors only beeing around 1% of Steam users)

What do you think will happen on average mid-end next gen?
  • PC gamers playing at 1440p or higher
  • PC gamers playing at 1080p with more demanding graphic options
  • PC gaming getting cheaper with games running on tablets/All in ones/mini-PCs etc



Edit 1: Official Press Releases with more info
added price
Edit 2: Detailed Specs of the sharp monitor (japanese)

Suprised it's still this expensive. 4K monitors have been around for a while in the world of medical radiology.
 

MrBig

Member
what about refresh rates and color depth?

Haven't advanced any of that shit.

standardizing all panels to be real 8bit and factory calibrated, as well as higher prevalence of 120hz+, would be fantastic but most consumers don't know or care about those options.
 

jmdajr

Member
standardizing all panels to be real 8bit, as well as higher prevalence of 120hz+, would be fantastic but most consumers don't know or care about those options.

wow, are we that far behind? Not even real 8-bit per channel?
 
Question... does anyone game successfully on a 30+ inch monitor? I can't imagine so... at that point you have to physically dart your eyes everywhere to see what's going on... It's the same reason I don't get multiple displays in games... Is it really more immersive to play if your eyes aren't even going to see most of it for the vast majority of gameplay?
 

jmdajr

Member
Question... does anyone game successfully on a 30+ inch monitor? I can't imagine so... at that point you have to physically dart your eyes everywhere to see what's going on... It's the same reason I don't get multiple displays in games... Is it really more immersive to play if your eyes aren't even going to see most of it for the vast majority of gameplay?

I think for monitors main advantage would be graphic work. I mean they shoot film at 4k, and cameras go way up in megapixels. I don't know if you need it to be 30 inches, but I am sure the extra pixels help.

I guess picture in stuff would be cool too. But who is going to be making games run at that rez? You know, that actually play optimally.
 
Id rather just have a 24" 1920x1200 monitor that is lag / blur free and has true black level

4K resolution is way to much for my system to handle
 
I think for monitors main advantage would be graphic work. I mean they shoot film at 4k, and cameras go way up in megapixels. I don't know if you need it to be 30 inches, but I am sure the extra pixels help.

I guess picture in stuff would be cool too. But who is going to be making games run at that rez? You know, that actually play optimally.

Resolution I can understand, but even in the opening post it mentions 30+ inch monitors and I just don't got it. Before switching to my TV as my gaming PC's monitor (which sets a good 5-6 feet from the couch) I was using a 22" monitor and even that at times could be difficult to track while playing certain games (mainly RTS games, shooters you can usually stay pretty focused in the center of the screen and not miss much)
 
JordanKZ said:
The graphics card required to push 4K is magnitudes higher than 1080p. So right now? No. A few years? Probably.
As a guy doing stereoscopic gaming I've done quite a bit of 1920x1080x2 on mid-range graphics cards. I'm sure the beefier cards could double that resolution and not sacrifice effects/frame rate too much. Especially since once could skimp on the AA a bit.
 

zoku88

Member
Yep, the majority of mid to lower end consumer monitors are 6 bit + FRC.

Which is really sad, given that my monitor, from 2007, has 8-bit color depth...

doesn't the asus pa246Q (or something like that) have 10-but color depth or something?
 

x3sphere

Member
Question... does anyone game successfully on a 30+ inch monitor? I can't imagine so... at that point you have to physically dart your eyes everywhere to see what's going on... It's the same reason I don't get multiple displays in games... Is it really more immersive to play if your eyes aren't even going to see most of it for the vast majority of gameplay?

I game all the time on my 30", feels more immersive than gaming on my 42" TV, but that is mostly due to the resolution bump I think. I can see what's going on just fine. Is it hitting the upper limit of what my eyes can handle, yes... I don't think I'd go past a 30". Maybe a 32".
 

MrBig

Member
Which is really sad, given that my monitor, from 2007, has 8-bit color depth...

doesn't the asus pa246Q (or something like that) have 10-but color depth or something?

it is 10-bit, yes. The aRGB colorspace isn't widely supported/used for anything besides specialized purposes though; 8-bit panels cover the sRGB standard, or 10-bit do with sRGB emulation.
 
I was gaming on a 1920*1200 wide screen from Dell before the 360 even launched. Don't feel like I need a higher res than 1080p though (I would murder for a 4k projector though. Imagine that shit in your front room)
 
Somepeople think 4k(3840x2160) is faraway or almost impossible with current hardware. But that's not true, right now we only need the 4k monitors.

Mass Effect 3 at (3840x2160) 4k if you want

Not impossible, but a GTX 680 is pretty much top-end gear, maybe something like 5% of PC gamers have hardware like that. And with a more graphically intensive game like Far Cry 3, the results won't be nearly that smooth.

I'm sure 4K will be standard at some point, hopefully sooner rather than later, but I'd be surprised to see it pass up 1080p in adoption in the next couple of years. If you look at Steam stats, the trend is actually away from resolutions higher than 1080p.
 

zoku88

Member
it is 10-bit, yes. The aRGB colorspace isn't widely supported/used for anything besides specialized purposes though; 8-bit panels cover the sRGB standard, or 10-bit do with sRGB emulation.

That's true.

Can you even output 30-bit color on consumer gfx cards w/o driver hacks?
 

dr_rus

Member
Yep yep. The new standard is here, and it is 120hz. If it isn't offering that, it will not be adopted by enthusiasts.
There is some truth in this. If we're talking about 24-30" display sizes I'd rather go with 120Hz refresh rate instead of 4K resolution.
 

xenist

Member
Not impossible, but a GTX 680 is pretty much top-end gear, maybe something like 5% of PC gamers have hardware like that. And with a more graphically intensive game like Far Cry 3, the results won't be nearly that smooth.

I'm sure 4K will be standard at some point, hopefully sooner rather than later, but I'd be surprised to see it pass up 1080p in adoption in the next couple of years. If you look at Steam stats, the trend is actually away from resolutions higher than 1080p.

The thing with PCs is that there's not "This can't be done" there's "Wait until next year."

And resolutions being stuck on PCs happened because there are few monitors being made at above 1080p. Once the industry make up their minds about 4k the race will begin again.
 
The thing with PCs is that there's not "This can't be done" there's "Wait until next year."

And resolutions being stuck on PCs happened because there are few monitors being made at above 1080p. Once the industry make up their minds about 4k the race will begin again.

Sure, I just think it's still a few years off. Especially once we get new consoles and no longer have the luxury of "free" IQ and framerate because our games are being built for 7 year old hardware. Jumping from 1080p to 4K is going to be a pretty massive performance hit that I don't think most people will want to take (until it becomes cheap enough to be a no-brainer).

And there are plenty of monitors being made >1080p today... the mainstream is just not adopting them very quickly. Maybe the mobile resolution arms race will accelerate the process, but at the moment the trend seems to be toward lower-resolution laptops, things are pretty stagnant/shrinking for higher-resolution desktop displays (based on the Steam HW survey, anyway).
 
Top Bottom