• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

What's your image quality index?

What's your image quality index?

  • 2160p and q<0.9

    Votes: 4 11.4%
  • 2160p and 0.9<q<1.1

    Votes: 2 5.7%
  • 2160p and q>1.1

    Votes: 13 37.1%
  • 1440p and q<0.9

    Votes: 0 0.0%
  • 1440p and 0.9<q<1.1

    Votes: 4 11.4%
  • 1440p and q>1.1

    Votes: 2 5.7%
  • 1080p and q<0.9

    Votes: 2 5.7%
  • 1080p and 0.9<q<1.1

    Votes: 2 5.7%
  • 1080p and q>1.1

    Votes: 6 17.1%

  • Total voters
    35

Haemi

Member
Since most discussions about resolution are very subjective, i want to try a more objective approach here and do a comparison with the other people in here.

I don't want to put a graph on here and ask where you are on it. Instead i want to show you my calculations, so that you can actually understand them and give me your opinion on them.

First of all, the resolution of the human eye can be measured in arc minutes. A typical human can tell two dots apart, when their distance is bigger than one arc minute. One arc minute is one sixtieth of a degree. And because it is an angular distance it is independent of your distance to the objects.

Now, to get the distance of two pixels on a screen we need the screen height and it's vertical resolution.

Because the screen size is defined by its diagonal, we first need the proportional length of the diagonal line on a 16:9 screen:

diag = sqrt(16 * 16 + 9 * 9) = 18.3575...

and now we can calculate the pixel (center to center) distance:

pixDist = (screenSize * 9) / (diag * vertRes)

But what we have now, is the distance in inch, not in arc minutes. To get the angular distance, we need the sitting distance and the arctan:

pixRadianDist = arctan(pixDist / sitDist)

and now converting the angular distance from radians to arc minutes:

pixArcDist = (60 * 360 * pixRadianDist) / (2 * pi)

Now we can simplify the calculation by doing two things:

1. For numbers close to zero we can remove the arctan: arctan(x) = x
2. We can combine all constants into one

The result is:

pixArcDist = (1685 * screenSize) / (sitDist * vertRes)

Since we like it, when a big number means better, i'm inverting the formula and introduce the "image quality index" q:

q = (sittingDistance * verticalResolution) / (1685 * screenSize)

Take your calculator, plug in your numbers and tell us your (perceived) "image quality index":

q = (sittingDistance * verticalResolution) / (1685 * screenSize)

q smaller 1: Pixel get more and more noticable the smaller q gets. AA will only bring small improvements.
q bigger 1: Pixel start to blend into each other for higher q values (your eye is doing SSAA). Increasing resolution beyond q = 2 brings almost no improvements.
q equal 1: Great experience with a good AA solution.




For example in my living room:

55" 4k (2160p) and 157" inch sitting distance

q = (157 * 2160) / (1685 * 55) = 3.66

=> I need a bigger TV for 4k. I would like a q value of 1.2 - 1.5. But i don't think going beyond 85" will be an option.

My PC as another example:

28" 4k (2160p) and 40 inch sitting distance

q = (40 * 2160) / (1685 * 28) = 1.83

=> Awesome picture. AA not necessary at 4k.
 
Last edited:

Kuranghi

Member
So since I got ~1.8 with my figures, are you saying I need an 130" 4K TV at my sitting distance of 96" (8') to get the full benefit of 4K?

What exactly are you getting at with this? Not knocking it I just don't understand what we are supposed to draw from it.
 

cormack12

Gold Member
4c9.jpg


2160 and 0.47?
 
Last edited:

Haemi

Member
So since I got ~1.8 with my figures, are you saying I need an 130" 4K TV at my sitting distance of 96" (8') to get the full benefit of 4K?

What exactly are you getting at with this? Not knocking it I just don't understand what we are supposed to draw from it.

With good AA everything above 1.0 is not really necessary for a good experience and you would get more out of the image by getting a bigger screen or sitting closer.

I just have the suspicion that most people don't have the right conditions for 4k and Publishers and Studios should concentrate more on 1080p60 or 1080p30.
 
Last edited:

GymWolf

Member
1440p at worst, 1800p\4k if i can play with the settings or there is dlss (or the fake 4k on console), and i play at 1mt to 1.4 meter away from my 55" oled, am i a big boy or what?!
 
Last edited:
Since most discussions about resolution are very subjective, i want to try a more objective approach here and do a comparison with the other people in here.

I don't want to put a graph on here and ask where you are on it. Instead i want to show you my calculations, so that you can actually understand them and give me your opinion on them.

First of all, the resolution of the human eye can be measured in arc minutes. A typical human can tell two dots apart, when their distance is bigger than one arc minute. One arc minute is one sixtieth of a degree. And because it is an angular distance it is independent of your distance to the objects.

Now, to get the distance of two pixels on a screen we need the screen height and it's vertical resolution.

Because the screen size is defined by its diagonal, we first need the proportional length of the diagonal line on a 16:9 screen:

diag = sqrt(16 * 16 + 9 * 9) = 18.3575...

and now we can calculate the pixel (center to center) distance:

pixDist = (screenSize * 9) / (diag * vertRes)

But what we have now, is the distance in inch, not in arc minutes. To get the angular distance, we need the sitting distance and the arctan:

pixRadianDist = arctan(pixDist / sitDist)

and now converting the angular distance from radians to arc minutes:

pixArcDist = (60 * 360 * pixRadianDist) / (2 * pi)

Now we can simplify the calculation by doing two things:

1. For numbers close to zero we can remove the arctan: arctan(x) = x
2. We can combine all constants into one

The result is:

pixArcDist = (1685 * screenSize) / (sitDist * vertRes)

Since we like it, when a big number means better, i'm inverting the formula and introduce the "image quality index" q:

q = (sittingDistance * verticalResolution) / (1685 * screenSize)



TL;DR:

Take your calculator, plug in your numbers and tell us your "image quality index":

q = (sittingDistance * verticalResolution) / (1685 * screenSize)

q smaller 1: Pixel get more and more noticable.
q bigger 1: Pixel start to blend into each other (your eye is doing SSAA)
q equal 1: Great experience with a good AA solution. More is not really necessary.



For example in my living room:

55" 4k (2160p) and 157" inch sitting distance

q = (157 * 2160) / (1685 * 55) = 3.66

=> I need a bigger TV for 4k. 200" would be ideal...
Hmm, with this calculation you personally would get q = 1 with a vertical resolution of 590p as well. Can we really draw meaningful conclusions from this?
 

Kuranghi

Member
With good AA everything above 1.0 is not really necessary for a good experience and you would get more out of the image by getting a bigger screen or sitting closer.

I just have the suspicion that most people don't have the right conditions for 4k.

Okay, but what does my number tell you about my particular setup?
 

Haemi

Member
Hmm, with this calculation you personally would get q = 1 with a vertical resolution of 590p as well. Can we really draw meaningful conclusions from this?
When i'm watching a youtube video at 720p sharpness is the same as at 1080p. Only difference i notice are the comression artifacts
Okay, but what does my number tell you about my particular setup?
1440p would be enough for you. Go for performance mode.
OP sits 13 feet away from his TV? WTF.
Big livingroom and i did not want to put the sofa in the middle of the room...
 

Mithos

Member
If I understood it correctly...

q = (80 * 1080) / (1685 * 50) = 1.0255

So I guess (1080p and 0.9<q<1.1).
 
Last edited:
When i'm watching a youtube video at 720p sharpness is the same as at 1080p. Only difference i notice are the comression artifacts

1440p would be enough for you. Go for performance mode.

Big livingroom and i did not want to put the sofa in the middle of the room...
Hmmm OK. I'm 118 inches away from my 65" TV, which gives me an "optimal" resolution of 928p. I think anyone with two eyes could spot the difference between <1080p and 4k on my TV, I sure know I can. In movies, Youtube and games.
 
Last edited:

Kuranghi

Member
1440p would be enough for you. Go for performance mode.

You are saying there are diminishing returns between 1440p and 2160p for a 65" 4K TV being viewed from 96"?

I just can't agree, there is a clear difference for me between even 1800p and 2160p with my conditions. Let alone 1440p and 2160p.

When i'm watching a youtube video at 720p sharpness is the same as at 1080p. Only difference i notice are the compression artifacts

Sorry to be blunt but if this is how you feel then: You aren't close enough to the TV, your upscaler is complete crap and/or you need glasses. Thats double the pixels, there should be noticeable difference.

I don't think your math makes any sense, as others have already pointed out above.

What exact TV model do you have?
 

R6Rider

Gold Member
Sorry to be blunt but if this is how you feel then: You aren't close enough to the TV
This. 13 feet away from a 55 inch TV is insane. Generally you can sit further away for movies and tv, but for gaming it's even a bigger issue.

I find the inch to feet "algorithm" to be nearly perfect. On my 27" monitor I sit just over 2 feet away. For my 55 inch TV I sit between 5 and 6 feet away. Essentially sitting one foot away for every 10 inches of the screen when gaming.
 

Reizo Ryuu

Gold Member
I'm generally leaning back so at UHD it would be:
(45x2160)/(1685x55) = 1.04882654438
For 1080p it would be :
(45x1080)/(1685x55) = 0.52441327218

I use both and 1080p is noticeably blurrier, but not enough for me not to pick it for higher framerates.
 

killatopak

Member
I’m not reading all that.

Just give me 1080p at least and ideally 1440p with all bells and whistles at minimum 60fps at most.

4k is too wasteful for me and the difference I can perceive is not worth the price or framerate.
 
Last edited:

Kuranghi

Member
This. 13 feet away from a 55 inch TV is insane. Generally you can sit further away for movies and tv, but for gaming it's even a bigger issue.

I find the inch to feet "algorithm" to be nearly perfect. On my 27" monitor I sit just over 2 feet away. For my 55 inch TV I sit between 5 and 6 feet away. Essentially sitting one foot away for every 10 inches of the screen when gaming.

Its funny you say that 10inches for each foot thing, because I always say I wouldn't want to be closer than I am and I thought I was 8 feet away, but I got a new chair not that long ago and didnt remeasure and I just did and its actually 7' I'm at now because its closer than before.

So that lines up with your thinking, if I was more than a foot closer than I am now I'd start to lose the instant noticing of things on the edges of the screen.

OP, imo at 13 feet viewing distance you should be using a 65" minimum but preferably a 75" if you want to get the full benefit of 4K.
 

GrayDock

Member
2.09 = (107 x 2160)/(1685 x 65) which I think is good overall, but sometimes I sit a little closer (85") due some dumb small font.
To get q=1 I would need a 137" that I think it would be too big.

The rtings calculator says that a 70" would be ideal for my distance, and I agree with them.
 

Amey

Member
How come the 'sitting distance' value is in the numerator? I'm supposed to experience better image quality by siting farther away from screen?
Seems to me this Q is a measure of screen door effect rather than image quality.

My Q = (24" x 1080) / (1685 x 24") = 0.64
 
Last edited:

Miles708

Member
42''and I sit 2 meters, which is 78 inches in that absurd measurement system of yours. Also 1080p, so the result is around 1.19. So, in your system I'm ok?
 
Mine is (98x2160)/(1685x55) = 2.28
I sit 2.5m from my 55 inch lg cx. And with a ps5 on it, it looks gorgeous. But i play ratchet and clank rift apart on rt performance and this is more like 1440p ish so.
(98x1440/(1685x55) = 1.52. fine by me ! Looks absolutely amazing!
 

MetalRain

Member
27" 4K display roughly one meter away (39 inches) so q is (39 * 2160) / (1685 * 27) or about 1.85

I don't know how much resolution would be too much for my eyes detect, but I think using these calculations it would be at least q > 2 maybe even 3 or 4.
 
Last edited:

Fbh

Member
55" minimum. It will be glorious! The prices on quality tv's has really come down in recent years.
Yeah, used to have a 55" Samsung (JS8500). Really liked it.

Then I moved back to my home country and the thing was too big to take with me so I sold it. Have spent the last year with other priorities so I just used my old 32" set, but I'll be going back to 55" as soon as I see some good deal.
 

Haemi

Member
How come the 'sitting distance' value is in the numerator? I'm supposed to experience better image quality by siting farther away from screen?

Yes, because the pixels are getting "smaller" and this looks better than having to look at huge squares.
 
28" 4K monitor, normal desk sitting distance. I notice all the detail and 1440p and below look like ass to me. Tons of people on here say 1440p is basically indistinguishable from 4K and I strongly disagree.
 

Haemi

Member
28" 4K monitor, normal desk sitting distance. I notice all the detail and 1440p and below look like ass to me. Tons of people on here say 1440p is basically indistinguishable from 4K and I strongly disagree.
The reason for this is, that 2160 is not dividable by 1440. So the image gets blurry. And even when it would be dividable, i don't think there is integer scaling in most monitors or gpu drivers. So you still get a blurry image.
 
The reason for this is, that 2160 is not dividable by 1440. So the image gets blurry. And even when it would be dividable, i don't think there is integer scaling in most monitors or gpu drivers. So you still get a blurry image.
Yes, but doesn't the resolution vary by game? I have no control over that. A game might be 1440p and then upscaled to 4K. Upscaled 4K never looks as good as native 4K.
 

Haemi

Member
Yes, but doesn't the resolution vary by game? I have no control over that. A game might be 1440p and then upscaled to 4K. Upscaled 4K never looks as good as native 4K.
You can compensate for this by sitting further away. That's hardly possible on a PC and imo this is the reason higher resolutions only really matter for PCs.
 

DaGwaphics

Member
This seems a bit rigged in favor of desktop users. A 24" 1080p at about 2.5 feet nets me a .8. If I've done that right at all.
 

Bo_Hazem

Banned
Don't know, but I'm like 2 meters away from a 55" 4K HDR tv and would happily upgrade to 8K in the future and 12-bit or more. Happy with 4K now, could easily notice it's inferior to 8K. Still can watch native 1080p though.
 

rofif

Can’t Git Gud
1,07
48" 1 meter away, 4k. Maybe closer than 1 meter if I play pc games. 1 meter laid back with controller.
Absolutely cannot see pixels and jaggies. Looks fantastic. of course 27" 4k windows text looked like vectors but I could not see all the 4k benefits at 27"
 
Last edited:
Top Bottom