View Single Post
Old 03-18-2019, 05:55 PM   #26
AnotherCat
....
AnotherCat ought to be getting tired of karma fortunes by now.AnotherCat ought to be getting tired of karma fortunes by now.AnotherCat ought to be getting tired of karma fortunes by now.AnotherCat ought to be getting tired of karma fortunes by now.AnotherCat ought to be getting tired of karma fortunes by now.AnotherCat ought to be getting tired of karma fortunes by now.AnotherCat ought to be getting tired of karma fortunes by now.AnotherCat ought to be getting tired of karma fortunes by now.AnotherCat ought to be getting tired of karma fortunes by now.AnotherCat ought to be getting tired of karma fortunes by now.AnotherCat ought to be getting tired of karma fortunes by now.
 
Posts: 1,547
Karma: 18068960
Join Date: May 2012
Device: ....
Quote:
Originally Posted by maximus83 View Post
I love it when I've read these articles of the type saying: the human eye cannot distinguish resolutions higher than X, or more colors than Y.

Uh, yes it can. :-) I absolutely grasp the theoretical principles behind why you shouldn't, in theory, be able to tell the difference between an HD and a UHD movie when sitting more than about 4 feet from your TV. But I can--and everyone I know can. Same thing between a 200 DPI screen, and a 300 DPI screen. Or photos take at 12MP, and those take at 20MP. The difference is not hard to distinguish, particularly if you work in front of screens all day as I have for over 25 years.

Maybe everyone doesn't CARE enough about higher resolution to spend the money for it, and maybe it doesn't make a huge difference in the end--that I'll buy. That's a value decision. But people can tell the difference between 200 and 300 DPI, HD and UHD, and non-HDR and HDR pictures. I love the crispness on the 323 PPI screen of my Nexus 7, for instance, and can tell a clear difference on text clarity between that and cheaper Samsung and Amazon tablets at Best Buy I've seen in the ~200 PPI range.
If you reread my post you will see that I did not use the word "resolution" in any way at all associated with displays and I think that has confused you into thinking the angular resolution of the human eye is in some way related to "screen resolution"; they are, in fact, different things. I only referred to the angular resolution of the eye; and the fact is that while there is some small variability between people it controls our ability to resolve adjacent small points.

It has become practice to use the term "screen resolution" when referring to the pixel density of displays. Pixel density has no ability to control the angular resolution of the human eye, it is a matter of fact that the eye's resolution is angularly limited and that means (if one does the trigonometry) that individual pixels at the densities of the display examples I gave cannot be resolved by the human eye at the reading distances of those displays.

As I pointed out (without any reference to "screen resolution") that "Other aspects of the screen are more important to the clarity of the display for reading" and another poster has also made a similar point regarding quality.

You refer to TVs, I am not sure why because TV displays are of quite poor quality (note that when I say "quality" I am not referring to pixel density) compared to even middle range small device LCD displays from a name maker so do not bear comparison. Then if you are comparing 1080 to 4k as you are then the main thing you are seeing is the result of higher bandwidth available allowing better encoding, fewer artifacts, less compression, etc. If one has the opportunity to review professionally produced 1080 video uncompressed you may be surprised at how well it compares to 4k that you see on a TV set even though 4k has twice the pixel density.

You also refer to photography - I think informed photographers familiar with in camera sensor technology, in camera processing and digital post processing will be able to tell you that camera sensor resolution in pixels is not a good indicator as to the "resolution", as the eye perceives it to be, of the finished output (when the same camera optics are used). That applies also to the Raw output from the camera (even when no in camera compression of the Raw file is performed by the camera software). In fact photography is one area where being mesmerized by megapixels does not inflict informed users in the way it does with uniformed users of consumer grade products - a very simple example is the ready acceptance of the informed camera community to 24megapixel full frame (35mm) sensor cameras (say) generally being better than the much smaller APS-C sensor but still 24 megapixel sensor cameras despite the smaller sensor having a much higher pixel density.

If your claim as to your working for over 25 years in front of screens has any relevance to ones knowledge of display technologies then you will have to concede that I am considerably more informed than you as I have spent many more than 25 years in front of displays of various technologies in many services (including industrial applications and the more common desktop applications).

But again I am not claiming that pixel density cannot be important, just that despite it being disputed by you it is actually a physical fact that for the examples I gave in my earlier post then at normal reading distances the eye cannot resolve the pixels. And that the quality of the display (and the processing before it) has much to do with how we perceive "resolution".

Last edited by AnotherCat; 03-18-2019 at 05:59 PM.
AnotherCat is offline   Reply With Quote