A flat screen is best for general use.
> It is recommended that the reference pixel be the visual angle of one pixel on a device with a pixel density of 96dpi and a distance from the reader of an arm's length. For a nominal arm's length of 28 inches...
It would seem that the reference pixel is simply 1/2688 of the typical distance between your eyes and the device. If a device is meant to be used at half the "arm's length" distance (14 in), the reference pixel on that device would be only half as large. If a device is meant to be used at 3x times the distance (84 in), the reference pixel would be 3x larger. Much easier than angular diameters.
I suspect I know why I was downvoted; I used unnecessary emotive language ("dumb") and didn't explain my point clearly. Most of the rest of the commenters were focused on one part of the article's point, which is very relevant -- the idea that a pixel is no longer a pixel, but a particular fraction of an inch of screen space. I was complaining about a different part, which is the article author's claim that the function mapping real pixels to CSS pixels is nonlinear (which I think is just a misreading of what the spec intended.)
Personally, I'm disgusted at the W3C standard. It's a great idea to have an angular measure (really great) but to call it a "pixel" is horrible. A pixel is the smallest controllable dot on a physical display, and nothing else. Call it an "aixle" abbreviated "ax" and short for "angular pixel" but don't overload the term "pixel".
Personally I've made peace with it. I say either "CSS pixels" or "device pixels", depending on what I want to express.
And although the high resolution screens have only made the difference between the two more visible, it was there since Opera featured full pages zoom many years ago, and when Mobile Safari introduced the "viewport" meta-tag in 2007.
I mean, think about it. Say you have a 600dpi printer. Take a typical web page that sets its body element to be 1000px wide, because the person writing it was using a 96dpi display. If "px" really meant "smallest controllable dot", that web page would print about 1.66 inches wide. Which is obviously undesirable. On the other hand, if "px" means "the length that looks about as long as one pixel on a 96dpi display" then the same web page would print about 8 inches wide, which is probably much closer to what both author and user wanted.
This is also exactly why Apple did the "pixel doubling" thing on iPhone 4 and iPad 3: it was done to prevent existing content that made certain assumptions about the visible size of "px" from breaking.
Screens are more accurately measured in PPI (pixels per inch) while the smallest elements a printer can produce (more akin to each of the 8 bit sub-pixels on a screen) are measured in DPI (dots per inch). Since ink is 1 bit more smaller elements (dots) are needed in some sort of dithered pattern to represent grays and colors.
Using halftone screening [1] the image elements are called lines and so a 600dpi printer is capable of producing 85–105 LPI (lines per inch)[2].
The lines per inch of print are more analogous to the pixels per inch of a screen than dots per inch are.
So, that 96ppi LCD and the 600dpi printer have around the same information density for practical purposes.
[1] http://en.wikipedia.org/wiki/Halftone [2] http://en.wikipedia.org/wiki/Halftone#Resolution_of_halftone...
To my knowledge, all desktop browsers ignore this spec and treat each pixel as a pixel. (This will likely change with the upcoming Retina Macbook Pros)
For a while all mobile devices treated all pixels as a pixel. But then iOS and Android devices began to dramatically increase their DPI. In the case of iOS, the math is easy, everything gets multiplied by 2 (though chasing pixel precision in a browser does still require hacks [1]).
Android is much more fragmented (go figure). System-wide, there is a DPI setting that influences the viewport pixel-size that the browser claims. For a 800x480 screen, a 1.5x multiplier is used. The browser advertises the mobile-standard 320px viewport width.
For the most part, this is good because websites are easier to design for and look roughly as designed on more devices. On ultra-high DPI devices, they even appear pixel-precise.
The problem is on the very common mid-dpi devices like the millions of 4" 800x480 devices out there. Pixel-level control is lost, and the pixels are large enough for this to be visible. Some people don't care about pixel-level design precision, some people do. Most people, though, will recognize that a webpage looks not-quite-perfect even if they can't put a finger on it.
We're almost out of the woods on phones as DPI is quickly approaching the upper 200's across the board. Unfortunately we're just entering it for non iOS tablets.
That's not ignoring the spec, though, it's following it. Where the device pixel is close to the reference pixel (as it is, on desktop browsers), the px measurement is supposed to represent one device pixel. See the CSS 2.1 spec: http://www.w3.org/TR/CSS2/syndata.html#length-units
Not really. Most desktop browsers support pixel scaling with Ctrl+Plus and Ctrl+Minus or user preferences, and they may even remember this setting for each domain.
So not only is a CSS pixel not always a device pixel, but it may be a different (fractional) number of device pixels on different websites.
Using absolute measurements to size things on a web page that is then viewed on a TV (viewing distance in the 5-15ft range), a tablet (viewing distance in the 1-2ft range), and an eyeglass HUD (viewing distance in the 1-3in range) would be a disaster.
The fact that people _were_ using inches and millimeters on the web and expecting them to somehow work across all these devices is why they're all now defined in terms of CSS reference pixels...