Where Are The High Resolution Displays?

“I can’t think of any other piece of computer hardware that has improved so little since 1984.”

Batteries.

What a non sequitur. Video has improved where it’s mattered, just think about CGA graphics and the primitive graphics modes of some years ago. Why don’t you complain about still using a mere three primary colours? We should be up to 30 by now. Or why don’t you complain sound, or the number of keys on a keyboard?

Looking at long-term trends, you’re missing the causes of those trends and drastically missing the near-term trend.

Yes, computer displays have been stuck at about 75dpi for a very long time, only recently advancing to 100dpi and beyond. Why?

  1. Too many pixels. A 600dpi display would be awesome to behold. However, given the standard DVI connector and a 60-75Hz refresh rate, the max size of that beast would be about 5" diagonal (don’t even get me started on the VGA limitations we lived with until about seven years ago!). Now we have dual-DVI connectors, of course, but even that only brings up the max screen size to 10". Given a choice between crystal-clear 10" and good-enough 45-40" displays, a significant portion of the market would choose the giganto-screen. And, honestly, you get a lot more usable data on that than on a 10" printed-paper-quality screen.
  2. Dumb OSes. Until OS X Tiger (prototype) and Leopard on the Apple side, and until Vista on the Windows side, having a DPI of twice the “norm” meant that all your screen targets were half the size. Having a 600dpi screen would mean your on-screen square-inch icon target becomes one-36th the size (6x resolution == 36x reduction in area). People can’t deal with targets that small, and until very recently OSes couldn’t reliably provide larger targets.
  3. Increased processing needed for larger screens. Obviously it doesn’t matter if you have a 30" 200-pixel-wide screen or if you have a 20" 2000-pixel-wide screen here. However, given that the average person will be able to view significantly less information on the 20" screen relative to the 30" screen, the extra processing power needed isn’t offset by added utility.
  4. Disappointing lack of human evolution. The constant here is human eyesight and hand/eye coordination. That hasn’t changed at all. Yes, 600dpi is ideal, but 75dpi is “reasonable”. The quality increase with increased dpi moving from, say, 100dpi to 200dpi is significant, but not so large that it can outweigh the negatives above.

4 isn’t changing any time soon, but 2 and 3 have already changed drastically, and 1 might see further improvements in the near future.

Looking at near-term trends, we’ve gone from 75-ish dpi to 100dpi fairly slowly, and went from 100dpi to 133dpi pretty darned quickly.

Now, obviously given point (1) we’re more likely to see 200dpi screens in less-than-gargantuan proportions in the next few years. But, honestly, if you’re using a 30" monitor to read type at 6 or 8 points, you’re a little off your rocker.

Overall, in the average-size screen range, what’s not holding us back is the ability to pack LCD pixels in tighter. We can make 200dpi screens. What is holding us back from that is that there aren’t enough people willing to buy the higher-dpi screens, primarily because of (2) above.

I love my 1920x1200 17" laptop (133dpi?).

Most desktop LCD’s look blurry when forced to use them. The last company I worked for wanted to purchase 19" LCD’s. I could not find higher res units available through normal channels at any price! Why are these not available? Therefore we bought on price.

I don’t believe 133dpi is enough yet, and hope my next machine is higher

That said I don’t recommend high rez displays to just anyone right now, only power users. As stated by other you have to know how to make windows XP and a lot of websites jump through hoops. Hopefully the new versions of windows and macs address this.

Some applications don’t even work properly (under XP). The one that comes to mind first is Google Earth! The driving direction box is a fixed dpi causing the bottom row of fields to barely show up. Making changes to system settings to simply run different software is not fun.

I’m not a fan of the wide format displays. I believe a more square ratio screen has more functionality for the majority of users outside of niche activities such as movie editing and programming.

In a related argument I once operated a small portrait studio on the side (I’m a left brain guy by nature… so it was a challenge). When going digital from medium format film cameras, the long format was really awkward to compose with. How do you tell Nikon or Canon to make a 4:5 format digital camera for it’s pro users?

I think the important thing we need to focus on is that we need choice. I have a Dell Inspiron 8600 with the WUXGA screen and I love the little thing to death. Yes, some people can’t read type that small. Some people don’t want their OS windows to scale. But I do, and there are more like me. Some people have bad vision, I have 20/10. Do you know how much it pisses me off that I can’t even buy a good screen like what I want? Do you know how much it pisses me off that I can’t buy a 19" 1600x1200 LCD for my desktop?

And yes, I am a gamer. I don’t understand all those whiners that complain about low resolution gaming. The LCDs usually upscale anyway, and frankly they do a good job of it. UT2004 still looks fantastic at 1280x800 rezo on my laptop, World of Warcraft looks great, and I’m sure a lot of other newer games would look great too if they actually ran on my (now) puny Mobility Radeon 9600.

Somewhat off-topic, but:

As I read your latest post I couldn’t help but think of an analogy with respect to music synthesisers. Not one manufacturer today builds a keyboard bed with polyphonic aftertouch, despite the fact that this feature is highly desired by true performance musicians who value the expressive power of poly AT.

Even KORG’s top of the line, no feature unimplemented, ultimate keyboard, the $8000 OASYS, does not feature poly AT. The reason always seems to be, it’s too expensive to justify the limited market. New model keyboards will sell without it. The “less good” monophonic aftertouch that they have is “good enough”.

I’m wondering if Hi-DPI displays are in a similar position.

The original mac’s screen was as close to 72 dpi as possible since that number is special in typography.

http://lowendmac.com/lowendpc/tech/pixels.shtml

Great post Jeff. You really opened my eyes as to why my laptop display looks sooo much better than my desktop display. As I type this, I’m running a 15.4 inch Asus G1 laptop with a 1680x1050 resolution that’s hooked up to a 17 inch LCD monitor at 1280x1024.

Needless to say, the picture on the laptop blows away the picture on the external monitor. At first I thought it was the resolution + glossy screen, but your post makes it clear that its actually the DPI that makes the difference.

The original mac’s screen was as close to 72 dpi as possible

Thanks Andrew and others who pointed this out; as I mentioned in the original post, no CRT actually uses the entire edge-to-edge diagonal size of the tube, there’s always a border. I updated the post to assume the 1984 CRT borders are about 5% which makes the Mac DPI 72.

Tom Dibble made an excellent point about software limitations in scaling DPI. Here’s what Vista does to address that:

http://www.istartedsomething.com/20061211/vista-dpi-scaling/

I understand the new release of OS X “Leopard” will also have DPI scaling, but I can’t find any screenshots of it in action:

http://www.appleinsider.com/articles/06/10/24/resolution_independence_in_leopard_confirmed_by_apple.html

See http://daringfireball.net/2007/06/high_res_macbook_pro for a comparison of current Apple offerings. 200 DPI ThinkPads have been around for years, but haven’t been popular.

It’s a chicken-and-egg issue, and one that’s changing. People don’t buy high-res displays, because the software is designed for low-DPI displays. If you use one of those 200 DPI ThinkPads, you have to either have really good eyesight, or set all your fonts to huge, plus have a web browser that will scale images up.

Apple has been telling developers for a few years now that they need to start migrating away from bitmaps and resolution assumptions. And, from what I understand, Microsoft wants people to use a resolution-independent API that’s new with Vista.

The iPhone and high-DPI MacBook Pro are only Apple’s first volley. Once OS X 10.5 is out, I would expect Apple to introduce more high-DPI devices in quick succession.

One more thing…

I have a friend who just replaced his ancient 200 DPI ThinkPad with the new MacBook Pro. When he bought the ThinkPad, it was pricey, but not $20k pricey. He’s been holding on to the ThinkPad for years because, once he got it configured right, he didn’t want to give it up.

The growth rate is considerably less than even you have indicated. In late 1991, I bought my first computer (first with my own money anyway). It had a 14" (13.3" viewable) screen because that was pretty much all I could afford, but I got a pretty expensive ATI graphics card. I ran at 1152x864 resolution on that itty bitty display. According to TAG, that’s a DPI of 108, better even than your 2004 Apple Cinema HD display.

Basically, in the last fifteen years, the cheaper monitors have done around 96 DPI, and the more expensive monitors have done 110, lately up to slightly over 120, with few exceptions.

Those numbers were true 15 years ago. They’re true now. Based on our growth rate, I’d expect mainstream 200 dpi monitors, oh…never. At least not in my lifetime.

I do hope that the growth rate accelerates, but I’m not counting on it.

I read a post up aways about computer hardware lacking support for high resolution in regards to gaming…by the time 200 DPI is standard hardware and software will be leaps and bounds ahead of where we are now…Plus look at the rate of change in resolution over the years and compare it to the advances in processing power and memory then draw a conclusion on what the problem will be?

Perhaps mobile phone displays will drive high dpi development. I believe that the Openmoko is projected to use a 280 DPI display, and I dont think its being custom made for it or anything.

“To reach 200 DPI, that same 15.4” laptop display would have to pack in 2560 x 1600 pixels. Imagine a 30" Apple Cinema HD display shrunken by half, and you’ll get the idea."

The size is measured diagonally so going from 30" to 15.4" isn’t shrinking it by a half.

Going from 30" to 15.4" is reduction of around 75% of the display area - a 50% reduction on both the horizontal and vertical axis.

[)amien

Yesterday I’ve read Jakob Nielsen’s “Designing Web Usability” (http://www.useit.com/jakob/webusability/). He notices that reading from screen is much more tiresome than from the printed page because of screen blinking (this problem has been eliminated in LCD) and its lower resolution.
I think that switching to 200 DPI (or even better, to 300 DPI) monitor can seriously improve person’s efficiency. Imagine working with your code when the characters look like they have been printed on a laser printer.
There is an ‘electronic paper’ technology developed at E Ink Corporation (http://www.eink.com/), but unfortunately, by now they only can afford maximum a 9.7" display (http://www.eink.com/products/matrix/High_Res.html), and yes, they are grayscale only.

Creating an actual panel with 200 DPI is not a problem. Think about it: 200 dpi is really 600 dpi (counting the RGB components separately). That means a single sub-pixel is 0.0254/600 = 0.0423 [mm], which is enormous, given that it’s 10,000 times larger than the features on a modern CPU.

Sure it’s a different technology, but the point is that making smaller pixels is not the bottleneck. It’s far harder to get the other bits working: A 21", 16:9, 200 dpi screen means driving a whopping 65 million sub-pixels. With a 60 Hertz refresh rate, this requires a data-stream of 31 Gbit/s. That’s completely out of the question with analog connections, and the digital standards are only emerging now. Even the latest HDMI spec defines a clock of “only” 340 MHz, which results in bandwidth of 10.2 Gbit/s. Still not enough for a reasonably sized 200 dpi screen.

According to the calculator, my laptop has a resolution of 168 dpi which would explain why some things on it look particularly crisp.

For most things, I find that portrait orientation is much better and so normally have my LCDs rotated. The growing popularity of widescreen makes this harder as I find 1200 to be the minimum useful horizontal resolution.

So now we have 3 days of articles about why you think clear type is better? Seriously get over it you’re sounding like a child who’s upset people didn’t agree with him the first time.

Regarding if ClearType is better or not:

I think it’s fairly obvious that aligned vertical/horizontal lines are more distinct with ClearType. While I’m all for font designers handling this themselves: correct me if I’m wrong, but I don’t believe that the 9-point Arial font is specified separately from the 10-point Arial font, right? And I also don’t believe that the 9-point Arial font on a low-resolution (say, 100dpi) display is specified differently from the 9-point Arial font on a high-resolution (say, printed) display. Which, to me, says that font designers, as wonderful and intelligent as they are, can not hope to make up for display pixelation. Right?

In any case, there is still the argument that ClearType makes some text less readable, not more readable. An obvious example was given (then pooh-poohed by a few other commenters) in the first posting here: when the under-baseline parts of “g” and “j” get absorbed into the underline because both get aligned to the same pixel row, you definitely lose readability. Of course there ae solutions to this: don’t vertically hint below the baseline, or move the default underline so that it can not align with below-baseline lines, or make underlining additive instead of masking (ie, if the underline coincides with another line the pair “thickens”). But, as best I can tell, ClearType doesn’t do that yet.