Welcome to the Post PC Era


But until the iPhone and iPad, near as I can tell, nobody else was even trying to improve resolution on computer displays

There’s a reason for this = pixel density above ~100ppi is irrelevant on desktops and laptops. These devices are viewed typically 25" from your eyes, which is double the viewing distance of tablets! At this distance, one can’t distinguish individual pixels and a typical 92ppi 24" monitor qualifies as “Retina”.

Even the low ppi on HDTVs are fine. Since you’re sitting typically 6-8 ft away, your eyes can’t distinguish pixels on 1080p 50" display and thus these devices also qualify as “Retina” too.

So relatively, the iPadNew really didn’t achieve anything astounding in terms of display technology – in fact, computers and TVs have had “Retina” display for years. iPadNew just brought tablets up to speed.


I think a lot of commenters here are missing a couple of key points. First, “Post-PC” doesn’t mean that PCs disappear, it only means that the world ceases to revolve around them anymore. Everything you create assumes an audience of tablets and phones, and you only limit yourself to a PC when you have to. Second, the retina display isn’t about just being nicer - it crosses a threshold. Like crossing the threshold from really cheap to free, it removes a perceptual barrier.


Luigi wrote:

Actually, the original Droid had a 265ppi display, which was released in October 2009.

Yes - for resolution 854×480


Are calibration tweaks possible on the ipad3 as mentioned above?


To flip the coin over, I am so glad the tablet era has arrived. So much easier for lots of simple things. Email, movies, books, news and most of my little games—on the tablet.

Still. I think it’s too soon to declare the desktop dead. Maybe dead for being exciting and new, but still very useful and very relevant for most of the work being done in industry.

That said, the pc can be killed by a few feature additions:

  • Really fast and complete tablet docking stations
  • Really effective voice to text
  • Better gestures to do more complicated work

All these exist, but they are not of high enough quality yet. Make high quality version of all three and suddenly the pc is unnecessary.


Wojtek Swiatek: “The graph, although interesting, tries to compare devices which are not comparable.”

Agreed, but even more so, I’m really skeptical that you can draw any real conclusions from that graph (let alone the conclusions the author makes) based on the differences in time periods measured.

The Applie ]['s starting point is 1977, and the Mac’s starting point is 1984. You can’t compare the speed of market penetration for expensive and exotic personal computers in 1977, when nobody really know what a personal computer could do and there was no established software industry, with the potential of selling a much less expensive and well-understood device to a marketplace of highly tech-savvy people.

You may as well put the automobile and television set on that chart too - you’d find that the curves are much less steep than the personal computer, even though those products were absolutely revolutionary and became ubiquitous to our everyday life. America is just a different consumer environment now than it was when the PC was new, or when the TV and automobile were new.

Specifically, America in the last decade is far quicker to learn about and embrace new technology of any kind than it was receptive to computers in 1977-1984. The whole marketplace is way too different to assume that the difference in the curves can be attributed to the factors called out in the article.

Overall I liked and enjoyed the article and the comments, but that really stood out as a poor interpretation of data.


Since this is a “programming” blog, I’ll comment on “PC”.

To me, as a “programmer”, the PC represents a “programmable computer.” This is the origin of my attraction to it.

The iPad is not a “PC” because it is not directly “programmable” - you need to get a “PC”, more specifically an Apple “PC” to create a program for the iPad, then distribute this program to the iPad thru a strictly regulated application signing and distribution mechanism controlled by Apple. Have done it, loads of fun. Fascinating that the hardest part of writing an iPad app is getting the thing digitally verified and running on the iPad.

I find the iPad an amusing device for “consuming” media, as a “programmer” i tend to spend more time producing software.

Now if there were some sort of verbal compiler for the iPad…

as Shakespeare may have put it today :

For integer I equals zero. I, less than one thousand? I, plus plus!


Your posts are always so insightful. I love reading your blog.


Lets have a look at what I use my computer for:

Can I do this with an ipad? yes
Can I do it a fast as on a PC? no

Can I do this with an ipad? no

Can I do this with an ipad? no

Am I living in the post PC era? no


…You’re telling me that you bought two oversized and overpriced iPod Touchs (because, let’s face it, that’s exactly what the iPad is) just because they have a high resolution?


Is this one of those “the PC gaming is dead” kind of speeches?

These devices can not fully replace a PC unless they have such a flexibility when they can be called PCs themselves. Not in the strict technical term, but in the original sense of the word. Once I hook up some decent input devices to an iPad and run some sensible operating system on it (iOS doesn’t qualify), than I have no fear to call it a Personal Computer.


so, I assume you wrote this post on your new iPads, right?


I have to disagree with you here in some respects. The tablet is very useful, been waiting for decades to see them come to fruition. Apple however hasn’t yet won the game, nor are they likely to. Their operating system is piss-poor, especially in the UI department. (seriously? fields of infinite icons?) Methods of language inputting is a nightmare (on screen keyboards can GO TO HELL >:|)

I type at between 80 and 100 WPM, I’m a writer and programmer. I talk at maybe 30-40 WPM so even if we get perfect speech to text it would be a pain. Not to mention punctuation and grammar through speech to text is a double pain.

Also, I game, and I’m sorry the iPad nor any android devices out there are quite powerful enough to handle my favorite games. This is more a symptom of lazy and inefficient coding than anything else, but that has become standard.

I admit, completely, that the ipad 3’s display is amazing, crisp, clear, and high DPI. I’ve been waiting for a display like that for eons. However that isn’t enough, they do not do enough with it. Not to mention trying to use your finger to touch many of those small buttons is a trial of patience. Not everyone has fingers that small!


To use only DPI/screen resolution as the sign of “Post PC Era” is a little bit restrictive. IF you allow to consider other criterias, you suddenly will see the iPad as a loser and Asus Transformer Pad Infinity as the winner (as of today) in so-called “Post PC Era”. I made a comparison of these 2 devices and new iPad 2012 is losing badly (and I even did not include the overheating problems into the consideration):



I partially disagree. At least with the terminology.

What does “Post PC” mean? The fact is that now your phone is a “PC”. More than “Post PC”, we are in the era of “Pervasive PC”.

Said that, I believe that in the world there is still space for conventional computers. Both the client devices and the servers hosting the services need to be programmed somehow. You do not want call it “PC”? More than welcome, let’s call it “Workstation” then.
Moreove, I do not believe that people doing CAD will move to tablets and touch-screens anytime soon: they still need large monitors and a precise pointing device (and fingers are not).

On the other hand, I think that everything that my mom needs is a tablet to do whatever she does with a PC today. I.e., writing some emails, reading recipes and checking what other people are doing on Facebook.

To go back to the point of your post, many things are “dead” reading the blogs these days: minframes are dead, fortran is dead and even C is dead (in the words of a friend of mine). They are not. They just represent a smaller slice of the market and they work very well in their niche. The fact that the market share of these products is smaller is not due to the fact that less people use them, but just a lot people more use other (newer?) technologies. The number of Fortran programmers did not decrease with time, there are just more PHP programmers around.


“they achieved their mission years ago”…Right.
So, what do you think they use most in Somalia? Vaio or Apple?

Jeff I usually read your posts with great pleasure but it makes me sad that even smart, educated people like you mistake (Western Europe + USA) as “the world”.
Let me remind you: 80% of the Earth population is NOT living the same way you are. I can’t be bothered to look up numbers right now, but a very very large percentage of those doesn’t own a computer, and a sensible amount of these people wouldn’t even know what to do with one.

"When was the last time you saw a desktop or a home without a computer?"
Every day, everywhere, here in Lebanon, although it is a fairly rich country (richer than other under-developed countries). In the capital, Beirut, a good bunch of people own a desktop, but still, in many cases, they don’t. Outside the capital, you sometimes have one desktop per street, and people knock on their neighbor’s door to use it, or go to network centers.
In these countries, most email checking and communication goes through mobiles, but not tablets mind you, and not iphones either.
And if they own another machine, it HAS to be all-purpose because there just isn’t enough money to get yet another box for playing and another to do office work. That one desktop has to able to cater for the daughter’s school research needs, the son’s office work, the father’s porn browsing and the mother’s facebook, plus the occasional game or specialized software install.

So next time you plan to get all prophet on us and make a prognostic, be sure to specify you are talking about your own geographical surroundings, because obviously you have no idea how the rest of us live.





I think this is the “second” post PC era. The first was after the fad with cheap home computers—the Vic 20, the Apple II, the PCjr, ah the days! They went away “when people got tired of boxes that go ‘bing’”.

The second PC era began with web/multimedia/etc.

Perhaps there will be a third PC era—I don’t know, because of in-home manufacturing through the follow-on to 3d printing? Telepresence that doesn’t have the problems of teleconferencing today? Will we all lock ourselves in our homes Asimov-style?


Todd Vance: "I think this is the “second” post PC era. The first was after the fad with cheap home computers—the Vic 20, the Apple II, the PCjr, ah the days! They went away “when people got tired of boxes that go ‘bing’”. "

I have to disagree with this. Yeah, once the novelty wore off, there was a burnout in the early 1980s on the absolute lowest price point computers that didn’t really do much (the Timex Sinclair and some of the TRS-80s come to mind, maybe the TI-99), but of the computers you describe above only the VIC-20 might qualify as a computer that just goes ‘bing,’ and the VIC was still pretty capable for its time. The Apple II was an extremely versatile computer that spawned the first spreadsheet program and some of the first word processors, and with development was a viable competitive product into the 1980s. The PC Jr. had its issues, but the very similar Tandy machines were successful, as obviously was the full-strength IBM PC and the dissimilar but still very capable Commodore 64.

You seem to be implying that PC buying hit a big trough after the early 1980s, until the second PC era began in the early 1990s with the Internet and multimedia, and I just don’t think that’s true. It’s not as if people purchased most early PCs as a fad, and subsequently got rid of computers altogether until the 1990s. Instead, they upgraded. People who bought VIC-20s may have replaced them with a Commodore 64. People who bought IBM machines may have replaced them with 286s, then 386s. People who bought Apple IIs may have subsequently moved on to Macs. Customer adoption of the PC just kept building through the 1980s as technology improved, but that doesn’t mean that the early PCs were a fad.

I do agree that the web and multimedia accelerated the process, though. Access to amazing new quality and quantity of content really helped increase adoption.


I apologize, but I had to stop reading at your “3-points-bulletin’” list.

Any IT-literate person cannot possibly desire to be living in a world with devices that are highly specialized to perform ONLY basic tasks and by design lack the capability of expansion (for any purpose that is).

How can you possibly advocate this kind of opinion?

…I feel inclined to conclude this comment with a very cheap but compelling citation from the realm of /b/ or /g/ by I am bigger than that.