The PC is Over

I can’t help replaying this clip from Portlandia in my head every time I hear someone proclaim something is “Over.”

Jesus christ.
No… no it’s NOT over.
A NEW form-factor is NOT the end of existing form factors.
Not EVERYONE who has a computer will buy or even CARE to buy a smartphone.

Ugh, the idiot doomsayers have now jumped to the tech realm. facepalm

" Our phones are now so damn fast and capable as personal computers that I’m starting to wonder why I don’t just use the thing I always have in my pocket as my “laptop”, plugging it into a keyboard and display as necessary. "

Tell me that when this devices, as well as laptops, last more than a year or two. The duration of laptops is so predictable ( that is, they broke after a few months once the warranty expired ) that I decided to buy instead a desktop for heavy use.

“What do you do when you have all the computing performance anyone could ever possibly need, except for the freakish one-percenters, the video editors and programmers?”

“Not EVERYONE who has a computer will buy or even CARE to buy a smartphone.”

But do you have reason to upgrade your computer anymore?

Thr Achilles heel of tablets (and even more do for phones) is input. I guarantee thst I can type faster and more accurately on a physical kbd than any but the most adept on a virtual kbd – and WRT voice input, don’t make me laugh! I’m "using"top-tier voice recognition s/w with one of the highest-rated mics, and it’s like hunt-and-peck typing in terms of speed and dictating to a not-too-bright 10-year-old in terms of accuracy.

Nothing will pry my physical kbds out of my hands until it doesn’t entail a 90% reduction in speed and accuracy. Oh, and auto-spell-correction? fuhgeddaboudit. No better than voice recognition.

P.S. Previous comment written on a tablet. I just can’t be bothrred to try to clean up every bit of slop the virtual kbd lays down, and on a real kbd I’m obsessive about accuracy…

Bigger desktop form factors are generally easier to repair and cool better. IMO one of the problems with laptops was, and is, a lack of good cooling, they get too hot. Laptops are harder to repair, and fail more frequently. Also I find my HTC Evo and incredible pain to use, but this seems to change between vendor’s, my samsung was not.

I think this is why the makers of Ubuntu have been trying to get their system to run in parallel to Android phones on their hardware. Imagine the phone you carry around being able to be dropped into a docking station and run 2 monitors and a keyboard/mouse on a full desktop OS.

I definitely need two or three monitors to get my work done, but increasingly it doesn’t really matter what form factor of device is plugged into them. One of my monitors is hooked to an E350 desktop case that could just as easily be an android device, as I mainly use it for logging into other systems and web browsing. Another monitor switches between being a secondary for a PC and running various console devices.

Small form factors are great for computing on the go and do a decent job of consumption. It will take a break through in interfaces and input controllers before we can actually be productive with a phone, but really its just air keyboards and projection onto the air to make a decent workstation. Or we just plug it in. Phones are still a very long way away from the performance of PCs despite their Mhz and core counts but CPU core power consumption is gradually decreasing to meet them

But people keep saying that current desktops are so fast that people can do anything on them but its simply not true. Imagine I gave you 100,000x the computing power, you and I would find new novel apps involving progressively smarter and more capable learning software. Algorithms we currently run in the “cloud” would be executable on the local machine at a lot less expense. The software you find on PCs today is what will run, there is after all no point writing a program that needs a computer that has computation power far in excess of what one can make.

What we are seeing is the end of the Mhz race as single threaded performance crawls to a meagre 10% gain every 2 years instead of more than 100% every 2 years off the back of Moore’s law. Further to that the core explosion stopped at quads because no one could utilise the additional cores and the memory bus couldn’t keep up. Modern CPUs are a cache with a small bit of logic on the side that represents the cores.

The tech industry is finding one way it can go is smaller and less power consumption, so they will pursue that until it too stops scaling. But don’t misconstrue the hardware scaling crysis for a lack of need for the performance, we can always do cleverer things with more performance, its still a concern we have to think about constantly as programmers.

You notice how MG doesn’t have comments on his blog? That’s probably because he doesn’t want anyone to call him out for hyperbolic crap. :slight_smile:

Tablets and phones really are great computers for couch consumption. I don’t really care for iOS (bigger Windows 8 fan every day), but I’ll freely admit that my iPad is a great device for catching up on RSS feeds and Web browsing. Like others who have commented here, I still need a “real” computer to write code, a novel or edit photos. I’m also thankful that I can do on a light and powerful 13" MacBook Air.

Where I really think that Siegler is wrong is this nonsense about apps. I so wish that Steve Jobs would’ve stuck to his guns (or not lied) about the Web being the app on the iPhone. So many apps could just as easily exist as browser-based apps that worked on every phone. The more apps there are, the more closed systems that come on line that you can’t use (let alone discover) without downloading it and hoping it runs on your model of phone. Apps don’t have hyperlinks to share with your friends.

I have seen this so many times: a new tech becomes available, and everyone goes saying: oh, look how cool this is, we don’t need anything else anymore! It’s the hype cycle of the emerging technologies, a widely known thing.

Yes, tablets are good for certain kind of things. Exactly, when you entertain yourself lying on the sofa and do nothing but surfing the Web and watching movies. Yeah, it’s great then. But why do people have to say that the tablet is a silver bullet? Just because you spend your day on the sofa, this doesn’t mean that everyone else does that. There are lots of things which are inconvenient to do on a tablet. Yes, you can accept the inconvenience and do that stuff on a tablet too since tablet is a PC to, but why do you have to?

Portable devices such as tablets and smartphones are great for consuming web content. But, they are absolute crap for creating content.

I have an Ultrabook and absolutely love it.

The battery lasts a loooong time, it’s super light, it has a large screen, a keyboard, multiple USB connectors, an SSD, wi-fi, and runs all of my Windows applications. I’m able to run Office, Visual Studio, Paint.NET, and SQL server.

You can give up your PC if you want. But, I’ll be happy to keep mine.

And, don’t forget…

You can’t build an iPad app with an iPad.

1 Like

If I dropped a SSD in it, do you honestly think you could tell the difference in real world non-gaming desktop usage between a high end 2009 personal computer and one from today?

Yes. But I’m a developer, and I notice my build speeds.

(Plus, gaming, which you explicitly bracket.

Gaming is real stuff - even for people who don’t buy $600 video cards.)

Also, Paul Keeble said: Further to that the core explosion stopped at quads because no one could utilise the additional cores and the memory bus couldn’t keep up. Modern CPUs are a cache with a small bit of logic on the side that represents the cores.

Did it? Last I checked both AMD and Intel ship 6-core designs, and Intel has 8s as well. They’re expensive and mostly aimed at the server market, but it sure seems like the idea that nobody could utilize them, or that it “stopped at quad” is untenable, given that people keep buying them and thinking they’re using them pretty well…

Memory buses also keep getting faster.

Can you give us a source for the idea that most of the content of a modern CPU die is cache rather than cores? It sure doesn’t look that way from pictures of, say, an Ivy Bridge i7…

In a word, Nope.
Again you’ve missed the ball here Jeff.

It’s a fairly easy ball to miss though. Yes the devices are becoming more common and PCs are becoming less common, but this has nothing to do with the PC and everything to do with who /was/ using the PC. For the past 20 years it has been everyone from every walk of life. You want to view the web? use a computer. You want to chat with your friends? use a computer. You want to write an email? use the computer. You want to watch a video? use a computer.
Most of those people now don’t need a desktop computer to do any of that, and they have no need for anything else.

But now I will make a very important distinction, a desktop is a PC, a small computing device that is for personal use… is a PC… Most tablets and phones are NOT PCs, in fact most people don’t even own them in reality. Sure they paid for the software and hardware, but they are licensing it. That iPhone you got? You are licensed to use the software on it, but if you go grab an android, you cannot then carry that license over and install the same software on the android device, you have to re-license it.

Many games I play I buy once, and then play on linux, windows, and mac… because I have a broad license. You are buying in to the gated community that is iPhone or android. It is not a personal product, it is a licensed gateway to your software. There are ways around this, jail breaking, rooting, and what not… but these are hacks, and difficult for the average person. Most of these people will never even know what they are giving up.

The PC isn’t dead, but it does need to catch up. Some day soon I’m sure we will have PCs we can assemble, just like our desktop, made from smaller components that we can swap out at a need. Where we can choose to put ubuntu, windows, iOS, Android, or whatever else on it we want… At least that is my hope… And we can take it with us where ever we go, then plug it in at home or work to our docks… with large storage systems and co-processors. Drop your tablet in, and suddenly you have a full on desktop, as powerful if not more powerful than we have today.

I cannot write without a keyboard, I cannot do my 3d design without a mouse. PC is not dead. But I do hope it is changing.

@BoltBait: “You can’t build an iPad app with an iPad.”

You almost can with a network connection and a VNC client for the iPad. A Mac somewhere running Xcode is still required though.

Or if your app fits into the model of one of those online services, you can build something via their web interface.

Pretty soon there will be a way to at least edit the user interfaces on an iPad, since that’s the most logical place to do it. Xcode/InterfaceBuilder was necessary because the iPhone/iPad didn’t exist yet. Now it’s a legacy app.

@Cryptnotic
SSH/VNC/whatever does not count, that still requires a desktop. :stuck_out_tongue:

I wonder what either you or MG Siegler worte your blog posts on? I hope it wasn’t a PC?

That is so sad!

Back in the days I started with computers, there were 2 kind of computer-people:
1 - “users” (developer, graphic artist, musicians)
2 - “lamers” (most of them gamer, just “consuming” software, usually games)

We never respected gamers as they were for cheap thrills and never respected the machines or the software which empowers the users to GODs (yes we were god on these machines!). They switched platforms as fast as something new was around(later mostly consoles: NES, SuperNES, Saturn,Playstation, etc. )

Now, with the advent of tablet/mobile computing peple can be breaked down in 3 categories:
1 - developer creator of software, a highly specialized task usually only applicable in a small niche
2 - creators (graphic artist, musicians)
3 - lamers (all people who are just consuming: facebooking, bloging is also part of this category)

Nevertheless these 3rd category thinks it is somekind of “IT-Professional” or “digital native” or such kind of nonsense.

But they are still the lamers from before: changing the platform without participating to anything but consuming software(nowadays called “apps”)

Tablets and Smartphones are like gameconsoles: you are in a walled garden, a golden cage.

That is not what our fathers(eg Steven Levy, John Perry Barlow, Richard Stallman, Jerome H. Saltzer, David P. Reed, David D. Clark. and many more) had fight for!

So even the most tech-savvy people throw away decades of war against big corporations to embrace the brave new world in a golden cage.

This is the future?

This is the end of independent software creators.

Linux created on an iPad?
GNU created on an iPad?
anything platform independant created on iPad?

You buy a cheap piece of silicon with hard coded functions embedded deep in proprietary. Not quite the same as you have standing on your desk:
A universal computing machine under your own power,

PS:
I do not start with environmental concerns (compare usage of average phone to a computer/laptop)…