The Infinite Space Between Words

Computer performance is a bit of a shell game. You're always waiting for one of four things:


This is a companion discussion topic for the original entry at http://blog.codinghorror.com/the-infinite-space-between-words/

Warning: Old Movie Spoiler Alert

I just saw (again) “Star Trek: First Contact” and at the end, after Data has begun being outfitted with “real” skin, but then deciding to give up his “gift” in order to save the day, says:

Data: [about the Borg Queen] She brought me closer to humanity than I ever thought possible. And for a time, I was tempted by her offer.
Picard: How long a time?
Data: 0.68 seconds sir. For an android, that is nearly an eternity.

4 Likes

That is very interesting to compare CPU cycles to our own concept of time. I will have to remember this when I am waiting on my phone to go from one app to another. What seems like minutes (most likely less then 20 seconds) must be an eternity to the CPU.

I think the times in the AT&T chart are round-trip. Right now, from Brooklyn, NY, I’m getting ~80ms ping times to www.stanford.edu, which I’m going to presume in in Palo Alto (the last host on the traceroute is a Palo Alto Hurricane Electric router).

Speed of light is under 14ms, so 35-40ms one-way is plausible with less-than-c propagation and router delays.

1 Like

Some other interesting related links from Twitter:

It’s interesting that the Norvig numbers are a bit different, in inconsistent ways. Norvig’s numbers are on the right here:

  • CPU cycle: 0.3 ns, 1 ns
  • L1 cache: 0.9 ns, 0.5 ns
  • L2 cache: 2.8 ns, 5 ns
  • Main memory: 120 ns, 100 ns
1 Like

I rather consider the seemingly infinite amount of processing our minds are capable of and how much work and effort would actually occupy a machine’s mind in all that infinite space?

should maybe be

“To humans, those computers work at an irrationally simpleton low level, near completely inept. Which is completely mind-bending. The faster computers get, the less reasonable this disparity seems.”

We are only beginning to get computers successfully to do some of the things humans can do almost without thought. They still have a long way to go. Maybe they will eclipse us some day, but consider how much greater this “performance” difference between humanity and machine will have to grow to close the gap?

As my colleague repeatedly proclaims: “Computers are stupid!”

.68 seconds would be fast for my Android.

But consider the converse. Assume there is another intelligence in the universe, who is to us, what we are to the microprocessors. Lightyears across. Takes centuries to utter one syllable. Maybe creating devices out of proteins and nucleic acids and sending them to those tiny dots known as “planets”. Maybe we are the artifacts, as in artificial … intelligence.

Radiolab did a great episode on just this idea. I highly recommend it:

1 Like

Strangely I can sympathize with the machines. Those spaces between words are the reason I have trouble with lectures, speeches, podcasts, etc.

The “how far away” picture shows travelling to Andromeda galaxy as only 1000 times longer than travelling to Pluto, which is ridiculous. Straight line distance to Pluto is about one thousandth of light year, while distance to Andromeda galaxy is about 2.5 million light years, so it’s about 2.5 billion times farther from Earth than Pluto. So basically the author of the picture sees no difference between fifty feet and circumference of the Earth.

Please don’t write distance in miles, you have an international audience. If you really must, then please add the equivalent in meters in parenthesis.

1 Like

Since computers already know how to multitask and buffer data from slow input channels, I don’t really see how or why AIs would have a problem with human speech.

I do agree, though, that “everyone walking around whispering to their AIs” is a stupid idea.

If we can convert from km to miles when reading international web sites, you can just as easily convert from miles to km when reading American sites.

The more I study AI & machine learning the more amazed I am of the human brain. Google ran a computer vision learning algorithm for 3 days on something like 60k cores. The algorithm watched every single youtube video which a person probably couldn’t do in a lifetime. The result… A state of the art vision algorithm that comes nowhere near human levels of performance. Even though the computer used was incredibly fast it only simulated like a million neurons. that’s .001% of the human brain’s one hundred billion neurons. This is only considering simple neurons. Just recently neuroscientists have discovered that dendrites (the inputs to the neuron) perform computations. Before they just thought that the soma (neuron cell body) did the computation. Considering that there is an average of 10,000 dendrites per neuron this .001% could be way off. It could be more like .000001%. Suffice it to say… Computers have a long way to go.

I have done considerable work in this area already regarding the the difference of the speed of thought of humans, AIs, (and any kind of life form) and how that difference affects the ability of one to measure the L8-IQ of another. The L8-IQ Scale (IntractableStudiesInstitute.org/communications/L8_IQ_Scale.pdf) has Meta rules (like the modeling rules), Pre-requisistes for the scale, and then the actual L8-IQ scale. In order to measure the L8-IQ of an AI, or any life form, the 2nd meta rule is that the measurer and the measuree have to be at the same or near same speed of thought. This does not mean that a slower-thinking life form has less L8-IQ, it in fact could have more and higher-quality IQ, but if the speed of thought is too different, the slower one may not be able to determine the L8-IQ of the faster life form. We should not assume that a faster speed of thought implies more intelligence.

This means that the human “think-brain” may have some difficulty evaluating the L8-IQ of a much faster AI. However, the human “feel-brain”, being much faster than the human “think-brain”, may have a chance. [I model human brains as 2 distinct processing types, think and feel, with the feel-brain being much faster]. When I complete Project Andros at the Institute and copy my mind into the android robot, I will be able to provide an answer from experience, half-ways there now.

–>Pat Rael, Director, Intractable Studies Institute

1 Like

He was probably referring to traditional spinning rust hard drives, so let’s adjust that extreme endpoint for today:
Latest fastest spinning HDD performance (49.7) versus latest fastest PCI Express SSD (506.8). That’s an improvement of 10x.

Jim was referring to storage latency, but you compare sequential read speeds.

I took numbers from storagereview.com for 4K 100% read 16 threads 16 queue test, avg latency:

1800ms — 7200rpm hdd
4ms — average ssd
0.34ms — fastest pci-e ssd

Overall improvement is from 450x to 5000x, which gives 10 million and 1 million miles, it’s about 1/6 of the distance to Mars and 4x distances to the Moon.

1 Like

Can we please finally get rid of the imperial system in the US? It wan’t such a big deal in Ireland, why would it be in the US…

Funny that, I often wait in the GPU/Graphic card do complete.

1 Like

To all of you who are referring to the superior computational performance of the human brain based on the number/layout of the neurons:

Please consider, that the human brain is a product of a trial-and-error based process, but computers are products of literally generations of engineering. While certain parts of the human brain is capable of complex ballistic computations needed to calculate the flight of a non-spherical ball through the windy air, those parts can’t be used to perform similar computations to determine the course of a guided missile. While the human brain is capable to store a life length of audiovisual information in an associative manner, it fails to recall where did you put those damn keys seconds ago.

Please also consider that the human brain has no fast access interface. Our fastest input port is our vision and it only has the throughput of 10 MBps, not to mention our poorly designed output ports. I definitely can type faster than I can speak , but I can barely type 120-150 words (est. 1-1.5 kb) a minute (25 Byte/s).

1 Like

Probably because the US has 46x more people and – even taking Alaska out of the mix – 96x more land.

Something as simple as replacing all those speed limit and “distance to city” signs would be incredibly expensive, for no appreciable gain.

Yes. This is one of those areas of which many programmers are oblivious. The other is the extreme that so many go to in order to avoid these fetch operations. IE, caching everything and then looping through it for every operation. The problem is that looping through 100,000 records in memory is more expensive than reading one from disk.

This could lead to a lot of other discussions that would bore everyone if I got into it. Suffice to say that reading data is not the only operation you need to be aware of. Stay away from extremes, there is no one shot solution. (IE, cache everything, or put everything in the DB, or everything on the disk). You need to use the best solution for your problem and it takes thinking.