In my upcoming novel, Raider (releases on Tuesday), there is a background thread of just how fast computers are. The bad guys have one size machine, and the good guys…well, that’s part of the story. We’re talking about only a few dozen lines in the whole novel, but I thought I’d talk about what’s behind all that.
In a funny way, computers are very slow. I know. I know. Computers certainly seem to be fast…
That’s the year, I learned how to crash the Cornell mainframe computer from our high school dial-up 200 miles away. (They served a few hundred high schools on dial-up telephone lines with their insanely powerful IBM 360/165. You can read about it here:
It wasn’t intentional. I just asked a language capable of building a multi-dimensional numerical array [I still miss APL–short for A Programming Language] a question it didn’t know how to answer. Essentially, I asked, “What is a negative numerical space?” Not fractional, but logically negative.
The computer crashed…hard!
I hung up the phone connection:
The computer was gone for at least 4 hours (the school closed and we had to go home). Curious if I was the one who’d caused it (it seemed pretty unlikely), I tried it again the next day after school.
The computer crashed…hard!
I hung up the phone connection.
About three seconds later the phone rang. No one called into a computer’s phone. I answered with more curiosity than trepidation–until I heard the other guy’s tone, then the trepidation kicked in.
“What the hell did you just do to my machine?” They’d put a tracer in place for the next crash.
“Uh, I asked it to build me a negative dimensional array. Not a big one. Just -1 by -1 by -1.” I could have stopped at two dimensions, but if the computer could describe such a space, I wanted to be able to go there. Think of it as an extension of Edwin Abbott Abbott’s still brilliant 1884 classic Flatland.
“Hunh! That’s new.” The guy who could ban me from ever logging on again seemed somewhat mollified. “Don’t do it again for at least a day.”
“Okay.” I waited a week.
Man, was that a major letdown. This was the early days of computing when imagination still had a place all the way down at the systems level. Couldn’t the SysOp at least have inserted an error of “You can’t get there from here.” or “Entropy only travels in one direction.” (At least until the movie Tenet.)
It had still taken him hours and hours of work to reboot the computer after my second test.
1982 PC – Slow Boot to Nowhere
My first PC, a non-IBM-compatible NEC 8800, was an 8-bit machine for running CPM, but if I held down three keys for what seemed like forever, I could usually get it to bootstrap in a full 16-bit DOS machine on a $500 plug-in card. Then I was rocking and had amazing tools to play with like WordStar.
The full 16-bit boot took at least 4-5 extra minutes and had a 50-50 success rate for loading properly. If it failed, you waited another 5 minutes to be certain it failed, then you powered down, waited 30 seconds for energy dissipation out of the chips, and tried again.
Now, I tap my phone and it says, “Yeah? Wudda ya want, punk?” (My wife feels that it’s always lurking there, waiting for her, watching her–with attitude! It kind of creeps her out. I try not to tell her that she’s right. We don’t talk about Siri or Alexa in polite society.)
However, the computers you and I tinker with every day are NOT fast. Not even the banks of computer servers we so blithely tap into every single time we hit a website or check our e-mail. Slow as molasses (I’ll get to why I say that in a moment).
(Oh, here’s the moment.) All of those server-farm computers are “wide” not “fast.”
Huh? Yeah, I know. It means that they’re really good at moving great masses of information. Images are so small that they’re passe in the computing world, YouTube videos almost as much so. Now we can stream HD and 4k television without even thinking about it (Netflix, Prime, Hulu, Disney…). 5G will bring us the width to go virtual, moving great masses of digital stuff around quickly–while we’re anywhere. But again, that’s width, not computing speed.
Some Computers Are Fast!
Since 1993, there’s been a measurement called the TOP500. It lists the fastest 500 computers in the world and tracks their computing capacity.
Here’s the site (and the latest article is a good, geeky read): https://www.top500.org
As always, Wikipedia has a simplified listing, mostly lifted from the site above:
These are the machines to go to when speed really matters.
So, How fast is fast?
Computing speed is measured in FLOPS (Floating Point Operations Per Second). Think of a FLOP as a line of math with a variable decimal point and you get the idea.
Your average, high-end, multi-core desktop home computer can do a few hundred GFLOPS. That Giga-FLOPS or a hundred billion operations per second. (Of course, they’d probably melt if you did that in a sustained run.) [There’s another story here about how I used this technique to make my program the #1 (and essentially only) priority over 300 other users on a different IBM 370. Again the phone rang. Again I was asked to do less of that, but it got me an A in computer class.]
So, if you had a penny for every FLOP, you’d get about $2 billion dollars/second (just trying to show some scale here).
The 500th computer on November 2020’s TOP500 list belongs to Internet Company of China. It is capable of 29,920 TerraFLOPS (that’s a thousand GigaFLOPS, or 30-ish PetaFLOPS).
At a penny per FLOP, this kind of breaks down. The entire world’s 2019 Global GDP was just over $87 trillion USD. (https://www.worldometers.info/gdp/)
30 PetaFLOPS is $30 Quadrillion USD.
That’s the slowest of the TOP500.
The present #1 fastest computer in the world is named Fugaku and works in Japan. It has 7M processor cores and can sustain 442 PetaFLOPS and peak at 537. It’s about 25 million times faster than your high-end desktop home computer. So, again, $0.01 for every full computing second of your high-end home machine (running on the edge of meltdown) gets you richer than Elon Musk ($184B) on Fugaku in just 122 seconds, or two minutes. In just 25 minutes, you’d be richer than the world’s largest company, Apple ($2.3T).
Now that’s fast!
What is the Fast for?
Well, that’s a question for a different time, but here are a few examples:
- Modeling weather patterns (one of the most difficult and complex systems ever encountered). Global climate change research? Oh yeah, supercomputers are all over that.
- Modeling nuclear explosives (with the data gathered in the past, these incredibly complex atomic interactions continued to be studied and improved on supercomputers). Molecular interactions of many forms.
- If my memory serves, two full minutes of supercomputer time were used to model the record-breaking design parameters of the Oracle catamaran raced in the 2010 America’s Cup (https://en.wikipedia.org/wiki/USA_17).
- This of course includes anything with complex aerodynamics (ie. moving through something like air, water, space, gravity…: cars, planes, rockets to Mars, etc. (It would be a waste of a supercomputer to do the flight itself, but to design the rocket and model the lander trying to slow down in Mars’ thin atmosphere? Absolutely.)
- AI, real AI? Yeah, you need a supercomputer to gobble “wide” and process “fast.”
- The imagination reels.