Off Into the Wild Blue Yonder

Off Into the Wild Blue Yonder

By Bradley Harrington 

Brad Harrington

“Cyberspace … Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding …” — William Gibson, “Neuromancer,” 1984 —

Every now and then, back when I was a young child, our family would head out of the city to go someplace special on a Sunday.

“Where are we going, Daddy?” I’d always ask.

“Off into the wild blue yonder,” he’d always tell me, and we’d never find out where until we got there. Consequently that phrase has always had a special meaning for me: Of a journey to some magical place we’d never discover until we actually arrived.

Just like what’s happening in the computer field, for instance …

Consider: When IBM released its original 5150 PC back in 1981, it came with an Intel 8088 CPU that clocked in at 4.77 megahertz and supported up to 256 kilobytes of RAM — but had no hard drive. That didn’t come until 1983 with the 5160 PC, and it could store 10 megabytes of data.

Now, 34 years later, your standard higher-end PC comes equipped with a multi-core CPU running at 4 gigahertz, 16 gigabytes of  RAM and a 2-terabyte hard drive.

So, what’s the trend? In that timeframe, CPUs have quickened their operations by a factor of 838 times. Since today’s CPUs are now capable of processing more than one instruction set per clock tick, however, it’s safe to say that processor speeds have mushroomed by a factor of at least 1,000 times — or 100,000 percent.

And RAM? The jump from 256 kilobytes to 16 gigabytes is a leap of 62,500 times, or 6,250,000 percent.

But even such tremendous boosts pale in comparison to secondary-storage disk capacity, which — at 10 megabytes to 2 terabytes — represents a gargantuan gain of 200,000 times or 20,000,000 percent.

So, based on those trends, what would our projections be for the next 34 years?

By current growth curves, the computers of 2051 will possess CPUs clipping along at 4 terahertz; contain 1 petabyte (a quadrillion bytes) of RAM; and come with a storage capacity of 400 petabytes.

On the software side, consider the magnitudes of evolution there as well. Starting off as mere number-crunchers, computers have swiftly matured into incredibly powerful and indispensable tools in thousands of fields such as database management, communications, graphics, banking, desktop publishing, gaming, 3D rendering, entertainment, medical and industrial instrumentation control, engineering, manufacturing and much, much more.

Furthermore, over the last 34 years, two other factors present themselves for consideration as well: (1) The rise of the Internet and its skyrocketing growth rate (the cloud currently contains over 1 exabyte, or a quintillion bytes, of data); and (2) the emergence of “rogue” (malware) entities, along with their resultant “protective” program counterparts, onto the cyberspace stage.

What the future holds on such hardware and software fronts is staggering in its implications: For, at what point will the processing, storage and interconnectivity capacities of the forthcoming computers begin to rival the levels of intricacy and complexity of the human brain?

With the biological advances made in nanotech and cloning capabilities in recent years, consider some of the potentialities. Do you want to live for hundreds of years, or maybe thousands? Keep a few cloned copies of yourself vegetating in cloning tanks, and periodically have the ENTIRE CONTENTS OF YOUR MIND backed up to secondary storage. Next time you get killed or die of old age, activate a clone, restore your backed up consciousness, and … Presto! Time for another century of life …

And how difficult is it, really, to envision human beings beginning to “computerize” themselves in order to augment either strength or intelligence? Testing for steroids will be so 20th century; how about testing for the presence of an Intel i9999 512-core PowerMaster muscle-oscillator in that character’s head instead?

And learning? Just let the information flow directly into your brain from cloud-based data depositories through your “always-on” wireless connection. That’s what I’d call a “crash” course … That you’ll never forget.

And, on the “Artificial Intelligence” (AI) side, true AI might be a little closer than we think — and that AI, just like our own, will be capable of being either good or evil. Once the required level of complexity is attained, who’s to say computers can’t think for themselves? And what will they be thinking about US?

Clearly, as a race of beings on Planet Earth, we stand on the verge of choices and actions that promise — or threaten — to make our current modes of existence resemble the deserts of Ancient Mesopotamia by comparison.

So, it’s “off into the wild blue yonder” we go — and, like my family’s Sunday outings, we’ll just find out what it’s like when we get there.

Bradley Harrington is a computer technician and a writer who lives in Cheyenne. Email:  bradhgt1776@gmail.com.

Share
Copyright © 2008-2024 All rights reserved   Terms of Use    Privacy Statement