Skip to main content

Limitless Levels of Unused Potential

“Joy comes from using your potential.” - Will Schultz

I'm part of the generation that grew up both in awe of and, more and more, subservient to computers. Thirty-somethings have never known a world without computers, although not all of us had them in the home until our teens. But they've always been an impact on society, business, and education in our lives. They are not mysterious machines that beep for unknown reasons (well, ok, sometimes the reasons are unknown). They are not, as some family members label them, necessary evils. Computers have always, to us, been tools--hammers in an era of data and the need to manipulate that data.

We bathe in technology now on a daily basis, the likes of which would have blown our minds as children; the phone on my hip has more computing power than all the TRS-80s in the computer lab where I first entered "10 PRINT "55378008" / 20 GOTO 10 / RUN" and then snickered on my way to lunch. The Web, for all its erudite uses, is still approached as a child's toy by most people, cataloging our cats and dogs, making us giggle at life's idiosyncrasies as if we were still that kid with a BASIC command line.

The influence on science and research is the most pronounced, with grid and distributed computing growing in importance. Worldwide computing power is nearly inconceivable; over 1 billion PCs are out there already, with projections of 2 billion by 2015. We have a global network that could potentially link a significant portion of these machines together (and already has in some cases) looking for cures for diseases, extraterrestrial life, or prime numbers.

And yet, in an age where consumer-level desktop computers contain the computing power of supercomputers from 15 years ago, what have we done with this potential? Aside from a sub-set of the general population donating computing cycles for distributed projects, what has the average person done with the machine sitting in their den or perched on their lap?

We play games, we surf the web, email our friends and update our statuses. We bid on auctions, vote on articles, and blog (hey, wait...). But, unlike many of our other tools, we don't shape the computers to do what we want them to do. How many things have you used a screwdriver for other than to drive a screw? How many odds and ends sit in a drawer for that just-in-case moment? I know I have a dental pick sitting in a drawer and I've never used it to clean my teeth.

But we don't view computers the same way. Sure, we install programs to collect and sift through our data, catalog our photos, and index our documents. By and large, though, we don't know how to make the computer bend to our will, to make it do something we can't find a command or app do for us. It's not that we don't know how to code, although relatively few people do know how to do so (when compared to the general population). It's that we don't even know how to use the cruft already on our machines to their full potential. The thought of using programs many of us own to build a list of addresses for sending out Christmas cards, merge it into a template, and print those addresses onto envelopes so eludes us that, even today, getting a card with a printed address from a non-business is shocking.

So, the obvious question here is: why? Why have we failed to utilize a greater part of the potential power in a  computer?

I only have my own thoughts here (so, chime in if you feel like it), but here's my punch list.

They're still new

From a cultural standpoint, we're still educating the second real generation to grow up digital, with computers as a part of their lives from Day 0. Computers are a relatively new concept culturally, despite the influence they wield. Because of this, we consider them powerful, mysterious machines; we don't know how, but they do all these amazing things, things we then take for granted.

We don't understand them

This goes along with the cultural newness. Arguments can rage for many pages of search results as to what constituted the first personal computer, but no matter where you draw the line (Apple ][ in 1977 or the Simon in 1949), the general acceptance of computers as de facto members of the household's electric-slurping ecosystem took some time. As recently as 1997, computers were only in 36.6% of homes [source: census (PDF)]. By 2003, it was just short of 62% [source: census (PDF)]. Think about that: 7 years ago, 1 out of 3 houses didn't have a computer. I'm not going to get into the socio-economics of who did and didn't have them within those groups, but as a collective, we haven't had much change to actually learn about how the machines work.

Nevermind that, fundamentally, computers are somewhat difficult to grasp for someone uninitiated in their use. My mother, who can sort of describe to a mechanic what she believes is wrong with her car, couldn't tell you the first thing about how a computer works. As why should she? She interacts with the interface without need to know why the program runs.

There's no need for to know about pointers or buses or floating point errors to make a computer work. She doesn't need to know about serpentine belts, brake pads, or alternators either in order to drive, but she's picked it up throughout her life because she lived in an automotive world. People around her just knew about cars.

We have yet to have a generation that grew up not only in a world gone digital, but one where knowledge of computers is absorbed simply by living.

It is hard

Just as computers are dumb, so it is difficult to tell them what to do. Instructing a computer to do something used to be actual rocket science. Programming languages have simplified the process a bit, but brought along the necessary cruft as well: syntax, objects, variables. We need them for modern programming, but these are high concepts for a non-programmer.

In short, it takes a degree of knowledge to be able to bend a computer to your will. The average person isn't just going to sit down with their shiny new laptop and start banging out code.

We're Not Interested

We've accepted that computers will be "easy to use" and that we have no need to learn their inner workings. Like cars, we have moved computers into a category of Commodity and woven them into our daily lives. Which, really, is the point of many people's careers: making computers easy to use and easy to integrate with our existences. We're not interested because we don't have to be.


I feel ok with the current situation. Honestly, I do. There are many, many people around the world working on harnessing the power that sits on a vast majority of desks to solve problems.

I do hope, however, that others will come to appreciate that same power and supplement it with some knowledge. While every day brings the reality of the most computing power in history, we need to, as a species, ask the questions in the right way. And that's something worth paying for.


Popular posts from this blog

Happy Retirement Pat Sweeny!

In a previous life, I was an active member of the West Michigan Shores Chapter of the STC. I met a lot of really cool people there and learned a lot about what it meant to be not just a technical writer, but more about how technical writers can break out of the mold and accomplish things.

One of the people who did that was Pat Sweeny. Pat is (or was, by this point) the President and owner of The Bishop Company, a contract do-it-all house; they document, streamline and illustrate your process, and they do it damn well. Pat was one of the first people in that chapter to "get it", which is to say, he and his company understand that technical writing isn't going to be a department for very much longer, it's going to be a business.

He had the foresight to actually make it a business, but he also had something else. Pat was forever trying to better those around him. He would come to meetings (which was a big step beyond most people) and teach you things. Or he would come to …


Evernote, for better or worse, is the best note-taking service for my needs. It works across all my devices/computers/modes. It's fairly easy to get stuff into it. Hell, they even have 2-Factor authentication. The Windows app is a little clunky and my girlfriend and I have never been able to get shared notes to work properly (conflicted note! three times in the same grocery trip!), but what service is perfect? At least they have nice socks.

Everything, in fact, is pretty good as long as you don't screw up. And screw up I did. I'm not very regular about making backups, but I do make them every month or so. Once you figure out how to create a backup, that is.

There's a helpful Export Note option (which turns into Export Notes when you select multiple notes HINT). The export process is essentially opening All Notes, selecting every note, and then choosing Export Notes. Or something like that; Evernote never tells you, you're left to figure it out on your own. The file…

Google Inbox: A classic Google product

My work domain (an EDU) recently had Google Inbox enabled so I had a good chance to try it out. My personal email is relatively quiet and, I believe, doesn't provide a good Inbox experience. Work is more active and requires actual management, something I've tossed many a tool at over the years. As part of my work life, I supported the Google Apps for EDU installation here and took a pretty extensive presentation to campus about how to manage large amounts of email.

Inbox is a classic Google product: the distillation of a number of excellent ideas into a set of half-complete features built for a use case most people don't meet. We've seen this in the past in products like ChromeVox, Google's Chrome extension for accessibility. ChromeVox works great on ChromeOS devices, but completely ignores the point that most users of accessibility tech (AT) don't have or want ChromeOS devices and come to services with their AT in tow. ChromeVox also ignores decades of convent…