Thursday, February 2, 2012

Computer Literacy in the Age of the iPad

Back in the early 1980s when personal computers were still a relative rarity in most homes and numerous platforms (Apple II, IBM PC, TRS-80, Commodore, Amiga, Atari, etc.) proliferated, people talked a lot about "computer literacy." Kids needed to achieve this skill in school if they were to succeed in life and adults were advised to get with it or be left behind. The problem was that no one was sure exactly what computer literacy meant. About the only thing that seemed obvious was that you needed to know how to type. I was in high school at the time, and the number of males taking typing classes increased dramatically. In those days, secretaries typed up whatever their (usually male) bosses needed and almost all secretaries were women. Indeed, there were many executives who proudly stated that they didn't know how to type and didn't need to since they had someone to do that for them. My family was a little different in that my mother insisted that all her kids, regardless of gender, learn to type. She wasn't looking forward to a day when everyone had a computer on their desk (indeed she was generally baffled and even a bit scared of computers), she was looking back to her college days when people (mostly males) who couldn't type had to hire people to type up their papers.

But beyond typing, what did computer literacy mean? Was it learning to program, and if so, in what language? Was it knowing how a computer operated? Knowing how to fix or troubleshoot it? Or maybe it was learning to use the predominant software of the day (WordStar, Visicalc, etc.) or learning the various operating systems? There didn't seem to be a cohesive strategy for imparting or even defining this nebulous skill.

While there were those who predicted it, it wasn't at all clear at the time that things would settle down to the point where there were essentially two choices, DOS/Windows and the Mac. I grew up in a county that was among the richest in the country and had one of the highest rated public school systems in the U.S. Despite this, there was one computer class available, taught by a math teacher who had taken some computer classes in college. In addition, there were only five computers in the classroom, two NECs that ran their own proprietary OS and three dumb terminals connected over phone lines to an HP mainframe at a remote location. With 30 or so kids in the class, computer time was quite limited.

As I recall, we spent a fair amount of time just learning how to log in and operate the two different systems. This was around 1982, before the Mac, mouse, and GUIs, so we had to learn the various commands for each. We didn't learn any software packages, probably because we were using two incompatible systems. We did learn, in a very general way that really didn't help us much, about CPUs and memory and the like. Finally, we were taught to do some rudimentary programming, mostly in BASIC but also a little FORTRAN.

For those like me who were little proto-geeks this was fine – we were into it, digging into the manuals to learn more and eventually knowing more about the systems and programming than our teacher, who after all was a math teacher and still had her regular classes to teach. But for most of the class, it was ultimately kind of useless. Most of the kids weren't looking to make computers their careers; they were in the class because their parents or a guidance counselor had told them they had to be "computer literate" if they wanted to succeed in this brave new world. I can't think of anything from that class that would end up helping any of those kids.

In just a few years, all that stuff we learned didn't really matter except to those of us who went off to become Computer Science majors in college. Graphical user interfaces and the mouse made memorizing the arcane commands of older systems unnecessary. Off the shelf software replaced any need for programming skills. Once you grokked pointing, clicking, and dragging, you could generally figure out how to use a program enough to suit your needs. Understanding the inner workings of a computer became unnecessary for most – as long as it did what you needed it to who cared how it did it? And if there was a problem requiring repair or troubleshooting, you called in a geek.

I was thinking about all this recently while watching a friend's four year old daughter use an iPad. And I mean use it, creating artwork and music and even sending messages to her friends. According to my friend, no one showed her how to use it, she just picked it up and went to it. And there are numerous stories of octogenarians who've never used a computer picking up a iPad and beginning to use it almost immediately. They didn't spend time learning to use it, they just used it. The same holds true for my friend's daughter.

So it's no surprise that no one talks about computer literacy anymore. Much of the energy put into creating it, at least in my experience, was channeled into curriculums and activities that weren't relevant to the lives of most of the people involved. Not that the people who created these programs weren't sincere, and I'm not trying to put them down. But I do find it interesting that the one thing they were right about was something we didn't learn in our computer class: you still need to know how to type.

No comments:

Post a Comment