Essays on Technology and Culture

Kids Can’t Use Computers? Depends What You Mean By “Use”

Marc Scott of Coding 2 Learn has raised a few hackles with a recent piece making the rounds. The title alone should explain why: “Kids can’t use computers… and this is why it should worry you”. Here’s the gist of the article, quoted:

…[A]ren’t all teenagers digital natives? They have laptops and tablets and games consoles and smart phones, surely they must be the most technologically knowledgeable demographic on the planet…

The truth is, kids can’t use general purpose computers, and neither can most of the adults I know.

He’s absolutely right on the money with the line I italicized. Well, he’s right for certain values of the word “use,” at least. There’s two ways to know how to use a computer: task-based knowledge, and skill-based knowledge. The former is much easier to acquire than the latter.

Task-based knowledge is functional. If someone wants to check their email, go onto Facebook, or watch a cat video on YouTube, people will figure out a way to do it that works for them. Their methods might end up seeming completely roundabout to a more savvy person, however. A while back, a company’s article on Facebook logins ended up as the top result on Google for “Facebook login” leading to them getting thousands of confused comments and emails from people whose Facebook workflow was to search to get to Facebook’s login page. This is the downside of task-based knowledge: when something changes, it can potentially break the workflow. This is also an advantage that mobile and tablet OSes have over traditional desktop computing. When all you have to do is tap the little blue box with the white “f” to get to Facebook, there’s a lot less cognitive load involved.

It’s skill-based knowledge where people fall short, and using a computer to its full potential is largely a skill. Turning on Wi-Fi is a task, and that’s something a person can learn, but skill is knowing how to find the settings for something you need to turn on and fix, no matter what it may be. The car metaphor, which always comes up in discussions like this, is apt. Most of us view our car as a method to get from point A to point B. We know how to drive it, and we know how to park it, and we know how to fill the gas tank. When something goes wrong, or even when it needs maintenance, we quickly turn to a professional. We can learn to fix our cars ourselves, but why should we? It’s much the same with computers.

Why this sorry state of affairs? Marc claims:

Being a bunch of IT illiterates themselves, the politicians and advisers turned to industry to ask what should be included in the new curriculum. At the time, there was only one industry and it was the Microsoft monopoly. Microsoft thought long and hard about what should be included in the curriculum and after careful deliberation they advised that students should really learn how to use office software. And so the curriculum was born.

I don’t know if it’s quite as insidious as that. The most common application many people use computers for in their professional lives are word processing, spreadsheets, and presentations. Even now, most of our jobs don’t require us to get deep enough into a computer to require learning how to code. A basic computer literacy course for students to impart task-based knowledge of office software is probably the best all-round education they need from a vocational perspective. That doesn’t make it the best possible education in computing, however—just the lowest common denominator.

Marc does have a point about the locking down of educational computers “…preventing kids and teachers access to system settings, the command line and requiring admin rights to do almost anything. They’re sitting at a general purpose computer without the ability to do any general purpose computing.” While a locked down environment is easier for schools to administer and reduces the potential for security issues, we lose the ability to teach people, if not the skills of maintaining a computer, at least giving them the task-based knowledge to configure a network connection. At a certain point, computer education needs to move beyond how to use office software and into how to use the computer as a whole.

While having parents and schools alike teach children to try and solve problems themselves is sound (if impractical in the bureaucracy of a school system), the rest of his suggestions border on lunacy. Where things fall down are suggestions like using Linux—even on a cell phone. Marc even admits that his phone “can’t use 3G… crashes when I try to make phone calls and the device runs so hot that when in my jacket pocket it seconds as an excellent nipple-warmer…” If you want to teach an ordinary, non-technically inclined person how to be frustrated and give up on computing, by all means sit them in front of a Linux box. [1]

The problem needs to be approached from both sides—how we teach computers, and how computers work. Apple seems to be the only company on the right track. However, with Marc’s outright dismissal of iOS, I’m sure he’ll disagree. He might agree that we should be making technology easier to use and more intuitive—while not sacrificing the things that make it powerful. More often than not in the attempt, we end up with interfaces that are either too dumbed down to help users learn anything, or interfaces that stubbornly refuse to change for marketing reasons. [2]

Marc lists a series of events that prove people don’t know how to use computers. I ask: Why is the OS insecure enough, out of the box, that a user needs to install software to prevent viruses and malware? Why should someone need to reinstall the insecure operating system to fix it? Why is there a hardware switch to turn on Wi-fi on a laptop, and why is it on the side of the machine, out of sight? Why are error messages for simple issues written in a complex way, or easily dismissed without actually resolving the problem? Why does a cell phone not automatically back itself up remotely? The only issue in the litany that isn’t based around a technical shortcoming is the user who lost their Internet Explorer icon in a mess on the desktop—though there’s a technical solution that exists.

We’re making progress in this area, but not fast enough. Any change that makes a computer easier to use for people, and reduces the potential for things to go wrong and a technician to be brought in, is usually fought against with knee-jerk kicking and screaming from the technical elite. For the majority of people, we want something to work, and work with a minimum of fuss. It’s not that kids don’t know how to prevent malware or configure a wireless network, it’s that they shouldn’t have to. Until that day comes, let’s at least teach them.

  1. I used Linux as my primary OS for a few years, and while it’s certainly improved since 2005, it’s still not ready for average users to make it their primary OS.  ↩

  2. Another thing I have to agree with Marc on: Windows 8 sucks.  ↩