Menu

Sanspoint.

Essays on Technology and Culture

Computing Is Dangerous Because We’re All Idiots (Even Me)

The dream of personal computing is unfettered access and control of powerful hardware that you can make do anything your little heart desires. The reality of personal computing, at least in the internet age, is that you and everyone with a connection has barely fettered access and control of your hardware. I don’t know if you can still plug a Windows XP machine into the Internet without a malware filter and have it turn into part of a botnet overnight, but it sure was that way for a while. I wouldn’t dare stick even a modern Windows 10 machine on the open web without something to protect it.

When anything can be accessed, when there are countless people: individuals, businesses, and states, all poking and prodding to find any possible weakness in the software and hardware we use, something has to give. In the case of Apple, it’s the freedom to run any random app on your iOS devices. Apple vets what is allowed to run on your iPhone and iPad, and it takes modifying the core software to change that. The big fear among some is that this will, eventually, come to the Mac. And this will be the end of days for free, open, personal computing. Hence concern over recent changes to Gatekeeper, the macOS tool for ensuring software is safe, that make it harder to run unsigned apps.

What code signing does is allow a user to know the app in question is being created and distributed by a developer that has, bare minimum, coughed up $99 a year for an Apple Developer Account. The intention is not necessarily a money grab, but a security measure so that Apple, and the user, can know if an app has been modified and identify the author.

It’s not perfect, by any stretch. The BitTorrent client Transmission got malware injected into it, and their code signing key compromised. Twice. If the purpose of code signing is a trust measure to confirm that an app you download and install off the open web is safe, this is a failure, although likely a failure on the Transmission team.

Code signing is also inconvenient. If you don’t have $99 a year to get your Apple Developer ID, you’re stuck up a creek with an unsigned app that is now harder for your potential audience to use. For open source apps, this is a huge pain in the behind, since someone now needs to serve as benefactor if they want a signed Mac version. Not every open source project will want or be able to afford that.

Yet, nearly five-hundred words in, I can’t say I’m terribly upset by this development for the reasons I brought up at the start. Free, unfettered, open computing is dangerous as hell. Being able to arbitrarily execute any piece of code that lands on your computer is a massive risk, and we’ve seen what happens when there’s no protection. If you need a reminder, go spin up an unpatched Windows XP VM or two.

Let’s face it. Users are idiots. All users. Even you and me. Even the top computer security experts, whether they’re working for Apple or the NSA, are liable to do something very stupid when they have to make a decision to keep their computers safe. Hell, case in point, some NSA operative left a powerful hacking tool on a server where it was compromised. What makes you think that you, smart and savvy computer user, won’t accidentally install a compromised executable and turn your machine into part of a botnet?

Restricting the average user’s ability to run random software is, as painful as it might sound, for their safety. It’s protection not just from malicious actors, of which there are many, but protection from their own idiocy. There’s no way to allow a computer to run arbitrary code and protect a user from the consequences thereof. With so much of our lives entangled in a garbage web of privacy, not preemptively locking things down is downright stupid.

Conversely, it should be possible for a user to open those locks. It shouldn’t be easy, it shouldn’t be obvious, and it should require them to absolve their OS provider of choice from responsibility if—or when—something blows up in their face. Digital security is doomed to be a cat-and-mouse game for eternity. That doesn’t mean we should make it easier for the cats in the interest of… what, exactly? The average person does not want to think about how to protect themselves, they just want their stuff to work.

That needs to be the priority: keep everyone safe, and keep everyone running. You can do this without crushing the freedom to make software, even free software. It’s valid to quibble about Apple making their $99/year Developer Program mandatory for developers who just want to make and distribute an app, too. Despite that, a system that offers a signing certificate to anyone with a pulse isn’t going to be secure either. We’re doomed no matter what, but a solution that keeps the user as safe as possible for as long as possible is the best option we have.