Essays on Technology and Culture

The Tension of Encrypted Messaging

Facebook just rolled out encrypted messaging on its Messenger service, but you’ll have to opt-in on each chat. This is coming not long after Facebook started collecting information from WeChat users, which used to be private. Google’s promised end-to-end encryption in their new Allo messenger, at least as an option. When released, Google went back on their promise so hard that Edward Snowden told the world not to use it.

There’s tension when it comes to technology companies and encrypted messaging. Snowden’s revelations about PRISM and other NSA spying through tech companies have them promising more encryption to protect their public image. Yet, if they use good encryption that governments can’t get their tendrils into, and if they do it by default, there’s other people who can’t spy in on people’s conversations: the companies themselves.

If Facebook is encrypting users conversations, they can’t mine data for its uses. That includes stuff like the News Feed algorithm, their digital assistant M, and—biggest of all—the data they sell to advertisers. That last one directly affects the company’s bottom line. It’s the same with Google, Microsoft, Snapchat, and any other advertising supported company that isn’t end-to-end encrypting messages by default. Whatever claims they want to make about valuing user privacy, and all that jazz, as long as they’re peeking into what you’re saying and doing, your conversations aren’t private. End of story.

Even with encrypted messaging, the provider does have to store something to make it work. Signal, which is end-to-end encrypted revealed that the FBI subpoenaed their user data. Of which they don’t have much: “only account creation date & last login time.” according to Edward Snowden. Apple, too, logs some user data, such as who you messaged and when, but not the content.

In the case of Apple, this is the sort of metadata that the NSA claims to have collected on phone calls. It’s still dangerous if it gets out—or subpoenaed—but it’s not great for marketing purposes. Advertisers are less interested in who you’re talking to, and more about what you’re talking about. This is why chatbots are so sinister. By presenting a friendly, playful personality that promises to do whatever you ask it to, chatbots are excellent tools for extracting your personal data. And what better way to get a good deal on a partnership with a company to integrate with your chatbot than promising to share valuable user data with them?

Messaging, even when you’re not talking about anything “important” is a gateway into our most intimate selves. That’s why that data is so precious to the NSA, to other governments, and to advertisers. By presenting a messaging service as private and secure, even when it’s not by default, a tech company can override yet another defense mechanism savvy users know to keep prying eyes out of their lives. Even worse, most ordinary users aren’t going to even know or care, as long as the service does what they want, and does it well.

This is what everyone is banking on. Without education about the potential of mass data collection by private companies and government agencies alike, most people won’t be aware of the risks. Without a compelling narrative about why people should care, education about the risks will just be ignored. We all have something to hide, not necessarily illegal things, but aspects of ourselves we want to keep between us and the human being on the other end of the line. If we can’t keep people from prying into this most intimate space of our digital lives, what will convince them to butt out?