In all the mystery around Snapchat’s insane valuations and dismissal of a $3 billion buyout by Facebook, there’s claims that Snapchat is being used by Wall Street for insider trading. Even if this isn’t the case, it’s the responsibility of Snapchat to ensure that it’s service is used in an ethical manner. This may be too much to ask from a company that made its bones by giving people a way to share temporary pictures of their naughty bits.
Hardware, software, and services all have different levels of control they have over how they can be used. Apple’s iTunes EULA has a clause requiring that it not be used to make nuclear weapons. If Kim Jong Un uses iTunes on his Mac, there’s precious little Apple can do about it. When was the last time you heard of anyone taken to court over an EULA? Once a piece of hardware or software is in the user’s hands, most bets are off.
It’s online services that have the most ability to influence what they’re used for. Many online services do enforce their terms of service, to a degree. Child porn is guaranteed to get services to act, for obvious reasons. Google and Microsoft both have created a program to block child porn from their search results. On Facebook, nudity is often a ticket to account suspension. Yet, of these same services are notoriously slow to act to shut down users who sexually harass and threaten users, behavior which is also against their terms of service.
Why? It’s easier to algorithmically identify porn than it is to identify hate speech and harassment. In Snapchat’s case, having people scan through the 350 million-plus photos on Snapchat per day for questionable or illegal content requires more manpower than is feasible. (That’s nearly a quarter million photos per minute.) Complicating matters further is that adding oversight from the start would have hampered Snapchat’s growth.
All of the the tools we create can be used for good or evil. Yet, as long as we are in the position to influence how our tools are used, it is a ethical requirement that we use that influence to make everyone’s lives better, not just line our pockets. Companies that acknowledge the potential use of their products for evil, who can act to prevent it, and do not, are committing evil by proxy. I am certain that safeguards can be put into place to, if not prevent unethical behavior on the part of users, at least mitigate its consequences. These have to be ingrained in the very fabric of the service, from the start, and it requires thought far beyond what most people put in to their apps. Perhaps we’ll see it in time.