The marquee feature of the new iPhone 5S is a fingerprint scanner that can not only be used to unlock the phone, but also make purchases from iTunes, the iBook Store, and App Store. While it’s not the first smartphone to include a fingerprint scanner, the Motorola Atrix 4G didn’t exactly set the world on fire. Elsewhere, the Moto X may have an always-on microphone to have the phone respond when you summon it, [1] but it’s not tuned specifically to your voice. The iPhone 5S is the first phone to make biometrics mainstream, and it has the potential to revolutionize security for the average user, in the mobile space and beyond.
The passcode on smartphones is an easy point of failure, and there have been plenty of alternate solutions. There’s the pattern-swipe lock screen that is the standard on Android phones, crackable by reading the smudges. Some phones use face detection to unlock, but those can be defeated simply by using a photograph. While thumbprint scanners can be exploited, it’s unlikely someone’s going to try making a gummy thumbprint just to get into a phone. Short of a crook cutting off your thumb, fingerprint identification is much more secure than any other form of smartphone security. That is, if it works right, and that’s yet to be seen. However, since Steve’s return to Apple, they’ve tended to not release features that are half-baked. Three-quarters baked, sure, but not half-baked.
Biometric technology hasn’t been huge in the consumer space. You can find it on some business laptops, but most of us are still tied to memorizing strings to log into our computers, if we even bother with that. [2] Fingerprint readers are known to be somewhat finicky, and cheaper readers that use less sophisticated scanning mechanisms and identification algorithms are either going to frustrate a user by not letting them in, or be laughably insecure. It looks from the Touch ID setup process that Apple’s trying to get as much info as possible about a fingerprint, which should reduce the chances of a legit user being denied access, and a non-legit user gaining access, but only Apple knows what the leeway the algorithm has.
If Touch ID in the iPhone 5S works well enough, it gives Apple a new way to tie a user to a device—and to their other devices. I don’t think it’s a leap to imagine an upcoming version of OS X (or, perhaps OS XI) that will allow you to use your iPhone’s fingerprint scanner to log into your Mac, perhaps over Bluetooth 4.0. Half of that setup is already used by two-factor authentication app Authy. It overcomes one of the inherent security issues with NFC-based payment methods like Google Wallet. Instead of a guessable PIN, someone could use the fingerprint scanner to authorize a transaction, possibly with Bluetooth 4.0, or iBeacon as the transmission mechanism instead of NFC, reducing the steps in the process. Touch ID based purchases from Apple could easily be a dry run for this.
The success of Touch ID all comes down to making it work well enough in these early days. If Apple’s support forums are flooded with users complaining that their iPhones won’t unlock no matter what way they put their thumb on the darn thing, Touch ID will not take off. I’ll give Apple the benefit of the doubt. I don’t think they’re desperate enough to release a product defining feature early enough that these problems wouldn’t have been caught in internal testing. Who knows how long Apple’s been working on making this work, after all? I’m also sure that Apple will be tweaking the algorithms that identify fingerprints once people are busily rubbing their thumbs against them. We’ll have to wait, first until the press embargo has been lifted and reviewers get to share their experiences, and then when the feedback comes in from ordinary users.
People who claim the iPhone 5S isn’t “revolutionary” enough aren’t thinking long term. Today it’s unlocking our phones with a thumbprint. We won’t be unlocking our entire lives with a thumbprint tomorrow, but in a few years, we may be looking back to the humble idea of a fingerprint scanner on the button of an iPhone as a sublimely brilliant idea that’s changed the face of security for the average user. Technological revolutions don’t happen in huge jumps, or creating “new product categories.” They come from the adjacent possible of using tools we’ve developed in new, and exciting ways, and that’s what Apple excels at. I just wish I didn’t have to wait two years to try it out myself.
This is the sort of thing that should make people more paranoid than the idea of someone’s fingerprints being stored in a database, whether locally or networked. ↩
Recently, Marco Arment tweeted about jobs that make the world worse. Naturally, because Marco is both wealthy and coming from a place of privilege—both things I think Marco has not denied—his views are tainted. Speaking as someone who worked in one of those careers Marco demonized (hint: not Patent Enforcement), I disagree with those who call him out. Marco is right. Maybe not 100% right, but he’s right about telemarketing—well, some forms of it.
First, some background. While attending college, I landed a part-time telemarketing job with the Walnut Street Theatre in Philadelphia. I spent half the year calling subscribers to raise money for the theatre’s Angels fund, a couple months calling existing subscribers to renew their subscription for the next season, and the rest of the year calling to sell new subscriptions. I worked this job, with a few short gaps, from January 2005 to August 2012. After graduating college in 2008, I struggled to find full-time work until I was convinced by a headhunter to take a job with Market Resource Partners, a Philadelphia based “lead generation” firm—business-to-business telemarketing—and worked there from October 2008 to February of 2010.
I took both jobs because I needed money, end of story. In the annals of telemarketing, the job with the Walnut Street Theatre was actually a lucky break. It was selling a legitimate product, without a great deal of completely cold calling. The environment was supportive, the pay was good, and the free theatre tickets were a great perk. While nobody wants to be called at dinner time, as long as I was friendly, polite, and brief, most people were okay with it. [1] I made a good living there during college, and even kept with it for the extra income after I graduated. I also kept it up because I love the performing arts, and felt that I was actually making the world a slightly better place by raising money and putting butts in the seats of a 200+ year arts institution. There are few telemarketing jobs that can inspire that feeling.
Case in point: the fourteen months I spent as a business-to-business telemarketer. This is where I realized how awful the job truly was, and why B2B telemarketing as a profession will hopefully wither and die—hopefully, soon. My role with Market Resource Partners was to contact IT departments, and convince them to take a second call from a representative of one of the most beloved companies in technology at the time, Sun Microsystems. [2] As much as people hate being called during dinner, people really hate being called at work—especially IT people. When you’re trying to get work done, that is the last time you want someone to interrupt you and demand any of your time, especially to sell you on new IT hardware. It’s why large organizations established safeguards to prevent people like me from reaching the people who make IT purchasing decisions. They have enough to do.
It’s worth noting here that I sucked at my job. After fourteen months, and enough bosses tut-tutting at my lack of performance, I was shown the door. I was glad, too. Part of the reason I sucked was that my skills at dealing with “warm” calling theater fans and subscribers didn’t translate easily to calling harried IT professionals and pushing a product I didn’t believe in. Another reason I sucked was that I knew I wasn’t doing anything worthwhile. I was interrupting people to get them to take a call from an actual sales person—not even a sales person who worked for Sun, but for a Sun Value-Added Reseller—and further take them away from the actual work they had to do.
I realized pretty quickly that my role was one of an easily ditched middleman. If an IT person needed new hardware, what was stopping them from doing their own research? We have the Internet for that, and they can squeeze a Google search or two into their day easier than they can squeeze a sales call in. When I was told to pack up and leave, I did so gladly, knowing that I would no longer be debasing myself at a job that had no intrinsic value beyond the money the company made per lead generated—which I never actually saw. It’s telling that the Monday after I was let go, the company actually let go about half their employees as Oracle, who had recently purchased Sun, terminated their contract. Thus always to middle-men, though I felt bad for the co-workers who were unceremoniously shitcanned.
“Dinner time,” I quickly learned, runs from about 5PM to 9PM. Shifts at the Walnut were typically from 6PM to 9PM, so any call could conceivably come as someone was sitting down to dinner. ↩
N.B.: the epithet “most beloved company” applied to Sun Microsystems should be read with as much sarcasm as the human brain can muster. ↩
On September 10th, Apple will announce the iPhone 5S, a slightly updated iPhone 5 with the same basic form factor, and one new bonus hardware specific feature, possibly a fingerprint scanner. They will also announce a “low-cost” model, the iPhone 5C, with a plastic shell in multiple colors. They will announce iOS 7’s release date, and a small revision to the Apple TV and iPod lines, including (possibly) a new iPod nano design (again).
Apple’s stock price will go down on the news that Apple is not innovative enough with their latest updates to the iPhone line, despite the fact that they always release small updates on off-years, and have since the iPhone 3G[S]. The Apple TV will be criticized for not being an actual TV, despite there being no indications that Apple will actually manufacture a television set. Pundits will also decry the lack of an iWatch, again despite there being no indications Apple has such a product even close to ready to announce. Some will also complain that Apple did not announce new iPads, or Mac hardware, despite typically having stand-alone events for those announcements. More articles will appear complaining about Tim Cook’s lack of leadership and how miserable of a CEO he is compared to Steve Jobs.
The iPhone 5S and iPhone 5C will go on to sell a stupid amount of units the weekend of its release, and the stock price will rebound slightly. Meanwhile, the rumor mill will wind up for the upcoming iPad and Mac event that is likely to happen in October. Pundits will complain that the iMac does not have a Retina display, that the new Mac Pro is overpriced, and Mavericks is not “revolutionary” enough. Then the hype cycle can begin in earnest for WWDC, where Apple is certain to announce the new Apple TV SDK, xMac, and the Jony Ive redesigned OS X 10.10.
Repeat ad nauseum, each year. Wake me when my AT&T contract’s up for renewal.
There was a time when being on the Internet allowed you to reinvent yourself. You could be an acne-ridden fifteen year old sleeping in an unfinished basement, but to the folks in a chatroom, you could be anyone or anything. This was in the days when we didn’t have services like Facebook that tied in heavily with our offline life. With nothing tying our online identity to a real person except an IP address, and maybe an email address, we could re-invent ourselves. The practical upshot of this was that, in different communities online, we could be different people. In some corners of the Internet, we still can, but these are in the minority.
For those of us who came of age online in this period we know that, on one server or message board we were an upstanding citizen, and contributor. On another, we were the thorn in the moderator’s side. We knew what communities we valued, and we knew what we wanted to keep secret. If there’s anything we’ve lost in the transition to a social media dominated culture, it’s the idea that we can be different people in different places. Perhaps this is coming back with the decentralized and ephemeral photo and message services like Kik and Snapchat, popular with the kids, but there’s too much money to be made from preventing true anonymity. Facebook and Google have too much to benefit by tying out youthful indiscretions to our accounts. Of course, we could always just change our name.
As Facebook, Twitter, and Google become the identity backbone of the Internet, we will continue to lose the ability to switch our personalities depending on who we’re dealing with online and where. It’s true that anonymity comes with the price of providing cover for troublemakers—4chan’s /b/ board comes to mind. But not everyone wants to live in public, and the option should exist for us to have an alternate identity online for our shadow self, the one we don’t want our parents, employers, and government to see—and one that can’t be algorithmically tied to our legit, public identity for the purpose of selling advertisements (or worse). I know this can be done, though not easily. I worry that it will become harder in the future, and that the people who need access to it most will lose out. There’s no conspiracy theory thinking here. It’s economics.
The problem is that technology is not a democracy. Even open source. Especially open source. This is not a polemic against open source as either philosophy or technology. [1] Open source has brought us many, awesome things, including the fundamental technologies this website is built upon. Open source proponents just tend to be the most vocal among the group that insists that things like Internet access, ubiquitous computing, and learning to code will make the world more free and democratic. Any time someone promotes anything as a cure-all, even if it’s something you’re generally in favor of, it should set off several red flags.
Bill Gates said of Google’s Loon project to bring the Internet to third-world countries using baloons: “When you’re dying of malaria, I suppose you’ll look up and see that balloon, and I’m not sure how it’ll help you. When a kid gets diarrhoea, no, there’s no website that relieves that.” He’s right. Access to information is way up at the top of Maslow’s Hierarchy of Needs. There’s more important things we can do—using technology, too—to improve the lives of people around the world. Just hooking them up to a network is solving the wrong problem. It’s hard to write a web app on your $100 laptop with a crank charger, when you can’t get food or are being forced out of your home by people with guns. Let’s use the tools we have to solve these problems first. Of course, doing that is much, much harder than building Internet connected balloons, which is why most technological “solutions” to the real problems consist of merely “raising awareness”. Turning all Twitter avatars green really helped in Iran, huh?
Furthermore, we’ve seen in the last month or two just how much of a double-edged sword technology really is. The exact same tools we use to connect us, share knowledge, and watch cat videos can be easily used to create a contemporary version of Orwell’s Telescreen, only more dangerous. [2] We can’t see them, but they can see us, and everything we do. This isn’t freedom—it’s borderline fascism. But it’s also the nature of the beast we’ve created, tying connectivity to corporations who are motivated not by the public good, but by benefiting stockholders, and by allowing a government security apparatus to work unchecked because we’re afraid of what the alternative might be. The open source wonks will claim that it’s our own fault for trusting closed software and hardware, but trusting a group of well-intentioned programmers is just as dangerous. No matter what happens, the majority of people will not be writing code. They’ll be using technology as an appliance and won’t care who wrote the software or who audited the code for security backdoors.
Democracy is about an equal voice, and while technology can give more people a voice, it doesn’t make those voices equal in any sense. The users of technology are going to remain subservient to the people who actually make it, whether they’re using Linux or MacOS. Technology literacy only goes so far, because as we’ve seen and I’ve said, the majority of people don’t care that much about how things work—only that they do. No matter what, someone’s going to be on the outside of the sphere of technological elites because they have either chosen to opt out, or never chose to opt in. Democracy doesn’t work when a majority of people don’t even get involved. We see it in the current political environment of the United States, and we see it in the current technological environment. One of these is more likely to change than the other.
Least of all is that I don’t want to open the floodgates of e-mails calling me a closed-source using fascist. ↩