The marquee feature of the new iPhone 5S is a fingerprint scanner that can not only be used to unlock the phone, but also make purchases from iTunes, the iBook Store, and App Store. While it’s not the first smartphone to include a fingerprint scanner, the Motorola Atrix 4G didn’t exactly set the world on fire. Elsewhere, the Moto X may have an always-on microphone to have the phone respond when you summon it, [1] but it’s not tuned specifically to your voice. The iPhone 5S is the first phone to make biometrics mainstream, and it has the potential to revolutionize security for the average user, in the mobile space and beyond.
The passcode on smartphones is an easy point of failure, and there have been plenty of alternate solutions. There’s the pattern-swipe lock screen that is the standard on Android phones, crackable by reading the smudges. Some phones use face detection to unlock, but those can be defeated simply by using a photograph. While thumbprint scanners can be exploited, it’s unlikely someone’s going to try making a gummy thumbprint just to get into a phone. Short of a crook cutting off your thumb, fingerprint identification is much more secure than any other form of smartphone security. That is, if it works right, and that’s yet to be seen. However, since Steve’s return to Apple, they’ve tended to not release features that are half-baked. Three-quarters baked, sure, but not half-baked.
Biometric technology hasn’t been huge in the consumer space. You can find it on some business laptops, but most of us are still tied to memorizing strings to log into our computers, if we even bother with that. [2] Fingerprint readers are known to be somewhat finicky, and cheaper readers that use less sophisticated scanning mechanisms and identification algorithms are either going to frustrate a user by not letting them in, or be laughably insecure. It looks from the Touch ID setup process that Apple’s trying to get as much info as possible about a fingerprint, which should reduce the chances of a legit user being denied access, and a non-legit user gaining access, but only Apple knows what the leeway the algorithm has.
If Touch ID in the iPhone 5S works well enough, it gives Apple a new way to tie a user to a device—and to their other devices. I don’t think it’s a leap to imagine an upcoming version of OS X (or, perhaps OS XI) that will allow you to use your iPhone’s fingerprint scanner to log into your Mac, perhaps over Bluetooth 4.0. Half of that setup is already used by two-factor authentication app Authy. It overcomes one of the inherent security issues with NFC-based payment methods like Google Wallet. Instead of a guessable PIN, someone could use the fingerprint scanner to authorize a transaction, possibly with Bluetooth 4.0, or iBeacon as the transmission mechanism instead of NFC, reducing the steps in the process. Touch ID based purchases from Apple could easily be a dry run for this.
The success of Touch ID all comes down to making it work well enough in these early days. If Apple’s support forums are flooded with users complaining that their iPhones won’t unlock no matter what way they put their thumb on the darn thing, Touch ID will not take off. I’ll give Apple the benefit of the doubt. I don’t think they’re desperate enough to release a product defining feature early enough that these problems wouldn’t have been caught in internal testing. Who knows how long Apple’s been working on making this work, after all? I’m also sure that Apple will be tweaking the algorithms that identify fingerprints once people are busily rubbing their thumbs against them. We’ll have to wait, first until the press embargo has been lifted and reviewers get to share their experiences, and then when the feedback comes in from ordinary users.
People who claim the iPhone 5S isn’t “revolutionary” enough aren’t thinking long term. Today it’s unlocking our phones with a thumbprint. We won’t be unlocking our entire lives with a thumbprint tomorrow, but in a few years, we may be looking back to the humble idea of a fingerprint scanner on the button of an iPhone as a sublimely brilliant idea that’s changed the face of security for the average user. Technological revolutions don’t happen in huge jumps, or creating “new product categories.” They come from the adjacent possible of using tools we’ve developed in new, and exciting ways, and that’s what Apple excels at. I just wish I didn’t have to wait two years to try it out myself.
This is the sort of thing that should make people more paranoid than the idea of someone’s fingerprints being stored in a database, whether locally or networked. ↩
On September 10th, Apple will announce the iPhone 5S, a slightly updated iPhone 5 with the same basic form factor, and one new bonus hardware specific feature, possibly a fingerprint scanner. They will also announce a “low-cost” model, the iPhone 5C, with a plastic shell in multiple colors. They will announce iOS 7’s release date, and a small revision to the Apple TV and iPod lines, including (possibly) a new iPod nano design (again).
Apple’s stock price will go down on the news that Apple is not innovative enough with their latest updates to the iPhone line, despite the fact that they always release small updates on off-years, and have since the iPhone 3G[S]. The Apple TV will be criticized for not being an actual TV, despite there being no indications that Apple will actually manufacture a television set. Pundits will also decry the lack of an iWatch, again despite there being no indications Apple has such a product even close to ready to announce. Some will also complain that Apple did not announce new iPads, or Mac hardware, despite typically having stand-alone events for those announcements. More articles will appear complaining about Tim Cook’s lack of leadership and how miserable of a CEO he is compared to Steve Jobs.
The iPhone 5S and iPhone 5C will go on to sell a stupid amount of units the weekend of its release, and the stock price will rebound slightly. Meanwhile, the rumor mill will wind up for the upcoming iPad and Mac event that is likely to happen in October. Pundits will complain that the iMac does not have a Retina display, that the new Mac Pro is overpriced, and Mavericks is not “revolutionary” enough. Then the hype cycle can begin in earnest for WWDC, where Apple is certain to announce the new Apple TV SDK, xMac, and the Jony Ive redesigned OS X 10.10.
Repeat ad nauseum, each year. Wake me when my AT&T contract’s up for renewal.
Why is programming so popular? Look at all the overnight success stories surrounding the startup culture. Some clever guy learns to program in his spare time, creates an online service or iOS app, and gets bought out by Google, Facebook, or Yahoo! and makes multiple millions. A kid just out of college or even out of high school, takes a programming job with a small company based purely on his GitHub contributions, and makes millions when that company gets bought out. Or, someone’s iOS app becomes an overnight success, making millions of dollars a year. Of course you should learn to code—you can get rich! Sure, if you’re lucky enough or get hired by the right company at the right time.
Of course, as technology infiltrates more of our lives, we’ll need more programmers to build, and to maintain, the apps and services we use. For every high-profile six-figure job writing code for a would-be world-changing startup company, there’s plenty of thankless programming jobs keeping financial software up to date with the latest laws, or equally essential, but banal programming tasks. Of course, those typically pay well enough, but they lack the glamor of creating a web or iOS app. Make no mistake: there is a skill shortage in the United States when it comes to engineers and programmers—well, maybe. There’s plenty of skilled programmers that companies can bring in with the H1-B program, or increasingly outsource, and for far less cost.
More importantly, taking an online class can teach you a programming language, but that’s not the same as knowing how to use it. The best programmers are those who aren’t just in it for a paycheck, but also actively enjoy writing code and solving problems. They have experience, and they have expertise that comes from spending more time than a weekend doing exercises from an online tutorial. If you’re hiring a developer, would you take the homeless guy who learned how to write code six months ago, and has barely any experience, or will you take the five-year technology veteran who not only knows the language your product uses, but several others, and has a few open source apps of her own out there. I know who I’d hire.
None of this is to say programming isn’t a skill worth learning. There’s plenty of other equally useful skills people will need. Programmers are often great at making things that work, but making something useful for the normal people requires design skills. It requires someone who can write intelligible documentation. Running a company requires people management skills, money management skills, and the ability to press the flesh and connect with people to give you seed capital and the like. It’s not as easy as “write code, make money” which seems to be the pitch I’m hearing. It’s a myopic worldview that puts value only on people who know how to write code, and smells of snake oil to boot.
Give Me Convenience or Give Me Death was the final official release by The Dead Kennedys, [1] collecting various loose tracks, B-sides, and live cuts. A fitting title for what is, arguably a cash-grab compilation album, albeit an essential one. The Dead Kennedys were always critical of mindless consumerism. “Give me convenience or give me death” is also a line of thought that also seems to pervade our relationship with technology and how we consume it.
Case in point: the price of apps. It’s no secret the app store model has caused the price of software to drop overall. If a developer wants to make money on mobile platforms, they can write for iOS where users are more likely to pay for apps—but that’s no no guarantee they’ll make anything when there’s so many free apps. Vesper may be a beautiful, well-designed app with high profile names behind it, but Evernote does more for free, and it’s hard to compete with that. [2] Either way, it’s hard to make money selling mobile apps unless you get critical mass—and with nearly a million apps in the store, and more added daily, it’s hard to get noticed. [3]
So developers are doubly screwed. An audience that leans towards free options, and a store where it’s almost impossible to get discovered means that most of the people getting rich are the ones selling how-to guides on making whatever type of app is trending on Apple’s App Store at the moment. We saw the same thing happen during the blogging boom where people thought they could get rich by posting banal crap and covering it with Google Ads. [4] We can’t have it both ways, and the market now is leaning towards convenience for app “buyers” rather than developers. No skin off Apple’s back—or any other hardware manufacturer—a huge app store is just another feature they can use to sell the hardware.
All of this makes the backlash over freemium apps and in-app purchases very interesting. There’s no shortage of ethical issues around games implementing freemium models, but freemium works on the App Store model—if you’re lucky enough to get noticed. John Moltz’s “very mild defense” of freemium apps and games makes the point better than I could. Moltz singles out Jetpack Joyride is his piece, which made me smile as I dropped $1.99 on the “Double Coins” upgrade, though Jetpack Joyride was easily a game I’d have been satisfied to pay $4.99 for with no in-app purchases. These apps are worth something, and if we’re paying what we think they’re worth, how is that bad? It’s certainly convenient.
But, it isn’t just software where convenience trumps value. How much did you pay for your smartphone? In the US, at least, many people get subsidized phones at a deep discount, or even for free, up front. We “know” we’re paying off the price of the phone—and then some—over the two years of the contract, but few of us are able to drop $649 all at once on an unlocked iPhone 5. Even the sort of cheap Android phone that you can find for free (with contract) in a cereal box costs $139.99 at Best Buy. It’s more convenient for us, and for the carriers, to have the impression of these devices as cheap so that we can buy expensive plans and sign nearly iron-clad contracts. The alternative? Pay more, and be inconvenienced.
I have no solutions to offer. Things are in upheaval, and it’s hard to judge how big the waves really are from our vantage point. All I know is that it was not always like this, and it will not always be like this. That, and if you’re a developer, it probably couldn’t hurt to keep your day job until you make an app that does well. If you’re coming from the consumer side, just keep in mind that these awesome apps and great games are made by real people who need to feed their families. They’re worth something, even if they have a big “FREE” next to them in the store.
I side with Jello Biafra, and refuse to acknowledge the cash-grab live albums and Jello-free tours as The Dead Kennedys. ↩
Then again, Vesper isn’t targeting the same market as Evernote, but as they’re both fundamentally note taking apps, and so the comparison stands to a point. ↩
Full disclosure: I once ran Google Ads on Sanspoint, but that was a long time ago, and I didn’t expect to get rich from it. I did expect to make more than the pennies I did make… which I don’t think I ever actually saw. ↩
In the wake of the PRISM scandal, I’ve seen people I admire and follow online write about cutting Google out of their digital lives. Many of them were already unhappy with Google poking their nose into their personal date, and their collusion with (or capitulation to) the NSA is all the more reason for them to jump ship. They’re right to do so. NSA aside, I still use Google for a lot of things—including all of my email—and will continue to do so until the tradeoffs become unbearable.
I’m not worried about Google looking through my email to serve advertisements—especially since I don’t see them. [2] Even if I didn’t block the ads, gMail’s ads are unobtrusive for online ads. Google is also fairly up front, as far as internet companies go, about looking at email contents to target ads. As long as that’s all they do, and there’s little reason to believe otherwise (again, ignoring PRISM), I’m okay with that. They have to pay for the service somehow. It’s a fair trade: Google lets someone pay them to show me an ad for something I won’t click on (or see), and I get a really great web-based email service. Roundcube looks nice, and may be a good solution if I ever end up hosting my own, but in the browser and with Mailbox on my iOS devices, I can tear through my email with ease. [3] Switching to another service means I have to give this up, and I’m not prepared to do that.
Of course, I don’t use Google for everything. Now that iCloud mostly has itself together, I prefer to use that to keep my contacts and calendar in sync across my devices, rather than rely on Google. I’ve opted out of Google+ for good. My photos are on Flickr, my data on Dropbox, and I pay for App.Net. Google is just one part of a balanced ecosystem of services I use online. It’s safer than giving one company control of my digital life, even if Google is the company with the deepest integration into the heart of it. But it works, and that’s more important to me than control, or some notion of privacy from advertisers.
It costs $50/mo to host a Mac mini with Macminicolo.net, plus the cost of a Mac mini, or an additional $100/mo to rent one. Either way, that’s money that comes out of more pressing expenditures for me. (Damn student loans…) ↩
Yes, I use ad blocking software in all my browsers, though I’m willing to unblock sites that are unobtrusive, or ad networks that are ethical. In any case, it means I have a gMail experience, in the browser, that is ad-free and painless. ↩
I only signed up for Mailbox after the Dropbox acquisition. Dropbox is another company that I am willing to trust with my data (yet again, ignoring PRISM). They have a valid business model, and though they had a security issue last year, they seem to be on top of their game now. ↩
It helps that many of the alternatives to Reader are better than what they replace, in no small part due because they’re actively developed. ↩