Microsoft just announced some neat new hardware, including the Surface Studio desktop computer, with itâ€™s equally neat Surface Dial controller. Considering how weak-sauce Appleâ€™s computer offerings have been lately, Iâ€™m glad to see Microsoft pushing the desktop, laptop, and tablet to strange new places. There isnâ€™t much they showed off for the average user, but boy howdy are they making a push towards the creative marketâ€”a market that used to be the exclusive domain of Apple. Bare minimum, itâ€™s a way cooler demo than the Magic Toolbarâ€™s gonna be.
It used to be that Microsoftâ€™s biggest strength was its dominance in the workplace. Even when the creatives loved their Macs at home (when nobody else did), creative apps had to become Windows-first apps, because thatâ€™s where the money was. People could get away with pirating Photoshop at home, because Adobe was making bank on site licenses to companies with thousands of Windows machines. Now, Macs are (slowly) becoming the new IT hotness, and Microsoft needs to get a thin edge of their wedge in somewhere. Why not Appleâ€™s old market?
One thing Microsoft is great at is coming up with neat ideas for hardware and software. Under Satya Nadella, theyâ€™ve also been great at shipping them. But, thereâ€™s two big problems Microsoft needs to solve if they want to coax over the creative market from Apple. Problem one is Windows. Itâ€™s come a long way in the last few years, but Windows is still a clunky OS loaded with far too much legacy cruft. Microsoft needs make a clean break with the past, and pull a Mac OS X style new operating system. Maybe even give up the Windows brand entirely.
That still leaves the other, bigger problem. How does Microsoft win over creatives without apps? This is the essential chicken and egg problem. Microsoft canâ€™t get users to switch without compelling apps for their platform. Microsoft went through this once before, with the Windows Phone saga. It got to the point where they were literally bribing companies to port their iOS and Android apps to Windows phone. After all, nobody wants to switch to a phone platform that doesnâ€™t have the apps they use on their phone.
Whatever your complaints may be about the Mac and iOS ecosystemâ€”and remember, Windows is also a Tablet OSâ€”apps probably arenâ€™t one of them. Well, maybe for iPad, but that drumâ€™s being beaten a lot more quietly of late. Android and iOS are almost at parity when it comes to apps, at least on the phone, so switching between the two isnâ€™t that painful. Switching to Windows Phone, on the other hand? If Microsoft hasnâ€™t learned from that debacle, however cool the new Surface hardware platform is, itâ€™s not going to go very far.
Without users who will buy apps, app developers arenâ€™t going to make apps for Microsoft. A compelling new suite of hardware, a refreshed operating system, and cool new input methods will only get them so far. Microsoftâ€™s suite of first party apps, like Paint 3D, and that Minecraft thing will help. It just remains to see how much. I hope they get some traction, if only because it might knock Apple out of their torpor and get us some cool, powerful, and feature-full hardware and software for the Mac and iOS. Without healthy competition, the tech industry goes nowhere. Iâ€™m excited to see things heat back up again.
Pity the poor personal computer. Its time is swiftly coming to a close, eclipsed by its progeny, the tablet. Hampered by legacy architecture and legacy operating systems, the PC will soon fade away from desks at home and in offices. Instead, PCs will live out of sight and out of mind, stacked to the rafters in server closets and data centers. An dignified end for a technology that changed the world.
Well, let’s not be too hasty. Steve Jobs’s famous metaphor about trucks and cars from the iPad introduction is an apt way of thinking about tablets and PCs, but only so much. In the intervening years, tablets have gained capabilities on par (light) trucks such as in the case of the iPad Pro. During the same time, PCs have become more car-like as in the single-port retina MacBook. And that’s muddied the waters plenty without even getting into the world of convertible hybrid tablets, Windows 10, and Chromebooks.
I’ll put myself in the camp that laptop/tablet hybrids are a short-term solution until the tablet and PC divide fully shakes out. The dividing line will be drawn when we establish the things a traditional personal computer can do that a tablet can’tâ€”and vice versa. There’s some clear lines now: you can’t develop applications on an iPad for the most part, though this is certainly going to change. You can’t do heavy graphics work, or anything more complicated than the most basic audio and video editing, but this is only for now.
More difficult to change is how tablets are locked to themselves. You can’t use a tablet with an external display, except to do a presentation. There are limited options for input and output, as well. Latency and transmission speeds for wireless connections will improve with time, but for now, anything that must be real time for input or output is hampered. What wireless solutions we have now for the tablet to escape itself are kludges and hacks.
As tablets get more powerful, they will expand to allow you to do what you can do on your traditional PC. In tandem, PCs will become more powerful too. A PC, even in a laptop form factor, has more headroom for computing power. This goes for both size of chips, but also thermal headroom. Nobody wants a tablet with a cooling fan, after all. The increased computing headroom of the personal computer opens it up to all kinds of new applications that will, in time, trickle down to a tablet in time for the personal computer to have more applications that require more power.
It’s better to think of the tablet as a sort of computing appliance. You buy it, and it serves its purpose of giving you the best of basic personal computing. When it gets old, and unsupported, you replace it. You could do this in the PC world, but it’s easier just to upgrade the components. That’s impossible on a tablet without a soldering iron, specialized components, and a lot of patience. Some bemoan the loss of upgradability, and we see it coming to the PC too, as they become more “car”-like. I’m not sure it’s such a bad thing.
How many of the problems that plagued PCs back in the day were a result of the componentized nature of the platform? Stick a bad RAM stick in your PC and watch things go sideways real fast. I know for a fact that part of why Windows is such a pain, even today, is that it needs to support a nearly infinite number of hardware configurations. Not everything will work well together. Upgradability of a PC’s components is nice and convenient, but opens up so much potential for hassle. Better to just plug stuff in without opening the case. At least there’s less possibility for things to go wrong. A point for the tablet, but also for the tablet-ified PC.
But why do you need more power right there at your fingertips, anyway? What about the Cloud? Who needs a computer at your desk, when the tablet (or ultra-portable PC) can offload its storage, processing power, and whatnot off to some box somewhere in a data center? We’re closer to making this a reality than we ever had been in the when Larry Ellison proposed his Network Computer, but connectivity is the enemy again. American broadband is still crap, and it’s crap in a lot of other countries too. Unless getting data to and from that remote machine is as fast as on a local one, this idea is stuck.
Though the biggest obstacle to tablets is the entrenched culture of the personal computer in workplaces. Yes, there are some progressive companies that are integrating tablets into daily work life, but existing limitations mean that your average workplace isn’t going to be able to swap out everyone’s laptop with a tablet any time soon. This goes double for desktops. There are those who suggest that once the children of the tablet age, whose first computing experience was an iPad or iPhone, enter the workplace, they’ll have to migrate to tablets. Not at all the case.
Kids growing up into a workplace IT culture built around PCs is not enough to shake things up. As anecdata, I know most kids in my age bracket, at least in the US, grew up with Macintoshes in their schools as the primary computers. (Hell, my middle school had Apple IIs in the computer lab until I was in 8th Grade.) Macs are making more penetration in the office, but of the six jobs I had after graduating college, only two of them were a primarily Macintosh IT environment. Both were tech jobs. If a whole generation of kids weaned on the Macintosh couldn’t get Macs on desks at your average workplace, what makes you think kids raised on tablets will?
None of this is to say that a tablet-first future isn’t coming. There needs to be something compelling enough to disrupt the entrenched legacy of the personal computer at home and at work. Tablets will get there first in the home. They already provide an easier way for people to do most of the ordinary computing tasks they would do on a PC. A few more iterations and OS upgrade cycles, and the tablet will be your average user’s primary computing device. The office, not so much.
For the time being, the PC will rule the desk. That is, unless you fit a specific niche where you can live within a tablet’s limitations. Over time, yes, the tablet’s limits will fall away, and tablets will let you do more, with more. We’re not there yet, and I don’t see it happening for at least a decade. The thin edge of the tablet wedge has gotten in, however. Its only a matter of time. Just don’t assume your next traditional computer will be your last.
There’s tension when it comes to technology companies and encrypted messaging. Snowden’s revelations about PRISM and other NSA spying through tech companies have them promising more encryption to protect their public image. Yet, if they use good encryption that governments can’t get their tendrils into, and if they do it by default, there’s other people who can’t spy in on people’s conversations: the companies themselves.
If Facebook is encrypting users conversations, they can’t mine data for its uses. That includes stuff like the News Feed algorithm, their digital assistant M, andâ€”biggest of allâ€”the data they sell to advertisers. That last one directly affects the company’s bottom line. It’s the same with Google, Microsoft, Snapchat, and any other advertising supported company that isn’t end-to-end encrypting messages by default. Whatever claims they want to make about valuing user privacy, and all that jazz, as long as they’re peeking into what you’re saying and doing, your conversations aren’t private. End of story.
Even with encrypted messaging, the provider does have to store something to make it work. Signal, which is end-to-end encrypted revealed that the FBI subpoenaed their user data. Of which they don’t have much: “only account creation date & last login time.” according to Edward Snowden. Apple, too, logs some user data, such as who you messaged and when, but not the content.
In the case of Apple, this is the sort of metadata that the NSA claims to have collected on phone calls. It’s still dangerous if it gets outâ€”or subpoenaedâ€”but it’s not great for marketing purposes. Advertisers are less interested in who you’re talking to, and more about what you’re talking about. This is why chatbots are so sinister. By presenting a friendly, playful personality that promises to do whatever you ask it to, chatbots are excellent tools for extracting your personal data. And what better way to get a good deal on a partnership with a company to integrate with your chatbot than promising to share valuable user data with them?
Messaging, even when you’re not talking about anything “important” is a gateway into our most intimate selves. That’s why that data is so precious to the NSA, to other governments, and to advertisers. By presenting a messaging service as private and secure, even when it’s not by default, a tech company can override yet another defense mechanism savvy users know to keep prying eyes out of their lives. Even worse, most ordinary users aren’t going to even know or care, as long as the service does what they want, and does it well.
This is what everyone is banking on. Without education about the potential of mass data collection by private companies and government agencies alike, most people won’t be aware of the risks. Without a compelling narrative about why people should care, education about the risks will just be ignored. We all have something to hide, not necessarily illegal things, but aspects of ourselves we want to keep between us and the human being on the other end of the line. If we can’t keep people from prying into this most intimate space of our digital lives, what will convince them to butt out?
The dream of personal computing is unfettered access and control of powerful hardware that you can make do anything your little heart desires. The reality of personal computing, at least in the internet age, is that you and everyone with a connection has barely fettered access and control of your hardware. I don’t know if you can still plug a Windows XP machine into the Internet without a malware filter and have it turn into part of a botnet overnight, but it sure was that way for a while. I wouldn’t dare stick even a modern Windows 10 machine on the open web without something to protect it.
When anything can be accessed, when there are countless people: individuals, businesses, and states, all poking and prodding to find any possible weakness in the software and hardware we use, something has to give. In the case of Apple, it’s the freedom to run any random app on your iOS devices. Apple vets what is allowed to run on your iPhone and iPad, and it takes modifying the core software to change that. The big fear among some is that this will, eventually, come to the Mac. And this will be the end of days for free, open, personal computing. Hence concern over recent changes to Gatekeeper, the macOS tool for ensuring software is safe, that make it harder to run unsigned apps.
What code signing does is allow a user to know the app in question is being created and distributed by a developer that has, bare minimum, coughed up $99 a year for an Apple Developer Account. The intention is not necessarily a money grab, but a security measure so that Apple, and the user, can know if an app has been modified and identify the author.
It’s not perfect, by any stretch. The BitTorrent client Transmission got malware injected into it, and their code signing key compromised. Twice. If the purpose of code signing is a trust measure to confirm that an app you download and install off the open web is safe, this is a failure, although likely a failure on the Transmission team.
Code signing is also inconvenient. If you don’t have $99 a year to get your Apple Developer ID, you’re stuck up a creek with an unsigned app that is now harder for your potential audience to use. For open source apps, this is a huge pain in the behind, since someone now needs to serve as benefactor if they want a signed Mac version. Not every open source project will want or be able to afford that.
Yet, nearly five-hundred words in, I can’t say I’m terribly upset by this development for the reasons I brought up at the start. Free, unfettered, open computing is dangerous as hell. Being able to arbitrarily execute any piece of code that lands on your computer is a massive risk, and we’ve seen what happens when there’s no protection. If you need a reminder, go spin up an unpatched Windows XP VM or two.
Let’s face it. Users are idiots. All users. Even you and me. Even the top computer security experts, whether they’re working for Apple or the NSA, are liable to do something very stupid when they have to make a decision to keep their computers safe. Hell, case in point, some NSA operative left a powerful hacking tool on a server where it was compromised. What makes you think that you, smart and savvy computer user, won’t accidentally install a compromised executable and turn your machine into part of a botnet?
Restricting the average user’s ability to run random software is, as painful as it might sound, for their safety. It’s protection not just from malicious actors, of which there are many, but protection from their own idiocy. There’s no way to allow a computer to run arbitrary code and protect a user from the consequences thereof. With so much of our lives entangled in a garbage web of privacy, not preemptively locking things down is downright stupid.
Conversely, it should be possible for a user to open those locks. It shouldn’t be easy, it shouldn’t be obvious, and it should require them to absolve their OS provider of choice from responsibility ifâ€”or whenâ€”something blows up in their face. Digital security is doomed to be a cat-and-mouse game for eternity. That doesn’t mean we should make it easier for the cats in the interest of… what, exactly? The average person does not want to think about how to protect themselves, they just want their stuff to work.
That needs to be the priority: keep everyone safe, and keep everyone running. You can do this without crushing the freedom to make software, even free software. It’s valid to quibble about Apple making their $99/year Developer Program mandatory for developers who just want to make and distribute an app, too. Despite that, a system that offers a signing certificate to anyone with a pulse isn’t going to be secure either. We’re doomed no matter what, but a solution that keeps the user as safe as possible for as long as possible is the best option we have.
So, the chip in the new iPhone is faster than any MacBook Air. The pure silicon power in the A10 is fueling another round of speculation about whether iOS will come to the desktop. It was something floated as an idea here and there for the past few years, but itâ€™s faded into the background as iOS gains more Mac features, and macOS gains more iOS features. Personally, I donâ€™t think iOS will come to the Mac any time soon. What I think will happen is something a bit more radical: a new version of macOS built with the underpinnings of iOS instead.
Itâ€™s not as crazy as it sounds. If you recall the original iPhone announcement, Steve Jobs didnâ€™t say the iPhone ran iPhone OS, he said it ran OS X. Itâ€™s not exactly OS X, but iOS and macOS have the same technology at the core. iOS also serves as the underpinnings of Appleâ€™s other two operating systems: watchOS and tvOS. Thinking about iOS this way, you can see the development of iOS as stripped-down, mobile and touch optimized version of macOS. In the past decade, itâ€™s been built back up with modern, touch and mobile-focused computing as its focus.
Every version of iOS since the initial release has added new features and extensibility of the type we take for granted in modern desktop OSes. Yes, iOS isnâ€™t at the point where has feature parity with the Mac on an OS level by any measure. Itâ€™s not unreasonable to assume that with a few more years of development time that iOS will reach parity with modern macOS. One example of this future is the upcoming Apple File System which is planned to run on all Apple devices. Thatâ€™s some serious unification.
If the pattern holds, I expect to see a new version of macOS built on the iOS code base, optimizing for desktop features, and possibly even running on ARM chips. Thereâ€™s almost certainly an ARM version of macOS as know it today running on ARM, but if Apple can jettison the same legacy crap they let go for iOS on the Mac, I donâ€™t see why they wouldnâ€™t. How many of the issues we deal with in on the Mac (not related to outdated hardware) are from fifteen years of accumulated cruft? Or longer, if you count the baggage from the NeXTSTEP days.
None of this is going to happen for a while. Weâ€™re only a decade out from the last processor transition in Macintosh hardware. While the Intel transition was pretty seamless, a transition to ARM Macs will bring one major hassle: the loss of Windows compatibility. Perhaps if an A15X chip of some sort is powerful enough to do real-time CPU emulation without a huge speed crunch, it wonâ€™t be an issue. Or, maybe, Windows for ARM will become a thing again. Either way, itâ€™ll be a hell of a transition. Appleâ€™s done it twice before, though. Iâ€™m excited to see what a desktop OS built on iOS would work like, even if it still looks like macOS. And Iâ€™d put safe money down that it will.