Menu

Sanspoint.

Essays on Technology and Culture

A Battlefield Promotion to Social Media General

Recently, I inherited the role of managing social media and community building for the startup company I work for. It happened quite suddenly—my boss found another job, and I got a battlefield promotion. As I’ve tried to find my bearings in the short time I’ve had this role (about 48 hours as of this writing), I’ve had one main, nagging, concern above all the others, even the “can I actually pull this off?” question. That concern is: How can I do social media and get people to use our product without being the canonical “Social Media Douchebag.” I’ve been paranoid enough about it that I even Asked MetaFilter.

The fears are twofold. First, a major part of our product’s target audience are not exactly tech-savvy. We’re not exactly talking the “have my secretary print out my email” types, but that’s not far off. Social media and social networking are not things they’re terribly up on. If they have a profile anywhere, it’s probably on LinkedIn. And, to come clean here, I don’t know much of anything about how to reach people on LinkedIn. The second fear is that, if I push too hard, or do the wrong thing, I will destroy any good will my company and its product has earned in this community.

By way of example, a user of our product sent us a very angry email after we linked to him in a broadcast that we send out to all of our users each week. It was merely his name, a link to his profile, the firm he works for, and his title—all required, and public information on our product that anyone could see. The pull quote from his e-mail is “I do not do business that way.” If so, then he clearly signed up to the wrong service. This is comparatively minor, but it’s a good example of the sort of slightly-paranoid, slightly technology-phobic audience I’m looking to woo.

There’s plenty of people who love us and what we do, but more who have only heard of us, but don’t know what we do. I’m making it my job to increase the number in the first column. However, I want to do it in an ethical, sane way that steps on as few toes as possible. This may be the initial problem. One way to know you’re succeeding is when you’re making people upset. It’s possible that my fears in this regard are an obstacle. Maybe I should ruffle a few feathers. No such thing as bad publicity, right? On the other hand, if I’m doing something I’ve explicitly railed against on this site, or elsewhere, [1] there’s an “it’s okay when I do it” moral relativism that chafes me when I see it in others.

I think it’s possible to do this. It’s not boiling the ocean. We can increase people using our product, actively, win paying customers, and maintain a good reputation, but it’s going to take some thought, some time, and some patience. I have all the tools at my disposal I need, including a small number of people who love our company and product passionately. Reddit co-founder Alexis Ohanian made a brilliant observation about those passionate users of any sort of technology product in the modern era: “They are your website, not you.” The trick is to find them, and listen.


  1. Which I’m not, and haven’t. I’m just covering bases, here.  ↩

Merlin Revisited: Workflows with Merlin Mann

It’s Christmas time for Mac Nerds. The fourth annual Mac Power Users workflows show with Merlin Mann is live. [1] Though I’m not a regular listener to MPU these days, Merlin’s annual visit always provides me with a ton of inspiration to revisit, review, and improve my workflow in so many ways. In this particular installment, Merlin, Katie, and David talk up Evernote, text-based workflows on iOS, and some great GTD meta-stuff that really got me thinking about how I manage the things I have to do.

Take some time, today, and give it a listen. It’s always a good idea to find out how to do the things you do better. Thinking about the things we do over and over, the patterns we discover, and ways to break them and form better ones, is essential to how we use technology in our lives. And, when someone who’s great at that takes a couple of hours to share them with us, it’s worth that time to listen.


  1. If you have a lot of time to kill, it’s worth checking out the three previous episodes.  ↩

The Ratholes of Technology

For a brief, shining moment in the 90s, multimedia and CD-ROMs were the hot technology. It was going to change the world, by putting knowledge in the hands of people, and mixing sound and video with text, in a new, interactive way. Everyone jumped on the bandwagon, from traditional software companies like Microsoft, to musicians as diverse as Laurie Anderson, Devo, The Residents, Queen, and Yes. A computer couldn’t be sold unless it had a 2x CD-ROM drive, then a 4x or 8x, along with a sound card, and enough RAM to make the darn thing go. Then, suddenly, this Internet thing showed up in people’s homes, and the interactive, multimedia CD-ROM became a curious, forgotten relic.

Grolier’s Encyclopedia was my first exposure to the idea of multimedia. In my middle school’s computer laboratory, on a creaking old 486 that was, even for 1995, behind the times, I got my first taste of the future, watching a grainy, blocky video of the [Hindenburg Disaster]. Around that time, my own aging 486 got an upgrade with a 2x CD-ROM drive and a SoundBlaster 16, followed by a hard drive and RAM upgrade to offset the increased load on the machine. Waking up one Christmas morning and unwrapping a five CD-ROM Encarta set, with encyclopedia (on two discs!), atlas, dictionary, and timeline was amazing. Now, I can get all of that through Wikipedia, and its sister projects in less time than it would have taken to put in the CD, let alone run the software and search.

Technology is an iterative process, and when looked at with enough distance and hindsight, it seems like a fairly straight line of progress. We went from teletypes and punch cards to video terminals and command lines, then to GUIs. We had 8086 processors, then 8088s, then the 286, 386, 486, Pentium, Pentium II, III, 4… It’s easy to lose sight of what was proposed as the next big thing, then fell to the wayside: the Amiga, OS/2 Warp, CD-ROMs, VRML… If you zoom in closer, the history of technology is more of a finely combed branch, or feather, with a lot of terminating dead-ends, where we thought something would be great, and it wasn’t, or something else came along and proved to be better. Technology is full of little ratholes like these.

The Internet predated CD-ROMs, in one form or another, but in the days when the best you could hope for in terms of bandwidth meant it took an hour to download a decent quality image, CD-ROMs had the advantage of (near-)immediacy. Put the disc in, launch the app, wait for the drive to spin up and seek to the file, and suddenly, you had a video of a burning zeppelin on your screen. Unless you were on a college campus and lucky enough to have access to a T1, or if you were rich enough to pay for ISDN, video over the Internet was a laughable concept.

But things got faster. 33.6Kbps modems, 56k modems, home broadband that outstripped a standard T1 connection. Suddenly, it may not have been faster to look something up on the Internet, but it was fast enough that the flaws of a CD-ROM became apparent, not just in terms of seek time, but also clunky interfaces and high system requirements. The legacy of the multimedia CD-ROM now lies in landfills, artful mobiles of microwaved CDs, and every Flash-based website for a company or restaurant that insists on audio, video, and custom UI controls to display something that could be communicated faster and easier with text and images.

It’s why I try to maintain a healthy skepticism when someone proposes a new technology that will change how we do things. Whether we talk of ubiquitous motion-based computing, viz-a-viz the Microsoft Kinect, or just NFC based payment systems, I can’t help but think that these are ideas that look really cool in demo videos, but prove difficult, at best, to implement in the real world. Something already out there could suddenly catch up and surpass the newest, latest thing. Something else newer, later, and cheaper and faster, could nail its proposal to a venture capital firm and get a ten million dollar investment, seemingly out of nowhere. It’s happened before.

Quicksilver 1.0.0, or: You Can Go Home Again, but Do You Want To?

Quicksilver made me switch to the Mac. It wasn’t the only thing. I’d grown frustrated with Linux, and saw people I respected jumping to the Mac with gusto, but it was Quicksilver—and Merlin Mann’s breathless exhalation of it, that got me to take the $500 I saved for a laptop, and buy a Mac mini instead. Quicksilver changed how I thought about using my computer, and I’d been using a computer since the days of MS-DOS 5.0 and Windows 3.1. It was a veritable Swiss Army Knife of functionality, and I could make my computer practically dance with a a Command-Space and a few keystrokes.

But, Alcor, the brilliant mind who gave us Quicksilver, had to put it aside, especially when he was snapped up into the bosom of Google. He at least thought to open source it, and, in time, developers came to breathe new life into the app.

Just, not fast enough for some of us.

A couple of years ago, Merlin Mann saw the writing on the wall, and switched to LaunchBar. I saw the same thing, and followed suit. Launchbar was soon joined by Alfred dominating the space that Quicksilver made.

It’s been ten years, and Quicksilver’s new developers have finally put enough into it that they’ve taken it out of beta. I decided to give it a try, wondering if I could, truly, go home again. A few minutes after downloading the app, configuring it, and installing the assortment of plugins that I needed to replicate my Launchbar-based workflow, and installing my long missed BezelHUD plugin, I got down to work.

Everything I loved about Quicksilver was there. So was everything I hated, as well as a few new things. I freely admit that most of those new things were related to plugins, such as the Disk Images Module, and the Things Module, which hadn’t been updated since 2009. These modules were 32-bit, and wouldn’t run on my 64-bit machine and 64-bit Quicksilver install. Disappointing, but understandable. The modules were third-party creations, and the new developers didn’t have access to the source code to recompile. The impetus would be on the unknown developer of Disk Image Module, and on Cultured Code for the Things Module to bring them back to life.

Another disappointment came when trying to use Quicksilver to control iTunes, something I use Launchbar for, a lot. In LaunchBar, I could just hit Command-Space, and start typing a song, an album, an artist, or a playlist, and have it come up. In Quicksilver, I had to type “it” for iTunes, right-arrow into it, type or arrow to the appropriate “Browse” item, and then type the artist, album, song, or whatever I wanted. Attempting to play an album through Quicksilver resulted in a long beachballing, rendering both Quicksilver and iTunes unresponsive for several minutes. [1]

In fact, most of the interactions I had to do with Quicksilver seemed to be more complicated and involved than in Launchbar. This seemed quite odd as, back in the day, Quicksilver made my computer dance. Perhaps in the years between using Quicksilver and Launchbar, I adapted to Launchbar’s own quirks, such as pulling items and using Command key shortcuts to manipulate them, versus Quicksilver’s noun and verb command structures. More likely, I think, is that the Launchbar way of doing things is actually faster. Fundamentally, it’s a religious difference. Launchbar is about learning commands, and muscle memory. Quicksilver is about thinking of sentences, and machine memory. I could sit and use Quicksilver for another year, reteaching it how I think and work, but that’s time I can better spend… working.

Make no mistake, Quicksilver is a beautiful app launcher, and it works great. With a few clicks of various options, and installing a new interface plugin, I felt for a few minutes, like I’d never switched to Launchbar. I’ve never known nostalgia for a mere utility application like the nostalgia I have for Quicksilver. Perhaps running DOSSHELL would be as nostalgic, but in 2013, running DOSSHELL would accomplish nothing but that nostalgia. Somewhere out there is a user for whom Quicksilver 1.0.0 is exactly the app they need to do their work better. It used to be that way for me. It may be again, but for now, I’ll be staying with Launchbar.


  1. The Quicksilver team tells me that this is a problem related to changes in Mac OS 10.8.3. I believe them, but for me, this was a showstopper.  ↩

Inertia, and Why We Hate Technological Change

Those of us who think a lot about the technology we use become very resistant to change. Call it inertia. We know our operating system of choice, our apps of choice, our hardware of choice, and damned if we’re going to try something else because we know this and we don’t want to learn again. People who don’t put a whole lot of thought into technology have the same symptoms. Why should they switch, if the thing they’re using works? All they care about is whether they can do what they want to do. These two groups exist on a spectrum that twists around and is joined at the ends like a Möbius strip—dyed-in-the-wool, neck bearded Open Source enthusiasts sitting back to back with grumpy corporate CTOs on decade-old Windows XP boxes, neither one knowing the other sits next to them. All they see is the shades of gray fading off in front of them.

A radical change in an established technology will be immediately polarizing to anyone deep in the technology world. The reactions to the original Mac and the reactions to Windows 8 are the same sentiment expressed by different populations, all boiling down to “It’s new, it’s different, and I’d need new software to use it.” However, if you’re wondering why sales of iPhones and iPads have gone through the roof, while the Mac’s market share has grown much more slowly, consider this: While the smartphone and the tablet were not new, per se, the iPhone and iPad were the first of these devices that were designed for the common user, not a geek, or a sales guy who needed e-mail on the road. For all intents and purposes, the iPhone and iPad invented the smartphone and tablet spaces, though “re-invented” is more accurate from a technology standpoint. They’re so far removed from the Palm Treo and Windows for Pen Computing tablets as to be a new thing entirely—and even then, they had grumpy detractors.

Microsoft’s use of a traditional desktop UI on smartphones and tablets in the early 2000s may well have been guided by the idea that a user would already be familiar with it from their PC. “A user already knows to click the Start button,” someone at Microsoft thought, “so they’ll know to do that on Windows CE.” This was true, but it ignored the possibility that there was a better way. I’d love to see the design process behind the Windows CE interface, and if anyone suggested a PalmOS like grid of icons, or some other, new, UI convention. It wouldn’t necessarily be better, either because new UIs aren’t always better, or because the hardware couldn’t make it work—though more likely through inertia—but it would be nice to know someone tried.

Institutional inertia is simply user inertia, writ large. In a technology company, you (with luck) have people using the product they create. There’s a term for this: “eating your own dog food.” The benefit of eating your own dog food is that you can find ways to improve it by scratching your own itches. [1] This, of course, assumes the organizational structure around the product actually allows for that sort of thing, something you’ll see more of in young, small companies than in big ones with layers of management, [2] but this not always the case. It’s the ones furthest from the metal, as it were, who react to changes with rancor, and those people can be inside the company as well as outside.

People complain every time Facebook changes its interface. They complain when Apple changes the iTunes UI. Every change breaks somebody’s workflow, even if it’s a bugfix. It’s psychology, and the further you go towards the joint on the Möbius strip, the more adamant you’ll be that things stay the same. Meanwhile, in the creamy middle, there’s a bunch of people who, perhaps with skepticism, evaluate what’s new, and make the jump. Some even go back if they find they made the wrong choice It may take more goading than others, but everyone who isn’t on the extremes can at least adapt to something different when they need to, or find something that truly is better. Life in the middle is much more interesting than on the extremes of stubborn technological inertia, but for many people that’s exactly why they stick where they are.


  1. Yes, the metaphor goes completely bonkers here. Sorry.  ↩

  2. c.f. The Peter Principle and The Dilbert Principle.  ↩