Essays on Technology and Culture

Quantified Time and Knowing Thyself

Malcolm Gladwell’s Outliers defined an amount of time needed to become an expert in any task: 10,000 hours of deliberate practice. Not 10,000 hours of simply performing a task, but 10,000 hours of practice, which is work that tests skills, finds and breaks limits, and helps improve your craft. Don’t confuse the two. Gladwell’s 10,000 hour figure’s been tossed around a bit, including in So Good They Can’t Ignore You, which is why it’s fresh in my mind.

Elsewhere in the book, Cal Newport writes about venture capitalist Mike Jackson, who tracks how he spends his day with an Excel spreadsheet, allocating the time he spends on the various aspects of his job. The intent of the spreadsheet is so Mike could “become more ‘intentional’ about how his workday unfolds,” and it caught my attention. How many of us know exactly how much time we spend on our work, each day? Freelancers with excellent time tracking and invoicing software need not answer. [1] I immediately thought of the Quantified Self movement, whose adherents wear sensors and use gizmos to track nearly everything about their body, and often beyond. While there’s plenty of valid criticism of what the Quantified Self movement will do to the lives of its adherents, there’s plenty of practical applications for QS technologies to improve our work.

In my own life, working for a startup company with no set hours and the ability to define my own workday do a large degree has resulted in me being more than a bit unanchored. Building some structural scaffolding to my day would help me greatly. At the very least, it would keep me from crawling out of bed at 10 AM, and getting to work around “lunch time”. I’ve been increasingly eyeing some Quantified Self technology, but what I want to quantify most is how much time I spend doing what I need to do, and how much time I spend doing… something else. There’s plenty of tools out there, high-tech and low-tech, to assist, ranging from RescueTime to “The Unschedule” of assigning yourself defined periods of free time.

Data is relentless. The mantra at any web-based startup is “test, track and quantify.” It’s important to see what works, what doesn’t, and iterate mercilessly to improve what does. This takes time, data, and analysis. Can we not apply this to our lives too? The tools we use and their endless potential to distract us with alerts, status updates, text messages, and more also hold within them the potential to help us improve ourselves. Let’s quantify not ourselves, but our time, so we know what we should be focusing on, and what we should not. The answers to those are going to vary for each of us, but if we know that, for example, we’re blowing an hour a day checking for status updates on social networking services at work, we might be able to stop that, and leave work an hour early without guilt. Alternatively, it’s an hour we’ve reclaimed to use towards working to become an expert. We have the tools. Let’s use them.

  1. Actually, if you have any suggestions for good time tracking software that works on the Mac and iOS, please e-mail me. I don’t need to write invoices.  ↩

The Un-Quantified Self

It’s possible now, for only a few hundred dollars, to learn nearly everything about your body. You can get a fitness tracking device to monitor how much you move, how well you sleep, and how many calories you burn. You can get a WiFi enabled scale to track and plot how much you weigh. You can get a Bluetooth monitor to measure your pulse, and your blood pressure. You can get an app that pulls all of this data together. With all these tools in hand, you can generate the data you need to make a Nicholas Felton style annual report on yourself.

Proponents of the Quantified Self movement suggest that measuring everything you do is a pathway to better health, and a better life. I’m not going to say they’re wrong, either. The facts say they’re right. If you can see that every time you go to bed at three in the morning, because you were out at the bar the night before, you wake up late and feel terrible, you can then decide to not go to bed at three in the morning. This is an extremely simple and reductive example, but the point remains that knowing facts about yourself and your body does give you an extra layer of perspective towards making behavioral changes. Now, the technology is there and reasonably priced enough to put it in reach of people who can afford a couple hundred dollars in gear.

This works well, if you’re the sort of person for whom gamification is made for. In the parlance of role-playing games of old—the sort with pen, paper, and character sheets with numbers representing stats—someone who strove to max out there character in all aspects was known as a munchkin. There’s plenty of Quantified Self adherents who are in it just to know more about themselves, but I worry if the Quantified Self movement may lead to the same phenomenon to grow in the real world. I don’t think it’s a huge leap to imagine it, either. Go to the gym and look at the muscle-bound guys showing off for each other, and ask yourself if the munchkin qualifier may not apply.

Of course, there’s also the parts of the self that aren’t so easily reduced to numbers. You can know the number of steps you walked, but there’s plenty about us that is more nebulous. To be blunt, you can’t quantify “happiness”. It lacks a scale, or even a sense of the baseline. What does it mean to really know one self? Quantification of the quantifiable is a start, but the only measurement that counts in the end is how you feel. Hacking the body is a start, but hacking the mind and the soul is another. One day, perhaps, the data scientists will team up with the philosophers—there’s enough looking for work—to build the tools needed for the Quantified Soul movement. Then, things will get interesting.

Learning to Write Code All Over Again

When I went off to college for the first time, I was set on studying Computer Science. The plan was to learn how to program, and maybe get a job creating video games or even start my own video game studio—a dream shared by almost every geeky teenager in the early 2000s, I suspect. Three semesters later, I decided English was more to my liking, and that the higher math required in a CompSci program was beyond my grasp. My grades in CompSci weren’t great either. I passed my first course by the skin of my teeth, and begging my professor to regrade a misgraded assignment.

I got started with programming like many people my age, with QBASIC, though my first real exposure to programming as a concept came from Apple II LOGO in grade school. In QBASIC, the furthest I got was discovering how to write trippy graphical screen savers using random number generators and drawing primitives on screen. In 8th Grade, I got ahold of a copy of Microsoft Visual Studio 6 that, er, fell off the back of a truck. [1] In high school, I switched to Visual Basic, thinking it would be the best step up from QBASIC.

There were two pet projects I had in high school. One was a simple game: “Whac-A-Mac” where the player had to click to smash Macintosh computers that popped up on screen in the style of Whac-a-Mole. I completed this, but it was a bloated, ugly mess. The next project was “NerdQuest,” an RPG inspired heavily by System’s Twilight, a Mac-only puzzle game. This never got off the ground. My understanding of how to do graphics in Visual Basic was non-existent, as illustrated when a friend offered me the code to his isometric graphics engine that he wrote in Visual Basic. One look at the code made my head hurt.

I had exactly two successes as a teenage programmer: a Visual Basic database-based app on the various animal kingdoms I wrote (barely) for a High School biology class, and teaching myself TADS, the Text Adventure Development System. Ultimately, I focused on learning HTML and CSS, with a tiny bit of JavaScript, and enough PHP to design WordPress themes, and be dangerous. I did, with help from a book, write a very basic PHP and MySQL based blog and database for Booji Boy’s Basement, but after a previous webhost was hacked, I lost the source code.

Lately, however, two things have me thinking it might be a good idea to try and learn how do program again. The first is that I am now part of the technology startup economy, albeit in a non-programming role. [2] The other is listening to tech podcasts like the now defunct Build and Analyze, Back to Work, and Quit!. Regular topics of discussion on all of these shows has been learning to program, working for yourself, or changing your role in a job to a technical one—and doing so without expensive education.

The barrier to entry to a programming job has been lowered, significantly. If you have the chops, and have the code to back it up on GitHub, you can find work. Xcode, the tool used to create software for the Mac, and iOS, is free. To become an iOS developer costs $99. There are free and paid classes on how to program in the languages that make our modern tech go, including JavaScript, jQuery, Python, Ruby, and Objective-C. It couldn’t hurt me to learn. So, the other day, I downloaded Xcode, installed a Python development environment, and reinstalled MAMP, with the full intention of getting back into programming, at least as a hobby. We’ll see how it goes.

  1. There was a minor underground economy of software trading in my middle school, which was also a high school. This was back in the days when the fastest home internet connection was a 128kbps ISDN line, and BitTorrent didn’t exist. It was a miracle we even had these CDs of Photoshop 7, Bryce 3D, and Visual Studio floating around.  ↩

  2. Creating e-mail newsletters involves no programming. It’s simply HTML and CSS, albeit HTML and CSS done with standards circa 2002. Thankfully, I haven’t needed to use spacer GIFs. Yet.  ↩

Fear and Consequences

I’ve spent the last few days living in fear. I’ve been afraid of the consequences of a mistake I made, and a very dumb mistake too. It has the potential to ruin my professional life, and more. What sets this mistake apart from the countless others in my life is that it could effect a lot of people, and their careers as well. Forgive the vagueness, but the nature of the mistake, its potential impact, and the audience of this site make me wary to go into details.

Fortunately, it seems that the worst has passed. I’ll find out for sure in a week or so. Talking with people aware of the issue has also helped me gain some perspective—it may not be as bad as I have made it out to be in my head. Still, worry is what I do best, and it didn’t help that this whole thing coincided with a bout of illness, which never helps anything. It’s affected my sleep, my appetite, and my ability to write. [^1]

There’s talk about failing upwards, the idea that the people who make mistakes go on to better things because they’re known. I’ve written about owning your own failures. Neither of these were any consolation in the late, worrisome hours, or during the hours of busy work fixing my mistake before the Sword of Damocles could fall. I’ll be satisfied when I know for sure the end result, but I’ve done all I can, and as far as I can see, it looks to be safe. What I’m not sure about is how far I’m able to see.

[^1] Which is why I’m posting this late.

Give Yourself Over to the Failure Inside of You

Give yourself over to the failure inside of you, and let it envelop your soul.

Failure is not out to get you. Failure wants to be your friend, the one you can count on when success, that is ever elusive, eludes you.

King Missile III – “Failure”

Before you read this piece, take a few minutes, and listen to this song.



My father always told me that you only own two things in life: your failures, and your word. You can’t even take credit for your successes, because someone else will try to hog the spotlight. Nobody wants to claim someone else’s failures. You’re stuck with them. That’s what us so terrified to fail. As for your word, you own that too, because if you fail to keep your word, well…

Problem is, being afraid of failure keeps us from doing pretty much anything. We only manage to do those things that we’re confident of our ability to not fail. I can be reasonably sure that if I walked to the kitchen and got a cookie, I could eat that cookie successfully. If I went into the kitchen and tried to make a batch of cookies, that I’m not sure about. I don’t know if I have the ingredients, the tools, and the time. I don’t know if I can remember a good cookie recipe. So, if I were to go into the kitchen, blind, and try to make cookies, I would almost certainly fail.

This is a slightly absurd comic example, but the same could be applied to almost anything. Failure is a permanent state, only in the sense that you can’t change the past, and that you have to deal with the consequences. Consequences may last a long time, or they may not. Dirty dishes, and carbonized “cookie” on a baking sheet are temporary. Unless you’ve done something particularly terrible or outright criminal, however, there’s no “Permanent Record” to dog you. Only yourself.

Failure wants you to get over your fear of failure—and what better way to do that than to fail and fail again?

If at first you don’t succeed…

How many times have you tried to do something and failed? Anything. Small things. Big things.

It’s probably a very big number, unless you don’t do anything.

But, what do you take away from those failures?

First, you know what not to do next time, which will reduce your chances of future failures. After all, as Merlin Mann says, “The beginning of expertise is going ‘Oh! I’ve dealt with this problem before!’” Often this means “I’ve tried to do this before, and I know what doesn’t work.”

You have to own your failure, because only by doing so will you recognize those chances that come up where you can fail in exactly the same way you failed before. You can still fail in an entirely different way, but that’s okay. Pick yourself up, wipe yourself off, and fail again in another way.

Or, perhaps, succeed.

But most likely not. And that’s okay.

So befriend it. Make love to it. And believe in it with all your might. Because failure is all there is for you.

(With apologies to John S. Hall.)