After the hackathon was over, Maurice gathered every team member’s contact information and wrote it all down on the back of the business card I gave him. Later that night, I received a phone call from him, and he asked if I would be able to teach him how to code. I love teaching, and Maurice seemed like a nice kid, so I offered to help. He immediately said, “okay, go,†and awaited instruction. I explained that I wouldn’t be able to teach him over the phone, but then I found out that he didn’t have internet at home. If that surprises you, you should know that this is actually a bit more common than you may think.
This is why the so-called “meritocracy” of technology is bullshit. When someone is so poor that they can’t afford Internet access, or even a computer, how can they learn to program in the first place? Tech is only a “meritocracy” if you are privileged enough to have the first rung of the ladder in reach. Few kids are as lucky as Maurice is to have people helping him.
People can say a thing on Twitter thinking they are being clever or funny, seeking attention and recognition for their clever funniness. But sometimes when seen from another perspective, the thing they have said makes them come across in a less than positive way—and on Twitter the leap from doing something mildly objectionable to being considered by many to be a colossal scumbag is very short. This in itself can create problems, as the rejection of one faction can shove people towards others. A person might feel like a bridge has been burned before they even got to cross it, so maybe they’ll just saunter off to hang out with some actual colossal scumbags. The process of groups aggressively rebuffing people who do not immediately measure up to their standards can be damaging in the longer term.
Anger, outrage, and hatred all existed long before Twitter. Nobody disputes this. The problem is that Twitter, and most other social media, right down to website comments, are designed to promote quick bursts of emotionally charged responses. That’s what “wins.” Twitter doesn’t reward thoughtfulness, consideration, or even moderation.
Phil Hartup includes some thoughtful choices for direct action one can take to make the healthy debates that Twitter can create more likely to occur. We should all take them up.
I am good with computers so I know these ways. I understand the conventions of computing. I understand them because that’s what I’m interested in and where I’ve spent my time. I enjoy computers and what I can do with them.
But not everybody does.
In fact, I’d wager most people don’t enjoy their interactions with computers. They’re confusing. Why? Because they’re unpredictable. Doing the same thing over and over doesn’t always produce the same result.
One of the reasons why tablets and smartphones are becoming popular ways to use technology is that they’re far more predictable. Less cognitive overhead, and consistent user experience make these device much easier for ordinary people to grasp.
Part of maturing, I think, is realizing that charges of acting in bad faith are often themselves made in bad faith, an attempt to explain away gaps in understanding between two people rather than trying to bridge them, or even make peace with them. That’s as true in politics and in relationships as it is in music, but in music—arguably the strangest and most subjective art form there is—the best option often is “make peace.” Not everything is for you, even you of eclectic tastes and voracious listening appetite.
As passionate as people can be about what they love, they can have equal passion about what they don’t. Back in the Crush On Radio days, we often talked about bands and music that we should like “on paper” that fail to click. There was never judgment—we were there to share what we love, but nobody was wrong for loving something that wasn’t for the rest of us.
Something we all should learn, and it extends to so many things in life.
Those who thrive off social contact will love a wearable, but those already overwhelmed by Facebook and texting will find it tears at their solitude and sense of self. Both will be, in part, right. The device may be new, but those hopes and fears are old.
The whole piece is an interesting read about the role watches have played in culture. (Funny how wristwatches were originally a “feminine” thing, considering how the smart watch is so masculinized through its size.) I highlight the end of it here, because it encapsulates a big chunk of my skepticism.
The platonic ideal of wearable computing is omnipresence. At least I can put my phone in a drawer or on its dock on my desk. Once the battery problem is licked, and I have a communications device physically strapped to my body almost all the time, what happens then?