Essays on Technology and Culture

Couch Potatoes of the 21st Century?

Over at The Guardian, John Naughton bemoans the rise of the new, Internet enabled Couch Potato

What we failed to appreciate was the passivity of most of humanity and its inexhaustible appetite for consumption, entertainment and “infotainment”. The spread of high-speed broadband connections did not liberate human creativity but instead created Couch Potato 2.0, a creature that sees the internet mostly as zillion-channel TV. In that sense, it’s no accident that the corporations which now dominate network traffic are outfits like Google and Netflix, beaming YouTube and movies to you in the comfort of your own settee.

If the dream of the Internet was for everyone in the world to start making stuff, then the dream was far too big. Most people are passive consumers of media, and it’s been this way since the dawn of media. More people watched plays in Ancient Greece than wrote them. In the late 19th and early 20th Century, music instrument ownership was common, but how many instrument owners wrote music, and how many just learned to play popular songs? Instrument ownership is way down from that peak, but there’s certainly more people making new music today. If there’s not more people making it, there’s at least more people putting their music out in the world.

True, a lot of what people are creating is distributed though the centralized networks of Facebook, Tumblr, YouTube, Soundcloud, DeviantArt, and Instagram. It’s not quite the same as the corporate domination of previous media revolutions. An ordinary person couldn’t expect to get their idea for a TV show on NBC without a lot of work and a lot more luck. Now, you can film a TV show pilot on your phone and post it on YouTube. If the stars align, you might not even need a network deal to get the audience, and the money, to make more of it. Most of these centralized clearinghouses for (ugh) “content” don’t exert more than the bare minimum of editorial oversight, so anything goes. It’s not the open, democratized, everyone controlling their Internet Identity that some of the technologists dreamed of in the late 90s, and perhaps we should bemoan that. Still, you can’t deny that these centralized services take a lot of the pain out of making new things.

There are more people making things than ever before. But they’re not the majority, and never will be. No matter how easy we make it to make things, put them on the Internet, and find them, it’ll always be something pursued in earnest by the sort of people who want to make things, the sort who always have. Beyond that, even creators take their time to be passive and watch Netflix, too. Naughton admits that “the internet of our (utopian) dreams hasn’t ceased to exist. It’s just that it’s becoming a minority sport.” Problem is, making stuff has always been a minority sport. The minority is getting larger, but it’s always going to be a minority. Even in the Star Trek future, not everybody’s writing Holonovels, when there’s planets to explore. To create isn’t divine, it’s just human, but it’s not the only thing that makes us so.

One Size Fits Some

Though Flappy Bird has passed from App Stores and into legend, debate still rages over what it means gaming as a while. In one corner is MG Siegler, speaking for legions of older gamers who see Flappy Bird as a siren call to return games to the simplicity they had in their heyday.

You didn’t need to read an instruction manual to play [games], you just needed to pick up the controller. Once you did this, a few taps and you were off.

-MG Siegler

On the other side, is Matt Birchler, who thinks Flappy Bird is just part of the mix of simple and complex games being sold today. I’m taking Birch’s side on this one, but that we’re even having a debate about this is proof that gaming is maturing as an art form. Slowly.

Does a game need to be insanely complicated, or insanely simple? That’s more a question of the type of experience you want to have. If you’re looking for something to occupy your time while waiting in line for your latte, Flappy Bird would fit the bill. If you’re looking to occupy an evening, you want something with a little more meat and complexity. This could mean anything from Super Mario Bros. to Grand Theft Auto, to Civilization V. I’m the sort of gamer who prefers whiling his hours away on SimCity and Sid Meier’s Alpha Centauri over run-and-gun shooters and their ilk, but my tastes shouldn’t be everyone else’s.

MG has romantic notions of the dawn of the home video game era. Back then, home systems in order to be cheap and mass-produced imposed limits on the simplicity of games. Computer games, with access to more computing power, even if over a time sharing system, could be far more complex. Consider the 1971 Star Trek computer game. There’s nothing simple there. Even some early console games were complex enough that you couldn’t just dive in without reading a manual: Adventure, Metroid, The Legend of Zelda, and others were home console games that practically demanded you familiarize yourself with the controls if you wanted to get anywhere, but they rewarded you with deep gameplay (for the time). Now the divide is much more distinct, but it’s been there since the days of Space War and Pong.

There are more gamers, more games, and more things to play them on than we’ve had at any time in the history of interactive home entertainment. Gaming is experiencing a shift that TV experienced around the time cable got into people’s homes. It used to be that you were lucky to have three channels of programming to watch. In order to attract the most eyeballs, and therefore the most advertiser dollars, it was to these networks advantage to produce shows that appealed to as many people as possible. UHF stations added some competition for eyes, but the growth of cable TV had broadcast networks found themselves increasingly competing for eyeballs. On the one end, this gave us miserable reality shows. On the other, it’s given us Mad Men, Breaking Bad, and Boardwalk Empire. Even with more people watching, no one network—broadcast or cable—pulls as many pairs of eyes as the broadcast networks ever did in their prime, but we now have more interesting programming to show for it.

Video games aren’t at that inflection point yet, but they will be soon. There’s enough people who are getting their first fixes of games outside the “traditional” gaming demographic of 18 to 29 year old males to make it happen. There are audiences discovering gaming that are underserved, and they’re going to want more options. Addictive as Candy Crush Saga may be, eventually some people are going to want to do more than slide around brightly colored icons, but they might not want to shoot cops and steal cars. It’s the indie developers on new platforms like iOS who are going to serve them, much like it ended up being the cable networks who produced much of the great television shows of the last decade-plus. They won’t sell in Super Mario numbers, but they won’t have to. The kinds of games they’ll make are uncertain, but what is certain is that among the best of these new games are going to be ones where you’ll need to learn how to play. For the right person, their experience will be that much richer.

Popping The Geeky Masculinity Bubble

At the recent TechCrunch Disrupt conference, two teams of hackers introduced and presented sexist—to say the least—app ideas. TechCrunch was quick to deliver the typical non-apology apology that has become standard in the business. Their acceptance of blame only extended to admitting they didn’t scrutinize the apps properly before the conference. You can’t tell me that someone saw the name “TitStare” and didn’t think something was amiss. I doubt they assumed it was a birdwatching tool. “Circle Shake” had a more innocuous name, but its content was far from it.

Whenever sexist behavior pops up at a technology event, or whenever another woman in the technology world speaks up about the sexist bullshit she’s endured at conferences and elsewhere, the reaction is always the same. There’s a sheepish apology and promise to deal with the issue that never is fulfilled. Elsewhere, anonymous crusaders hiding behind screen names will take up arms and attack… the people who raised the troubling issue in the first place. Unlike cockroaches, who typically scatter when the light has been cast upon them, the particularly insidious breed of sexist that lives in the technology community only sees a sign to attack further.

And in the same breath, we often wonder why women are so underrepresented in technology.

Never mind that women are often shunted from birth into
“traditionally female” pursuits, and often discouraged from exploring technology. Even the lucky few women who are able to escape the societal conditioning that technology isn’t for them have to put up with institutionalized sexism from both the Old Boys Club and Young Boys Club that is the tech community at large. [1] When you have a luminary like Dave Winer claiming “[T]here’s something about programming that makes many women not want to do it. Programming is a very modal activity. To be any good at it you have to focus. And be very patient.” something is amiss.

There’s certain aspects of the type of people the tech world attracts that help make sexism so pervasive. First, there’s the societal bubble of technology as a primarily male pursuit. With few female voices, technology becomes an echo chamber of men. This becomes deadly when combined with the geek tendency to overgeneralize. [2] Geeky people think in terms of systems and tend to become dogmatic, fining variation difficult to handle to various degrees. (This is a problem I’ve struggled with, too.) So, when a geeky guy who has surrounded himself with fellow geeky guys suddenly sees someone who isn’t like him penetrate his bubble of geekery and masculinity, how does he react? All too often, with fear and hatred.

Even in their bubble, geeky males can still take on the victim mentality, even when there are real victims who are being victimized by the same community and tools that can be used for constructive purposes. This victimization manifests itself in awful places like the Men’s Rights Movement, which leads a lot of the harassment of women who dare speak up about sexism online.

I wish I had an answer on how to fix this. Whenever an event like the TechCrunch debacle happens, or when someone like Melody Hensley dares to just be female in a male dominated space, plenty of decent people step up to make noise and shout down the assholes from their public perch. It’s the private attacks and harassment that make things troublesome, and that the allies of the harassed get more people listening to them than the people they support makes it even worse. These are symptoms, not causes, however. Though there’s more vocal defenders and allies then there were in the past, their support seems to only embolden the forces of sexism. All I think we can do is hope for a tipping point, and to continue to force the discussion about sexism in technology—and elsewhere. Expect a long, drawn out battle.

  1. Very specifically, it’s an Old and Young White Boys Club, but I’m only taking on one aspect at a time, lest this post become too long to expect anyone to read it.  ↩

  2. If I haven’t secured a batch of angry emails by this point, I will now.  ↩

“Better Ways of Living and Experiencing Music”

A few weeks ago, I attended a concert by Savages, a British post-punk revival band in the tradition of Joy Division and early Siouxsie & The Banshees. Along the entryway into the venue, and even in the Men’s room, the band had signs up making a request of the audience:

A Note from Savages

Nice typography. Awful photography.

Though not a requirement, the signs were effective enough that I did see a lot fewer phones and cameras out in the crowd at Terminal 5 than I had at most shows in the last few years. I kept my phone in my pocket for the entirety of Savages extremely intense set, though it took a lot of willpower at times.

A couple of weeks later, I was back at Terminal 5 for They Might Be Giants, who had less of an issue with photo takers and phone sharers. They even asked their Twitter followers to post pics of themselves in the crowd, pre-show, and re-tweeted them. (Though they didn’t retweet my picture…) At a previous TMBG show, John Linnell even snatched the camera from an audience member’s hand, recorded himself playing keyboard for a few seconds, and handed it back.

It was an interesting comparison. Looking back, I was more involved in the Savages performance, and not just because I got swept up to the barrier by the surging crowd during the song “Husbands”. Not taking and sharing pictures meant that I wasn’t looking down at my phone to see if it came out right, set the right vintage looking image filter, or see if the upload failed due to spotty reception inside the venue. Nothing taking myself away from the show meant I was as present as I could be. Something that I’ve missed out on in the past.

“Our goal is to discover better ways of living and experiencing music.”

People have been taking photos and video at concerts as long as it was possible to bring cameras into venues, whether the venue allowed it or not. It’s just easier to do now. There’s always been a desire to immortalize a temporary experience in something that can last forever: a photo, a video, a souvenir. Those have always been for ourselves, however. Now, when we immortalize something temporary, it’s for the world, or at least all of the people we know.

That changes things. The experience is no longer personal. It becomes instant nostalgia, and that’s not a crack on image filters. We become concerned about how our friends (and “friends”) feel about the cool thing we’re doing, and not the actual cool thing we’re doing. That concern is going to take up a part of our attention that can’t pay attention to the actual event unfolding in front of us. Mathematically alone, it diminishes out involvement in what we’re experiencing. We lose something this way.

I think at the next show I go to, I’ll take a couple of pictures early in the set, then shove my phone in my pocket until the house lights come back on. Those pictures won’t be shared on any social media service until I’m home, looked them over, and decided if any are worth even keeping, let alone sharing. I’d prefer to be more present for the concerts I attend from here on out—or any other experience I’m having. I don’t know if it’s a better way of living, but being fully there is certainly a better way of experiencing. And it goes way beyond just rock concerts.

For the curious, here’s the music video for Savages song “Husbands”. (Contains some disturbing imagery.)

Fear of a Beeping Watch

Thursday evening, I had an odd experience on my ride home on the subway. I was sitting on a crowded E train, next to a middle-aged white woman who was sound asleep. It’s always amazed me when someone manages to fall asleep on a crowded subway train during rush hour. Somewhere along the ride, the woman’s wristwatch alarm went off with the standard Casio digital beep. She didn’t wake. The alarm kept beeping, and for a moment, I was scared. I could feel the fear from the people around me on the train, as well. A simple, beeping wristwatch on a crowded train. Was this a portent of death?

If you’re reading this, you know the ending. Nothing happened. The watch stopped beeping, the woman kept sleeping, and everyone around her returned to their own world. Even I forgot about the incident until I got home, and had a chance to breathe and think back on my day. I had to think about this. I’ve lived in New York City for only nine months, but I’ve been a city dweller for the majority of my life. There are more likely threats than a subway bomb, and I know it from experience. One Christmas Eve, on the subway in Philadelphia, I was the victim of an attempted mugging for my phone. [1] Why, then, should a beeping wristwatch alarm scare me, or anyone else on the train?

The Casio F–91W is ubiquitous. An inexpensive digital watch, with classic styling. I’ve owned three, once because they’re cheaper to replace than spend the effort to replace the band. The Casio F–91W is also easy to turn into a timer for an improvised explosive device. The bombs used in the London subway bombings in 2005 didn’t use watch-based detonators, but it’s not impossible for a future attacker to try it. I don’t know how many of my fellow passengers knew what I knew, but the shared moment of fear as the alarm beeped confirmed that my paranoia wasn’t exclusive.

The denizens of any dense, urban environment go about their day living in their own little bubbles. We’re all less worried about the threats to our lives as we are about the little threats: if we can make the rent, if our boss is planning to fire us, if tickets to that show are sold out. We don’t think about the possibility that something could happen to us, by accident or by malice. We’re inured to the idea of our own safety. Perhaps this is for the best. We could be afraid of anything, but we choose not to be. It takes something serious to pop that bubble. If something does, and it turns out to be a false threat, it hones our sense of what to really be aware of.

I choose not to live in fear. I don’t always succeed.

  1. The kid who tried to mug me didn’t get away with anything, but he did break my headphones and damage the jack on my phone.  ↩