Everything new has its detractors. It was Douglas Adams who said something along the lines of “Anything invented between when you were 15 and 35 is new and revolutionary and exciting, and you’ll probably get a career in it. Anything invented after you’re 35 is against the natural order of things.” Douglas was right in sentiment, though the numbers are really more of a rough estimate. There are plenty of young detractors, and plenty of older early adopters. It’s really a question of mindset.
We see this at play when any new, “disruptive” technology hits the scene. It’s one thing to be a healthy skeptic, and if technology is going to be a major part of your life, it’s to your own benefit that you maintain a certain skepticism before adopting anything new. It’s the only way to maintain a level head, and not get suckered in by the pundits, paid shills, and hype any new thing brings. Ask anyone who has gone blind into the latest new thing and gotten burned if they wish they’d listened to someone with a different opinion.
Maintaining healthy skepticism, however, can easily get you lumped in with the naysayers and doomsayers, the sort who are in Douglas Adams’s second category. These are the sorts of people who say “the Internet is destroying human interaction,” or “eBooks are the first step to Orwell’s dystopia.” There is always a kernel of truth to these outlandish claims, going back to Plato’s claim that the written word would result in people becoming unable to remember things. It’s not that, with writing, we can’t remember things, it’s that we often don’t need to—a subtle, but important, distinction. Claims of information overload and decreased attention, maladies associated with the Internet, actually go back to the rise of the printed book, if not before.
Behind such claims are various motivations ranging from sheer skepticism gone wild, to fears of lost control over certain segments of the populace. Some are founded, some are not, and yet we live on. Separating the valid arguments from the invalid, or frequently absurd is a task that burdens all of us, and it is very easy to fall victim to skilled rhetoric. Speaking of the Internet and isolation, all one needs to do is point to your prototypical computer geek—an introverted, quiet man and spending hours in front of a glowing display—and your point is made. You could also use that prototypical computer geek as an example to the opposite, pointing to his rich social life communicating with people all over the world, just from behind a keyboard.
I’ve heard this reply myself: “But, those people don’t exist!” If so, then who’s typing the messages? Even a Markov chain generator needs input.  We often use technology to do things we’ve always done, just in new ways that seem alien by comparison. Writing with a pen and paper isn’t a massive leap from styluses and wax tablets. The principles behind a typewriter are not difficult to understand. Pushing buttons on a piece of metal and seeing glowing letters appear on a display that doesn’t have any physical connection to the buttons is.
When you think about it that way, a great deal of the doomsayers arguments fall flat. A new way to do something can displace the old way, but it doesn’t remove it completely. We still talk face-to-face, in the real world, and no clever application or piece of hardware  will ever undo that. The simplest technologies and the most complex technologies often exist side-by-side. It’s the solutions in-between we see that get knocked by the wayside by whatever’s new, and then only if it’s truly good enough. The difference between a skeptic and a naysayer is that a skeptic can be convinced with evidence. A naysayer can’t.