For a brief, shining moment in the 90s, multimedia and CD-ROMs were the hot technology. It was going to change the world, by putting knowledge in the hands of people, and mixing sound and video with text, in a new, interactive way. Everyone jumped on the bandwagon, from traditional software companies like Microsoft, to musicians as diverse as Laurie Anderson, Devo, The Residents, Queen, and Yes. A computer couldn’t be sold unless it had a 2x CD-ROM drive, then a 4x or 8x, along with a sound card, and enough RAM to make the darn thing go. Then, suddenly, this Internet thing showed up in people’s homes, and the interactive, multimedia CD-ROM became a curious, forgotten relic.
Grolier’s Encyclopedia was my first exposure to the idea of multimedia. In my middle school’s computer laboratory, on a creaking old 486 that was, even for 1995, behind the times, I got my first taste of the future, watching a grainy, blocky video of the [Hindenburg Disaster]. Around that time, my own aging 486 got an upgrade with a 2x CD-ROM drive and a SoundBlaster 16, followed by a hard drive and RAM upgrade to offset the increased load on the machine. Waking up one Christmas morning and unwrapping a five CD-ROM Encarta set, with encyclopedia (on two discs!), atlas, dictionary, and timeline was amazing. Now, I can get all of that through Wikipedia, and its sister projects in less time than it would have taken to put in the CD, let alone run the software and search.
Technology is an iterative process, and when looked at with enough distance and hindsight, it seems like a fairly straight line of progress. We went from teletypes and punch cards to video terminals and command lines, then to GUIs. We had 8086 processors, then 8088s, then the 286, 386, 486, Pentium, Pentium II, III, 4… It’s easy to lose sight of what was proposed as the next big thing, then fell to the wayside: the Amiga, OS/2 Warp, CD-ROMs, VRML… If you zoom in closer, the history of technology is more of a finely combed branch, or feather, with a lot of terminating dead-ends, where we thought something would be great, and it wasn’t, or something else came along and proved to be better. Technology is full of little ratholes like these.
The Internet predated CD-ROMs, in one form or another, but in the days when the best you could hope for in terms of bandwidth meant it took an hour to download a decent quality image, CD-ROMs had the advantage of (near-)immediacy. Put the disc in, launch the app, wait for the drive to spin up and seek to the file, and suddenly, you had a video of a burning zeppelin on your screen. Unless you were on a college campus and lucky enough to have access to a T1, or if you were rich enough to pay for ISDN, video over the Internet was a laughable concept.
But things got faster. 33.6Kbps modems, 56k modems, home broadband that outstripped a standard T1 connection. Suddenly, it may not have been faster to look something up on the Internet, but it was fast enough that the flaws of a CD-ROM became apparent, not just in terms of seek time, but also clunky interfaces and high system requirements. The legacy of the multimedia CD-ROM now lies in landfills, artful mobiles of microwaved CDs, and every Flash-based website for a company or restaurant that insists on audio, video, and custom UI controls to display something that could be communicated faster and easier with text and images.
It’s why I try to maintain a healthy skepticism when someone proposes a new technology that will change how we do things. Whether we talk of ubiquitous motion-based computing, viz-a-viz the Microsoft Kinect, or just NFC based payment systems, I can’t help but think that these are ideas that look really cool in demo videos, but prove difficult, at best, to implement in the real world. Something already out there could suddenly catch up and surpass the newest, latest thing. Something else newer, later, and cheaper and faster, could nail its proposal to a venture capital firm and get a ten million dollar investment, seemingly out of nowhere. It’s happened before.