About a week ago I wrote of predictive fiction and its ability to capture the minds of many interested in the future. In the intervening time, I read a very interesting book.
Toast is a collection of short stories written by Charles Stross. The stories were very good, but what caught my attention was what he wrote in the edition’s foreword. The stories were written between 1990 and 2000, and Stross’ newest edition and commentary is from 2005. In his foreword and some of the story commentaries, he writes about how outdated some of the stories seem, even five years later. And in fact, another five years after that it is even more apparent that things are dated. Fiction about Y2K elicits a chuckle (this is the biggest example Stross cites as evidence that he’s dated himself), and even the relatively far-out notion of having nano-sized supercomputers woven into clothing is made somewhat quaint by our continued progression towards the “cloud” and the notion of distributed computing. However, like all good predictive fiction, certain concepts hold up well, such as in his last story where Manfred Macx, the protagonist, gives away all of his services in exchange for favors and lives without money, a subtle intermediate step to the reputation economy described in Doctorow’ Down and Out in the Magic Kingdom. Additionally, Stross revisits the concept of information burn several times, an interesting thought experiment in a world where the amount of information being directed towards a person in a day often far outstrips their ability to process it.
What made this interesting was a book I started to read today, called Republic.com 2.0. The “2.0” is for the book’s second edition, written because the original book, out in mid 2001, had become heavily dated, even in the 5 or 6 years that had passed until Sunstein, the author, decided to revise it. The thesis of the book, about the dangers of information personalization and the rise of the internet “echo chamber” is still very relevant in 2010, but the arguments to support it were made at the very least incomplete by the events of 9/11 and the rise in power of blogs and content delivery sites such as Digg, Reddit, and StumbleUpon, not all of which even existed in 2001 (Reddit was launched in 2005, so sayeth wikipedia).
So what does this mean about the analysis of technology? The social conventions surrounding the average internet user change at a frightening pace, and by the time many sit down to write a book on it, it’s already out of date by the time it comes out. Twitter was launched in 2006, meaning that I’ve spent more time in college than Twitter has spent existing. To put it another way, Facebook was launched in 2004, MySpace in 2003. So if you were a human interactions scholar and wanted to write a book on MySpace, you had a little under a year before Facebook emerged and changed the entire game. The gap between Facebook and Twitter, two entities that also had profound effects on each other (if it weren’t for Twitter, do you think @tagging of Facebook users would have been implemented? I doubt anyone would have known what it was) was a little longer, but still not really long enough to write and publish a book.
So is this the Singularity? The idea that technologies develop faster than the userbase can react to them is a central concept to Singularity theory, as technology’s ability to outpace its userbase changes the social dynamic of technology significantly. This is truly the “information burn” concept as Stross envisioned it. The difference here is that Stross thought, at least ten years ago, that this was off in the future. But now things are becoming dated that much more quickly.
It’s not a Singularity in the popular sense. There are no artificial intelligences pushing the actual development of technology past the bound of human comprehension. However, the fact is that we are probably less than a decade away from our communications frontier changing faster than the users can transition from one domain to the next. And that is when things will get a little nuts.