Bot-mediated reality

Having just spent upwards of 25 hours in a car driving between Peterborough, Toronto, and Pukaskwa National Park, one of the ways we passed the time was listening to a variety of podcasts, including Philosophy Bites, CBC Ideas, and the Long Now Foundation’s Seminars About Long Term Thinking (SALT).

While SALT has hosted a bevy of fascinating and influential guests, including Craig Venter, Jimmy Wales, Francis Fukuyama, and Ray Kurzweil, Daemon: Bot-Mediated Reality by author and software engineer Daniel Suarez was one of the most interesting and thought-provoking (mp3 here).

Continue reading “Bot-mediated reality”

Consequences of bot-mediated reality

I have a lot of catch-up listening to do with regards to The Long Now Foundation‘s excellent Seminars About Long-term Thinking (SALT) lecture and podcast series. I’m a charter member of the Foundation, which gets you a sweet membership card and access to video of their lectures, among other less tangible things like knowing you’re helping inject some much-needed awareness of long-term thinking and planning into public discourse.

One of the lectures I’m particularly looking forward to downloading is the recent Daemon: Bot-Mediated Reality by Daniel Suarez, which I think has particular relevance given the recent and rather large f-up in which Google’s news crawler inadvertently “evaporated $1.14B USD”.

Unfortunately, I think that in the near future, as more and more processes are automated, we will see more such screw-ups of this scale. I can’t help but think that this might have been avoidable, though, if the indexing engine had been able to take advantage of semantic data rather than relying on scraping and evaluating natural language.