I agree with much of what Michael Gazzaniga says here, and with the overall thrust of the project exemplified most recently in his book Who’s in Charge?: Free Will and the Science of the Brain.
However, as much as we might like to make this project entirely palatable, there is an important sense in which it does seem to challenge some stubborn intuitions, and I think we need to recognize the fact that such findings can and do influence our ideas about how and when to hold people responsible. The clearest evidence for this is found in the legal realm where, from the insanity defence to the twinky defence, we encounter a long history of attempts to grapple with the relation between the apparently mechanistic nature of the brain and our ability to hold people responsible for their actions–attempts which clearly show that a scientific-mechanistic understanding of the brain has important bearing on our understanding of freedom and agency.
The last book I read was Joseph Schumpeter‘s Capitalism, Social, and Democracy, a book famous for coining the phrase “creative destruction” as a description of the process inherent to capitalism whereby old methods of production and commodities are incessantly obsolesced and replaced–an insight drawn, I believe, from Marx’s talk about capitalism constantly revolutionizing the means of production (compare also Deleuze and Guattari on on de/reterritorialization). Continue reading “Some support for Schumpeter”
Ray Kurzweil, author of (among other books) The Age of Spiritual Machines, expounds on the promises and pitfalls of the coming expansion of GNR (genetics, nanotech, and robotics) technology, claiming that by 2029 scientists will have effectively modelled the human mind, producing artificial intelligence fully capable of passing a Turing test.
At least for action movies, movies in HD is a step backwards in the viewing experience.
Schlimmbesserung is a handy German word for an ‘improvement’ that makes things worse, that is actually a step backwards–applicable, I think, to at least certain aspects of HD television. Continue reading “Is HD a Schlimmbesserung?”
One of the most exciting things going on in webland today, I think, is the myriad of technologies, user experiences, and computer-to-computer interactions that typically pass under the monikers of “Web 3.0” or “the semantic web.” There isn’t a lot of general agreement on what precisely these terms mean (though I think the latter is more concrete), but what many people envision as the future of the web is an online environment in which data, text, and various forms of information and media are structured in ways that are machine-readable (if not machine-interpretable), leading to all sorts of new possibilities for interoperability between websites, new forms of user-agent interaction, and generally a web experience that is less characterized by “dumb” websites.
All of this, in addition to the manifest benefts, of course probably would present new opportunities for abuse, invasive marketing techniques, and threats to users’ privacy.
A glimpse of this last concern was provided recently by a paper from some Google researchers (“Could your social networks spill your secrets?”) that details how data from two different social networking sites (e.g. LinkedIn and Myspace) could be linked together to reveal the single person behind two different public profiles, despite the profiles being relatively anonymous and not directly linked. From the NewScientist article:
That approach is dubbed “merging social graphs” by the researchers. In fact, it has already been used to identify some users of the DVD rental site Netflix, from a supposedly anonymised dataset released by the company. The identities were revealed by combining the Netflix data with user activity on movie database site IMDb.
December 2009: As an addendum to this article, I direct your attention to “project gaydar”.
Kevin Kelly on the future of the web, which he sees basically in terms of a movement towards the semantic web, or a web of linked data.
Kelly unfortunatley comes across a bit naive, as he discusses our inevitable dependance upon, and surrendering to, the envisioned “web 10.0” without any critical hesitation or indication of cause for concern.
Having just spent upwards of 25 hours in a car driving between Peterborough, Toronto, and Pukaskwa National Park, one of the ways we passed the time was listening to a variety of podcasts, including Philosophy Bites, CBC Ideas, and the Long Now Foundation’s Seminars About Long Term Thinking (SALT).
While SALT has hosted a bevy of fascinating and influential guests, including Craig Venter, Jimmy Wales, Francis Fukuyama, and Ray Kurzweil, Daemon: Bot-Mediated Reality by author and software engineer Daniel Suarez was one of the most interesting and thought-provoking (mp3 here).