Canada: a brief history of failed GHG reduction policies

Attended a talk entitled Getting Climate Policy Right yesterday, presented by Mark Jaccard and co-sponsored by University of Toronto’s School of Public Policy & Governance and the Centre for Environment. Jaccard is a leading expert, not just in Canada but internationally, on climate change policy and economic modelling, and delivered an informative, stimulating and engaging presentation.

Some of the key take-aways:

  • Energy efficiency is expensive – economists who model energy efficiency policies and programs often still fail to take into account a variety of factors that make investment in energy-efficient technologies much more costly.
  • Information programs are not enough – governments have 4 (or five) policy levers to reduce greenhouse gas emissions: information campaigns (e.g. the Rick Mercer one-tonne challenge), subsidies, regulations, financial penalties (taxes), and cap and trade schemes (a combo of numbers 3 and 4). We need to see much more of numbers 3-5.
  • Offsets are not working the way they’re supposed to – in the EU cap and trade scheme (or at least ETS1), companies can achieve 15% of their targets via offsets which go to clean development mechanisms as subsidies to developing countries for advanced, cleaner technologies from developed countries. Jaccard showed the audience a slide demonstrating how China is taking advantage of this as a “free-rider,” using the CDMs for hydroelectic projects that would already have been done anyway, and thus failing to have any mitigating impact on their GHG emissions from coal-fired plants.
  • Targets don’t matter – while I think the language used here is a bit too strong (of course targets matter), what Jaccard is saying is that we’ve been setting great targets for years, but have consistently failed to meet them. According to Jaccard, we need clear plans for meeting our targets, absolute caps and minimal or no offsets. Which brings me to…
  • Canada has been failing at greenhouse gas reduction policies since the late 80s – first introduced by the Mulroney government, Canada has gone through more than five policies to reduce GHGs, all of them failures. By the reckoning of Jaccard’s team, the current plan under the Conservatives will have some effect (good news) but not nearly as much as is claimed or needed.

As Jaccard said, Canada has clearly demonstrated it is a follower and not a leader in this area. We should expect to see more action once the US has got implemented some serious GHG reduction policies, which will hopefully be happening soon.

The future of journalism

From one of my favourite e-newsletters, J-Source, comes a provocative article by Alan Bass, assistant prof at Thompson Rivers University School of Journalism, that takes journalists themselves to task for failing to preserve the vitality of journalism and for failing to make the case for the relevance of journalism vis-a-vis the infotainment that constitutes the majority of “media.”

What’s wrong with journalism? Look in the mirror is worth reading for anyone interested in the challenges that faced by major news outlets–particularly those challenges posed by the new forms of media emerging on the web. Though I think Bass places too much blame on journalists themselves (and simultaneously assumes they have more power to reform “the press” than they do), such calls for serious self-reflection are an essential part of efforts to change large, public institutions like the media. If change can’t (at least partially) come from within, it won’t happen.

Scam ads on Facebook

Perhaps I’m the only one to whom this would be unexpected, but I was surprised to notice blatant scam ads on Facebook today. Maybe they’re not a recent addition but I just noticed them today for the first time. I would have thought such things wouldn’t get clearance from Facebook’s marketing department.

Who knows how useful the little “thumbs down” functionality is in getting the misleading ads removed.

picture-3

picture-4

As the web gets smarter, will our anonymity evaporate?

One of the most exciting things going on in webland today, I think, is the myriad of technologies, user experiences, and computer-to-computer interactions that typically pass under the monikers of “Web 3.0” or “the semantic web.” There isn’t a lot of general agreement on what precisely these terms mean (though I think the latter is more concrete), but what many people envision as the future of the web is an online environment in which data, text, and various forms of information and media are structured in ways that are machine-readable (if not machine-interpretable), leading to all sorts of new possibilities for interoperability between websites, new forms of user-agent interaction, and generally a web experience that is less characterized by “dumb” websites.

All of this, in addition to the manifest benefts, of course probably would present new opportunities for abuse, invasive marketing techniques, and threats to users’ privacy.

A glimpse of this last concern was provided recently by a paper from some Google researchers (“Could your social networks spill your secrets?”) that details how data from two different social networking sites (e.g. LinkedIn and Myspace) could be linked together to reveal the single person behind two different public profiles, despite the profiles being relatively anonymous and not directly linked. From the NewScientist article:

That approach is dubbed “merging social graphs” by the researchers. In fact, it has already been used to identify some users of the DVD rental site Netflix, from a supposedly anonymised dataset released by the company. The identities were revealed by combining the Netflix data with user activity on movie database site IMDb.

December 2009: As an addendum to this article, I direct your attention to “project gaydar”.