On October 13, 1992, the United States became the world’s first industrialized nation to ratify a treaty on climate change. The treaty committed its parties to the important, if awkwardly worded goal of preventing ‘dangerous anthropogenic interference with the climate system. In acknowledgment of the fact that America and its allies were largely responsible for the problem, the pact set a different standard for them; Europe, Japan, Australia, and the United States were supposed to ‘take the lead in combating climate change and the adverse effects thereof.’ Signing the instrument of ratification for the treaty, the United Nations Framework Convention on Climate Change, President George H. W. Bush noted the special responsibilities that the developed nations were taking on; they ‘must go further’ than the others, he said, and offer detailed ‘programs and measures they will undertake to limit greenhouse emissions.’ The convention remains in effect, and for the past seventeen years the United States has insisted that it is living up to its terms. Under Bill Clinton, this claim was implausible; the U.S. took no meaningful action to reduce its emissions. Under George W. Bush, it became a bad joke. (When the Bush Administration wasn’t handing out tax […]

Read the Full Article