Politicians and diplomats are heading to Katowice, Poland, in December for the next round of international climate negotiations. This summer’s heatwave — an “unambiguous” sign of global warming, according to a recent report — will be in their minds. A major cause of that warming is two centuries of rising consumption of fossil fuels (coal, oil and gas).
Large-scale coal consumption began during the Industrial Revolution of the late 18th century. Oil boomed from the start of the 20th. Use of all fossil fuels then expanded dramatically during the Second World War and the economic boom that followed.
This expansion was one of several impacts of economic activity on the natural world that earth systems scientists define collectively as the “great acceleration.” This is seen as part of the Anthropocene, a new geological epoch in which the impacts of human activity work on the same scale as great natural forces.
The identification in the 1980s of global warming, perhaps the most dangerous such impact, did nothing to slow fossil fuel consumption growth. Today, the quantity of greenhouse gases pumped out by fossil fuel use in three years is about the same as that produced in the whole of the 19th century.
Who has been consuming all these fossil fuels? Rather than individual households, it is more accurate to think about technological systems — car-based transport and the urban infrastructure that supports it; electricity networks; industrial systems reliant on carbon-heavy materials such as steel; and agricultural methods that soak up gas-based fertilizers — and the social and economic systems in which they are embedded.
Take electric air-conditioning (AC) systems, used widely in hot summers. They first appeared in large numbers before the First World War, in the U.S., which had pioneered the installation of urban electricity networks from the 1880s and led the way in popularizing appliances such as refrigerators, washing machines and radios.
In the 1920s, AC was marketed aggressively, as were electric lighting and motor cars. Manufacturers, anxious to sell to schools, battled furiously against New York state regulators, who argued that fresh air from open windows would be better for pupils’ health than conditioned air.
Sales of AC units, like other consumer electric goods, soared again in the U.S. in the boom that followed the Second World War. The doubts of homeowners and office managers, many of whom regarded AC as a luxury, were overridden by the construction industry, who were convinced by AC manufacturers to install it as standard.
AC made urban growth and industrial production possible in the tropical climate of the U.S. southern states, as the historian Raymond Arsenault showed in his seminal article of 1984, “The End of the Long Hot Summer.” In the 1960s, AC was a major cause of the reversal of the trend of out-migration from the South to other parts of the U.S.
By the 1980s, AC, like other energy-guzzling household technologies, was already ubiquitous in U.S. cities and common in other rich countries.
Today, such technologies are condemned as wasteful of fossil fuels. An interpretive challenge in researching and discussing the global history of fossil fuel consumption is to explain how — even after the world’s governments acknowledged, at the 1992 Rio summit, the need to stop global warming — their use continued to rise.
It is a common fallacy that soaring fossil fuel use in the last quarter of a century is largely about first-world consumer technologies, such as AC, being diffused across developing countries. There are small, relatively rich, corners of the global south where these are used, but their contribution to total global fuel consumption is minor. And the fundamental inequalities remain. Canada’s per-head energy consumption is estimated at 450 times higher than that of the poorest African countries. U.S. households use four times as much electricity for air conditioning alone as Nigeria’s use for everything.
But this is not solely about first-world consumers. Final-user technologies develop and spread as part of larger systems. AC, for example, needs a reliable electricity supply, which is beyond the reach not only of 14% of the world’s population, who have no electricity, but of a further 25% or so who have limited or erratic supply. Furthermore, AC and other electrical appliances have become a part of life in energy-intensive cities largely built with energy-intensive materials, such as cement and steel. These cities are characterized by badly insulated buildings and energy-wasteful transport systems: these two inefficiencies consume far more fossil fuels than AC systems.
During the postwar boom, when fossil fuels were cheap, technologies that used them were scarcely regulated. The 1970s’ oil price shocks made for temporary improvement; much of that progress was reversed in the 1980s; the threat of global warming has stimulated little action. From plastics to construction, aviation and car travel, governments allowed (indeed, encouraged) crucial decisions to be made according to companies’ profit motives. Japan — where the energy efficiency of new AC systems rose by 68% in the decade to 2005, because regulators required improvements — is the exception that proves the rule, at least in regard to AC.
Weak regulation is part of a larger political problem: the ideological conviction that it is market mechanisms that can and must reduce fossil fuel consumption. This belief underpinned both the 1997 Kyoto protocol, which failed to cut greenhouse gas emissions, and the 2015 Paris agreement, with its voluntary — and inadequate — targets. Governments that failed to put money into moving away from fossil fuels nevertheless oversaw hundreds of billions of dollars in subsidies for those fuels.
Technological systems have evolved the way they have because of the social and economic systems in which they are embedded. A move away from fossil fuels requires the wholesale transformation of all these systems.
Simon Pirani is the author of Burning Up: A Global History of Fossil Fuel Consumption (Pluto Press, 2018).