None of this is why Guy Callendar’s name will be boldfaced in tomorrow’s textbooks. Instead it will be because he was willing to delve into fields he knew nothing about, atmospheric science among them. Nobody knows why he got so interested in the air. Callendar himself attributed it to ordinary curiosity: “As man is now changing the composition of the atmosphere at a rate which must be very exceptional on the geological time-scale, it is natural to seek for the probable effects of such a change.”
In the early 1930s Callendar began collecting measurements of the properties of gases, the structure of the atmosphere, the sunlight at different latitudes, the use of fossil fuels, the action of ocean currents, the temperature and rainfall in weather stations across the world, and a host of other factors. It was a hobby, but a remarkably ambitious one: He was producing the first rough draft of the huge climate models familiar today. After years of calculation, in 1938 he came to a surprising conclusion: People were dumping enough carbon dioxide into the air to raise the world’s average temperature.
Callendar did not have a PhD, but he had enough academic status to be allowed to present his ideas that year in front of a panel of six professional climate scientists at the Royal Meteorological Society. The pros were familiar with the claim that carbon dioxide affected climate, which other researchers—notably Sweden’s Svante Arrhenius—had made in previous decades. But these ideas, in their view, had been thoroughly debunked. Years before Callendar’s presentation, British Meteorological Office head George Clarke Simpson had stressed the scientific consensus that carbon dioxide in the air had “no appreciable effect on the climate.” Now he was one of Callendar’s commentators. Callendar, unlike his predecessors, had a coherent model and decades of new data. Nonetheless Simpson was not kind. The problem with people like Callendar, he sniffed, was that “non-meteorologists” simply didn’t know enough about climate to be helpful. The other five commentators were no more appreciative. Although Callendar had spent years gathering evidence, they were “very doubtful” that his work meant anything.
Aside from snobbery, the biggest reason for skepticism was that there simply wasn’t—and isn’t—very much carbon dioxide in the air. When Callendar was scribbling away, carbon dioxide comprised about .03 percent of the atmosphere by volume (the level has risen slightly since then). If somebody collected 10,000 scuba tanks of air, the carbon dioxide in them would be enough to fill up three tanks. How could anything so tiny be important to a huge, super-complex system like the atmosphere? It was like claiming that a toy bulldozer could level Manhattan. The idea seemed absurd on its face.
Undeterred, Callendar kept working on what came to be called the “Callendar effect.” This was not because he feared the impacts of rising carbon dioxide. In fact, Callendar believed that this warming business sounded like a good thing. “Small increases in mean temperature” would help farmers in cold places, he argued. Better yet, they would “indefinitely” postpone “the return of the [ice ages’] deadly glaciers.”
Callendar died in 1964. By that time, many climate scientists had reconsidered their opposition to his ridiculous-sounding belief that slightly increasing the small amounts of carbon dioxide in the air could affect global temperatures. A few were grappling with an even crazier idea: that people were pumping enough carbon dioxide into the air to reshape the face of the Earth and put human existence at risk. But nobody was imagining that the possible solution to our inadvertent transformation of the planet would be to transform the planet even more.
Stupid Bad Luck
Climate change entered the modern public arena on June 23, 1988, when NASA researcher James E. Hansen testified before the US Senate about its potential effects. Colorado Senator Tim Wirth, an avid environmentalist and unusually cerebral politician, had learned about the Callendar Effect and wanted Hansen, a climate expert, to ring the bell and warn the nation. To bolster the narrative, Wirth deliberately scheduled the hearing for what historically was the summer’s hottest day and shut off the room’s air conditioning.
The scheme worked beyond his wildest dreams. Hansen sat down amid a wave of bad weather that covered the entire planet. Downpours inundated parts of Africa; unseasonable cold shriveled European harvests; droughts scorched crops in the US Middle West; forests were aflame across the West. That day Washington, DC, experienced a record 101ºF; perspiration glistened on Hansen’s temples as he spoke. He said, “The Earth is warmer in 1988 than in any time in the history of instrumental measurements.” He said, “With 99 percent confidence we can state that the warming during this time period is a real warming trend.” Carbon dioxide, he said, “is changing our climate now.”
Hansen’s stark words sparked headlines across the world. The New York Times put his charts on page one, and he appeared on a dozen television shows. Suddenly the parched fields, forest fires, and sweltering cities added up to a coherent pattern—harbingers of a dystopian future. Adding to the furor, journalist Bill McKibben published in 1989 the first popular account of climate change, The End of Nature, a worldwide best-seller despite its ominous title. More important, scientific research took off. Before 1988 peer-reviewed journals had never published more than a score of articles in a given year that contained the terms “climate change” or “global warming.” After 1988 the figure climbed: 55 in 1989; 138 in 1990; 348 in 1991. By 2000: 1,340. In 2015 it was 16,576.
Despite all the publications, few non-climatologists have even a rudimentary understanding of why airborne carbon dioxide raises temperatures—of what Callendar figured out in 1938. The Sun washes the Earth with every imaginable type of light: ultraviolet, microwaves, infrared, radio waves, visible light, you name it. Roughly half of this light is either reflected by clouds or absorbed by the atmosphere. The rest, mostly visible light, passes through the air and is soaked up by the land, oceans, and plants. Having taken in all of this energy, the ground, water, and vegetation naturally warm up, which makes them emit infrared light (the kind of light that we can see with night-vision goggles in old James Bond movies). This release of energy prevents the surface of the Earth from getting unbearably hot.
More than 95 percent of the atmosphere consists of nitrogen and oxygen. A trivial-sounding but important fact about nitrogen and oxygen is that they cannot absorb infrared light. If our air consisted entirely of these two gases, the infrared from the surface would pass through the atmosphere into space like a shotgun blast through tissue paper and our planet would be unbearably cold. Happily, there is something else in the atmosphere: water vapor, which can and does absorb the majority of this outgoing infrared energy. If water vapor caught all of it, the air would become unbearably hot. Instead, a little bit of the infrared energy slips by water vapor—just enough to prevent the atmosphere from heating to intolerable levels.
Two mechanisms are responsible for the escape. The first is that the water vapor releases some of its absorbed energy, also as infrared light. It is re-absorbed and re-emitted by water vapor many times in our atmosphere, but eventually some of that light passes beyond the atmosphere, into outer space. The second is that water vapor doesn’t absorb all the Earth’s infrared radiation—the vapor is effectively transparent to certain wavelengths. Through these transparent “windows” some infrared passes into the vacuum.
So far, both Callendar and his critics would have been in agreement. But what Callendar realized—and his critics initially didn’t believe—is that carbon dioxide absorbs some of the wavelengths that water vapor lets through—it shuts the windows, so to speak. The more carbon dioxide, Callendar said, the more firmly shut the windows. With the escape route cut off, the atmosphere heats up.
In Callendar’s scenario, the atmosphere is like a bathtub. Water pours into the tub in the form of infrared radiation. In the tub are little holes—the “windows” through which water vapor allows light to pass. As a rule, the outflow from the holes is approximately equal to the inflow from the spigot, so the water level in the tub is constant. But block a hole or two with chewing gum—that is like adding carbon dioxide to the air. Now, slowly, inevitably, the water will rise.
From the human point of view, this is just stupid bad luck. If the physical properties of carbon dioxide and water vapor didn’t intersect in this way—if carbon dioxide didn’t happen to absorb the infrared radiation that water vapor lets through—then burning fossil fuels would be of little interest to climate researchers. The carbon-dioxide rise would be a dusty corner of atmospheric science, swept occasionally by pedantic graduate students. Coal and oil could be burned without worry (after removing pollutants). Industrial civilization would not be facing an existential challenge.
Note, though, what Callendar’s bathtub doesn’t tell us: how fast, after some holes are plugged, the water rises. Figuring this out has turned out to be vexingly difficult, because the basic physics and chemistry is overlaid by a morass of feedback mechanisms. If higher carbon dioxide levels warm the atmosphere, for example, it will become more humid. On the one hand, moister air absorbs more heat, further driving up temperatures: a positive feedback loop.
On the other hand, moister air leads to more cloud cover, which blocks the sun, lowering temperatures: a negative feedback loop. Similarly, higher temperatures could melt the ice in glaciers and at the poles, leaving bare rock. Because the rock is darker than the ice, it would soak up more of the sun’s heat, raising the temperature and melting more ice to expose more rock: positive feedback. But the cold meltwater from the glaciers would pour into the oceans, lowering their temperature, which would chill the air over the water: negative feedback. The complications are endless, and they have still not been wholly resolved.
Consider this question: What will happen to global average temperatures if atmospheric carbon dioxide levels double? First asked by Arrhenius, Callendar’s precursor, the question seeks to define what researchers today call “climate sensitivity.” Before the widespread use of fossil fuels—before 1880, more or less—the atmospheric carbon dioxide level was about 280 parts per million. Arrhenius in effect asked what would happen if that number went up to 560 parts per million (the level now is a bit above 400). In 1979 the US National Research Council asked the same question. Its report projected that doubling atmospheric carbon dioxide would raise global temperatures by between 2.7ºF and 8.1ºF (1.5º and 4.5ºC).
The National Research Council team had produced its estimate by averaging the results from two models and adding about 1ºF on either end to account for uncertainty. The procedure was crude, but the best available at the time. Since then, many other scientific groups with vastly more sophisticated methods have tried to improve climate-sensitivity estimates. Notable among them was the Intergovernmental Panel on Climate Change, which has produced five major reports on the state of climate science, the most recent in 2014. All five attempted to assess climate sensitivity.
Unfortunately, as the economists Gernot Wagner and Martin L. Weitzman lament in their book Climate Shock (2015), the likely range of increase from doubling carbon dioxide levels foreseen in the last IPCC report—2.7º to 8.1º F (1.5–4.5ºC)—was exactly the same range the scientists had guesstimated back in 1979. Four decades of additional research has not brought us closer to predicting the precise impact of dumping carbon dioxide into the air. We know that temperatures will rise, but not how fast or how high.
This is not because the researchers are lazy or incompetent but because global climate change is phenomenally complex. Still, the uncertainty puts political leaders in a bind. A rise of 2.7ºF would be tolerable, it is generally believed, whereas a rise of 8.1ºF would be intolerable: enough to melt polar ice, inundating coastlines around the world. It is as if our species were running blindfolded toward a cliff. Nobody knows its precise location or height. There is a small chance the cliff is so low and far away as to be harmless—and a larger chance that it is high and rapidly approaching.
Lake Michigan at Your Disposal
Imagine a graph. On the vertical axis is some variable of human welfare: nutrition, income, mortality, life expectancy, literacy, overall population. On the horizontal axis is time. In almost every case the graph skitters along at a low level for thousands of years, then rises abruptly in the eighteenth and nineteenth centuries. The reason for the sudden, ubiquitous rise in well-being is the Industrial Revolution. It permitted people to produce enormous quantities of essential, civilization-sustaining substances like cement, steel, and fertilizer. To make those enormous quantities required equally enormous amounts of energy. That energy consisted of fossil fuels: coal, oil, natural gas, kerosene, and so on. Humankind went on an energy binge.
The results were staggering. Between 1900 and 2000, according to the University of Manitoba environmental scientist Vaclav Smil, global energy consumption rose roughly 17-fold. In that same time, economic output rose 16-fold—“as close a link as one may find in the unruly realm of economic affairs.” Chances are that you are reading this article in a comfortable chair in a comfortable, well-lighted room that is heated or cooled to your preference—a level of luxury that was almost unavailable anywhere two centuries ago. Fundamental to your ease is billions of tons of fossil fuels.
Before the Industrial Revolution, energy supplies were limited to what people had around them: wind (windmills), water (watermills), animals (horses, oxen, llamas), and biomass (wood, charcoal, dung). All of that changed in the 18th and 19th centuries. Suddenly people had access to energy that had been stored underground for millions of years. It was the difference between being restricted to a trickle of water flowing by in a brook and having Lake Michigan at your disposal. Fossil fuels let people tear apart nitrogen gas to make huge quantities of artificial fertilizer.
They let people bake giant ziggurats of calcium to create the 4.6 billion tons of cement humans use every year. They let people pile up coking coal in blast furnaces as part of the necessary physical infrastructure to transform pig iron into the steel that supports every structure on Earth that is taller than a hundred feet. They provide so much power that they rocket huge, bulbous metal tubes full of people 30,000 feet into the air hundreds of times a day—a dream of flight beyond the imagination of Leonardo.
Fossil fuels are utterly entangled with every aspect of modern life. The idea of doing away with such an essential prop to contemporary comfort and affluence seems absurd. But that is just what must be done to avoid the worst impacts of climate change. It is like being told you can never again eat your favorite foods but must instead adopt some new diet of substances you have never seen. Little wonder that politicians have shown no appetite for pushing this on the public!
The uncertainties about climate sensitivity make the question especially confounding. When the IPCC says that the likely consequence of doubling carbon dioxide is a temperature rise between 2.7ºF and 8.1ºF, Wagner and Weitzman point out, the scientists have a specific definition in mind for “likely.” Skipping the mathematical complexities, it boils down to saying there is about a two-thirds chance that the temperature rise will be between these two numbers. But that means there is a one-out-of-three chance that the effect will be outside this range. In rough terms, this translates into a one-out-of-six chance that nothing much will happen—and a one-out-of-six chance of complete disaster, with chunks of the planet becoming nearly uninhabitable. That small but real chance of catastrophe is the key, Wagner and Weitzman argue.
On a personal level, people deal with this sort of risk all the time. They know that they face a small but real chance of personal calamity: a home robbery, a car accident, a cancer diagnosis. To manage the risk, people buy insurance. Insurance mitigates the consequences of terrible but unlikely problems. Few people are upset if they pay for fire insurance and their house does not burn down, or if they buy life insurance and fail to die. They happily invested money for security against the risk of disaster.
Hope for the best but plan for the worst—that is the philosophy.
For climate change, what would that involve? What does climate insurance against disaster look like?
When the next hurricane approaches New Orleans, every resident will know what to do: empty the fridge. Back in 2005, hardly anyone did that for Hurricane Katrina. Families in the city were accustomed to leaving for a couple of days during bad storms, then coming back to streets strewn with branches and trash and maybe a few shingles. When Katrina hit, the flooding was so bad that people couldn’t return for weeks. This was NOLA—New Orleans, Louisiana: The weather was sunny and hot. Because of the storm, the electricity was out. Across the metropolitan area, a quarter of a million refrigerators became inadvertent experiments in the biology of putrefaction.
Despite the warnings, many homeowners opened their refrigerators. Almost everyone who did realized instantly that it was beyond repair.
Throughout the fall and winter, returnees duct-taped their refrigerators shut and dragged them out to the curb. White metal boxes lined the streets like gravestones. Sometimes they were spray-painted with sardonic slogans. Feed my maggots. Caution: Breath of Satan inside. Ho ho ho NOLA—this one decorated like a Christmas tree. Occasionally people illicitly dumped their refrigerators in faraway neighborhoods and came back home to find people from those neighborhoods had dumped refrigerators on their street.
Katrina created about 35 million cubic yards of debris in southern Louisiana—an estimate that does not include, among other things, the area’s 250,000 destroyed automobiles. East of the city is the Old Gentilly landfill, shut down as a hazardous waste site. It quickly reopened and became Mount Katrina: a 200-foot-tall mass of soggy armchairs, ruined mattresses, busted concrete, and moldy plywood. The refrigerators had their own staging area, separate from the stoves and dishwashers, in the foothills of Mount Katrina.
Fridgelandia was an amazing sight. Battered white boxes, stacked up hundreds of feet in every direction. Teams of workers in gas masks and crinkly hazmat suits, scooping out the writhing contents with plastic snow shovels. If people didn’t shovel quickly, carnivorous dragonflies would descend on the maggots in such clouds that workers couldn’t see.
Until I visited post-Katrina New Orleans I did not realize that rebuilding a flooded modern city would involve disposing of several hundred thousand refrigerators. Nor had I realized that it would involve toxic blooms of fungi new to science. Or that cities would have a hard time functioning after the sudden and immediate collapse of all local insurance bureaus.
When Katrina hit New Orleans, it had become a relatively modest storm, but one that was still strong enough to overwhelm inadequate dikes and levees. Many climate scientists believe that in days to come governments will need to get better at shoreline defense. The world has 136 big, low-lying coastal cities with a total population of about 550 million people. All are threatened by the rising seas associated with climate change.
A study in Nature in 2013 estimated that if no preventive actions are taken annual flood costs in these cities could by 2050 reach as much as a trillion dollars. Other research teams have arrived at similarly extreme estimates. Coastal flooding could wipe out up to 9.3 percent of the world’s annual output by 2100 (a Swedish-French-British team in 2015). It could create losses of up to $2.9 trillion in that year (a German-British-Dutch-Belgian team in 2014). It could put as many as one billion people at risk by 2050 (a Dutch team in 2012).
Some economists argue that these figures are exaggerated; indeed, I have cited the researchers’ worst-case scenarios, to emphasize the stakes. But the same economists also point out that some of the most threatened areas are irreplaceable parts of the world’s cultural and natural patrimony. Venice is an obvious example, but so are places like central London, New Orleans and the Mississippi Delta, the vast ancient complex of Chan Chan in coastal Peru, and the great Sundarbans mangrove forests in India and Bangladesh.
Suppose that the worst scenario has come to pass by 2050: Our civilization has cut the amount of carbon it sends into the air, but not by enough. Suppose further that climate sensitivity has turned out to be on the high side. In this scenario we are racing toward an increase in global temperatures of 7ºF, perhaps even 9ºF. (Nothing known today rules this out.) What to do? The waters are lapping higher.
Tomorrow’s leaders would face a dance of impossibilities: The extraordinary cost and effort of rapidly replacing the world’s energy infrastructure versus the extraordinary cost and effort of moving cities versus the extraordinary cost and effort of continuing on the same path. Facing this dizzying choice, who wouldn’t look for an escape hatch? To save the future, some would look at the past.
Two pasts, in fact. The first goes back to Mount Pinatubo, a volcanic eruption in the Philippines in 1991. The explosion shot gas, dust, and ash into the stratosphere. That plume of volcanic pollution contained at least 20 million tons of sulfur dioxide, a pungent, toxic gas. Water vapor in the stratosphere combined with it, producing shiny, microscopic droplets of sulfuric acid. Taken together, the journalist Oliver Morton has calculated, these aerosols had a surface area similar to that “of a large desert—definitely bigger than the Mojave, likely smaller than the Sahara.” Like an airborne desert of blazing white sand, the field of sulfuric acid droplets reflected sunlight into space. For two years the amount of sunlight that reached the surface dropped by more than 10 percent. Average global temperatures fell by about 1ºF.
Now perform some elementary arithmetic. The air today contains a bit more than 400 parts per million of carbon dioxide. Each part per million is equivalent to 7.8 billion tons of carbon dioxide. Four hundred parts per million times 7.8 billion comes to 3.1 trillion tons of airborne carbon dioxide. In 1880, before people began burning coal in large quantities, the carbon dioxide level was about 280 parts per million. Doing the same kind of multiplication, that is equivalent to 2.19 trillion tons of carbon dioxide in the air. Subtracting the preindustrial number from today’s number leads to the conclusion that our antic consumption of fossil fuels has added 0.91 trillion tons of carbon dioxide to the atmosphere. Round up to a trillion, for simplicity’s sake. The result of that trillion has been to raise global temperatures by about 1.4ºF (0.8ºC), with most of that warming since 1975. A bit more back-of-the-envelope arithmetic: 650 billion tons of carbon dioxide is, roughly speaking, equivalent to 1ºF of warming.
Pinatubo offset that one degree of warming with about 20 million tons of sulfur dioxide. Doing the arithmetic again, sulfur dioxide is, molecule by molecule, more than 30,000 times more effective at lowering temperatures than carbon dioxide is at raising them.
Actually, this understates the relationship. One ton of water in a single round blob has a surface area of roughly 50 feet. Divide that same ton of water into droplets a few 10,000ths of an inch in diameter. The volume of water remains the same, but the surface area increases—to more than two square miles. Cut each of these little droplets into five identical but even smaller pieces. Now the surface area is about 10 square miles—10 square miles of thinly spread mirror. (I have lifted this calculation from Morton’s fine book, The Planet Remade .) The smaller the droplets, the bigger the mirror; the bigger the mirror, the more the reflection.
The cooling from Pinatubo’s 20 million tons of sulfur dioxide was geophysical happenstance; the droplets it formed were not of the optimal size. By making smaller, more effective droplets, carbon-dioxide fighters could achieve the same reduction by spraying just a few million tons of sulfur dioxide into the air in a year. Actually, they would probably spray sulfuric acid directly, rather than having the atmosphere convert sulfur dioxide, but the principle is the same. The most direct method to accomplish this task would be to launch specialized delivery vehicles from the Earth, each with a payload of sulfuric acid.
Such vehicles are already aloft. They are called commercial jetliners. A new Boeing 747 carries as many as 600 passengers. The average weight of a US person is about 175 pounds. To make figuring easy, assume that each 175-pound passenger has 25 pounds of luggage and thus represents a unit of 200 pounds. Each 600-person flight on a 747 thus carries 120,000 pounds—60 tons—of human weight. To send aloft two or three million tons of sulfuric acid would require flying about 100 or so flights a day. That’s a rounding error in global aviation. Ryanair, an Irish budget airline, operates 1,800 flights a day; Alaska Air, a regional US airline, has almost 900. Recreating Pinatubo would require a service about a 10th the size of Ryanair or possibly a fifth of Alaska Air.
For better or worse, a fifth of an Alaska Air would not be expensive. One well-known estimate from 2012 suggested that just 14 big cargo aircraft—Boeing 747s, for example—could pull a Pinatubo for a little more than $1 billion a year. But commercial jets are not designed to fly into the stratosphere (the higher one places the sulfur, the longer it will stay aloft). Specially designed planes could be more effective and cost only two or three billion dollars a year to operate. The cost for a decade of counteracting most of the impact of carbon dioxide, the Harvard physicist David Keith has written, “could be less than the $6 billion the Italian government is spending on dikes and movable barriers to protect a single city, Venice, from climate-change-related sea level rise.”
To governments looking at rising seas, experiments with sulfur dioxide could seem worth the risk. Should carbon emissions not fall quickly enough, the idea goes, the world might dump sulfuric acid into the air for a couple of decades, buying enough time to finish the transition from fossil fuels. In theory, the injections could be focused on the skies above the poles, creating a reflective shield over the Arctic and Antarctic ice sheets. The goal would not be to eliminate all global warming, but to take the edge off, reducing it by a fourth or a third until it reaches the relatively safe level of 3º–4ºF.
Since the 1980s such plans to deliberately alter the Earth’s climate have been called “geoengineering.” Geoengineering fights climate change with more climate change. It is, in the jargon, a “technical fix.” It replaces the idea of staying within natural limits with the goal of creating a balance on terms set by humankind. It is an audacious promise to fix the sky.
Geoengineering is not a new idea. Ancient religions promised for millennia to control the weather by negotiating with heavenly powers. When the rise of science downgraded the role of priestly intercession, lunatics, impostors, and bunco artists filled the vacuum. Flotillas of phony rainmakers traveled through the 19th-century Middle West, taking advantage of drought fears to sell mysterious engines, bottles of vile, foamy liquids, and pamphlets filled with scientific-sounding gabble to credulous farmers.
Legitimate experiments in “cloud seeding”—sprinkling tiny crystals of dry ice in clouds, to stimulate raindrop formation—began in the 1940s. They effectively ended the reign of the con artists, but gave rise to another breed of fraud, the overly optimistic intellectual. Promising that “[w]e will change the Earth’s surface to suit us,” the physicist Edward Teller proposed that atom bombs be used to shake loose recalcitrant petroleum deposits, create a second Panama Canal, and manipulate weather patterns. The most wild-eyed schemes came from Moscow, where Soviet dreamers unfurled one grandiose, loopy strategem after another. Melting Arctic ice by bombing it with soot. Building a causeway off Newfoundland to redirect the Gulf Stream. Damming the Congo River to irrigate the Sahara. Pumping warm water from the Japanese Current into the Arctic Ocean, shrinking the ice cap. Launching thousands of rockets full of potassium dust to create Saturn-like rings around the Earth that would somehow induce a “perpetual summer.”
From these brainstorms came proposals to offset climate change from carbon dioxide, the first in 2006, from the Nobel Prize–winning chemist Paul Crutzen. Gone were the days of Soviet-style gusto; Crutzen’s tone was anything but triumphant. “The very best would be if emissions of the [climate-changing] gases could be reduced so much the stratospheric sulfur release experiment would not need to take place,” he wrote at the end of his article. “Currently, this looks like a pious wish.” Others have echoed his tone. Geoengineering might be the culmination of technophile dreams of power and control, but its advocates have drawn back, chastened, from the implications; their support for geoengineering is mixed with regret and caution. Keith, the Harvard physicist, has likened it to chemotherapy for the planet—a treatment that nobody would want to have unless forced by circumstance, because it deliberately makes the patient sick to cure a greater disease. Tinkering with the atmosphere, in the phrase of writer Eli Kintisch, may be a bad idea whose time has come.
The potential pitfalls are many. Sulfur compounds would interact with stratospheric ozone, which protects surface-dwellers like us from the sun’s dangerous ultraviolet radiation. The sulfur soon falls to the earth, contributing to lethal air pollution. (For this reason, some have suggested using particles of titanium, aluminum oxide, or calcite, which are more expensive but less likely to interact with ozone and unable to form acid.) Geoengineering may reduce temperatures globally, but there will still be local losers and winners—places that experience too much or too little rainfall, places subject to sudden temperature extremes. And no matter how much sulfur dioxide humankind throws into the heavens, the carbon dioxide will remain; indeed, to counteract the ever increasing total, more sulfur must be launched into the air every year. Indeed, stopping it suddenly would be disastrous; all the postponed warming would emerge in a few months.
The greatest danger posed by planet-hacking comes from its greatest virtue: its low cost. Wagner and Weitzman, the economists, call it a “free-driver” problem; driving the car is so cheap, anyone can take it for a spin. Spraying sulfur is cheap and easy enough that a single rogue nation could reengineer the planet by itself. Or two countries could separately decide to alter the climate in conflicting ways. In our time of economic inequality, the world seems aswarm with idle billionaires. “A lone Greenfinger, self-appointed protector of the planet and working with a small fraction of the Gates bank account, could force a lot of geoengineering on his own,” remarked the Stanford international-relations specialist David Victor.
Environmentalists go on tilt when they hear these ideas. Fighting pollution with pollution is, in their view, marching precisely in the wrong direction. Not only is it crazy to begin with, they argue, it’s a distraction from the urgent social reforms needed for the future. Worse, geoengineering forever desacralizes Nature; it puts the final seal on the replacement of the authentic, billion-year-old natural world by a new, artificial world whose every surface bears the greasy human fingerprint. But the specter of drought, flooded cities, and ruined ecosystems is so imminent that some have begun thinking, however anxiously, about their own form of geoengineering. Like the first kind of geoengineering, it is animated by a vision of the past. But this past is ancient: the end of the Carboniferous epoch.
Diverse plant and animal life has existed for about 550 million years. For almost all of that time, carbon dioxide levels have been high—sometimes 20 times higher than they are now—and the world was, by today’s standards, unbearably hot. Only twice during this period has the world experienced prolonged periods of lower temperatures: our own epoch—more exactly, the last 50 million years—and the end of the Carboniferous.
The Carboniferous, one recalls, was the period in which large land plants emerged: lepidodendrons, horsetails, giant ferns, and a host of other now-vanished species. Forests grew in such proliferation that they sucked huge amounts of carbon from the air. Average temperatures fell from 75º–85ºF to something like 50ºF, lower than today’s average of 55º–60ºF—low enough to set off not one but two ice ages, killing huge numbers of plants and setting in motion the creation of coal.
Could natural systems be harnessed to suck carbon from the air? Why not create a new Carboniferous by covering the two biggest deserts in the world—the Sahara and the Australian outback—with trees? In 2009 three researchers—two at the NASA Goddard Institute for Space Studies, one at the Mount Sinai School of Medicine, all in New York—proposed just that. At bottom, the idea is easy to understand. Very roughly speaking, humankind emits 40 billion tons of carbon dioxide a year, mostly by burning fossil fuels. About 40 percent of the total is absorbed by plants, microorganisms, and the ocean.
Foresters have spent decades measuring the rates at which trees grow, which in turn is a measure of their capacity to take carbon dioxide out of the air. If one takes foresters’ measurements seriously, covering all 3.8 million square miles of the Sahara with drought-tolerant Eucalyptus grandus would suck roughly 20 billion tons of carbon dioxide from the atmosphere every year—enough to have a substantial impact on climate change, though its exact size would depend on the reaction of the oceans and land plants. Still more carbon dioxide could be tucked away by foresting the Australian outback, which is almost two-thirds the size of the Sahara.
Tree planting, advocates say, is simpler and less risky than high-tech schemes. Instead of flotillas of planes sprinkling poison into the sky, people should install cheap, natural carbon-eating mechanisms—trees—in equatorial deserts. Unlike carbon-capture plants, trees in carbon farms represent a direct solution to the problem of climate change. Adding sulfur to the air, geoengineer-style, would make the world less hospitable by harming the ozone layer. Planting trees in the Sahara, the Arabian desert, the Kalahari, or the Australian outback would make these parts of the Earth more habitable, even desirable. The trees would increase humidity, which in turn should increase rainfall. Land that is now sterile would become farmland for carbon, and then, possibly, just farmland.
All climate-change measures will involve people in developed nations paying a lot of money, Klaus Becker and Peter Lawrence of the University of Hohenheim, in Stuttgart, have contended. Now present those taxpayers with two alternatives, “one that requires the introduction of untried and potentially hazardous new technology on their own doorsteps, and one that involves the establishment of forests in underpopulated countries far away with possible related benefits for the local populations.”
To get an idea of what a massive reforestation project might be like, visit the Sahel. Technically, the name “Sahel” refers to the arid zone between the Sahara desert and the wet forests of central Africa—a broad east-west band that runs from Mauritania on the Atlantic through Burkina Faso, Niger, and Chad to Sudan on the Red Sea. Until the 1950s the Sahel was thinly settled. When the population boom began, people from the more crowded areas to its south shifted north, into the empty zone. City slickers moving into the sticks, they didn’t know how to work this dry land. In the 1960s problems were masked by unusually high rainfall. Then came two waves of drought, one in the early 1970s and a second, worse episode in the early 1980s. More than 100,000 men, women, and children died in the ensuing famine.
In Burkina Faso, an aid worker named Mathieu Ouédraogo assembled the farmers in his area to experiment with soil-restoration techniques, some of them traditions that Ouédraogo had read about in school. One of them was cordons pierreux: long lines of stones, each no bigger than a fist. Because the area’s rare rains wash over the crusty soil, it stores too little moisture for plants to survive. Snagged by the cordon, the water pauses long enough for seeds to sprout and grow in this slightly richer environment. The line of stones becomes a line of grass that slows the water further. Shrubs and then trees replace grasses, enriching the soil with falling leaves. In a few years, a minimal line of rocks can restore an entire field. As a rule, poor farmers are wary of new techniques—the penalty for failure is too high. But these people in Burkina were desperate and rocks were everywhere and cost nothing but labor. Hundreds of farmers put in cordons, bringing back thousands of acres of desertified land.
One of the farmers was Yacouba Sawadogo. Innovative and independent-minded, Sawadogo wanted to stay on his farm with his three wives and 31 children. “From my grandfather’s grandfather’s grandfather, we were always here,” he told me. Sawadogo laid cordons pierreux across his fields. He also hacked thousands of foot-deep holes in his fields—zaï, as they are called—a technique he had heard about from his parents. Sawadogo salted each pit with manure, which attracted termites. The insects dug channels in the soil. When rain came, water trickled through the termite holes into the ground, rather than washing away. In each hole Sawadogo planted trees. “Without trees, no soil,” he said. The trees thrived in the looser, wetter soil in each zaï. Stone by stone, hole by hole, Sawadogo turned 50 acres of desert waste into the biggest private forest for hundreds of miles.
To my untrained eye, his forest looked anything but miraculous: an undistinguished tangle of small trees and shrubs interspersed with waist-high grass. Then Sawadogo showed me a photograph of his land at the time of the drought: bare reddish soil, tufts of grass, a few dusty bushes. Not a tree in sight. For me to think his land looked undistinguished was like looking at a functioning automobile somebody built out of junk in the basement and sneering at the paint job.
At his home Sawadogo had a list of the tree species in his forest, compiled by a botanist in Ouagadougou, the capital. Atop the list was Jatropha curcas, a small, shrubby tree with nuts used to make fuel oil. In 2014 German researchers dug up jatropha trees from Luxor, Egypt, and measured their carbon content. They determined that an acre of desert jatropha warehouses the carbon from 209.5 tons of carbon dioxide every year. On average, each US citizen emits 18.7 tons of carbon dioxide per year; each German, 8.9; each Indian, 1.7. If jatropha carbon-storage values are typical, walking through Sawadogo’s 50-acre tree farm was like pushing through a crowd of 560 Americans, 1,175 Germans, or 6,160 Indians.
Unsurprisingly, the new techniques, uncomplicated and inexpensive, spread far and wide. The more people worked the soil, the richer it became, the more trees grew. Higher rainfall was responsible for part of the regrowth. Another contributing factor, possibly, was higher atmospheric carbon dioxide, which encourages plant growth, especially in dry areas. (Above a certain level, the benefits of higher carbon-dioxide levels to plant growth are outweighed by the negatives, including drought, heat stress, and enzyme failure.) But mostly the restoration of Burkina was due to the hands and hearts of thousands of men and women. Next door in Niger was even greater success, according to Mahamane Larwanou, a forester at Dioffo University in Niamey. With little or no support or direction from governments or aid agencies, farmers used picks and shovels to reforest more than 40,000 square miles, an area the size of Virginia.
The carbon farms envisioned by green geoengineers are much bigger and located in even drier areas. Initially, they would require irrigation. In many cases the water would have to come from desalination plants on the shore. At first the plants would probably run on solar energy; after about three years, they could be driven by trimmings, leaves, and nuts from the trees. Studies suggest that trees could provide enough power to provide their water—they would be, so to speak, sustainable. After several decades, carbon farmers would harvest the trees and replace them with new, fast-growing saplings. All of this would be expensive, but all carbon remediation schemes are expensive. It is not ridiculous to imagine that the economic activity from making the Sahara habitable would offset some of the costs.
The old trees could be “pyrolized”—burned in low-oxygen environments, which turns them into charcoal. Depending on how it is produced, charcoal typically retains about two-thirds of its original carbon. The charcoal can be ground and buried, enriching the soil. Desert soils tend not to hold nutrients and organic matter because they are made from types of dirt that don’t bind to them chemically. Any precipitation makes them wash away. Over time, buried charcoal slowly oxidizes, providing the requisite binding sites. Nutrients and organic matter “sticks” to it, providing food for the bacteria, fungi, and other microorganisms that make soil fertile. Charcoal, properly manufactured and deployed, can dramatically improve bad farmland. It also stores carbon: Johannes Lehmann, a charcoal-soil expert at Cornell University, has calculated that turning residues from agriculture and logging could offset as much as an eighth of the world’s carbon dioxide output if the gases from charcoal-making were captured and turned into fuel. The figure is higher if the climate-changing gases methane and nitrous oxide, emitted by agricultural waste in rice paddies, are included. Presumably these techniques could be applied in carbon farms.
Naturally these ideas have attracted critics. The forests would destroy desert ecosystems, they say. Or they would require large numbers of people to radically change the way they live. Or it amounts to green imperialism—forcing poor people in desert areas to offset the emissions of faraway rich people. Environmentalists say this kind of reforesting works best when it is bottom-up, harnessing the willing participation of people like Yacouba Sawadogo. Either one must coordinate the actions of millions of people to have an impact or create processes that need so few people that they can’t be controlled.
What to do, in a world brimming with fossil fuels? In climate change, all choices involve leaps into the unknown. Claims that this or that measure cannot be economically viable generally amount to saying, I prefer the unknown risks associated with this course rather than the unknown risks associated with that course because the first leads to a future that I like better. At bottom, the choices stem from private images of the good life—a life in which people are tied to the land or free to roam the skies. Only individuals can choose. The important thing is that they have choices, and we are still at the stage where, however dimly, we can see them working.