TIME health

The Medieval Black Death Made You Healthier—If You Survived

Plague killed millions in Europe
The Black Death killed as much as half of Europe's population Photo by Science & Society Picture Library/SSPL/Getty Images

The plague was horrific, could hit without warning and killed tens of millions in 14th century Europe. But paradoxically, the population that survived ended up better off, with higher wages, cleaner living conditions and healthier food

Game of Thrones doesn’t tell you the half of it. Life during the medieval ages was nasty, brutish and short. That was especially true during what became known as the Black Death. The widespread outbreak of plague struck between 1347 and 1351, killing tens of millions of people, resulting in the loss of 30 to 50% of the region’s population. The disease itself was horrific. “In men and women alive,” wrote the Italian poet Giovanni Boccaccio, “at the beginning of the malady, certain swellings, either on the groin or under the armpits…waxed to the bigness of a common apple, others to the size of an egg, some more and some less, and these the vulgar named plague-boils.” And it seemed to strike indiscriminately and without warning. People could be healthy in the morning and dead by evening.

The upside, if you can call it that, is that the plaque left in its wake populations that were healthier and more robust than people who existed before the plague struck, according to a new study published today in PLOS ONE. “The Black Death was a selective killer,” says Sharon DeWitte, a biological anthropologist at the University of South Carolina and the author of the paper. “And after the Black Death ended, there was actually an improvement in the standard of living.” The plague was natural selection in action.

In a way, that’s a marker of how brutal the medieval era was. It took a serial killer of a plague to actually bring about an improvement in living conditions. If that sounds counterintuitive, think about how life might have changed after half of Europe’s population died off. Suddenly there was a dramatic drop in the number of able-bodied adults available to do work, which meant survivors could charge more for their labor. At the same time, fewer people meant a decreased demand for foods, goods and housing—and as a result, the prices for all three dropped. By the late 15th century, real wages were three times higher than they were at the beginning of the 14th century, before the plague struck. Diets improved as employers were forced to raise wages and offer extra food and clothing to attract workers. As a result, the money spent per capita on food in the wake of the Black Death actually increased. “People were able to eat more meat and high-quality bread, which in turn would have improved health,” says DeWitte.

But the clearest evidence that people were healthier after the Black Death than they were before it comes in the bodies themselves. DeWitte looked at skeletal samples taken from medieval cemeteries in London both before the plague and after it. She found that post-Black Death samples had a higher proportion of older adults, and that morality risks were generally lower in the post-Black Death population than before the epidemic. In other words, if you were strong and lucky enough to survive one of the deadliest epidemics in human history, you were probably strong enough to live to a relatively ripe old age. And since the Black Death was so widespread, that was true for the surviving population as a whole.

Earlier studies looking at historical documents like diaries, letters and wills from the time period had shown conflicting results, but that kind of data only covers the very small part of the population that was literate, male and relatively well off. The advantage of DeWitte’s grave-combing bioarchaeological research methods is that they encompass a much more representative swath of the medieval population. “This provides information about the people who are missing from historical documents, including women and children,” says DeWitte. Not everyone in medieval London left a will behind—but everyone left a corpse.

So for survivors, life after the Black Death would have been at least a little less nasty, brutish and short than life before it. But that doesn’t mean the survivors were really the lucky ones. The Black Death was a period of unremitting horror and terror, the likes of which we can’t imagine. No one knew how the disease spread, or how to treat it. Popular but gruesome methods like blood-letting or boil-lancing would have been counterproductive at best, assuming victims could find anyone to treat them. Doctors abandoned their patients for fear of infection, and priests even refused to give last rites to the dying—an appalling dereliction given medieval fears of eternal damnation. Even animals like sheep, cows and pigs fell victim to the disease. “The people who survived the Black Death would have lost everyone they knew,” says DeWitte. “They’re the people I feel sorry for.” If the Black Death really was natural selection at work, it was the cruelest form imaginable.

TIME Environment

Carbon Pollution Could Make Your Sandwich Less Healthy

Nutrient levels will fall as CO2 rises
Nutrient levels in crops like wheat could fall as CO2 rises Shawn Baldwin/Bloomberg via Getty Images

Add more thing to worry about on climate change: the more CO2 we put in the atmosphere, the fewer nutrients many crops will have

The massive National Climate Assessment (NCA) that came out yesterday was full of sobering lessons about the way that human-caused global warming is changing life around us. That includes human health: the report found that rising temperatures could exacerbate air pollution and allergies, including asthma, while worsening wildfires and killer heat waves. More extreme weather—including frequent heavy downpours—can raise the risk of food and waterborne illness, and allow disease-carrying pests like deer ticks and mosquitoes to expand their range.

Now a new study published today in the journal Nature offers the most direct evidence yet of a significant health threat associated with climate change: less-nutritious crops. Researchers led by Dr. Samuel Myers at the Harvard University School of Public Health looked at how rising levels of the greenhouse gas carbon dioxide will impact staple foods like wheat, maize and soy. They found that as CO2 increases, the levels of vital minerals like zinc and iron will decline. Some 2 billion people around the world already suffer from zinc and iron deficiencies—resulting in a loss of 63 million life years annually. Elevated levels of CO2 will make that malnutrition even worse. “These crops are an important source of dietary zinc and iron for people who are near the nutritional threshold,” says Myers. “From a human health standpoint this could be very important.”

Researchers grew crops at different test sites—some sites had CO2 levels close to the levels we see today while others had levels we’re likely to reach by mid-century if the world keeps burning fossil fuels at an unsustainable rate. Earlier studies have found that elevated levels of CO2 could depress nutrient levels, but they were carried out in greenhouses or closed chambers, which might have skewed results. The experiments Myers and his colleagues drew on used free air carbon dioxide enrichment (FACE) technology, which allowed the crops—dozens of different types of grains, rice, soybeans and field peas—to be grown with variable levels of CO2 in open fields, as they would in the real world. “These FACE experiments are the gold standard,” says Myers.

Elevated CO2 levels affected different crops in different ways. Zinc, iron and protein concentrations in wheat grown at high CO2 sites fell by 9.3%, 5.1% and 6.3%. Field peas and soybeans also lost zinc and iron as CO2 rose. Maize and sorghum plants, however, showed less sensitivity to changing levels of CO2. And even within a crop like rice, different cultivars or genotypes were more or less resistant to the effects of CO2. Nor is it exactly clear why rising concentrations of CO2—which on the whole help fuel plant growth—might result in lower concentrations of important nutrients, though it’s possible the increased CO2 might be washing out some of those elements.

What do know is that the malnutrition would worsen if the staple crops that the world’s poorest people depend on become less nutritious as CO2 levels rise. One way to adapt to that threat, of course, would be to slow the increase of CO2 emissions, which would also have the useful side effect of slowing disastrous climate change. Farmers can also try to grow varieties of staple crops that have shown more resistance to high CO2 levels, while scientists can begin the work of breeding crops that are more resilient to carbon. Microsupplements of essential nutrients like zinc, which bolsters the immune system, could fill the gap as well. But there’s no getting around the fact that we already struggles to properly feed the 7 billion plus people who live on the planet. Higher CO2 levels and warmer temperatures will just make that task tougher.

For the first time in at least 800,000 years—if not far longer—average carbon concentrations in April stayed above 400 ppm for the entire month. Barring some way of cheaply removing CO2 from the atmosphere, those levels will only go up for the foreseeable future. “We are radically changing the entire global environment,” warns Myers. “We are moving into a set of environmental conditions we haven’t adapted to, and there will be impacts that affect our well being that will be hard to anticipate.” We are entering a new world, even if it is one of our own making. Expect a bumpy ride ahead.

TIME Environment

National Climate Report Is a Study in Extremes

A car sits in dried and cracked earth of what was the bottom of the Almaden Reservoir on Jan. 28, 2014 in San Jose, California.
A car sits in dried and cracked earth of what was the bottom of the Almaden Reservoir on Jan. 28, 2014 in San Jose, California. Justin Sullivan—Getty Images

The newly released National Climate Assessment grimly shows that warming is already upon us and extreme weather could become the norm

The White House pulled out all the stops for today’s rollout of the new National Climate Assessment (NCA), including making President Obama available to talk to local and national weather people about global warming. The report itself — download the whole 839-page paper here — is an incredibly impressive piece of work, detailing the current impacts and projected effects of global warming in the U.S. across a range of geographic regions and economic sectors. Even better is the government website dedicated to the NCA, which offers fascinating interactive and multimedia tools to help anyone see how climate change will affect their life, their community and their country. The entire document is much easier to understand — and much bolder — than the increasingly antiquated Intergovernmental Panel on Climate Change assessments. If the U.S. were as good at stopping climate change as we are at studying it, we’d have nothing to fear.

But we’re not—and we do. It’s worth exploring the NCA on your own — start with the highlights — but what struck me is this: to understand what climate change has done and will do to the U.S., you need to understand the extremes. There’s something about the very term “global warming” that makes it seem as if climate change is something that will happen gradually and uniformly, like boiling a pot of water. The NCA finds U.S. average temperature are expected to rise 2°F (1.1°C) to 4°F (2.2°C) over the next few decades, which on the face of it can seem easy to adapt to. The difference between an 83°F (28.3°C) and an 87°F (30.6°C) summer day is barely noticeable.

But those averages can hide dramatic changes in extremes. Heat waves have become more frequent across the U.S. in recent decades, with western regions setting records in the 200s, while the number of extreme cold waves has reached the lowest level on record. The number of record low monthly temperatures has declined to the lowest level since 1911, while the number of record high temperature days has increased to the highest level since the 1930s. And that’s expected to worsen — by the end of the century, what would have previously been once-in-20-year extreme hot days are projected to occur every two or three years across much of the country.

That’s true for precipitation as well. On average, precipitation is expected to increase across the country, which makes sense — warmer air can hold more water. But increasingly that rainfall is coming in very heavy precipitation events. (That’s a once-in-20-year day of rainfall.) In the Northeast, Midwest and upper Great Plains, the amount of rain falling in very heavy precipitation events is more than 30% above the 1901–60 average. If carbon emissions keep growing, those extreme precipitation events could occur up to five times more often. Even in regions where total precipitation is expected to decrease — like the parched Southwest — what rain that does fall is more likely to fall in heavy events. “It’s not the average changes we’ll notice,” said Jerry Melillo, the chairman of the National Climate Assessment Committee, at the White House event this afternoon. “It’s the extremes.”

That’s because it’s extreme weather that really tests our resilience. A prolonged heat wave leads to a spike in electricity demand as people turn up their air conditioning, which in turn can stress out our vulnerable electrical grid, leading to brownouts and blackouts. Those who don’t have access to cooling—especially the elderly and the poor — are at direct risk for heat-related health conditions. Extreme precipitation events — like the one that struck much of the Southeast last week — can lead to devastating floods, which have been on the increase in the eastern Great Plans, parts of the Midwest and much of New England. The inland floods from Hurricane Irene were devastating for much of the Northeast, destroying farms and infrastructure. Those costs will compound over time as we keep adding greenhouse gases into the atmosphere.

The red-carpet rollout of the NCA wasn’t by accident — later this year Obama’s Environmental Protection Agency (EPA) will put forward regulations designed to curb carbon emissions from existing power plants. It’s in his interest to make the scientific threat of climate change crystal clear — and the NCA does that. But the science is the easy part. “We all have to come together and turn these words into actions,” said National Oceanic and Atmospheric Administration head Kathryn Sullivan at the White House event. That’s the tough part.

TIME climate change

Obama Administration Releases Major Climate Change Report

Hurricane Sandy
Superstorm Sandy showed the dangers of climate change Scott Eells—Bloomberg/Getty Images

A new report released by the Obama administration details the tough toll of climate change on the U.S. and what may happen if it's not addressed. The findings are especially bad for California and Alaska, which will experience severe drought and melting

The Obama administration released a wide-ranging climate change report on Tuesday, laying out exactly what impact the changing climate is having on the U.S., and what could happen if it isn’t addressed.

The third National Climate Assessment (NCA), a kind of Intergovernmental Panel on Climate Change report focused on the U.S., is the product of years of work by over two hundred climate scientists. A review draft was released last year, but the report has now been signed off by the federal National Climate Assessment and Development Advisory Committee.

The White House will unroll the report’s findings with a PR blitz today, the latest signal that President Barack Obama is placing a fresh emphasis on climate change during his second term in office.

Among the NCA’s findings, four critical conclusions stand out:

1.The Southwest will bake: California’s epic drought has done more than anything else this year to draw attention to the threat of global warming—even if the climate history of the West has shown evidence of decades-long droughts even before humans started pouring carbon dioxide into the atmosphere. But climate change will only make it worse—the reports predicts that the entire region, including states like California and Arizona, will get hotter, and the southern half of the region will get much drier, prompting major wildfires. Given that population in the Southwest has been growing rapidly in recent years, warming will only increase the decades-old competition for water in the West.

2.Alaska will melt: The Arctic is the fastest warming part of the world, which is why Alaska—America’s Arctic state—has been heating up more than twice as rapidly as the rest of the country over the past 60 years. That trend will likely continue in the future, which might sound like good news in a place where winters can last for more than six months. But the retreating summer sea ice, shrinking glaciers and melting permafrost will radically change Alaska—and especially Inuit communities that have lived on the land for thousands of years.

3.Coastlines will be in danger: Superstorm Sandy provided a very expensive preview of what happens when a powerful storm and rising sea levels meet over some of the biggest and richest cities in the world. Even if climate change doesn’t end up making Atlantic hurricanes more powerful or more frequent—and the debate is still out on that question—sea levels will continue to rise. That will put the 164 million Americans who live in coastal counties—more than half the country, and growing by 1.2 million a year—at intensified risk from flooding.

4.Agriculture will be resilient… at first: Many farmers should actually be able to adapt relatively well to warming for the next 20 to 25 years, in part because increased CO2 concentrations and longer growing season can benefit some crops. But that won’t last for long—as warming intensifies, the negative impacts for crops and livestocks will begin to outweigh the positive ones, especially in drought-stricken farming regions.

The release of the report comes as the Obama administration begins to renew its commitment to climate change. The White House released its Second National Climate Assessment (NCA) in 2009, only months after the president’s election. The document didn’t get a whole lot of attention, but that didn’t matter much to environmentalists. 2009 was the high-water mark for climate action in Washington. Obama had promised that his election would mark “the moment when the rise of the oceans began to slow,” and that June, national carbon cap-and-trade legislation had narrowly passed the House of Representatives. The wonky work of the NCA, a Congressionally mandated program that pulls together the latest science on how climate change will impact the U.S., seemed important, but also a bit besides the point. The scientific debate was over—and environmentalists believed the time was ready for the U.S. to finally lead on climate change.

It didn’t work out that way. Cap-and-trade stalled and eventually died in the Senate in 2010, and when Republicans took back the House during midterm elections later that year, hopes for national climate legislation evaporated. While Obama could claim meaningful environmental wins—toughening fuel efficiency standards and channeling billions of dollars to renewable energy—many environmentalists believed he had turned his attention away from one of the most dire threats facing the country and the world.

But beginning with his second inaugural address in January 2013, marked by a promise to “respond to the threat of climate change,” Obama has renewed his focus on global warming. The big showdown will come later this year, when the Environmental Protection Agency (EPA) puts forward controversial rules that will curb greenhouse gas emissions from power plants, but this morning provides fresh evidence of the emphasis Obama is now putting on global warming. The release of the Third NCA will be marked by one-on-one Presidential interviews with local and national metereologists—still a trusted source on climate science for many Americans, even if they shouldn’t always be—and the presence of top White House officials at the rollout later this afternoon. The NCA is no longer a sideshow—to a White House searching for a climate win, it’s the main event.

 

TIME health

MERS Shows That The Next Pandemic Is Only a Plane Flight Away

SARS ravaged Hong Kong in 2003
A single patient seeded Hong Kong's SARS outbreak in 2003 Peter Parks—AFP/Getty Images

On Feb. 21, 2003, a 64-year-old Chinese physician named Dr. Liu Jianlun traveled to Hong Kong to attend a wedding. He stayed in room 911 on the ninth floor of the Metropole Hotel. Liu, who had been treating cases of a mysterious respiratory disease in the neighboring Chinese province of Guangdong, was already sick when he arrived in Hong Kong, and the next day he checked into the city’s Kwong Wah hospital. Liu died on Mar. 4 of the disease doctors soon named Severe Acute Respiratory Syndrome, or SARS. But before he died, he inadvertently infected at least 16 people who spent time on the ninth floor of that Hotel.

Some of those people boarded international flights before they knew they were sick, seeding new outbreaks in places like Vietnam, Taiwan and Singapore. SARS had been confined to southern China for months, but once Liu checked into the Metropole Hotel, it was only a matter of time before the first new infectious disease of the 21st century went global. Before it was stamped out months later, SARS had infected 8,273 people, killing 775 people in 37 countries.

It’s that chain of events that must have been on American officials’ minds last week when news broke that the U.S. had its first case of Middle East Respiratory Syndrome (MERS). A male health care provider had been in Saudi Arabia, the epicenter for the ongoing MERS outbreaks, before flying to Chicago via London on Apr. 24. After arriving in Chicago, he took a bus to the Indiana town of Munster, where on Apr. 28 he was admitted to the hospital and was eventually diagnosed with MERS. A deadly respiratory disease that has already infected hundreds, almost all in Saudi Arabia, and killed over 100 people had come to the U.S.

CDC officials played down the larger threat of the first U.S. MERS case. “In this interconnected world we live in, we expected MERS to make its way to the U.S.,” Dr. Anne Schuchat, director of the CDC’s National Center for Immunization and Respiratory Diseases, told reporters on May 2. “We have been preparing for this.” CDC officials will contact and track individuals who might have been close to the patient — including health workers who treated him and fellow travelers on his international flights and his bus ride to Munster — just in case any developed MERS symptoms. That’s not likely. So far MERS hasn’t shown much ability to spread easily from person to person, so the threat to the larger U.S. public is probably very small.

But if that Indiana case remains isolated — and MERS itself never becomes the global health threat that SARS was — it only means we were lucky.

As Schuchat put it, exotic, emerging diseases are now “just a plane’s ride away.” In the past, before international air travel became common, emerging pathogens could begin infecting people but remain geographically isolated for decades. Scientists now think that HIV was active among people in Central Africa for decades before it really began spreading globally in the 1970s, again thanks largely to international air travel. Today there’s almost no spot on the planet — from the rainforests of Cameroon to the hinterland of China — so remote that someone couldn’t make it to a heavily populated city like New York or Hong Kong in less than 24 hours, potentially carrying a new infectious disease with them.

The surest way to prevent the spread of new infectious disease would be to shut down international travel and trade, which is obviously not going to happen. The occasional pandemic might simply be one of the prices we pay for a globalized world. But we can do much more to try to detect and snuff out new pathogens before they endanger the health of the planet.

Because most new diseases emerge in animals before jumping to human beings (the virus that causes MERS seems to infect humans mostly via camel, though bats may be the original source), we need to police the porous boundary between animal health and human health. That work is being done by groups like Global Viral (whose founder I profiled in November 2011) is creating an early warning system capable of forecasting and containing new pathogens before they fuel pandemics. But as the stubborn spread of MERS shows, that’s easier said than done — especially if diseases emerge in countries that have less than open political systems.

Because as it turns out, the driving factor behind the spread of new diseases isn’t just globalization. It’s also political denial. SARS was able to spread beyond China’s borders in part because the Chinese government initially covered up the outbreaks — at one point even driving SARS patients around Beijing in ambulances to hide them from an international health team. Meanwhile, the autocratic Saudi government has made life difficult for researchers studying MERS. Much the same thing happened when the avian flu virus H5N1 began spreading in Southeast Asia in 2004. In every case, a rapid and public response might have contained those viruses before they threatened the rest of the world.

Eleven years later Hong Kong’s Metropole Hotel is now called the Metropark, and Liu Jianlun’s infamous room 911 doesn’t exist any more. After SARS, hotel management changed the number to 913 in an attempt to scrub out the past. Denial is always so tempting. But in an interconnected world, where the travel plans of a single person can seed deadly outbreaks a continent away, it’s no longer an option.

TIME

Southern California Blaze Kicks Off What Could Be Especially Dangerous Wildfire Season

A fire crew uses their deck gun to cut down an aggressive branch of the Etiwanda Fire in Rancho Cucamonga, Calif., on April 30, 2014.
A fire crew uses their deck gun to cut down an aggressive branch of the Etiwanda Fire in Rancho Cucamonga, Calif., on April 30, 2014. David Bro—Zuma Press

Rising temperatures and a prolonged drought have prepped the Golden State for what could become one of the most severe and dangerous wildfire seasons on record, beginning with the Etiwanda Fire that firefighters have about 53 percent contained

As he looks ahead to summer, firefighter Steve Abbott is worried about the down and dead. The term refers to the dry, lifeless leaves and branches that are explosive fuel for wildfires and which are more abundant in California this year thanks to an unprecedented drought that has gripped the state. “The combination of temperatures and fuel adds to our concern,” says Abbott, one of more than 500 firefighters now battling what’s known as the Etiwanda Fire in San Bernardino County east of Los Angeles.

The fire, which started on April 30, has burned about 1,600 acres and was 53 percent contained by Thursday evening. In addition to the drought conditions and temperatures that climbed above 90 in Southern California this week, fierce Santa Ana winds helped propel the blaze and prevented fire crews from fighting it from the air. Although the fire has not yet destroyed any structures, Etiwanda is effectively opening night for a wildfire season that fire officials say could be one of the most severe and dangerous on record—and a preview of what life in a hotter and drier world could be for Californians.

That’s because the Golden State is primed to burn. California is suffering through its most severe dry spell in decades, with the entire state now in some category of drought. At the beginning of May the snowpack level in the Sierra Nevada mountains—a key source of stored water—was just 18% of normal. This winter, meanwhile, was the warmest on record for the state. The drought and the heat mean that plants and trees haven’t grown as many green leaves as usual. Those leaves help trees maintain moisture—and without them, the plants are that much more likely to ignite in a blaze. And it might not even take a fire to kill some of these parched trees. “If you don’t have the vegetation receiving water, not only do you have lower humidity levels in the plants, but some of the trees will actually die,” says Carlos Guerrero, a Glendale, Calif. fire captain and a spokesman for the multi-agency unified command battling the Etiwanda Fire. Dead trees means even more fuel on the ground as the height of the summer wildfire season approaches.

Guerrero and his fellow firefighters are getting the Etiwanda blaze under control—the mandatory evacuation orders announced after the fire began on Apr. 30 were lifted by the next day. But the changing climate means that the threat from wildfires is likely to only increase in the months and the years to come, in California and in much of the rest of the West. A study published last month in the journal Geophysical Research Letters found that the number of large wildfires in the West had increased by a rate of seven fires a year from 1984 to 2011, while the total area had increased at a rate of nearly 90,000 acres a year. Since 2000 more than 8 million acres have burned during six separate years. Before 2000, no year had seen 8 million acres burned. The authors connected the increase to climate change, as did the researchers behind a 2012 study in Ecosphere that predicted that global warming would likely cause more frequent wildfires in the Western U.S. within the next 30 years. Even the most recent report from the Intergovernmental Panel on Climate Change, considered the gold standard for climate science, concluded that there was high confidence that global warming was already intensifying wildfires in the West.

Climate change isn’t the only factor behind the increasing wildfires in California and the West. Successful firefighting in the past has allowed some forests to grow beyond their natural limits, ironically providing more fuel for megafires. And the number of people who have moved to areas that border wild land has increased as well. Given that most wildfires are begun by human beings—either purposefully or by accident—more people near a forest means more chances for forest fires.

For people like Mia Hidayat, who lives in a housing development near the border of the Etiwanda Fire, that means the simple sight of dry brush and bushes in her neighborhood has taken on a new danger. “I’m afraid,” says Hidayat. As California’s wildfire season grows, many others are sure to feel the same.

TIME

The Seismic Link Between Fracking and Earthquakes

Environmentalists fear that fracking could cause more quakes if it expands to California Photo by David McNew/Getty Images

New research indicates that wastewater disposal wells—and sometimes fracking itself—can induce earthquakes

Ohio regulators did something last month that had never been done before: they drew a tentative link between shale gas fracking and an increase in local earthquakes. As fracking has grown in the U.S., so have the number of earthquakes—there were more than 100 recorded quakes of magnitude 3.0 or larger each year between 2010 and 2013, compared to an average of 21 per year over the preceding three decades. That includes a sudden increase in seismic activity in usually calm states like Kansas, Oklahoma and Ohio—states that have also seen a rapid increase in oil and gas development. Shale gas and oil development is still growing rapidly—more than eightfold between 2007 and 2o12—but if fracking and drilling can lead to dangerous quakes, America’s homegrown energy revolution might be in for an early end.

But seismologists are only now beginning to grapple with the connection between oil and gas development and earthquakes. New research being presented at the annual meeting of the Seismological Society of America this week shows that wastewater disposal wells—deep holes drilled to hold hundreds of millions of gallons of fluid produced by oil and gas wells—may be changing the stress on existing faults, inducing earthquakes that wouldn’t have happened otherwise. Those quakes can occur tens of miles away from the wells themselves, further than scientists had previously believed. And they can be large as well—researchers have now linked two quakes in 2011 with a magnitude greater than 5.0 to wastewater wells.

“This demonstrates there is a significant hazard,” said Justin Rubinstein, a research geophysicist at the U.S. Geological Survey. “We need to address ongoing seismicity.”

Rubinstein was speaking on a teleconference call with three other seismologists who have been researching how oil and gas development might be able to induce quakes. All of them noted that the vast majority of wastewater disposal sites and oil and gas wells weren’t connected to increased quake activity—which is a good thing, since there are more than 30,000 disposal wells alone scattered around the country. But scientists are still trying to figure out which wells might be capable of inducing strong quakes, though the sheer volume of fluid injected into the ground seems to be the driving factor (that’s one reason why hydraulic fracturing itself rarely seems to induce quakes—around 5 million gallons, or 18.9 million L, of fluid is used in fracking, far less than the amount of fluid that ends up in a disposal well).

“There are so many injection operations throughout much of the U.S. now that even though a small fraction might induce quakes, those quakes have contributed dramatically to the seismic hazard, especially east of the Rockies,” said Arthur McGarr, a USGS scientist working on the subject.

What scientists need to do is understand that seismic hazard—especially if oil and gas development in one area might be capable of inducing quakes that could overwhelm structures that were built for a lower quake risk. That’s especially important given that fracking is taking place in many parts of the country—like Oklahoma or Ohio—that haven’t had much experience with earthquakes, and where both buildings and people likely have a low tolerance to temblors. Right now there’s very little regulation regarding how oil and gas development activities should be adjusted to reduce quake risk—and too little data on the danger altogether.

“There’s a very large gap on policy here,” says Gail Atkinson, a seismologist at the University of Western Ontario. “We need extensive databases on the wells that induce seismicity and the ones that don’t.”

So far the quakes that seem to have been induced by oil and gas activity have shaken up people who live near wells, but haven’t yet caused a lot of damage. But that could change if fracking and drilling move to a part of the country that already has clear existing seismic risks—like California, which has an estimated 15 billion barrels of oil in the Monterey Shale formation that could only be accessed through fracking (limited fracking has been done in California, but only in the lightly populated center of the state). Environmentalists who seek to block shale oil development in the Golden State have seized on fears of fracking-induced quakes, and a bill in the state legislature would establish a moratorium on fracking until research shows it can be done safely.

Regulation is slowly beginning to catch up. In Ohio, officials this month established new guidelines that would allow regulators to halt active hydraulic fracturing if seismic monitors detect a quake with a magnitude of 1.0 or higher. But it will ultimately be up to the oil and gas industry to figure out a way to carry out development without making the earth shake.

“I am confident that it is only a matter of time before we figure out how to exercise these technologies in a way that avoids significant quakes,” says Atkinson. Otherwise the fracking revolution may turn out to be short-lived.

TIME

It’s Time to Stop Ignoring the Bad Air We Breathe

Air pollution
Nearly half of Americans breathe unhealthy air Photo by David McNew/Getty Images

A survey shows nearly half of all Americans breathe unhealthy air — but air pollution doesn't get the attention it deserves

Take a look outside your window. Chances are the air you’ll see is far cleaner than it was decades ago. Since 1980 levels of ozone pollution — one of the main ingredients in smog — have fallen by 25% in the U.S., while nitrogen dioxide has fallen by 55% and sulfur dioxide by 78%. The change is visual too — the smog-obscured skies that were once a constant backdrop to cities like Los Angeles in the 1960s and ’70s are far less common. It’s easy to assume that America won the war on air pollution, and to look with pity on developing cities like Beijing and New Delhi where the air is still poisoned.

There’s just one problem with that sense of satisfaction: the data doesn’t back it up. According to a new report from the American Lung Association (ALA), nearly 148 million Americans live in areas where smog and soot particles have led to unhealthy levels of pollution. That means that for almost half of all Americans, simply breathing can be dangerous. Even worse, the report shows that some aspects of air quality have been deteriorating over the past few years in many cities — from 2010 to 2012, ozone worsened in 22 of the 25 biggest metropolitan areas, including cities like New York and Chicago. “Air pollution is not just a nuisance or the haze we see on the horizon; it’s literally putting our health in danger,” Bonnie Holmes-Gen, senior policy director of the ALA in California, told the Los Angeles Times. “We’ve come a long way, but the status quo in not acceptable.”

The news is far from all bad. Thanks in part to the retirement of a number of older coal-fired power plants, levels of particulate pollution — soot, in other words — have been dropping in recent years, with cities like Philadelphia and Indianapolis recording their lowest levels yet. And historically, we’re far better off — as Brad Plumer notes over at Vox, air pollutants as a whole have fallen 72% since the Clean Air Act was passed in 1970, even as the economy, population and energy use have all risen.

But as the ALA report makes clear, some of that progress is being lost, in part thanks to climate change — one environmental challenge we’re very much not meeting. Rising levels of ozone pollution have been linked to warmer temperatures, which will make it that much tougher to fight smog in the future. And the government could have done more — in 2011, President Obama went against the recommendations of the Environmental Protection Agency (EPA) and rejected a proposal that would have tightened the ozone standard to between 60 and 70 parts per billion. (The level is currently at 75 ppb, set by former President George W. Bush, who was not exactly known as an environmental paragon.)

Those regulatory battles matter because it’s becoming increasingly clear that healthy air is a moving target. The more scientists learn about the health impacts of air pollution, the more dangerous it appears — even at comparatively low levels. Last October the World Health Organization (WHO) officially declared air pollution to be a carcinogen, connecting it directly to lung cancer as well as bladder cancer. And bad air doesn’t just hurt the lungs — a raft of studies have connected air pollution, especially soot, to cardiovascular disease, even triggering heart attacks. Even autism has been linked to pollution. In March the WHO estimated that outdoor air pollution caused 3.7 million premature deaths globally in 2012 — nearly three times the number of people who die each year from tuberculosis.

Climate change gets most of the environmental attention, with reason — its effects are already being felt, and it has the potential to radically change our world for the worse. But air pollution is sickening and killing millions of people around the world right now. And unlike global warming, the technological and regulatory solutions to conventional air pollution already exist. That’s why it was good news yesterday when the U.S. Supreme Court upheld the EPA’s ability to control coal-fired power-plant emissions in 28 states. The decision excited greens because it indicates the court will eventually back even more controversial carbon regulations that the Obama White House is busy formulating now, but the regulation that was upheld — the Cross-State Pollution Rule — will prevent an estimated 45,000 deaths a year from conventional air pollution once it’s in place.

Air pollution remains stubbornly difficult to eliminate, in part because of the vagary of the wind itself, which separates the victims of pollution from its source. As Justice Ruth Bader Ginsburg wrote in her decision yesterday, quoting from the Book of John: “The wind bloweth where it listeth, and thou hearest the sound therof, but canst not tell where it cometh, and whither it goeth.” But if we can’t control the air, we can control what we put into it — and protect ourselves.

 

TIME Environment

From ‘Gale’ to ‘Inconceivable,’ Ranking Tornado Strength

Ranking tornado strength
Deadly tornadoes devastated the town of Vilonia, Arkansas on Apr. 27 Mark Wilson/Getty Images

As tornadoes blast across the southeastern U.S., a look at how officials gauge just how powerful a killer twister is

Tornado season began with a crash in the southeastern U.S. this week, where dozens of twisters ripped across Mississippi, Arkansas and Alabama. At least 29 people have died in the storms — and with more tornadoes forecast as the weather system moves further east, that number will almost certainly rise.

It’s the suddenness of tornadoes, as much as their power, that accounts for the lives they take. Meteorologists can forecast when and where storms that can produce tornadoes will appear, but they can rarely give residents more than 15 minutes of warning before a twister touches down. Unlike hurricanes, which meteorologists can now track days in advance with increasing precision, tornadoes remain stubbornly unpredictable, although forecasters at the National Oceanic and Atmospheric Administration (NOAA) are working on ways to extend that warning time.

That unpredictability also makes it harder to assess the destructive power of a tornado in real time. Hurricane categories are based on sustained wind speeds in a storm—a Category 1 storm would have sustained winds 74-95 mph (119-153 kph), while a Category 5 storm would have sustained winds of over 157 mph (252 kmh) (“Sustained wind speeds” means the average wind speed in a storm over 10 minutes). The damage a hurricane can cause doesn’t always conform completely to categories. Superstorm Sandy, for instance, wasn’t even a Category 1 hurricane by the time it made landfall in New Jersey, but still caused more than $60 billion in damage, largely due to the size of its storm surge. But more wind generally means more danger—just ask the people of New Orleans, hit by Category 5 Hurricane Katrina in 2005.

Tornado strength is assessed on a different and slower scale, after the twisters have struck. When tornadoes occur, National Weather Service (NWS) officials are dispatched to survey the damage. They also reconstruct tornadoes’ life cycles, where they touched down—and how strong they were. Tornadoes are ranked on the Enhanced Fujita (EF) Scale, developed by a Japanese-American meteorologist who, not coincidentally, got his start studying the damage caused by the atomic bomb in Hiroshima. The original Fujita scale was based primarily on the damage a tornado did, with wind speed estimated after the fact. The scale ranked tornadoes from a F0 (Gale) to an F5 (Incredible), with an unofficial F6 category that would require winds in excess of 318 mph and which goes by the name Inconceivable—accurate, since no F6 tornadoes have ever been recorded.

The Enhanced Fujita scale was adopted in 2007. It was designed to more accurately reflect the actual damage a tornado had done on the ground. The EF scale uses 28 different damage indicators, ranging from small barns to hardwood trees to shopping malls—and each of those indicators is assessed based on several different points of possible damage. A shopping mall could range from damage that is just barely visible to complete destruction of some or all of the building. There’s a large database of how strong a tornado needs to be to cause certain kinds of structural damage, so meteorologists are able to use the final damage report to go back and estimate the tornado’s wind speed at the time of touchdown. The categories range from EF0—with three-second wind gusts of 65-85 mph (104-137 kph)—to EF5, with three second gusts over 200 mph (321 kph).

We won’t know the full strength of this week’s multiple tornadoes until NWS surveyors have had a chance to measure the damage on-site. But there has already been a pair of EF3 twisters this year, striking Arkansas and North Carolina on Apr. 27, and those tornadoes may be upgraded as full damage assessments are carried out. 2014 had been shaping up to be a quiet year for tornadoes—Apr. 27 marked the end of a string of 159 days without an EF3 or above tornado, and there had been only 93 tornado reports this year through Apr. 24. That changed this week—there were 87 tornado reports on Apr. 28 alone. And while no tornado that’s hit yet looks to be as strong as the EF5 twister that devastated Moore, Oklahoma last year, the season is far from done.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser