TIME Agriculture

New Report Says FDA Allowed ‘High Risk’ Antibiotics to Be Used on Farm Animals

Experts worry that the overuse of antibiotics on livestock is leading to resistant-strains of bacteria Elyse Butler via Getty Images

Antibiotic resistance claims 23,000 lives a year in the U.S.—and the overuse of antibiotics in livestock plays a role. Is the FDA doing all it can to protect Americans?

A stark fact: around 80% of the antibiotics by weight used in the U.S. are given not to sick human beings, but to farm animals. And for the most part, these drugs aren’t prescribed by veterinarians to save ill pigs or chickens, but instead are administered to animals in low doses in their food and water, for the purpose of growth promotion—the drugs seem to help livestock pack on weight—and prophylactially, to help them survive the packed conditions of a modern factory farm.

That the heavy use of antibiotics on farm animals in the U.S. can pose a real health threat to human beings—by inadvertently promoting the growth and spread of antibiotic-resistant strains of bacteria—is something that nearly every expert outside the food and drug industries agrees on. According to the Centers for Disease Control (CDC), more than 2 million Americans are sickened and 23,000 die each year thanks to antibiotic-resistant infections, and while some of that is due to the overprescription of antibiotics to human beings, use and abuse of the drugs in meat production plays a significant role as well, but it’s one that the Food and Drug Administration (FDA) has long been reluctant to crack down on.

Now a new report by the Natural Resources Defense Council (NRDC) underscores just how lacking the FDA’s regulation of antibiotics in farm animals has been. Using FDA documents acquired through the Freedom of Information Act (FOIA), the NRDC found that the agency allowed 30 potentially harmful antibiotics—18 rated as “high risk” by the FDA itself—to remain on the market for use as additives in livestock feed and water. Despite internal FDA reviews that raised questions about the risks posed by the drugs, the additives still remain approved and many of the drugs are still on the market for food production. “The FDA knew the risks, but they still haven’t done anything to revoke the approval of these drugs,” says Avinash Kar, an attorney for the NRDC and the co-author of the new report.

(MORE: Farm Drugs: The FDA Moves to Restrict (Somewhat) the Use of Antibiotics in Livestock)

The FDA has been looking at antibiotics in farm animals since 1970, when the agency convened a joint task force of experts that eventually found that the nontherapeutic use of antibiotics in livestock—meaning for growth promotion or for prophylactic use on healthy animals—could lead to resistant strains of bacteria that could threaten human health. In 1973, the FDA adopted regulations that required drug manufacturers to prove the safety of antibiotics used in animal feed and water. In 1977 the FDA found that the use penicillin and tetracyclines—two classes of antibiotics that are widely used to treat humans—in animal feed was unsafe, and proposed to withdraw approval of the drug classes. But according to NRDC’s findings, the agency never followed through.

In 2001, prompted by legislation that set aside money for the agency to look at antibiotics, FDA experts began reviewing livestock feed additives already in use that contained penicillin or tetracyclines. The additives—30 altogether—were reviewed according to two sets of criteria: the 1973 safety regulations, and 2003 guidelines meant to evaluate the safety of any new animal antibiotic drugs. (The 2003 guidelines gauged the risk of antibiotics in feed leading to resistant strains of bacteria, as well as the chance those strains can reach people and damage human health. The antibiotics would then be classified as low, medium or high risk.) The internal FDA documents unearthed by the NRDC show that agency experts found that 26 of the 30 additives had never even met the initial 1973 safety criteria. The agency also found that 18 of the 30 additives posed a “high risk” of exposing human beings to antibiotic-resistant bacteria through the food chain, according to the criteria set out by the 2003 guidelines.

(MORE: Talking Meat and Antibiotics)

For the 12 remaining additives, manufacturers hadn’t even supplied the FDA with sufficient evidence for the agency to determine the health risk they might pose to human beings. According to the NRDC, none of the 30 antibiotic feed additives in question could be approved today under the current guidelines. Because the FDA does not disclose sales of specific animal drugs, it’s impossible to know how widely those additives are still being used in animal feed. But the NRDC found evidence that at least nine of the additives are still being marketed today, and 28 of the drugs apparently still remain approved for use. The remaining two were withdrawn voluntarily from the market.

While the food industry says that restricting antibiotics in livestock would lead to sicker animals and more expensive meat, it is possible to have a major meat producing industry without the dangerous use of antibiotics for growth promotion. The European Union has banned all antibiotic growth promoters in animal feed, and Denmark—which produces about as many hogs as Iowa even though the Scandinavian country is more than three times smaller than the Hawkeye State—has banned all prophylactic uses of antibiotics in animals. But while a few food companies in the U.S. like Chipotle have touted their drug-free meat, millions of pounds of antibiotics are still being used on farms. There are a pair of bills in Congress that would curb antibiotic use in animals—the Preservation of Antibiotics for Medical Treatment Act (PAMTA) in the House and the Preventing Antibiotic Resistance Act (PARA)—but neither are likely to pass.

That leaves the FDA, which has in recent years begun to move gently on antibiotics in animal feed. Last month the agency released guidelines that ask drug manufacturers to change their labels voluntarily so that farmers would no longer be able to use the drugs for growth promotion, and instead would need a veterinarian’s prescription to use the drugs for therapeutic purposes, rather than simply allowing them to be bought over the counter. The FDA has said that voluntary guidelines will lead to faster changes in antibiotic use, largely because tougher rules could face time-consuming legal challenges from the food industry. And the agency says that once the labels on drugs have been changed, it would be illegal for the additives to be used for growth promotion—and the FDA has claimed it would take action against companies that failed to comply.

In response to the NRDC report, Siobhan DeLancey of the FDA’s Veterinary Medicine team noted that two major drug companies have expressed support for the agency’s new guidelines, which she said are informed by the FDA’s earlier scientific review of those 30 additives. She added that the FDA expects to fully implement its strategy to phase out all medically important antimicrobials—including the penicillins and tetracyclines called out by the NRDC—within three years:

The FDA is confident that its current strategy to protect the effectiveness of medically important antimicrobials, including penicillins and tetracyclines, is the most efficient and effective way to change the use of these products in animal agriculture. We note that our strategy also does not limit our authority to take future regulatory action.

But consumer and environmental groups are doubtful that much will change without a legal mandate. “The FDA has the authority to move independently on this,” says Kar. “It seems to me the FDA is using the specter of time and resources to justify a voluntary approach.” Until that changes, neither will our other drug problem.

(MORE: Getting Real About the High Price of Cheap Food)

TIME

Hundred Years of Dry: How California’s Drought Could Get Much, Much Worse

California faces historic drought
California is the driest it has been on record, but its geologic history indicates the drought could get far worse David McNew / Getty Images

Scientists fear California's long-ago era of mega-droughts could be back

As he gave his State of the State speech yesterday, California Gov. Jerry Brown had reason to feel pretty good. The 75-year-old governor has helped rescue the state from fiscal insolvency and presided over the addition of 1 million new jobs since 2010. But as he spoke, Brown hit a darker note. Last week, amid the driest year for the state since record-keeping began in the 1840s, Brown declared a drought emergency for California, and in his speech he warned of harder times ahead:

Among all our uncertainties, weather is one of the most basic. We can’t control it. We can only live with it, and now we have to live with a very serious drought of uncertain duration…We do not know how much our current problem derives from the build-up of heat-trapping gasses, but we can take this drought as a stark warning of things to come.

(MORE: Can GM Crops Bust the Drought?)

Californians need to be ready, because if some scientists are right, this drought could be worse than anything the state has experienced in centuries. B. Lynn Ingram, a paleoclimatologist at the University of California, Berkeley, has looked at rings of old trees in the state, which helps scientists gauge precipitation levels going back hundreds of years. (Wide tree rings indicate years of substantial growth and therefore healthy rainfall, while narrow rings indicate years of little growth and very dry weather.) She believes that California hasn’t been this dry since 1580, around the time the English privateer Sir Francis Drake first visited the state’s coast:

If you go back thousands of years, you see that droughts can go on for years if not decades, and there were some dry periods that lasted over a century, like during the Medieval period and the middle Holocene [the current geological epoch, which began about 11,000 years ago]. The 20th century was unusually mild here, in the sense that the droughts weren’t as severe as in the past. It was a wetter century, and a lot of our development has been based on that.

Ingram is referring to paleoclimatic evidence that California, and much of the American Southwest, has a history of mega-droughts that could last for decades and even centuries. Scientists like Richard Seager of Columbia University’s Lamont-Dohery Earth Observatory have used tree-ring data to show that the Plains and the Southwest experienced multi-decadal droughts between 800 A.D. and 1500 A.D. Today dead tree stumps—carbon-dated to the Medieval period—can be seen in river valley bottoms in the Sierra Nevada mountains, and underwater in places like California’s Mono Lake, signs that these bodies of water were once completely dry. Other researchers have looked at the remains of bison bones found in archaeological sites, and have deduced that a millennium ago, the bison were far less numerous than they were several centuries later, when they blanketed the Plains—another sign of how arid the West once was. The indigenous Anasazi people of the Southwest built great cliff cities that can still be seen in places like Mesa Verde—yet their civilization collapsed, quite possibly because they couldn’t endure the mega-droughts.

(MORE: How the Drought of 2012 Will Make Your Food More Expensive)

In fact, those droughts lasted so long that it might be better to say that the Medieval West had a different climate than it has had during most of American history, one that was fundamentally more arid. And there’s no reason to assume that drought as we know it is the aberration. Ingram notes that the late 1930s to early 1950s—a time when much of the great water infrastructure of the West was built, including the Hoover Dam—may turn out to have been unusually wet and mild on a geologic time scale:

I think there’s an assumption that we’ll go back to that, and that’s not necessarily the case. We might be heading into a drier period now. It’s hard for us to predict, but that’s a possibility, especially with global warming. When the climate’s warmer, it tends to be drier in the West. The storms tend to hit further into the Pacific Northwest, like they are this year, and we don’t experience as many storms in the winter season. We get only about seven a year, and it can take the deficit of just a few to create a drought.

These mega-droughts aren’t predictions. They’re history, albeit from a time well before California was the land of Hollywood and Silicon Valley. And the thought that California and the rest of the modern West might have developed during what could turn out to be an unusually wet period is sobering. In 1930, a year before construction began on the Hoover Dam, just 5.6 million people lived in California. Today more than 38.2 million live in the largest state in the U.S., all of whom need water. California’s 80,500 farms and ranches produced crops and livestock worth $44.7 billion in 2012, but dry farming districts like the Central and Imperial Valleys would wither without irrigation. (Altogether, agriculture uses around 80% of the stare’s developed water supply.) More people and more crops have their straws in California’s water supply. Even in normal years, the state would be in trouble. If we see a return to the bone-dry climate of the Medieval period, it’s hard to see how the state could survive as it is now. And that’s not even taking the effects of climate change into account—the most recent Intergovernmental Panel on Climate Change (IPCC) report found that it was likely that warming would lead to even drier conditions in the American Southwest.

In his speech, Brown told Californians “it is imperative that we do everything possible to mitigate the effects of the drought.” The good news is that the sheer amount of water we waste—in farms, in industry, even in our homes—means there’s plenty of room for conservation. The bad news is that if California lives up to its climatological history, there may not be much water left to conserve.

(MORE: Rising Temperatures and Drought Create Fears of a New Dust Bowl)

TIME ecocentric

How a Plant Virus May Help Cause the Beepocalypse

A new study says that a plant virus could be killing honeybees YunhyokChoi via Getty Images

A new study finds that a plant pathogen could play a role in honeybee colony collapse disorder

Honeybees are dying. In the winter of 2012-2013, one-third of U.S. honeybee colonies died or disappeared, a 42 percent increase from the year before and well above the 10-15 percent losses beekeepers once thought was normal. Many of them have been hit by colony collapse disorder (CCD), a mysterious and still unexplained malady that wipes out honeybee hives. Given that honeybees pollinate about one in every three mouthfuls of food you eat—adding some $15 billion worth of value to crops each year—this is a big deal. And we don’t know why they’re dying.

As I wrote in a cover story for TIME last summer, there’s no shortage of possible causes. Agricultural pesticides, Varroa destructor mites, the Israeli paralytic virus (IASV), the loss of open wilderness—each and every factor could play some role in the death of the bees. But there’s been no single smoking gun—which has made it that much tougher to save the bees.

(MORE: The Plight of the Honeybee)

A new study, though, may shed more light on the beepocalypse. Researchers at the USDA’s Agriculture Research Laboratory, as well as academics in the U.S. and China, have found evidence of a rapidly mutating plant pathogen—the tobacco ringspot virus (TRSV)—that seems to have jumped into honeybees, via the pollen bees collect as they fly from flower to flower. The study, published in the journal mBio, found that the virus spread systematically through infected bees and hives, reaching every part of their bodies except the eyes.

While it’s not yet clear how TRSV spreads among honeybees, or what it may do to the infected—though researchers theorize it attacks the nervous system—the study found that the presence of TRSV, along with other bee viruses like IASV, was correlated with lower rates of honeybee colony survival over winters.

Part of what makes TRSV so worrying is that it’s an RNA virus, like HIV and the influenza virus in humans, which allows it to rapidly mutate and evade its hosts’ immune defenses. As a plant virus that has found a way to jump the species barrier, TRSV could be especially tricky. Cross-species pathogens are so new that hosts generally have no defense against them.

Still, this virus isn’t acting alone. The researchers found that the virus was present in Varroa mites, blood-sucking parasites that have killed millions of bees in the U.S. since being introduced in the late 1980s. It’s possible that the mites could help spread the virus from bee to bee and colony to colony, or could weaken the honeybees enough to make them more susceptible to new pathogens like TRSV. The more we learn about CCD, the more it seems as if bees are suffering from a host of ills—pathogens and pesticides and nutritional problems—all interacting in ways we haven’t yet untangled. TRSV is far from a smoking gun, but it could be a very big bullet.

(PHOTO: The Bee, Magnified)

TIME climate science

Snowpocalypse or Not, 2013 Was One of the Warmest Years on Record

Winter Storm Climate Change
It may not feel like it to snowbound residents in New York, but the climate is still warming. TIMOTHY A. CLARY/AFP/Getty Images

Amid a polar winter in much of the U.S., a new report reinforces the long-term trend of global warming—and sets the stage for an even hotter 2014

As I write this, I can see snow falling heavier and heavier outside my office window in midtown Manhattan. Up to 14 inches (36 cm) are projected to accumulate by Wednesday morning, part of major winter storm that’s spreading from South Carolina to Maine. Temperatures are predicted to stay well below normal for the rest of the week, as we all remember what winter used to be like. In short, it’s going to be cold, snowy and brutal, and Americans might feel as if warm weather will never return.

But don’t worry—on a global climatic scale, the heat is still on. That’s the takeaway from the National Oceanic and Atmospheric Administration’s (NOAA) annual analysis of global climate data, which was released Tuesday. The red-hot numbers:

  • 2013 ties with 2003 as the fourth-warmest year globally since records began in 1880.
  • The annual global combined land and ocean surface temperatures was 58.12 degrees Fahrenheit (14.52 degrees Celsius), 1.12 degrees Fahrenheit above the 20th century average (the warmest year on record is 2010, which was 1.19 degrees Fahrenheit (0.66 Celsius) above the average.
  • 2013 was the 37th consecutive year that the annual global temperature was above the average, which means that if you were born any year after 1976, you’ve never experienced a year when the global climate was average, let along cooler.
  • Including 2013, 9 of the 10 warmest years on record have occurred in the 21st century, and just one year in the 20th century—1998—was warmer than 2013.

(MORE: Climate Change Might Just Be Driving the Historic Cold Snap)

The NOAA report, coming out in the middle of a major snowstorm and during a U.S. winter that’s been marked by the polar vortex, is a reminder that climate isn’t about the day-to-day changes in the weather (Note: NASA came out with its own report on 2013, using a different calculating method than NOAA, and found 2013 to be slightly cooler, but still the seventh-warmest year on record). It’s about the very long-term, as Gavin Schmidt, the deputy director of NASA’s Goddard Institute for Space Studies in New York, said on a conference call with reporters Tuesday afternoon:

The long-term trends in climate are extremely robust. There is year-to-year variability. There is season-to-season variability. There are times such as today when we can have snow even in a globally warmed world but the long-term trends are very clear. They’re not going to disappear.

Not only is climate change a long-term phenomenon, it’s also a global one, though it’s easy to get lost in our weather. Case in point: the average temperature in the continental U.S. in December was 30.9 F (-0.6 C). That’s 2.0 F (1.1 C) below the 20th century average. That’s the 21st coldest winter on record for the U.S. You weren’t just a wimp—December really was chilly for much of the U.S.

But the globally the picture was very different. The worldwide average temperature in December was 55.15 F (12.84 C), which is 1.15 F (0.64 C) above the 20th century average. While the U.S. was shivering, on a global scale December 2013 was the third warmest December on record. That’s global warming.

And 2014, despite the snowy and chilly start in the U.S., could be even hotter. Scientists now say that an El Nino seems likely to develop later this year, which is likely to push temperatures up in 2014 and 2015, since El Nino years tend to be warmer. So enjoy the snow while you can—it will likely be a faint memory by time Americans are sweating in July.

(MORE: Arctic Blast: The Northern Air Mass Bringing Record-Breaking Cold to the U.S.)

TIME Oil

U.S. Oil Demand Grew Faster Than China’s in 2013. That Won’t Last

Oil demand grows in the U.S.
Oil demand grew faster in the U.S. than anywhere else in 2013 Photo by Scott Olson / Getty Images

Production has been booming for awhile, but last year American demand for the black stuff grew by 390,000 barrels a day

The oil production boom in the United States is old news, something we covered in a special section just a few months ago. Improved hydraulic fracturing and directional drilling has helped unlock vast new tight oil supplies, mostly in Texas and North Dakota. But I don’t think everyone has realized just how much boom is in this boom.

New numbers from the International Energy Agency (IEA) might change that. Crude oil production in the U.S. rose by 990,000 barrels a day (bbd) last year, a increase of 15% from the year before. That’s the fastest such absolute annual growth of any country in 20 years. And it’s not just production: The IEA reports that in 2013, U.S. demand for oil grew by 390,000 bbd, or about 2%, after years of decline. For the first time since 1999, U.S. demand for oil grew faster than China’s demand, which rose by 295,000 bbd, the weakest increase in six years. So not only is the U.S. producing a gusher of oil, but it’s also consuming more crude.

(MORE: North Dakota Derailment Shows Dark Side of America’s Oil Boom

That increase in domestic demand could a good sign for the economy, if not for the environment. Growth in oil demand was mostly steady in the U.S. from the early 1980s on, before plateauing a couple of years before the financial crisis of 2008. Since then it’s mostly dropped. Average consumption in the U.S. was 18.8 million bbd between 2009 and 2012, compared to 205 million bbd between 2005 and 2008. Economic growth and energy demand have historically gone together—more businesses using more energy, more workers driving to the office—so last year’s unexpected increase in oil demand could mean the U.S. is rebounding, as Antoine Heff, head of oil market research at the IEA, told the Financial Times:

It is clear that the US economy is rebounding very strongly thanks to its energy supplies. Sometimes oil is a lagging indicator, but sometimes it is the opposite and shows that an economy is growing faster than thought.

According to the IEA, much of that growth has been in the petrochemical industry, which has taken advantage of burgeoning domestic oil supply. U.S. exports overall hit a record high in November, cutting the trade deficit to its lowest level since 2009. And much of that export growth came not from manufactured goods but from diesel and gasoline, with the U.S. exporting $13.3 billion worth of petrochemical products in November. With oil companies forbidden from exporting crude from the U.S.—though they’ve been lobbying lately to get that changed—refineries have taken up the slack, benefiting from the fact that domestic oil is often sold at a discount (they’ve also benefited from low natural gas prices, thanks to shale drilling). It’s not for nothing, as Mitchell Schnurman noted in the Dallas Morning News, that the oil capital of Houston led the nation in exports in 2012, ahead of the New York area.

(PHOTO: Black Rock Rush: Working the Oil Fields of North Dakota)

But even if the U.S. economy does rebound—and boom times in the petrochemical industry don’t necessarily translate to the rest of the country—don’t expect the U.S. to go all the way back to its gas guzzling days. There are other reasons besides a declining economy that explain why U.S. oil demand fell so much over the past several years. Cars are now more fuel-efficient than ever, thanks to tougher fuel economy standards and growing consumer preferences for lighter, smaller cars and hybrids. But we’re also driving less. An analysis by Michael Sivak at the University of Michigan Transportation Research Institute found that 9.2% of U.S. households in 2012 were without a vehicle, compared to 8.7% in 2007. Vehicle miles traveled has largely plateaued over the last several years, indicating the U.S.—like other developed countries—may have reached something like “peak car.”

That’s arguable—the drop in the percentage of households with cars could well have more to do with high unemployment and slugging economic growth than anything else. But while the boom in domestic oil production has helped stabilize gas prices—a gallon cost an average of $3.32 a gallon in 2013, just a little more than in 2012—the days of cheap gas are almost certainly over. The future of oil demand is going to be in the developing world—especially China, where consumers bought over 20 million cars in 2013, compared to 15.6 million in the U.S. 2013 will likely turn out to be a blip in that epochal shift.

(MORE: US Oil Dominance Will Be Short Lived)

TIME climate change

Why Some Mushrooms May Be Magic for Climate Change

Fungi growing in soil
EEM fungi like this Amanita mushroom help soils store more carbon Colin Averill

The soil contains more carbon than all living plants and the atmosphere combined. Now a new study says that a certain type of fungi can help soil hold up to 70% more carbon—with potentially big impacts for the climate

Fungi don’t get the respect they deserve. Maybe that’s because they do most of their work in the dark, beneath the ground or on dead matter, or because there’s something essentially alien and bacterial about their appearance and the way they grow. But fungi are so plentiful and basic to life that they’re recognized as their own phylogenic kingdom. There may be more than 5 million separate species of fungi, and the largest single organism on the planet is a fungus: the four sq. mi. (10 sq. km) Armillaria ostoya fungus, which lives in the soil of Oregon’s Blue Mountains and which may be more than 8,000 yeas old. Without fungi we wouldn’t have antibiotics, blue cheeses and most importantly, beer. And we won’t even get into the magic kind.

Fungi also play an important role in the carbon cycle, the biogeochemical process by which carbon—the essential element of life on Earth—moves between the air, soils and water. Plants sequester carbon dioxide, but when they die, that carbon enters the soil—a lot of it. Globally, soil is the biggest single terrestrial reservoir of carbon, far more than the amount of carbon contained in living things and in the atmosphere combined. (On a planetary scale, the oceans hold by far the most carbon.) As the dead plant matter is broken down by microbes in the oil, that carbon is released back into the air. The rate at which that carbon leaves the soil can obviously have a major impact on the amount of carbon in the atmosphere, which in turn helps drive climate change.

(MORE: Climate Change Might Just Be Driving the Historic Cold Snap)

One of the limits to the growth of those decomposing microbes is the availability of nitrogen in the soil. Living plants and soil microbes compete for nitrogen, and the less nitrogen is available to the microbes, the slower decomposition is—and the more carbon remains in the soil, instead of outgassing into the atmosphere. This is where the fungi come in. Most plants have a symbiotic relationship with mycorrhizal fungi: the fungi extract the nitrogen from the soil, and make it available to the plants through their roots. But according to a new study in Nature, one major type of the symbiotic fungi can extract nitrogen much more quickly than other types—and that in turn slows the growth of the competing microbes and leaves much more carbon locked away in the soil.

Researchers from the University of Texas, Boston University and the Smithsonian Tropical Research Institute ran computer models on data from more than 200 soil profiles from around the world. They found that soils dominated by ecto- and ericoid mycorrhizal (EEM) fungi contain as much as 70% more carbon than soils dominated by arbuscular mycorrhizal (AM) fungi. That’s because the EEM fungi produce more nitrogen-degrading enzymes, which allows them to extract more nitrogen from the soil. They essentially outcompete the soil microbes, which slows down their ability to decompose dead plant matter and return carbon from the soil to the atmosphere. “This analysis clearly establishes that the different types of symbiotic fungi that colonize plant roots exert major control on the global carbon cycle, which has not been fully appreciated or demonstrated until now,” said Colin Averill, a graduate student at the University of Texas and the lead author of the paper.

That relationship between the different types of fungi and plants is so important for the carbon cycle because it’s independent of temperature, precipitation, soil clay content and all the other variable factors that can influence plant growth and soil content. Perhaps unfortunately for us, though, AM fungi symbiosis is far more common, occurring in approximately 85% of plant families, while just a few plant families have a symbiotic relationship with EEM fungi. That could change as the composition of forests change, however, but we wouldn’t know the effects until scientists add the role of the different kinds of symbiotic fungi into global climate models, which they have yet to do.

“This study shows that trees and decomposers are really connected via these mycorrhizal fungi, and that you can’t accurately predict future carbon cycling without thinking about how these two groups interact,” said Averill. “We need to think of these systems holistically.” The humble fungus won’t be forgotten.

(MORE: A Newly Discovered Underground Lake in Greenland Will Help Us Understand Climate Change)

TIME endangered species

The Dingo Didn’t Eat Your Tasmanian Devil

Dingoes are blamed for driving the Tasmanian tiger to extinction
Dingoes are blamed for hunting the Tasmanian tiger to extinction. But humans bear the blame Jason Edwards via Getty Images

Dingoes were long blamed form hunting the Tasmanian devils off Australia. But a new study shows that human beings should get more of the blame

Dingoes get a bad rap. A free-ranging dog found in chiefly in Australia, dingoes have been blamed for killing sheep and hunting the Tasmanian tigers and devils to death. Oh, and snatching the occasional baby, which you might remember from that classic Elaine Benes line on Seinfeld.

But there’s good news for the Canis lupus dingo: it’s been cleared of one of those charges. Dingoes, which came to the Australian mainland from Southeast Asia over 4,000 years ago, were long believed to be primarily responsible for the extinction of the marsupial thylacine, also known as the Tasmanian tiger, as well as the elimination of a different marsupial, known as the Tasmanian devil, on the Australian mainland. Dingoes are voracious hunters, closer to a wolf than a dog, but while they established themselves through much of Australia, they never reached the island of Tasmania, which is now the only place that the devils can be found. That fact was enough for many experts to put much of the blame for the extirpations on the dingo’s shoulders.

(MORE: How Human Activity — and Extinctions — Are Driving Evolution)

But a new study by Australian scientists has found that it’s humans, not the dingoes, that are primarily at fault. The researchers created a mathematical model of the interactions among predators (including dingoes and human beings, who came to the Australian continent 50,000 years ago) and their prey, in prehistoric Australia. The model included climate variables and possible changes in vegetation, both of which could affect animal populations. They then experimented with the models to see which factor played the biggest role in the losses of the Tasmanian devil and tiger.

It shouldn’t be surprising that human beings took the rap. When prehistoric humans first came to Australia, the continent was full of unimaginably large animals: the rhino-sized Diprotodon, massive kangaroos, marsupial lions that weighed more than 200 lbs. (91 kg). And once they got there, humans proceeded to hunt those animals to death, part of a global event known as the Pleistocene megafaunal extinction. That “overkill,” as Jared Diamond has called it, likely wiped Australia clean of megafauna. And while humans—and dingoes—would have hunted the Tasmanian tiger and devil, the model found that rapid human population increase would have reduced the animals’ prey, essentially displacing them. The dingo, in turn, took the Tasmanian tiger and devils ecological role as a top predator and scavenger.

Far from being a scourge, big carnivores like the dingo play a vital role in ecosystems—and their loss can have major impacts on animals and plants below them in the food chain, as a Science paper published this week showed. Here’s my colleague Veronique Greenwood:

In Australia, areas in which dingoes are suppressed experience increased predation by red foxes, which feast on endangered creatures like the dusky hopping mouse. One study surveyed in the new paper showed that the mouse’s numbers were 40 times higher in areas where dingoes roamed—or at least in the two dingo-rich areas surveyed by the researchers.

So give the dingo a break. They’re doing more good than harm ecologically—which is more than you can say about most human beings. Now about that Seinfeld episode

(MORE: Why it’s Good (For Someone Else) to Get Eaten By a Lion)

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser