TIME Environment

The Last Coral Reefs

The SVII camera can take hundreds of photos of coral reefs, turning them into 360-degree panoramas
The SVII camera can take hundreds of photos of coral reefs, turning them into 360-degree panoramas Jayne Jenkins—Catlin Seaview Survey

A new survey is documenting the rain forests of the ocean—before they’re gone

There’s only one way to lower a $20,000 custom-made underwater camera from a swaying fishing boat into the open sea: very, very carefully. And that’s exactly how Manuel Gonzalez-Rivero’s colleagues handled the SVII camera as they nudged it overboard, where the coral ecologist was bobbing in the bathtub-warm waters off the Central American country of Belize. Gonzalez-Rivero is based at the University of Queensland’s Global Change Institute in Australia, but he was in the Caribbean working with the Catlin Seaview Survey, a scientific expedition that is assessing threatened coral reefs around the world. Once in the water, the cumbersome SVII–a beach-ball-size camera head with three separate lenses at the end of a 7-ft. (2 m) pole–was easy for Gonzalez-Rivero to maneuver. The camera’s attached propeller sled saved the scientist the work of swimming as he covered more than a mile of Belize’s protected Glover’s Reef, part of the vast and endangered Mesoamerican Reef that stretches from southern Honduras to the eastern tip of Mexico.

Every three seconds, the lenses on the SVII–facing to the left, right and below the camera head–snapped pictures of the reef. Over the course of his 45-minute dive, Gonzalez-Rivero produced more than 900 detailed images of Glover’s Reef, each one rich with data about corals and sea life. Back on the catamaran that served as the expedition’s temporary base, those images would be processed to generate a precise three-dimensional image of the reef. Later, computers at the Scripps Institution of Oceanography would analyze the pictures, giving scientists a quick diagnosis of the health of one of the most valuable marine ecosystems in the Caribbean. What’s long been possible on land, thanks to satellites scanning jungles and deserts, is now feasible beneath the waves. “Every coral reef is different,” says Gonzalez-Rivero. “This will allow us to see the reef as it really is.”

And we have to see it today, because coral reefs may not be here tomorrow. It’s a cliché to call coral reefs the rain forests of the ocean, but if anything, that understates their ecological value. They occupy less than 0.1% of the sea area, yet “between one-fourth and one-third of everything that lives in the ocean lives in a coral reef,” says Nancy Knowlton, who holds the Smithsonian Institution’s Sant Chair in Marine Science. Coral reefs support more species per square kilometer than any other marine environment, providing habitat, food and spawning grounds. And fish are not the only beneficiaries. The net economic value of coral reefs globally is almost $30 billion a year, and some 500 million people around the world depend on coral reefs for food, coastal protection and tourism.

At a time when climate concerns continue to mount–a widely watched March 31 report from a United Nations panel warned of drastic effects across the globe–coral reefs are under intense threat. Overfishing and coastal overpollution and development have left all but the most remote reefs a shadow of what they once were. By one estimate, the Caribbean has lost 80% of its coral cover over the past 50 years. And the future is even darker: the one-two punch of global warming and ocean acidification could make the seas essentially inhospitable to coral, with dire consequences for marine life. The U.N. report, from the Intergovernmental Panel on Climate Change (IPCC), warned that coral reefs are “the most vulnerable marine ecosystem on Earth” to the effects of global warming. “If we don’t dodge this bullet, the only coral reefs that our children’s grandchildren will see will be in picture books,” says Steve Palumbi, director of Stanford University’s Hopkins Marine Station.

That’s what makes the Catlin Seaview Survey so timely. The oceans in their full volume account for as much as 90% of the planet, but humans have seen just 5% of the underwater world with their own eyes. Ocean exploration can be expensive, difficult and time-consuming, even in the relatively shallow coastal waters where most reefs are found. But Seaview, which aims to survey every major coral reef worldwide, is able to take advantage of new advances in video and computer analysis to produce a long, sustained look at the oceans, essentially digitizing the seas. The result will be the kind of data that marine scientists have long craved. “By creating a really large global baseline of coral health, we can identify the areas that really need protecting,” says Richard Vevers, project director of the Catlin Seaview Survey. “We want to reveal the oceans of the world.”

Disappearing Riches

While I was in Belize with the Seaview team, I had the chance to view a coral reef the old-fashioned way–I dived it. Glover’s Reef, which is about 28 miles (45 km) off the Belize coast, lies at the heart of the largest reef system in the western hemisphere. As I hovered lazily near the ocean floor–while Gonzalez-Rivero and his colleagues carried out actual science above me–I could pick out boulder-size brain coral, jagged fire coral and majestic elkhorn coral. Sea fans billowed like flags in the underwater current.

Reefs look like living rocks–and in a sense, that’s what they are. Corals are tiny invertebrates that exist in symbiosis with photosynthetic single-cell algae called zooxanthellae, which live inside the coral’s tissue. (The zooxanthellae provide food to the coral by converting sunlight into energy.) Corals build up hard exoskeletons made of layers of secreted calcium carbonate, which form the reef. In a healthy reef, you can see everything from tiny gobies to predatory sharks swimming amid a network of coral as intricate as a medieval cathedral. “Coral reefs are a magic ecosystem,” says Palumbi. “If you could make the deserts bloom on land, that’s what coral reefs do for the oceans.”

Glover’s Reef, which is part of Belize’s protected Hol Chan Marine Reserve, is one of the healthier coral ecosystems in the Caribbean. But even here the reef isn’t what it once was. Coral cover dropped from 80% in 1971 to 13% in 1999, although there has been some recovery since, thanks to the recent establishment of a no-fishing zone. Most other Caribbean reefs are in far worse shape. The heavily developed waters off the coasts of countries like Jamaica are now little more than coral graveyards. Veteran coral ecologists who began by diving in the once verdant reefs of the Caribbean have witnessed the coral collapse over the course of their careers. “I’m 64, and everyone of my generation who became a conservation biologist has seen this loss happen in real time,” says Knowlton.

While Caribbean reefs have been particularly hard hit, corals around the world face the same threats. Overfishing species at the top of the food chain can cause a chain reaction, leading to the loss of smaller herbivores that play an important role in controlling the growth of seaweed, which competes with corals for living space. Pollution from coastal areas can kill corals–especially fertilizer runoff from agriculture, which can promote the growth of algae species that crowd out corals. Humans can accidentally introduce invasive species like the lionfish, a voracious eater that has plundered the Caribbean like Blackbeard the pirate. At least a quarter of the world’s corals have been lost over the past 25 years.

What really frightens coral scientists are the threats that will arise in the future. “If we push this too far, corals won’t be able to bounce back,” says Peter Mumby, a coral ecologist at the University of Queensland. “The whole system will collapse over time.” Climate change poses an existential challenge. Corals don’t like it when the water around them suddenly heats up, which can trigger what’s known as bleaching. The coral organism reacts by ejecting the zooxanthella algae living inside its tissues, which robs the coral of both its color and its source of food. While bleaching doesn’t necessarily kill the coral outright, it leaves it extremely vulnerable to other stresses. (In 1998, El Niño–led warming sparked the worst bleaching event on record, with 16% of the world’s coral lost in a year.) Even as climate change warms the seas, the additional carbon dioxide absorbed by the oceans will turn the water more acidic, which will in turn interfere with corals’ ability to form reefs. A 2013 study by researchers at the Carnegie Institution projected that if carbon emissions are not brought under control, no part of the ocean will be able to support coral reefs by 2100, and the new IPCC report predicts that Australia’s Great Barrier Reef will continue to degrade even if warming is slower than projected. “You could lose the coral reefs altogether,” says Ken Caldeira, an atmospheric scientist at Carnegie and a co-author of the paper. Coral scientists are right to fear that they could spend the rest of their careers watching their subject die.

Recording for Posterity

When Richard Vevers switched careers from advertising to underwater photography, he became friends with the great Australian underwater filmmaker and shark expert Ron Taylor, best known for his work on movies like Jaws. Vevers would dive along the Great Barrier Reef and bring back what he thought were images of a pristine marine ecosystem, bristling with coral and sea life. But when he showed his pictures to Taylor, the veteran photographer would just shake his head. “He’d say, ‘That’s great, but you don’t know how it used to be,'” says Vevers. “I didn’t believe it at first, but it began to sink in. I realized that there’s this decline that’s been happening almost too slow for people to notice.”

There’s a term for that decline: shifting baselines. Fisheries scientist Daniel Pauly coined it to describe how overfishing has changed the oceans so rapidly over the past several decades that what we think of as normal from recent experience–the baseline–has had to shift to keep up with what is actually a diminished reality. “We transform the world, but we don’t remember it,” Pauly said in a 2010 TED talk. “We adjust our baseline to the new level, and we don’t recall what was there.”

Shifting baselines can be seen in all environmental science, but they’re a particular problem in ocean research. Marine scientists have had to rely on quick hits–grabbing data from scuba surveys, competing for a spot on a submersible. Even those research trips are growing rarer in a budget-constrained age. Don Walsh and Jacques Piccard reached the bottom of the Mariana Trench, the deepest point on the planet, in 1960, but no one returned there until director James Cameron did so in 2012 in a submersible he designed and paid for himself. Our understanding of the oceans is “very data-poor,” says David Kline of the Scripps Institution of Oceanography. It’s as if we were trying to comprehend a movie by seeing a few random frames rather than the full, uncut length.

The Catlin Seaview Survey is working to create that complete film. The photographs taken by the SVII camera can be digitally combined to create panoramic images that reveal the underwater world with striking depth and clarity. Seaview has partnered with Google to put many of those images online as part of Google Ocean’s efforts to take its Street View program–which shows ground-level photographs from around the world–beneath the waves. (Seaview is primarily sponsored by the Bermuda-based reinsurance company Catlin Group, which has been funding climate-change research, knowing that global warming could hit the insurance industry hard.) Underwater images from Seaview’s first extended expedition–a four-month mission in 2013 that covered more than 90 miles (145 km) of the Great Barrier Reef–have already been viewed millions of times. With the help of time-lapse technology, the images can be stitched together to engineer what seems like a digital scuba dive through one of the best-preserved coral-reef systems in the world–albeit one that has lost more than 50% of its coral cover over the past 30 years. “People can see the beauty of this world for themselves,” says Jenifer Austin Foulkes, project manager of the Google Ocean Program. “It’s a powerful tool.”

The underwater world has suffered as an environmental cause because of its inaccessibility. Scuba diving, after all, became possible only in the postwar era. Vevers hopes the beauty and accessibility of the images that Seaview records will help motivate the public to care for the seas. “Ninety-nine percent of people don’t dive and probably never will,” he says. “We need to bring the oceans to the people.” If people can dial up a view of their closest reef the way they can zero in on their childhood home on Google Earth, they might begin to care about the 70% of the planet that is covered in water.

But the lasting value of Seaview will be in the science it supports. Underwater research has always been limited by two things: air and space. Humans–in scuba gear or in submersibles–can stay underwater for only so long and can bring only so much equipment with them. The standard method of surveying coral involved researchers diving a reef and taking photographs of the area they covered, square foot by square foot, then analyzing those images on a research boat or at a station. Each of those images could require 15 to 30 minutes of work by a trained observer. Scientists had to extrapolate the whole from a small data set, not least because there was no way to survey an entire coral reef. The Great Barrier Reef, for example, covers 134,364 sq. mi. (348,000 sq km).

A Gloomy Picture

Over the next several years, Seaview expects to cover the Caribbean, the Coral Sea in Southeast Asia, the Indian Ocean, the Mediterranean and the Middle East, producing hundreds of thousands of underwater images along the way. Under the old methods, it would have taken years for scientists to analyze it all, and most of the pictures would likely have remained in a dusty hard drive somewhere in the back of a lab. But Scripps and the University of California at San Diego, employing facial-recognition technology similar to what the CIA employs to analyze crowd photos, are using a computer program to scan each image from the expedition and spit out the pictured species and extent of coral growth–all more than a hundred times faster than such work could have been done by humans alone. The accuracy of the machine is already at 90%, and as the program analyzes more images, it will become more precise, learning along the way. “What used to take us years we can now do in weeks and months,” says Scripps’ Kline. “We’ll have large-scale, quality data about the health of the reefs, and that will let managers make much more informed decisions about protection policies.” This is a Big Data solution to a very big scientific challenge.

There’s no time to waste: the picture is vanishing even as we take it. I loved diving in the aquamarine waters of Glover’s Reef, letting my fingers drift past the outstretched arms of elkhorn coral. It was one of the most beautiful places I’d ever been. Yet I could tell–or maybe just feel–that something had been lost. It seemed empty of all but the smallest species, the result of years of intense fishing that more recent protections have only begun to reverse. My guide saw a hammerhead shark circling in the blue, but I missed it. It’s easy to miss things underwater.

TIME weather

Satellite Photos Show How the Washington Landslide Area Changed Over Time

Over a decade of satellite imagery shows why Oso, Washington was so susceptible to a fatal landslide

The youngest victim was four-months old. The eldest was 71. Altogether at least 29 people were killed when the earth gave way above the small town of Oso in rural Washington on Mar. 22, making it one of the deadliest landslides in U.S. history. And the saddest thing of all is that the disaster was anything but unexpected. The hill that collapsed had been the site of a number of landslides in the past, most recently in 2006. In 1999, outside consultants filed a study with the U.S. Army Corps of Engineers warning of “the potential for a large catastrophic failure” on the very hill that gave way in March.

As aerial photos from Snohomish County GIS and satellite photos collected by TIME from DigitalGlobe show, the recent landslide was all but impossible to have stopped. The North Fork Stillaguamish river cuts out the bottom of the hill that would eventually collapse, and the loose sediment—laid down by glaciers nearly 12,000 years ago—was inherently unstable. Landslides kill an average of 25 Americans and cause as much as $2 billion in damages each year, yet they’re too quickly forgotten. Hopefully the catastrophe in Oso will change that.

TIME Internet

Your Data Is Dirty: The Carbon Price of Cloud Computing

Servers
The computers behind the cloud are responsible for 2% of global carbon emissions Sean Gallup—Getty Images

The digital cloud that holds your data may seem invisible, but the electricity that powers it comes with a major carbon price and climate impact, according to a new report from environmental advocacy group Greenpeace

The digital cloud is built on invisibility. Instead of books, DVDs, CDs, newspapers or magazines, we have pure data, traveling back and forth between our web-connected devices. Everything we want is at our fingertips, and all we need to do is push a button.

But the digital cloud has a physical substance: thousands upon thousands of computer servers, which store the data that makes up the Internet. And those servers aren’t powered by magic, they’re powered by electricity. If that electricity is produced by fossil fuel sources like coal or natural gas—which together provide nearly three-quarters of U.S. power—our magical cloud may leave a very dirty footprint.

IT-related services now account for 2% of all global carbon emissions, according to a new Greenpeace report. That’s roughly the same as the aviation sector, meaning all those Netflix movies the world is streaming and the Instagram photos they’re posting are the energy equivalent of a fleet of 747s rumbling for takeoff. Unless something is done to green the cloud, we can expect those emissions to grow rapidly—the number of people online is expected to grow by 60% over the next five years, pushed in part by the efforts of companies like Facebook to expand Internet access by any means necessary. The amount of data we’ll be using will almost certainly increase too. Analysts project that data use will triple between 2012 and 2017 to an astounding 121 exabytes, or about 121 billion gigabytes.

“If you aggregated the electricity use by data centers and the networks that connect to our devices, it would rank sixth among all countries,” says Gary Cook, Greenpeace’s international IT analyst and the lead author on its report. “It’s not necessarily bad, but it’s significant, and it will grow.”

The good news is that a number of major Internet companies have begun taking big steps to green their cloud. Greenpeace points to Apple as an industry leader, as the company has committed to powering its iCloud exclusively through renewable energy. It’s backed that up by building the country’s largest privately-owned solar farms at its North Carolina data centers and by powering its new Nevada data centers with geothermal and solar energy. Apple has also purchased wind energy for its Oregon and California data centers.

Facebook is another success story. The company came under criticism from Greenpeace and other environmental groups for depending on coal for more than half its energy, which prompted a global Unfriend Coal campaign. Those protests yielded results—Facebook now prefers renewable energy to power its growing fleet of data centers. Its newest center will be in Iowa, where it has agreed to purchase 100% wind power—a move than pushed the local energy utility to make the single largest purchase of wind turbines in the world.

Facebook’s evolution is a welcome sign that Internet companies are becoming more aware of their environmental footprint. It also underscores the fact that their decisions on how to power their data centers can influence utilities for the better. “Apple and Facebook show the power IT companies have on this stuff,” says Cook.

But there are still major laggards in the industry. Greenpeace points at Twitter, which just went public last year. Unlike Facebook or Apple, Twitter still hasn’t built any data centers of its own, instead renting server space from third party companies. Twitter has remained silent about the kind of electricity that powers its services, while providing very little information in general about its energy use or its energy goals. While some of that silence can be explained by the fact that the company doesn’t own its own data centers, the Greenpeace report points out that other companies that rent servers, like Salesforce and Box, have made commitments to 100% renewable energy.

In response to the report, a Twitter spokesperson said:

Twitter believes strongly in energy efficiency and optimization of resources for minimal environmental impact. As we build out our infrastructure, we continue to strive for even greater efficiency of operations.

As a relatively new public company, Twitter will likely come under more pressure to be transparent about its energy use and environmental goals. The fact that Twitter is so popular among journalists and activists will certainly increase that pressure over time, as happened with Facebook.

But a bigger problem for a green cloud is Amazon Web Services (AWS), Amazon’s highly popular cloud-computing platform. AWS hosts Amazon’s own cloud content, like the Amazon Prime streaming video service, but it also hosts data from countless other customers — including Netflix, which by itself accounts for nearly a third of Internet traffic in North America during peak evening hours. Amazon has been mostly silent about the environmental footprint of its cloud services, though the company claims to have very high utilization rates, which allow it use cheaper off-peak electricity—but again, there’s little open data on this. The company’s data centers in northern Virginia are by far its largest, but just a tiny sliver of the electricity there is provided by renewable sources, with the bulk coming from coal. AWS does say that its data centers in Oregon (which includes the company’s YouGov platform) are run by 100% carbon-free power, but it’s not clear how those power sources break down, and Amazon hasn’t publicly committed to using renewable energy.

When asked about Greenpeace’s report, an AWS spokesperson said the company agreed that efficiency and clean energy were important for cloud computing, but also said the report “misses the mark by using false assumptions on AWS operations and inaccurate data on AWS energy consumption. We provided this feedback to Greenpeace prior to publishing their report.” Greenpeace’s David Pomerantz, a co-author on the report, said that AWS declined to share data on energy consumption before the report was put together, unlike a number of other companies:

We did share our data with Amazon in advance of publishing the report. Amazon told us that our energy mix data for some of its AWS facilities was incorrect, but refused to offer alternative data for any of its facility other than Ireland, where it claimed a mix of 50% renewable energy and 22% coal. When asked, Amazon refused to provide data on how it is achieving that mix in Ireland, so Greenpeace has continued to use Irish national data for that facility. Using Amazon’s Ireland data would result in a company CEI [Clean Energy Index] that would be improved from 15 to 19%, still quite low.

Amazon has noted in the past that cloud computing is inherently more efficient than traditional computing, since companies are able to consolidate their data center use. And moving media and other services as data via the cloud is much more efficient than creating and shipping physical objects. But the cloud doesn’t come free. As more of our lives migrate to the digital ether, Internet companies—and their billions of customers—need to be more aware of the power behind the cloud.

TIME whaling

Japanese Whaling Ban Won’t End the Whale Wars

A photo released in 2008 shows a whale being dragged on board a Japanese ship after being harpooned in Antarctic waters.
A photo released in 2008 shows a whale being dragged on board a Japanese ship after being harpooned in Antarctic waters AFP/Getty Images

The International Court of Justice has ruled that Japan will no longer be permitted to hunt whales in the southern Pacific under the dubious pretense of scientific research. But the battle over whaling isn't over

The science in Japan’s “scientific” whaling program has always been a little, well, questionable. Commercial whaling is essentially illegal for all nations that remain part of the International Whaling Commission (IWC). Norway and Iceland, two countries that continue to whale, get around the IWC’s 1986 moratorium by simply rejecting it. Japan, which is still a member of the IWC, has sidestepped the moratorium for years through subtler means, establishing a research program that allows the country to kill 3,600 minke whales since the studies began in 2005. Exactly what scientific information Japan’s whaling fleet is gathering through legal slaughter has never been clear — though what’s not in doubt is the destination of the whale meat taken in the hunt, most of which ends up in the handful of restaurants and markets in Japan that still serve whale.

If a scientific whaling program sounds like an oxymoron to you, the U.N.’s International Court of Justice (ICJ) apparently agrees. On Monday the ICJ ordered a temporary halt to Japan’s Antarctic whaling program, ruling that the country had failed to provide any scientific justification for its whaling. “The court concludes that the special permits granted Japan for the killing, taking and treating of whales … are not ‘for purposes of scientific research,'” presiding judge Peter Tomka said, reading the court’s ruling on a case originally brought in 2010 by the government of Australia. The program, he said, “cannot be justified.”

The Japanese government obviously disagrees with the decision, but Foreign Ministry spokesperson Noriyuki Shikata told reporters that Japan would “abide by the ruling of the court” — meaning that for now, at least, Japan’s annual Antarctic hunt is off. For environmentalists who have fought Japanese whaling for years in international courts, the court of public opinion and sometimes on the oceans itself — as seen in the reality-TV show Whale WarsMonday’s decision was a moment to celebrate. Former Australian Environment Minister Peter Garrett, who originally launched the suit when his government was still in office, told the Australian Broadcasting Corp. that Antarctic waters would become a true sanctuary for whales:

I’m absolutely over the moon, for all those people who wanted to see the charade of scientific whaling cease once and for all. I think [this] means without any shadow of a doubt that we won’t see the taking of whales in the Southern Ocean in the name of science.

The court’s ruling doesn’t mean that all Japanese whaling will immediately cease. The country has a smaller scientific program in the northern Pacific that will likely now be challenged under the same grounds. The court also left the door open for Japan to resume scientific whaling if it can redesign its program, as Tokyo has claimed it needs data to monitor the impact of whales on its fishing industry. And Japan has always held out the possibility that it could simply withdraw from the IWC altogether, so that it would no longer be bound by the commission’s decisions.

Whaling has never been just about whaling in Japan. Though some coastal towns in Japan have hunted whales for centuries — I visited one such village, Oshika, back in 2005 — Japan only became a whaling power in the wake of World War II, when some of its decommissioned naval vessels were converted into whaling ships and when U.S. occupation officials encouraged the harvesting of whales as a cheap form of protein. The drive to keep whaling today has much less to do with a taste for whale meat — which has long since waned — than it does with the government’s worry that any limit on whaling could set a precedent for Japan’s far more vital commercial fishing industry. Tokyo is right to worry — bluefin tuna, which can fetch tens of thousands of dollars at Tokyo’s Tsukiji fish market, are highly endangered as well.

There’s also the reality that hunting is just one of many threats that whales face today. Whales can be killed accidentally as bycatch, poisoned by pollution, even driven crazy by noise from ships. And like nearly every other species on the planet, whales are threatened by climate change — especially species like bowhead and beluga that live in the rapidly warming Arctic. But on a day when environmentalists are still reeling from the dire predictions in the latest U.N. climate change report, today’s ruling is a rare glimmer of good news.

TIME climate change

Warming World Threatens Us All, Warns U.N. Report

Polar Bears Struggle In Norway
A polar bear scans the area from the top of a large piece of glacial ice in Svalbard, Norway Rebecca Jackrel—Barcroft Media/Getty Images

A new U.N. report illustrates the impact that rising temperatures will have on crop yields, water supplies and sea levels

There have been thousands and thousands and thousands of studies published on climate change since 2007, when the U.N.’s Intergovernmental Panel on Climate Change (IPCC) published its fourth major assessment on global warming. It has taken hundreds and hundreds of scientists to comb through all that research. But the broad, basic message of all those studies is clear enough: climate change is real, it is happening, and unless we’re very lucky, we’re not doing anywhere near enough to adapt to it.

That’s the underlying message of chapter two in the IPCC’s fifth assessment of climate-change science, which was released on Monday morning in the Japanese city of Yokohama. Focusing on the impacts of climate change — ranging from the effects on endangered species to changes in agriculture — the new report demonstrates just how wide-ranging the effects of a warming world will be. “We have assessed impacts as they are happening in natural and human systems on all continents and oceans,” said Rajendra Pachauri, the chair of the IPCC, which was jointly established by the U.N. and the World Meteorological Organization. “No one on this planet will be untouched by climate change.”

So the report predicts with high confidence that the negative impacts of warming on crop yields will outweigh any potential positive impacts; that violent conflict will exacerbate the effects of global warming; that glaciers will continue to shrink as the climate warms, which has major impacts for downstream water supplies; that species on land and in the sea are shifting their range in response to warming and that some will face an increased risk of extinction; that health impacts will be felt from heat waves and from floods in low-lying areas; that the seas will continue to acidify, destroying coral reefs.

But it matters — greatly — exactly what those effects will be. And in this way, at least, the newest IPCC report is marked by a sense of humility, as the world’s scientists come to grips with just how difficult it is to predict precisely how the planet will respond to rising carbon emissions and rising temperatures. Unlike the 2007 IPCC report — which was marred by a handful of errors, including one predicting that Himalayan glaciers would melt by 2035, centuries earlier than any such change is likely to unfold — this year the IPCC is much more conservative about what can and cannot be known about a changing climate.

That means language that might seem less precise. Gone are confident predictions that climate change will definitely make hurricanes in the Atlantic stronger and more intense, as are projections that warming will place 250 million Africans at greater risk from water insecurity. Instead, the IPCC admits that warming will increase water stress and impact crop productivity, noting that “the fraction of the global population experiencing water scarcity and the fraction affected by major river floods increase with the level of warming in the 21st century.”

The report notes there are major uncertainties about the vulnerability of the world to climate change and how both natural and human systems will respond to warming, in part because those systems are so complex. In particular the report admits that the economic effects from climate change are “difficult to estimate,” ranging from 0.2% to 2% of global income.

Does this mean we don’t have anything to worry about from global warming? Not in the least. The IPCC isn’t telling us that the danger posed by global warming has fallen in the seven years since its last assessment report. Rather, the scientific body is more realistically putting climate change in the context of the countless other risks humanity faces — which is important, because climate risks and social risks can interact and amplify each other. Take conflict: the IPCC report notes that a warming world may make violent conflict more likely, but it also makes the case that countries already struggling with conflict will be less able to respond to climate change. Global warming is likely to make poor parts of the world even poorer, but existing poverty will worsen other impacts of climate change. “Climate-related hazards constitute an additional burden to people living in poverty, acting as a threat multiplier,” the report’s authors write.

A planet with 7 billion people and change is already a place that’s on the edge — and unchecked warming could help push us over.

TIME natural disaster

Landslides May Be Inevitable, But They’re Not Yet Predictable

A massive landslide killed dozens in Washington
A massive landslide near Oso, Washington killed at least 16 people, with far more still missing Photo by David Ryder/Getty Images

There was plenty of warning before the deadly Washington landslide. Why didn't it help?

There was the rain. The tiny town of Oso in northwestern Washington state is used to wet weather—rain falls every other day on average—but the past few months have been positively biblical, with precipitation as much as 200% above normal. There was the geography: steep terrain composed of glacial sediment, which is a loose mix of sand, silt and boulders, the geological equivalent of a banana peel. And there was the history. Mudslides have hit the land around Oso numerous times over the past few decades, including as recently as 2006. There’s a reason that some residents used to call the area “Slide Hill.”

Yet when the earth gave way on the morning on the morning of Mar. 22, no one was ready for the scale of devastation. More than 15 million cu. yards (11.5 million cu. m), equivalent to three million dump truck loads, came tumbling down, burying nearly 50 homes in a hilly area 60 miles (97 km) northeast of Seattle. At least 16 people have died in the landslide, which covered more than a square mile (2.6 sq. km) and more than 170 people are listed as missing, even as hope of finding survivors dwindles. Even if the number of missing comes down, as officials have predicted, this will go down as one of the deadliest landslides in U.S. history.

There was no shortage of warnings. As the Seattle Times reported earlier this week, a study by outside consultants had been filed with the U.S. Army Corps of Engineers in 1999 warning of “the potential for a large catastrophic failure” on the very hill that collapsed on Mar. 22. A 2000 study by the engineer and geomorphologist Tracy Drury warned that future landslides would take an increasing toll because “human development of the floodplain in this area has steadily increased.” Yet while local officials claimed that residents knew of the landslide risks, there’s little evidence that much was done to try to mitigate those risks. A 1,300 ft. (396 m) “crib wall” of boom logs anchored by 9,000 lb. (4,082 kg) concrete blocks every 50 ft. (15 m) was built after the 2006 landslide. But it was helpless against the landslide. “The place was set up to be unstable,” says David Montgomery, a geomorphologist at the University of Washington.

But despite all that, it’s not surprising that Oso wasn’t ready when the earth collapsed. Even though they kill more than 25 Americans and cause more than $2 billion in damages each year on average, landslides are the “underappreciated natural hazard,” as Montgomery puts it. But as Andrew Freedman points out on Mashable, that’s in part because there’s no uniform, national monitoring system:

Instead, the USGS, working with the National Weather Service (NWS) and state and local agencies, has put together a “patchwork quilt” of monitoring and experimental warning programs, based upon rainfall and soil moisture and pressure measurements. One such program has been in place near Puget Sound, but did not cover the area where the March 22 landslide occurred.

This is despite the fact that landslides are the most geographically dispersed natural hazard—all 50 states face at least some mudslide risk. But the widespread nature of landslide risk is part of the reason why there is no uniform warning system, although the USGS has put together a national map that identifies high-risk zones. (Unsurprisingly, they tend to be mountainous regions like the Appalachians, the Rockies and the Pacific Coastal ranges.) While landslides as a whole are common, they occur only rarely at any given location—even places as inherently unstable as the hills above Oso can go decades between slides. And while decades of study—and a national network of radar stations—has enabled meteorologists to predict hurricanes, tornadoes and other extreme weather with increasing precision, it is still incredibly difficult to identify when a landslide-prone hill will finally crumble. Heavy rainfall obviously plays a role, allowing water to infiltrate and loosen soil, but slides can also be triggered by earthquakes or erosion. “We can identify hazard zones, the places where you can expect a high probability of failure,” says Montgomery. “But it’s hard to say this slope will go on this particular day. We just don’t have enough data about the internal plumbing of the hillside.”

And it’s not just mountain towns that are at risk of landslides. Oregon state geologists have said that as much as 30% of metro Portland is in a high-risk zone for landslides, and a 2013 study by the University of Washington found that Seattle has some 8,000 buildings are at risk of an earthquake-induced landslide. Internationally, the danger is far greater: a 2o12 study in Geology estimated that rainfall-induced landslides alone—like the one near Oso—killed more than 32,000 people between 2004 and 2010, a massive toll, even though mudslides tend to get far less attention than earthquakes, hurricanes or tornadoes. Homes with a view come with danger attached, even if it’s one most people don’t know. Changing that fact might be the best way to ensure that the next major landslide is nowhere near as deadly.

TIME energy

The Afterlife of Oil Spills

Exxon Valdez oil spill cleanup
Nearly 11 million gallons of oil spilled into Prince William Sound after the 1989 Exxon Valdez spill Chris Wilkins—AFP/Getty Images

Twenty-five years after the Exxon Valdez oil spill, scientists are still reckoning with the ecological cost

On a shelf at my home, I have a small jar that contains a smear of crude oil. I dug it up on the shore of a small island in Alaska’s Prince William Sound in May of 2009, on a reporting trip for a story about the legacy of the Exxon Valdez oil spill. That crude oil is more than 25 years old now, and its existence is a reminder of just how long lived the effects of a major oil accident can be. Years after the spill has been stopped, after the press has gone home, the crude oil released into a river or a sea will affect the biology of almost anything it touches—just as it continues to weigh on the people who live and work in the area fouled by crude.

That’s worth remembering as we observe the 25th anniversary of the Exxon Valdez spill today. On Mar. 25, 1989, a tanker captained by Joseph Hazelwood ran aground on Alaska’s Bligh Reef, spilling nearly 11 million gallons (42 million liters) of crude oil into Alaska’s near-pristine Prince William Sound. The oil spread out to more than 1,300 miles (2,100 km) of coastline, choking bird and sea life, and permanently damaging the region’s ecology. Even now, you can still find some of that oil on remote beaches in the Sound, preserved by the cold. As of 2010, just 12 of the 32 monitored wildlife populations, habitats and resource services affected by the spill were considered fully recovered or very likely recovered. The once-prosperous Pacific herring fishery still remains closed after the population of the fish crashed in the years following the spill. While much of the Sound has rebounded, it will never be the same—even a quarter century later.

The Exxon Valdez disaster was the biggest oil spill in U.S. history—until April 2010, when BP’s Deepwater Horizon drilling rig was destroyed in a well blowout, leading to an oil gusher that lasted 87 days and resulted in more than 200 million gallons (757 million liters) of crude flowing into the Gulf of Mexico. While much of the oil was either cleaned up in a response operation that cost billions of dollars or was broken down by bacteria in the warm Gulf waters, the ecological damage from the spill was major, and almost four yeas later, scientists are only beginning to gauge the cost to marine life.

Here’s one example: in a new study published in the Proceedings of the National Academy of Sciences, researchers from the National Oceanic and Atmospheric Administration (NOAA) and several universities assessed the impact of Deepwater Horizon oil on developing embryos of bluefin tuna, yellowfin tuna and amberjack, all commercially important fish species that spawn near the site of the accident. The research team exposed embryos taken from breeding facilities to polycyclic aromatic hydrocarbons (PAHs), a toxic agent released by crude oil. In each tested species, PAH exposure—at levels the researchers said was realistic for the Gulf spill—was linked to abnormalities in heart function and defects in heart development. As the paper concluded:

Losses of early life stages were therefore likely for Gulf populations of tunas, amberjack, swordfish, billfish, and other large predators that spawned in oiled surface habitats.

The PNAS study isn’t the first to blame the BP oil spill for lingering problems with Gulf marine life; a study published earlier this month linked the spill to dwindling numbers of bottlenose dolphins Louisiana’s Barataria Ba. Nor will it be the last. But that hasn’t slowed the rush to keep drilling going in the Gulf of Mexico, a rush that BP has now been allowed to rejoin after initially being barred from participation in lease sales in the region. The British company won 24 out of 31 bids entered in an Interior Department offshore drilling lease sale held last week, paying more than $41 million for the right to explore oil and gas in the region. Altogether 1.7 million acres (.69 million hectares) off the coast of Louisiana, Mississippi and Alabama were opened up for new drilling. Despite evidence of the risks, nothing seems likely to stop operations in the Gulf.

As long as there is offshore drilling and marine transport of oil, the risks of accidents will exist. Just two days before the 25th anniversary of the Exxon Valdez spill, at least 168,000 gallons (636,000 liters) of oil spilled from a barge in Galveston Bay in Texas. The spill is blocking the bustling Houston Ship Channel, one of the busiest seaways in the U.S., and threatens an environmentally sensitive bird sanctuary nearby. Given the small size of the spill, it won’t have the kind of major aftereffects seen in the Valdez and the BP dissters. But it’s one more reminder that as long as our economy remains so dependent on oil, there will always be the risk of another catastrophe that could linger on and on.

[Update: BP sent along a statement in response to the PNAS study—I'm including it below:

The paper provides no evidence to suggest a population-level impact on tuna, amberjack or other pelagic fish species in the Gulf of Mexico. The oil concentrations used in these lab experiments were rarely seen in the Gulf during or after the Deepwater Horizon accident. In addition, the authors themselves note that it is nearly impossible to determine the early life impact to these species. To overcome this challenge, it would take more information than what’s presented in this paper.

It's worth noting that the researchers mention in the paper how difficult it is to sample live but fragile yolksac larvae of big pelagic species like the bluefin tuna in the wild, which is the embryos used in the study were collected from breeding stations on land, not the Gulf itself.]

TIME Human Body

Your Nose Can Smell at Least 1 Trillion Scents

A new study demonstrates that your sense of smell is far more sensitive than you think and the world of scents is infinitely more varied. Scientists previously thought humans could smell around 10,000 different odors

Human beings tend to think of themselves as visual first, auditory second, then touch and taste. Down at the bottom of the five senses is smell—at least when it comes to how often we’re aware of it. And while we all know how pungent a bad smell can be, and how memorable a good smell is, we probably don’t think our olfactory sense is all that sensitive, at least compared to the rest of our senses—or to the keen sense of smells exhibited in the animal world (Sharks can’t literally smell fear, but they can distinguish the smell of fish even if they make up only one part for every 10 billion parts in the water).

While scientists estimate that human beings can discriminate between several million different colors and almost half a million different sounds, they have long assumed that we can distinguish perhaps 10,000 different odors. Most of the time humans are barely aware they’re smelling anything at all.

But in reality, our noses are incredibly sensitive—and a new study published in Science provides evidence of just how amazing our sniffers are. Researchers at Rockefeller University and the Howard Hughes Medical Institute (HHMI) tested volunteers’ sense of smell using precisely crafted mixtures of odor molecules. After extrapolating the results, the researchers estimated that the average human being can distinguish between 1 trillion different odors, if not more, which makes our noses far more sensitive than any other organ in the body.

“The message here is that we have more sensitivity in our sense of smell than for which we give ourselves credit,” said Andreas Keller, a research associate at Rockefeller’s Laboratory of Neurogenetics and Behavior and the lead author on the Science study, in a statement. “We just don’t pay attention to it and we don’t use it in everyday life.”

The idea that human beings could only distinguish between 10,000 smells has been around since a 1927 study that posited four elementary odors that people are able to distinguish on a nine-point scale. Do the math and you get 6,651 discernible olfactory sensations, a number that was later rounded up to 10,000. Although that value was widely cited, most scientists were skeptical—after all, the human eye uses just three light receptors to see millions of colors, while the typical nose has 400 different olfactory receptors. But as Leslie Vosshall of HHMI and another study co-author noted: “For smell, nobody ever took the time to test.”

Obviously the researchers weren’t going to try to test each smell individually—that would take forever. Instead, they used 128 different odorant molecules to create smell mixtures, using 10, 20 and 30 different components. The molecules themselves evoked familiar smells like cut grass, but when combined in random mixtures of as many as 30 different types, the smells became unfamiliar. That didn’t matter—the study subjects weren’t supposed to identify the smells. Instead, the researchers would present them with three vials of scents—two that were identical, and one that was unique—and asked them to indicate which scent was different than the others. Each of the 26 subjects made 264 comparisons.

Keller and his colleagues found that their study subjects could generally tell the difference between mixtures containing as much as 51.17% of the same components. Much higher than that, and they were unable to distinguish the smells—though it’s worth noting that some subjects could distinguish between smell mixtures that were as much as 90% similar. The researchers then extrapolated the total number of mixtures possible in each of their three categories. Since the majority of their study subjects could distinguish between mixtures that were 51.17% similar or less, they estimate that the average human can discriminate more than 1 trillion separate smells.

That is a vast number of scents, and it’s almost certainly too low, because there are many more odor molecules in the real world that could be mixed in nearly uncountable ways. So it’s not just that human beings have sensitive olfactory systems—though not that sensitive, otherwise more people would be able to distinguish smells that were more than 50% similar. It’s that the world offers a near infinite variety of smells. If human beings think their sense of smell isn’t that important, it has more to do with the fact that we’ve done our best to eliminate smells through refrigeration, air filtration, and yes, daily showers. As Vosshall put it:

The world is always changing. Plants are evolving new smells. Perfume companies are making new scents. You might move to some part of the world where you’ve never encountered the fruits and vegetables and flowers that grow there. But your nose is ready. With a sensory system that is that complex, we are fully ready for anything.

The nose, as it turns out, really does know.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser