TIME animals

Nature’s Top 10 Cute Critters for 2014

A serious science journal allows itself some cuddles

If you read science journals (and really, who doesn’t?) you know that it’s not easy to top Nature—and Nature itself surely knows it. They’re the major leagues, the senior circuit, the place the serious stuff goes to get seen. Nature doesn’t do small—and it definitely doesn’t do cute.

At least, it didn’t.

But every now and then, even the folks on the peer review panels start to feel cuddly. Spend your days vetting new studies about the Dumbo octopus or the toupee monkey or the robot baby penguins that can fool real penguins, and you have to admit that sometimes nature can be pretty adorable—even if Nature can’t.

So in a nod to the sweetness that hides in the science, the journal just released an uncharacteristically precious video–the Top 10 Cutest Animals in 2014. You can go back to being Mr. Grumpypants tomorrow, Nature. But for now, give us a great big hug.

TIME health

For Once the Anti-Vaxxers Aren’t (Entirely) to Blame

Face of the enemy: A molecular model of the whooping cough toxin
Face of the enemy: A molecular model of the whooping cough toxin LAGUNA DESIGN; Getty Images/Science Photo Library RF

Jeffrey Kluger is Editor at Large for TIME.

California's whooping cough outbreak is largely the fault of a harmless but imperfect vaccine

Anti-vaxxers are epidemiology’s repeat offenders—the first and sometimes only suspects you need to call in for questioning whenever there’s an outbreak of a vaccine-preventable disease. So on those occasions when their prints aren’t all over the crime scene, it’s worth giving them a nod. That’s the case—sort of, kind of—when it comes to the current whooping cough (or pertussis) epidemic that’s burning its way through California, with nearly 10,000 cases since the first of the year, making it the worst outbreak of the disease since the 1940s. So far, one infant has died.

Before we start giving out any laurels, let’s be clear on one point: the anti-vaxxers continue to be risibly wrong when they say that vaccines are dangerous (they aren’t), that they lead to autism, ADHD, learning disabilities and more (they don’t), and that you should take your public-health advice from the likes of Jenny McCarthy, Rob Schneider, and Donald Trump instead of virtually every medical and scientific authority on the planet (you shouldn’t). But a safe vaccine is not always the same as an entirely effective vaccine, and here the whooping cough shot is coming up a little short—with emphasis on the “little.”

According to the U.S. Centers for Disease Control, the pertussis vaccine starts off perfectly effectively, with 90% of kids developing full immunity from the disease in their first year after inoculation. But that protection starts to fade in year two, and by the five-year point, only 70% of kids are still protected. Until the 1990s, a more effective formulation was available, but it was replaced due to side effects (pain, swelling and perhaps some fever—not autism, thank you very much). The newer version eliminates those problems, but at a cost to effectiveness.

The waning protection the vaccine affords helps explain the cyclical nature of whooping cough outbreaks, with cases usually beginning to rise every three to five years. Certainly, the anti-vax crowd has not helped matters any. When a vaccine offers only imperfect protection, it’s especially important that as many people as possible get it since this maximizes what’s known as herd immunity—the protection a community that’s largely immune can offer to the minority of people who aren’t.

Last spring’s mumps outbreak in Columbus, Ohio was due in part to a combination of the relatively low 80-90% effectiveness rate of that vaccine and the poor level of vaccine compliance. As I reported in Time’s Oct. 6, 2014 issue, 80% of people who contracted the disease said they had been vaccinated in childhood, but only 42% of those cases could be confirmed. In the current whooping cough epidemic, California health authorities estimate that only 10% of all people who have come down with the disease were never vaccinated. That’s up to 10% more people than needed to get sick, but a lot fewer than the total in Columbus.

The heart of the anti-vaxxers’ argument is not, of course, that some vaccines offer incomplete protection. If it were, they wouldn’t find so many willing believers. For one thing, the large majority of vaccines achieve at least a 90% effectiveness level—and often much higher. For another, it’s hard to make the case that even if they didn’t, imperfect protection would be better than none at all.

Seat belts, after all, aren’t 100% effective at preventing highway deaths either, and condoms don’t entirely eliminate the risk of pregnancy or STDs. But that doesn’t mean you stop using them, because your brain makes a rational risk calculation about the wisdom of taking cost-free precautions. You might not make such smart choices, however, if somebody muddied the equation by introducing the faux variable of imaginary risk—seat belts and condoms cause autism, say.

Persuading people to run that flawed calculus is where the the anti-vaccine crowd does its real damage. A new—and scary—interactive map from the Council on Foreign Relations tracks the global rise or fall of vaccine-preventable diseases from 2008 to 2014. In the same period, during which most of the world saw a 57% decline in cases, North America—driven mostly by the U.S.—showed a stunning 600% increase.

It’s fitting somehow that the locations of the outbreaks show up on the map as a sort of pox—with the once-clear U.S. slowly becoming blighted from one coast to the other. Misinformation is its own kind of blight—one that’s every bit as deadly as the bacteria and viruses the vaccines were invented to prevent. And it’s the anti-vaxxers themselves who are the carriers of this particular epidemic.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME animals

There Was a Big Bang for Birds

An ex-crocodile. Clearly a step up
An ex-crocodile. Clearly a step up Luis Costa—AFP/Getty Images

A sweeping new study tells a long genetic tale

If there’s a factory where birds are built, the workers were clearly smoking something the day they designed the hummingbird. And the ostrich. And the toucan. Imagine, too, the pitch meeting for the parrot, (“Let’s make this one talk!”), or the peacock (“So we got this crate of feathers…”).

Of course, that’s not how it really happened. Birds came along without our help, evolving from the Aves class into 23 orders, 142 families, 2,057 genera and finally 9,702 species—the most prolific speciation of all four-limbed vertebrates. The problem with such prodigious divergence is that it makes it hard to determine how the great bird explosion began in the first place. Now, however, in a pair of papers in Science, scientists report that they have an answer. Modern birds, they have learned, got their start like the universe itself—with something of a Big Bang, a burst of specialization that began 65 million years ago with the same asteroid hit that wiped out the dinosaurs and made room for mammals and other land animals.

This finding results from the work of hundreds of scientists at 80 labs and universities across 20 countries, done with the help of bird tissue collected from labs and museums around the world. Those specimens were sent to the Genome Tissue Institute in Beijing, where the basic sequencing was conducted. The first and most basic conclusion the investigators reached was a big one. “This confirms that there was a very rapid radiation and that major lineages of birds were in existence 5 to 6 million years after the extinction event,” says Joel Cracraft, an avian systemicist at the American Museum of Natural History in New York and a contributor to the papers. “They were very widely distributed as well.”

But there was much more to be learned, and that required the hundreds of others scientists to get busy parsing the genomes. A lot of their results live down in the technical weeds, where geneticists speak of such things as total evidence nucleotide trees and GTR+GAMMA models. Among the plain-English findings, however, there were some important top-line results. The investigators identified a sort of progenitor bird, for example, a so-called apex predator that came along shortly after the asteroid hit and was the great-great-great granddaddy of all extant land birds. The descendants that that founding father left can be connected in unexpected ways.

The gaudy flamingo and the proletariat pigeon turn out to belong to sister clades—or groups descending from one common ancestor. Similarly, there is a three-way kinship among the cuckoos; the bustards (medium-size game birds that include the paauw and its larger cousin, the straightforwardly named great paauw); and the turacos. The last group is a brilliantly colored and plumed family of birds that include the African banana eaters and the go-away birds, species that got their names because one of them, well, eats bananas and the other issues a warning call that sounds like it’s saying “go away,” which it sort of is.

Among the more granular discoveries, the investigators report that so-called vocal learners—birds with flexible repertoires of songs and mimicked speech—actually share some of their molecular brain structures with humans. And the very act of singing appears to change the birds’ epigenomes—the regulatory system that sits atop the genes and determines which ones are expressed—meaning that the more frequent the song the more specialized the bird’s genetic wiring will become.

But just in case the big, fun, colorful Aves class gets above itself, the papers do stress that every extant bird can trace its line back even further than the apex predator, all the way to a small and rather vulgar group of ancestors that are actually alive today; the saltwater crocodile, the American alligator and the Indian gharial—which is sort of an alligator with an absurdly skinny snout. For birds as much as for humans, it seems, no matter how high you climb, there are always a few embarrassing family members to keep you humble.

TIME Science

Apollo 17 and the Case for Returning to the Moon

Harrison H. Schmitt on moon
Apollo 17 astronaut Jack Schmitt standing on surface of moon while holding a rake full of rock samples, with Rover in distance Time Life Pictures / Getty Images

It's been two generations since the moon was eclipsed in NASA's priorities

Richard Nixon was a lunar buzzkill—but at least he was honest about it. During the early years of the space program, Nixon held no political office, which put him on the sidelines for all of the one-man Mercury flights and two-man Gemini flights, as well as the first two flights of the Apollo program. But he assumed the presidency in January of 1969 and was thus the one who got to spike the football in July of that year, phoning the moon from the Oval Office to congratulate the Apollo 11 crew on their historic lunar landing.

Not long afterward, the same President canceled the Apollo program—though he held off on making his announcement until after his reelection in 1972 was assured.

During the final lunar landing mission—Apollo 17, which left Earth on Dec. 7, 1972 and reached the moon on Dec. 11—Nixon was candid about what the future held for America’s exploratory ambitions, and it was not good. “This may be the last time in this century that men will walk on the moon,” he said in a formal pronouncement.

As it turned out, things have been even bleaker than that. It’s been 42 years since Apollo 17 commander Gene Cernan climbed up the ladder of his lunar module, leaving the final human footprint in a patch of lunar soil. TIME’s coverage of the mission provides not only an account of the events, but a sense—unintended at the time—of just how long ago they unfolded. There are the quotation marks that the editors thought should accompany the mention of a black hole, since really, how many people had actually heard of such a thing back then? There was, predictably, the gender bias in the language—with rhapsodic references to man’s urge to explore, man standing on the threshold of the universe. It may be silly to scold long-ago writers for such usage now—but that’s not to say that, two generations on, it doesn’t sound awfully odd.

Over the course of those generations, we’ve made at least one feint at going back to the moon. In 2004, then-President George W. Bush announced a new NASA initiative to return Americans to the lunar surface by 2020. But President Obama scrapped the plan and replaced it with, well, no one is quite certain. There’s a lot of talk about capturing a small asteroid and placing it in lunar orbit so that astronauts can visit it—a mission that is either intriguing, implausible or flat-out risible, depending on whom you talk to. And Mars is on the agenda too—sort of, kind of, sometime in the 2030s.

But the moon, for the moment, is off America’s radar—and we’re the poorer for it. There were nine manned lunar missions over the course of three and a half glorious years, and half a dozen of them landed. That makes six small sites on an alien world that bear human tracks and scratchings—and none at all on the the far side of that world, a side no human but the 24 men who have orbited the moon have seen with their own eyes.

We tell ourselves that we’ve explored the moon, and we have—after a fashion. But only in the sense that Columbus and Balboa explored the Americas when they trod a bit of continental soil. We went much further then; we could—and we should—go much further now. In the meantime, TIME’s coverage of the final time we reached for—and seized—the moon provides a reminder of how good such unashamed ambition feels.

Read a 1973 essay reflecting on the “last of the moon men,” here in the TIME Vault: God, Man and Apollo

TIME

One Great Act of Holiday Kindness

Anonymous
Manuel Sanchez-Paniagua learns to work the slate on the set of Criminal Minds Cliff Lipson; CBS

Jeffrey Kluger is Editor at Large for TIME.

A hit TV show does a very good thing for a very sick child

If you’re not fed up with the human species yet, it’s probably because you haven’t been paying attention. There are our wars for one thing. According to the Institute for Economics and Peace, of 162 countries surveyed, only 11 are not currently involved in some kind of armed conflict or full-scale combat.

There are our sectarian messes, too. Enjoy the ugly racial tensions sparked by the Ferguson and Staten Island non-indictments of white police officers who killed unarmed black men? Then you’ll love the far less defensible nativist uprising in Dresden, where weekly demonstrations are being staged to protest the imagined “Islamization” of the country, despite the fact that only 2% of the population of Germany’s entire Saxony region is made up of immigrants and only a small minority of them are Muslims. Then there are our drug gangs and street gangs and corrupt politicians and crooked bankers and all of the manifold reprobates who work their manifold harms on everybody else.

And then, just when you’ve had it, just when you’re really, truly, ready to wash your hands of the whole savage lot of us, somebody does something sweet and compassionate and wonderfully caring, and you’re willing to give the species one more chance. Which brings me to Manuel Sanchez Paniagua, the cast of the show Criminal Minds, and—yes, damn it—Christmas.

Manuel deserves a good Christmas season more than most. He is only 15, lives in Mexico City with his family and has been battling cancer for close to two years now—which is an awfully big piece of your life when you’re so young. (He is also—full disclosure—a member of my wife’s family.) Manuel’s illness began in January 2013 with a liver tumor which required three separate surgeries at Boston Children’s Hospital, the last of which was described by the lead doctor as “one of the most difficult in the history of the hospital.”

That was followed by three rounds of chemotherapy and—as is often the case with cancer—a blissful remission, leading his family to hope that Manuel had been cured. As is often the case with cancer too, however, those hopes collapsed.

In September, he suffered a seizure in Mexico and was rushed back to Boston, where his doctors found a brain metastasis. This time there would be more-aggressive treatments, and this time his parents would hear what every parent of a sick boy or girl dreads hearing, which is that just in case, if things turn worse, it might be time to think about granting your child some long-held wishes. So Manuel’s parents asked him what his wish was and he said he wanted to visit the set of Criminal Minds.

There aren’t a whole lot of people who haven’t thought about what they’d choose in such a situation, and the folks who’d pick a Polynesian beach house or a tour of Machu Picchu indulge in more than a little elitist sniffing when they hear of people who’d pick the Grand Canyon or Yankees training camp. The cancer romance The Fault in Our Stars made much of this idea, with Augustus Waters affecting shock that Hazel Grace Lancaster chose a trip to Disneyworld with her parents. “‘I can’t believe I have a crush on a girl with such cliché wishes!” he says.

But a wish, of course, is a reflection of a moment—who you are when you must make the choice. And when the number of moments you have left to you is in question, you choose what will make you happy right now, today. So Manuel chose Criminal Minds—and the cast and crew and production office made it happen.

Just before Thanksgiving, he and his family flew to Los Angeles to be present for the shooting of the series’ Dec. 10 episode. Joe Mantegna, the show’s biggest name, was directing that episode and he kept Manuel busy, dispatching him onto the set to work the slate, explaining scenes as he directed them, eating lunch with him during a break. Manuel met the rest of the cast, posed for photos with them and visited the writers’ room—a pretty static place if you’re not one of the show’s many rabid fans; Xanadu if you are.

None of what Manuel experienced in the six hours he was on-set will make a lick of difference in his prognosis—unless, of course, it does. Scientists have never fully understood the multiple ways optimism and hope and just plain being happy can help humans battle disease—except to say with near-certainty that they can.

Just as important is what the small act of kindness that came Manuel’s way—and a million-million others like it that are performed around the world every day—say about the prognosis for the human condition. Evil is vulgar, broad-brush stuff—the dark, mindless business of burning things down or blowing them up. Kindness is pointillist—bright dots of good, dabbed and dabbed and dabbed again. No single one of them amounts to very much. But a million-million every day? That can create an awfully beautiful picture.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME climate change

Watch the Science Cop Take on Climate Change Denying Senator Jim Inhofe

The climate denier in charge of the Senate Climate Committee

You don’t have to be a general to be head of the Senate Armed Services Committee, but if you don’t at least believe in the existence of a military, we’ve got a problem on our hands.

The country is about to face something similar in January, when the GOP takes control of the U.S. Senate and Oklahoma’s James Inhofe — Congress’s most vocal global warming denier — becomes chair of the Senate Environment and Public Works Committee. Inhofe, who has called climate change “the greatest hoax ever perpetrated on the American people,” not only sniffs at what the overwhelming majority of climatologists know to be true, he actually tries to go toe to toe with them on the science. And that’s where he exposes how little he knows — and how wrong he is. The Science Cop explains.

TIME psychology

How Memory Links the Presidency, Ferguson and the Cosby Mess

Do you know me? Relax, you're not alone.
Do you know me? Relax, you're not alone.

Jeffrey Kluger is Editor at Large for TIME.

The human brain forgets much more than it remembers, and that has an impact on history, criminal justice and more

Here’s a difficult one, history buffs: Who was Harry Truman? I know, I know, I told you it would be tough, but think hard: Some famous general? Maybe a physicist?

If you guessed U.S. president, good for you! And if you also knew that Truman was the one who came right after Roosevelt (Franklin, that is) and right before Eisenhower, go to the head of the class.

OK, so maybe remembering Truman isn’t such a big deal. But here’s the thing: By 2040, according to a new study just published in Science, only 26% of college students will remember to include his name if they are asked to make a list of all U.S. Presidents, regardless of order.

That finding, which is less a function of historical illiteracy than of the mysterious ways the human brain works, reveals a lot about the perishability of memory. And that, in turn, has implications for contemporary dramas like the Ferguson tragedy, the Bill Cosby mess and the very underpinnings of the criminal justice system.

The Science study, conducted by a pair of psychologists at Washington University in St. Louis, was actually four studies that took place over 40 years—in 1974, 1991, 2009 and 2014. In the first three, the investigators asked groups of then-college students to list all of the presidents in the order in which they served, and also to list as many of them as they could by name regardless of where they fell in history.

In all three groups over all three eras, the results were remarkably similar. As a rule, 100% of respondents knew the president currently serving, and virtually all knew the prior one or two. Performance then fell off with each previous presidency. Roughly 75% of students in 1974 placed FDR in the right spot, for example. Fewer than 20% of Millennials—born much later—could do that. In all groups, the historical trail would go effectively cold one or two presidents before the subjects’ birth—falling into single digits.

There were exceptions. The Founding Father presidents, particularly the first three—George Washington, John Adams and Thomas Jefferson—scored high in all groups. As did Abraham Lincoln and his two immediate successors, Andrew Johnson and Ulysses S. Grant. As for the Tylers and Taylors and Fillmores? Forget about them—which most people did. The pattern held again in a single larger survey conducted in 2014, with a mixed-age sample group that included Boomers, Gen X’ers and Millennials, all performing true to their own eras.

Almost none of this had to do with any one President’s historical relevance—apart from the Founding Fathers and Lincoln. James Polk’s enormously consequential, one-term presidency is far less recalled than, say, Jimmy Carter’s much less successful four-year stint. Instead, our memory is personal, a thing of the moment, and deeply fallible—and that means trouble.

One of the most disturbing aspects of the Ferguson drama is the mix of wildly different stories eyewitnesses presented to the grand jury, with Michael Brown portrayed as anything from anger-crazed aggressor to supine victim. Some witnesses may have been led by prosecutors, some may have simply been making things up, but at least some were surely doing their best, trying to remember the details of a lethal scene as it unfolded in a few vivid seconds.

If forensic psychology has shown anything, it’s that every single expectation or bias a witness brings to an experience—to say nothing of all of the noise and press and controversy that may follow—can contaminate recall until it’s little more reliable than that of someone who wan’t there at all.

Something less deadly—if no less ugly—applies in the Bill Cosby case. In an otherwise reasonable piece in the Nov. 25 Washington Post, columnist Kathleen Parker cautions against a collective rush to judgment and reminds readers that under the American legal system, Cosby is not a rapist, but an alleged rapist; and his victims, similarly, are as yet only alleged victims. Fair enough; that’s what the criminal justice rules say. But then, there’s this:

“…we have formed our opinions… only on the memories of the women, most of whom say they were drugged at the time. Some of them have conceded that their recollections are foggy—which, of course they would be, after decades and under pharmaceutically induced circumstances, allegedly.”

In other words, if Cosby did drug them, then perhaps we must throw their testimony out of court because, um, Cosby drugged them. Talk about the (alleged) criminal making hay on his crime. And yet, when it comes to the science of memory, that’s an argument that could work before a judge.

Finally, too, there is the unseemly business of Ray Rice. Virtually nobody who knows what he did has forgotten it—which is what happens when you’re a massively strong athlete and you cold-cock a woman. But it was the complete elevator video actually showing the blow, as opposed to the earlier one in which Rice was seen merely dragging the unconscious body of his soon-to-be-wife out into a hotel hallway, that spelled his end—at least until his lifetime NFL ban was overturned on Nov. 28. Knowing what happened is very different from seeing what happened—and once you saw the savagery of Rice’s blow, you could never unsee it.

When it comes to presidents, the fallibility of memory can help. In the years immediately following Richard Nixon’s resignation, it was a lot harder to appreciate his manifest triumphs—the Clean Air Act, the opening to China—than it is now. George W. Bush is enjoying his own small historical rebound, with his AIDS in Africa initiative and his compassionate attempt at immigration reform looking better and better in the rear-view mirror—despite the still-recent debacles of his Presidency.

We do ourselves a disservice if we hold historical grudges against even our most flawed presidents; but we do just as much harm if we allow ourselves to forget why ill-planned land wars in countries like Iraq or cheap break-ins at places like the Watergate are so morally criminal. Forget the sequence of the Presidents if you must, but do remember their deeds.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME space

Watch Christopher Nolan and Kip Thorne Discuss the Physics of Interstellar

Thorne literally wrote the book on (much of) the movie's cosmology

There’s no arguing about the blockbuster status of Interstellar, director Chris Nolan’s latest box office phenomenon. But plenty of people are debating the science component of that sci-fi tale—which is how it always is when a movie based in something as non-negotiable as physics has to take just enough liberties to make the fiction part of the story fly.

Nolan was determined to keep his narrative scientifically honest, which is why he signed on as technical adviser celebrated Caltech physicist Kip Thorne—who literally wrote the book on (much of) the movie’s cosmology. TIME’s Jeffrey Kluger sat down with Nolan and Thorne to talk about human curiosity, the art of sci-fi filmmaking and the one time the two of them locked horns over a plot point.

TIME

On Evolution Day, Remember That Darwin Knew He’d Meet Resistance

127035224
A statue of Darwin in the Natural History Museum, London Philippe Lissac—Godong / Getty Images

Plus, TIME's original coverage of the anti-evolution arguments of the 1925 Scopes trial

Correction appended, Nov. 24, 2014, 5:49 p.m.

Time was, “Darwin” was just a guy’s name. It was not a noun (Darwinism) or an adjective (Darwinian). And it certainly wasn’t a flash point for debate between folks who prefer a Scriptural view of the history of life and those who take a more scientific approach. That started to change 155 years ago today, on Nov. 24, 1859, when Charles Darwin’s seminal work—On the Origin of Species—was published.

Darwin knew that by supporting an empirical theory of evolution as opposed to the Biblical account of Creation he was asking for trouble. Two weeks before the book’s publication, he sent letters to 11 prominent scientists of his day, asking for their support—or at least their forbearance—and acknowledging that for some of them, that would not be easy. To the celebrated French botanist Alphonse de Candolle he wrote:

Lord, how savage you will be, if you read it, and how you will long to crucify me alive! I fear it will produce no other effect on you; but if it should stagger you in ever so slight a degree, in this case, I am fully convinced that you will become, year after year, less fixed in your belief in the immutability of species.

And to American Asa Gray, another botanist, he conceded:

Let me add I fully admit that there are very many difficulties not satisfactorily explained by my theory of descent with modification, but I cannot possibly believe that a false theory would explain so many classes of facts as I think it certainly does explain.

But the whirlwind came anyway. Speaking of Darwin in 1860, the Bishop of Oxford asked: “Was it through his grandfather or his grandmother that he claimed his descent from a monkey?” The battle raged in the U.S. in the summer of 1925, with the trial of John Scopes, a substitute school teacher charged with violating a Tennessee statute forbidding the teaching of evolution in schools.

But Darwin and his theory of evolution endured, so much so that Nov. 24 is now recognized as Evolution Day. As if serendipity and circumstance were conspiring to validate that decision, it was on another Nov. 24, in 1974, that the fossilized remains of Lucy, the australopithecus who did so much to fill in a major gap in human evolution, were found in Ethiopia.

In honor of Lucy and Evolution Day and Darwin himself, check out TIME’s coverage of the florid anti-evolution closing argument of prosecuting attorney and three-time presidential candidate William Jennings Bryan during the Scopes trial, as quoted in the magazine’s Aug. 10, 1925 issue:

“Darwin suggested two laws, sexual selection and natural selection. Sexual selection has been laughed out of the class room, and natural selection is being abandoned, and no new explanation is satisfactory even to scientists. Some of the more rash advocates of Evolution are wont to say that Evolution is as firmly established as the law of gravitation or the Copernician theory.

“The absurdity of such a claim is apparent when we remember that any one can prove the law of gravitation by throwing a weight into the air and that any one can prove the roundness of the earth by going around it, while no one can prove Evolution to be true in any way whatever.”

Bryan died mere days after the trial ended but, as the historical record shows, his strenuous efforts paid off—sort of. Scopes was duly convicted. His sentence for teaching what most of the world now accepts as science: $100.

Read the full text of that story, free of charge, here in the TIME archives, or in its original format, in the TIME Vault: Dixit

Correction: The original version of this article misstated the date of Darwin Day. Darwin Day is typically celebrated on February 12.

TIME space

New View of the Solar System’s Most Fascinating Moon

The newly released image of Jupiter's moon Europa.
The newly released image of Jupiter's moon Europa. NASA/JPL-Caltech/SETI Institute

NASA's reprocessed picture of Jupiter's Europa gives us a fresh look at the likeliest place in the solar system for extraterrestrial life.

This is not the back of an eyeball—even though it looks like the back of an eyeball. It’s Jupiter’s frozen moon Europa—the sixth largest moon in the solar system, just behind Earth’s. But the organic appearance of Europa in this newly released, newly reprocessed image captured by the Galileo spacecraft in the late 1990s is apt all the same, because the moon may be the likeliest world in the solar system to harbor extraterrestrial life.

Europa is entirely covered by a shell of water ice, anywhere from 1.8 mi. to 62 mi. (3 to 100 km) thick, depending upon which astronomer’s estimates you’re using and where on the moon you’re measuring. But the existence of the ice is proven, and it all but certainly covers a deep, mineral rich water ocean descending to a depth of another 62 mi. It is tidal flexing that keeps the ocean liquid—the steady gravitational plucking Europa experiences every time is passes or is passed by one of its three large sister moons, Io, Ganymede and Callisto.

In the same way a wire hanger bent rapidly back and forth can become too hot to touch at the point of flexing, so too does the center of Europa heat up. That causes the water to remain both relatively warm and constantly in motion. Keep that up for 4 billion years in an oceanic environment believed to contain hydrocarbons, and you may well cook up something living.

The most compelling evidence for Europa’s dynamic behavior was gathered by Voyager 2, when it flew by the moon in 1979, and Galileo, when it arrived in Jovian orbit in 1995. The cameras of both spacecraft captured the vascular-looking webwork of fractures in the moon’s surface ice, and close up images revealed what looked like jagged icebergs that had broken free, tipped sideways and quickly frozen back in place in the paralyzing cold of deep space. All this suggested an ocean that was in constant motion.

The colors used in earlier versions of the reprocessed image were based on knowledge of what the moon’s chemistry is and a bit of conjecture about exactly what shades it would produce. But the new version is based on both improved knowledge and improved image processing. The ruddy colors in the fractures are the products of the minerals that bubble up through the cracks. Green, violet and near-infrared filters were used to establish the proper palette.

A better, more accurate picture of Europa does nothing to change the facts on the ground there—or, more tantalizingly, below the ground. The moon remains the most fascinating non-Earthly object in our solar system. The new image, however, does serve as one more come-hither gesture from a world that’s been beckoning us to return for a long time.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser