TIME Science

Apollo 17 and the Case for Returning to the Moon

Harrison H. Schmitt on moon
Time Life Pictures / Getty Images Apollo 17 astronaut Jack Schmitt standing on surface of moon while holding a rake full of rock samples, with Rover in distance

It's been two generations since the moon was eclipsed in NASA's priorities

Richard Nixon was a lunar buzzkill—but at least he was honest about it. During the early years of the space program, Nixon held no political office, which put him on the sidelines for all of the one-man Mercury flights and two-man Gemini flights, as well as the first two flights of the Apollo program. But he assumed the presidency in January of 1969 and was thus the one who got to spike the football in July of that year, phoning the moon from the Oval Office to congratulate the Apollo 11 crew on their historic lunar landing.

Not long afterward, the same President canceled the Apollo program—though he held off on making his announcement until after his reelection in 1972 was assured.

During the final lunar landing mission—Apollo 17, which left Earth on Dec. 7, 1972 and reached the moon on Dec. 11—Nixon was candid about what the future held for America’s exploratory ambitions, and it was not good. “This may be the last time in this century that men will walk on the moon,” he said in a formal pronouncement.

As it turned out, things have been even bleaker than that. It’s been 42 years since Apollo 17 commander Gene Cernan climbed up the ladder of his lunar module, leaving the final human footprint in a patch of lunar soil. TIME’s coverage of the mission provides not only an account of the events, but a sense—unintended at the time—of just how long ago they unfolded. There are the quotation marks that the editors thought should accompany the mention of a black hole, since really, how many people had actually heard of such a thing back then? There was, predictably, the gender bias in the language—with rhapsodic references to man’s urge to explore, man standing on the threshold of the universe. It may be silly to scold long-ago writers for such usage now—but that’s not to say that, two generations on, it doesn’t sound awfully odd.

Over the course of those generations, we’ve made at least one feint at going back to the moon. In 2004, then-President George W. Bush announced a new NASA initiative to return Americans to the lunar surface by 2020. But President Obama scrapped the plan and replaced it with, well, no one is quite certain. There’s a lot of talk about capturing a small asteroid and placing it in lunar orbit so that astronauts can visit it—a mission that is either intriguing, implausible or flat-out risible, depending on whom you talk to. And Mars is on the agenda too—sort of, kind of, sometime in the 2030s.

But the moon, for the moment, is off America’s radar—and we’re the poorer for it. There were nine manned lunar missions over the course of three and a half glorious years, and half a dozen of them landed. That makes six small sites on an alien world that bear human tracks and scratchings—and none at all on the the far side of that world, a side no human but the 24 men who have orbited the moon have seen with their own eyes.

We tell ourselves that we’ve explored the moon, and we have—after a fashion. But only in the sense that Columbus and Balboa explored the Americas when they trod a bit of continental soil. We went much further then; we could—and we should—go much further now. In the meantime, TIME’s coverage of the final time we reached for—and seized—the moon provides a reminder of how good such unashamed ambition feels.

Read a 1973 essay reflecting on the “last of the moon men,” here in the TIME Vault: God, Man and Apollo

TIME

One Great Act of Holiday Kindness

Anonymous
Cliff Lipson; CBS Manuel Sanchez-Paniagua learns to work the slate on the set of Criminal Minds

Jeffrey Kluger is Editor at Large for TIME.

A hit TV show does a very good thing for a very sick child

If you’re not fed up with the human species yet, it’s probably because you haven’t been paying attention. There are our wars for one thing. According to the Institute for Economics and Peace, of 162 countries surveyed, only 11 are not currently involved in some kind of armed conflict or full-scale combat.

There are our sectarian messes, too. Enjoy the ugly racial tensions sparked by the Ferguson and Staten Island non-indictments of white police officers who killed unarmed black men? Then you’ll love the far less defensible nativist uprising in Dresden, where weekly demonstrations are being staged to protest the imagined “Islamization” of the country, despite the fact that only 2% of the population of Germany’s entire Saxony region is made up of immigrants and only a small minority of them are Muslims. Then there are our drug gangs and street gangs and corrupt politicians and crooked bankers and all of the manifold reprobates who work their manifold harms on everybody else.

And then, just when you’ve had it, just when you’re really, truly, ready to wash your hands of the whole savage lot of us, somebody does something sweet and compassionate and wonderfully caring, and you’re willing to give the species one more chance. Which brings me to Manuel Sanchez Paniagua, the cast of the show Criminal Minds, and—yes, damn it—Christmas.

Manuel deserves a good Christmas season more than most. He is only 15, lives in Mexico City with his family and has been battling cancer for close to two years now—which is an awfully big piece of your life when you’re so young. (He is also—full disclosure—a member of my wife’s family.) Manuel’s illness began in January 2013 with a liver tumor which required three separate surgeries at Boston Children’s Hospital, the last of which was described by the lead doctor as “one of the most difficult in the history of the hospital.”

That was followed by three rounds of chemotherapy and—as is often the case with cancer—a blissful remission, leading his family to hope that Manuel had been cured. As is often the case with cancer too, however, those hopes collapsed.

In September, he suffered a seizure in Mexico and was rushed back to Boston, where his doctors found a brain metastasis. This time there would be more-aggressive treatments, and this time his parents would hear what every parent of a sick boy or girl dreads hearing, which is that just in case, if things turn worse, it might be time to think about granting your child some long-held wishes. So Manuel’s parents asked him what his wish was and he said he wanted to visit the set of Criminal Minds.

There aren’t a whole lot of people who haven’t thought about what they’d choose in such a situation, and the folks who’d pick a Polynesian beach house or a tour of Machu Picchu indulge in more than a little elitist sniffing when they hear of people who’d pick the Grand Canyon or Yankees training camp. The cancer romance The Fault in Our Stars made much of this idea, with Augustus Waters affecting shock that Hazel Grace Lancaster chose a trip to Disneyworld with her parents. “‘I can’t believe I have a crush on a girl with such cliché wishes!” he says.

But a wish, of course, is a reflection of a moment—who you are when you must make the choice. And when the number of moments you have left to you is in question, you choose what will make you happy right now, today. So Manuel chose Criminal Minds—and the cast and crew and production office made it happen.

Just before Thanksgiving, he and his family flew to Los Angeles to be present for the shooting of the series’ Dec. 10 episode. Joe Mantegna, the show’s biggest name, was directing that episode and he kept Manuel busy, dispatching him onto the set to work the slate, explaining scenes as he directed them, eating lunch with him during a break. Manuel met the rest of the cast, posed for photos with them and visited the writers’ room—a pretty static place if you’re not one of the show’s many rabid fans; Xanadu if you are.

None of what Manuel experienced in the six hours he was on-set will make a lick of difference in his prognosis—unless, of course, it does. Scientists have never fully understood the multiple ways optimism and hope and just plain being happy can help humans battle disease—except to say with near-certainty that they can.

Just as important is what the small act of kindness that came Manuel’s way—and a million-million others like it that are performed around the world every day—say about the prognosis for the human condition. Evil is vulgar, broad-brush stuff—the dark, mindless business of burning things down or blowing them up. Kindness is pointillist—bright dots of good, dabbed and dabbed and dabbed again. No single one of them amounts to very much. But a million-million every day? That can create an awfully beautiful picture.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME climate change

Watch the Science Cop Take on Climate Change Denying Senator Jim Inhofe

The climate denier in charge of the Senate Climate Committee

You don’t have to be a general to be head of the Senate Armed Services Committee, but if you don’t at least believe in the existence of a military, we’ve got a problem on our hands.

The country is about to face something similar in January, when the GOP takes control of the U.S. Senate and Oklahoma’s James Inhofe — Congress’s most vocal global warming denier — becomes chair of the Senate Environment and Public Works Committee. Inhofe, who has called climate change “the greatest hoax ever perpetrated on the American people,” not only sniffs at what the overwhelming majority of climatologists know to be true, he actually tries to go toe to toe with them on the science. And that’s where he exposes how little he knows — and how wrong he is. The Science Cop explains.

TIME psychology

How Memory Links the Presidency, Ferguson and the Cosby Mess

Do you know me? Relax, you're not alone.
Do you know me? Relax, you're not alone.

Jeffrey Kluger is Editor at Large for TIME.

The human brain forgets much more than it remembers, and that has an impact on history, criminal justice and more

Here’s a difficult one, history buffs: Who was Harry Truman? I know, I know, I told you it would be tough, but think hard: Some famous general? Maybe a physicist?

If you guessed U.S. president, good for you! And if you also knew that Truman was the one who came right after Roosevelt (Franklin, that is) and right before Eisenhower, go to the head of the class.

OK, so maybe remembering Truman isn’t such a big deal. But here’s the thing: By 2040, according to a new study just published in Science, only 26% of college students will remember to include his name if they are asked to make a list of all U.S. Presidents, regardless of order.

That finding, which is less a function of historical illiteracy than of the mysterious ways the human brain works, reveals a lot about the perishability of memory. And that, in turn, has implications for contemporary dramas like the Ferguson tragedy, the Bill Cosby mess and the very underpinnings of the criminal justice system.

The Science study, conducted by a pair of psychologists at Washington University in St. Louis, was actually four studies that took place over 40 years—in 1974, 1991, 2009 and 2014. In the first three, the investigators asked groups of then-college students to list all of the presidents in the order in which they served, and also to list as many of them as they could by name regardless of where they fell in history.

In all three groups over all three eras, the results were remarkably similar. As a rule, 100% of respondents knew the president currently serving, and virtually all knew the prior one or two. Performance then fell off with each previous presidency. Roughly 75% of students in 1974 placed FDR in the right spot, for example. Fewer than 20% of Millennials—born much later—could do that. In all groups, the historical trail would go effectively cold one or two presidents before the subjects’ birth—falling into single digits.

There were exceptions. The Founding Father presidents, particularly the first three—George Washington, John Adams and Thomas Jefferson—scored high in all groups. As did Abraham Lincoln and his two immediate successors, Andrew Johnson and Ulysses S. Grant. As for the Tylers and Taylors and Fillmores? Forget about them—which most people did. The pattern held again in a single larger survey conducted in 2014, with a mixed-age sample group that included Boomers, Gen X’ers and Millennials, all performing true to their own eras.

Almost none of this had to do with any one President’s historical relevance—apart from the Founding Fathers and Lincoln. James Polk’s enormously consequential, one-term presidency is far less recalled than, say, Jimmy Carter’s much less successful four-year stint. Instead, our memory is personal, a thing of the moment, and deeply fallible—and that means trouble.

One of the most disturbing aspects of the Ferguson drama is the mix of wildly different stories eyewitnesses presented to the grand jury, with Michael Brown portrayed as anything from anger-crazed aggressor to supine victim. Some witnesses may have been led by prosecutors, some may have simply been making things up, but at least some were surely doing their best, trying to remember the details of a lethal scene as it unfolded in a few vivid seconds.

If forensic psychology has shown anything, it’s that every single expectation or bias a witness brings to an experience—to say nothing of all of the noise and press and controversy that may follow—can contaminate recall until it’s little more reliable than that of someone who wan’t there at all.

Something less deadly—if no less ugly—applies in the Bill Cosby case. In an otherwise reasonable piece in the Nov. 25 Washington Post, columnist Kathleen Parker cautions against a collective rush to judgment and reminds readers that under the American legal system, Cosby is not a rapist, but an alleged rapist; and his victims, similarly, are as yet only alleged victims. Fair enough; that’s what the criminal justice rules say. But then, there’s this:

“…we have formed our opinions… only on the memories of the women, most of whom say they were drugged at the time. Some of them have conceded that their recollections are foggy—which, of course they would be, after decades and under pharmaceutically induced circumstances, allegedly.”

In other words, if Cosby did drug them, then perhaps we must throw their testimony out of court because, um, Cosby drugged them. Talk about the (alleged) criminal making hay on his crime. And yet, when it comes to the science of memory, that’s an argument that could work before a judge.

Finally, too, there is the unseemly business of Ray Rice. Virtually nobody who knows what he did has forgotten it—which is what happens when you’re a massively strong athlete and you cold-cock a woman. But it was the complete elevator video actually showing the blow, as opposed to the earlier one in which Rice was seen merely dragging the unconscious body of his soon-to-be-wife out into a hotel hallway, that spelled his end—at least until his lifetime NFL ban was overturned on Nov. 28. Knowing what happened is very different from seeing what happened—and once you saw the savagery of Rice’s blow, you could never unsee it.

When it comes to presidents, the fallibility of memory can help. In the years immediately following Richard Nixon’s resignation, it was a lot harder to appreciate his manifest triumphs—the Clean Air Act, the opening to China—than it is now. George W. Bush is enjoying his own small historical rebound, with his AIDS in Africa initiative and his compassionate attempt at immigration reform looking better and better in the rear-view mirror—despite the still-recent debacles of his Presidency.

We do ourselves a disservice if we hold historical grudges against even our most flawed presidents; but we do just as much harm if we allow ourselves to forget why ill-planned land wars in countries like Iraq or cheap break-ins at places like the Watergate are so morally criminal. Forget the sequence of the Presidents if you must, but do remember their deeds.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME space

Watch Christopher Nolan and Kip Thorne Discuss the Physics of Interstellar

Thorne literally wrote the book on (much of) the movie's cosmology

There’s no arguing about the blockbuster status of Interstellar, director Chris Nolan’s latest box office phenomenon. But plenty of people are debating the science component of that sci-fi tale—which is how it always is when a movie based in something as non-negotiable as physics has to take just enough liberties to make the fiction part of the story fly.

Nolan was determined to keep his narrative scientifically honest, which is why he signed on as technical adviser celebrated Caltech physicist Kip Thorne—who literally wrote the book on (much of) the movie’s cosmology. TIME’s Jeffrey Kluger sat down with Nolan and Thorne to talk about human curiosity, the art of sci-fi filmmaking and the one time the two of them locked horns over a plot point.

TIME

On Evolution Day, Remember That Darwin Knew He’d Meet Resistance

127035224
Philippe Lissac—Godong / Getty Images A statue of Darwin in the Natural History Museum, London

Plus, TIME's original coverage of the anti-evolution arguments of the 1925 Scopes trial

Correction appended, Nov. 24, 2014, 5:49 p.m.

Time was, “Darwin” was just a guy’s name. It was not a noun (Darwinism) or an adjective (Darwinian). And it certainly wasn’t a flash point for debate between folks who prefer a Scriptural view of the history of life and those who take a more scientific approach. That started to change 155 years ago today, on Nov. 24, 1859, when Charles Darwin’s seminal work—On the Origin of Species—was published.

Darwin knew that by supporting an empirical theory of evolution as opposed to the Biblical account of Creation he was asking for trouble. Two weeks before the book’s publication, he sent letters to 11 prominent scientists of his day, asking for their support—or at least their forbearance—and acknowledging that for some of them, that would not be easy. To the celebrated French botanist Alphonse de Candolle he wrote:

Lord, how savage you will be, if you read it, and how you will long to crucify me alive! I fear it will produce no other effect on you; but if it should stagger you in ever so slight a degree, in this case, I am fully convinced that you will become, year after year, less fixed in your belief in the immutability of species.

And to American Asa Gray, another botanist, he conceded:

Let me add I fully admit that there are very many difficulties not satisfactorily explained by my theory of descent with modification, but I cannot possibly believe that a false theory would explain so many classes of facts as I think it certainly does explain.

But the whirlwind came anyway. Speaking of Darwin in 1860, the Bishop of Oxford asked: “Was it through his grandfather or his grandmother that he claimed his descent from a monkey?” The battle raged in the U.S. in the summer of 1925, with the trial of John Scopes, a substitute school teacher charged with violating a Tennessee statute forbidding the teaching of evolution in schools.

But Darwin and his theory of evolution endured, so much so that Nov. 24 is now recognized as Evolution Day. As if serendipity and circumstance were conspiring to validate that decision, it was on another Nov. 24, in 1974, that the fossilized remains of Lucy, the australopithecus who did so much to fill in a major gap in human evolution, were found in Ethiopia.

In honor of Lucy and Evolution Day and Darwin himself, check out TIME’s coverage of the florid anti-evolution closing argument of prosecuting attorney and three-time presidential candidate William Jennings Bryan during the Scopes trial, as quoted in the magazine’s Aug. 10, 1925 issue:

“Darwin suggested two laws, sexual selection and natural selection. Sexual selection has been laughed out of the class room, and natural selection is being abandoned, and no new explanation is satisfactory even to scientists. Some of the more rash advocates of Evolution are wont to say that Evolution is as firmly established as the law of gravitation or the Copernician theory.

“The absurdity of such a claim is apparent when we remember that any one can prove the law of gravitation by throwing a weight into the air and that any one can prove the roundness of the earth by going around it, while no one can prove Evolution to be true in any way whatever.”

Bryan died mere days after the trial ended but, as the historical record shows, his strenuous efforts paid off—sort of. Scopes was duly convicted. His sentence for teaching what most of the world now accepts as science: $100.

Read the full text of that story, free of charge, here in the TIME archives, or in its original format, in the TIME Vault: Dixit

Correction: The original version of this article misstated the date of Darwin Day. Darwin Day is typically celebrated on February 12.

TIME space

New View of the Solar System’s Most Fascinating Moon

The newly released image of Jupiter's moon Europa.
NASA/JPL-Caltech/SETI Institute The newly released image of Jupiter's moon Europa.

NASA's reprocessed picture of Jupiter's Europa gives us a fresh look at the likeliest place in the solar system for extraterrestrial life.

This is not the back of an eyeball—even though it looks like the back of an eyeball. It’s Jupiter’s frozen moon Europa—the sixth largest moon in the solar system, just behind Earth’s. But the organic appearance of Europa in this newly released, newly reprocessed image captured by the Galileo spacecraft in the late 1990s is apt all the same, because the moon may be the likeliest world in the solar system to harbor extraterrestrial life.

Europa is entirely covered by a shell of water ice, anywhere from 1.8 mi. to 62 mi. (3 to 100 km) thick, depending upon which astronomer’s estimates you’re using and where on the moon you’re measuring. But the existence of the ice is proven, and it all but certainly covers a deep, mineral rich water ocean descending to a depth of another 62 mi. It is tidal flexing that keeps the ocean liquid—the steady gravitational plucking Europa experiences every time is passes or is passed by one of its three large sister moons, Io, Ganymede and Callisto.

In the same way a wire hanger bent rapidly back and forth can become too hot to touch at the point of flexing, so too does the center of Europa heat up. That causes the water to remain both relatively warm and constantly in motion. Keep that up for 4 billion years in an oceanic environment believed to contain hydrocarbons, and you may well cook up something living.

The most compelling evidence for Europa’s dynamic behavior was gathered by Voyager 2, when it flew by the moon in 1979, and Galileo, when it arrived in Jovian orbit in 1995. The cameras of both spacecraft captured the vascular-looking webwork of fractures in the moon’s surface ice, and close up images revealed what looked like jagged icebergs that had broken free, tipped sideways and quickly frozen back in place in the paralyzing cold of deep space. All this suggested an ocean that was in constant motion.

The colors used in earlier versions of the reprocessed image were based on knowledge of what the moon’s chemistry is and a bit of conjecture about exactly what shades it would produce. But the new version is based on both improved knowledge and improved image processing. The ruddy colors in the fractures are the products of the minerals that bubble up through the cracks. Green, violet and near-infrared filters were used to establish the proper palette.

A better, more accurate picture of Europa does nothing to change the facts on the ground there—or, more tantalizingly, below the ground. The moon remains the most fascinating non-Earthly object in our solar system. The new image, however, does serve as one more come-hither gesture from a world that’s been beckoning us to return for a long time.

TIME weather

What Is Lake-Effect Snow? (Hint: It Involves a Lake)

Wintry Weather New York
Gary Wiepert—AP A band of storm clouds moves across Lake Erie and into Buffalo, N.Y., on Nov. 18, 2014

Why Arctic air, a prevailing wind and a body of water can cause a blizzard

You don’t need a meteorologist to tell you what lake-effect snow is: it’s snow that’s, um, caused by a lake, right? As it turns out, things are a teensy bit more complicated than that, and if you live in one of the states bordering the Great Lakes that are forever getting clobbered by the stuff — or even if you just marvel at the footage of the latest white-out to hit those luckless places — it can help to know what’s actually going on.

Lake-effect snow starts the way so much other winter misery does, with a blast of Arctic air descending on us from the north. Water temperature, even in the Great Lakes in winter, is generally higher than air temperature, since water retains heat longer than air does, and the long, slow warming from the summer months tends to linger. Sometimes the difference in temperature — what’s known as the lapse rate — between the onrushing Arctic air and both the water and the thin layer of local air just above it can be as much as 25ºF (14ºC). That gets things churning in a lot of ways.

For one thing, the air draws moisture from the warmer lake in the same way a hurricane will as it passes over the Gulf of Mexico, gathering in fuel in the form of heat and water. The Great Lakes water warms the Arctic air too, causing it to rise; the act of rising, in turn, causes the air temperature to drop right back down. But that cold air is now carrying more moisture, which condenses into clouds — and those clouds produce snow.

Cold air does not hold as much moisture as warmer air does, which means that lake-effect storms should be heavy but relatively brief. But a lot of things can change that. Air encounters greater friction as it moves over land than it does over water, which causes it to slow down and pile up as the higher-speed air streaming across the lake rear-ends the air that has made landfall, in the same way cars can on a highway collide when the driver in front hits the brake too fast. That intensifies any snowfall.

Elevation can make a difference too. Relatively flat ground adjacent to the lake will have a higher air temperature than hilly land; the colder the air is over those elevated regions, the greater the cloud formation and resulting precipitation.

What’s more, not all Great Lakes are created equal. The distance the Arctic air has to travel over water — what’s known as the fetch — changes depending on how the lakes are oriented. Since cold air moves roughly from the northwest to the east, Lakes Michigan and Huron and part of Superior — which are generally oriented north to south — require less of a watery crossing. Lakes Erie and Ontario and the eastern half of Superior are oriented more east to west, giving cold air more of an opportunity to pick up moisture. The direction of the air also means that cities that lie to the east of a lake get hit harder (we’re looking at you, Buffalo). But even a slight shift in winds means everyone takes the blast (hello, Chicago).

None of this makes a whit of difference when your city gets clobbered by a sudden blizzard. But if you can’t be a true New Yorker or Los Angeleno without knowing just which subway lines or highways to curse, you can’t really be a Midwesterner without understanding why you’re going to spend the next three hours of your life trying to dig your car out of 18 inches of snow.

TIME psychology

Extraterrestrials on a Comet Are Faking Climate Change. Or Something

Just to be clear: This is a comet, not a spacecraft
ESA Just to be clear: This is a comet, not a spacecraft

Jeffrey Kluger is Editor at Large for TIME.

Conspiracy theories never die, but that doesn't mean we can't get smarter about dealing with them

You’ve surely heard the exciting news that the European Space Agency successfully landed a small spacecraft on the surface of Comet 67P—or perhaps we should say “Comet 67P.” Because what you probably haven’t heard is that the ostensible comet is actually a spacecraft, that it has a transmitting tower and other artificial structures on its surface, and that the mission was actually launched to respond to a radio greeting from aliens that NASA received 20 years ago.

Really, you can read it here in UFO Sightings Daily, and even watch a video that seals the deal if you have any doubt.

None of this should come as a surprise to you if you’ve been following the news. Area 51, for example? Crawling with extraterrestrials. The Apollo moon landings? Faked—because it makes so much more sense that aliens would travel millions of light years to visit New Mexico than that humans could go a couple hundred thousand miles to visit the moon. As for climate change, vaccines and the JFK assassination? Hoax, autism and grassy knoll—in that order.

Conspiracy theories are nothing new. If the Protocols of the Elders of Zion and the myriad libels hurled at myriad out-groups over the long course of history indicate anything, it’s that nonsense knows no era. The 21st century alone has seen the rise—but, alas, not the final fall—of the birthers and the truthers and pop-up groups that seize on any emerging disease (Bird flu! SARS! Ebola!) as an agent of destruction being sneaked across the border from, of course, Mexico, because… um, immigration.

The problem with conspiracy theories is not just that they’re often racist, foster cynicism and erode the collective intellect of any culture. It’s also that they can have real-world consequences. If you believe the fiction about vaccines causing autism, you will be less inclined to vaccinate your kids—exposing them and the community at large to disease. If you believe climate change is a hoax, you just might become the new chairman of the Senate’s Environment and Public Works Committee, as James Inhofe, Republican of Oklahoma soon will be, thanks to the GOP’s big wins on Nov. 4.

That’s the same James Inhofe who once said, It’s also important to question whether global warming is even a problem for human existence… In fact, it appears that just the opposite is true: that increases in global temperatures may have a beneficial effect on how we live our lives.” It’s the same James Inhofe too who wrote the 2012 book, The Greatest Hoax: How the Global Warming Conspiracy Threatens Your Future. So, not good.

Clinical studies of conspiracy theory psychology have proliferated along with the theories themselves, and the top-line conclusions the investigators have reached make intuitive sense: People who feel powerless are more inclined to believe in malevolent institutions manipulating the truth than people who feel more of what psychologists call “agency,” or a sense of control over their own affairs.

That’s why the CIA, the media, the government and the vaguely defined “elite” are so often pointed to as the source of all problems. That’s why the lone gunman is a far less satisfying explanation for a killing than a vast web of plotters weaving a vast web of lies. (The powerlessness explanation admittedly does not account for an Inhofe—though in his case, Oklahoma’s huge fossil fuel industry may be all the explanation you need.)

Psychologist Viren Swami of the University of Westminster in London is increasingly seen as the leader of the conspiracy psychology field, and he’s been at it for a while. As long ago as 2009, he published a study looking at the belief system of the self-styled truthers—the people who claim that the 9/11 attacks were carried out by the U.S. government as a casus belli for global war.

He found that people who subscribed to that idea also tested high for political cynicism, defiance of authority and agreeableness (one of the Big Five personality traits, which also include extraversion, openness, conscientiousness and neuroticism). Agreeableness sounds, well, pleasantly agreeable, but it can also be just a short hop to gullible.

In 2012, Swami conducted another study among Malaysians who believe in a popular national conspiracy theory about Jewish plans for world domination. Swami found that Malaysians conspiracists were likelier to hold anti-Israeli attitudes—which is no surprise—and to have racists feelings toward the Chinese, which is a little less expected, except that if there were ever a large, growing power around which to build conspiracy theories, it’s China, especially in the corner of the world in which Malaysia finds itself.

The antisemitic Malaysians also tended to score higher on measures of right-wing authoritarianism and social domination—which is a feature of almost all persecution of out-groups. More important—as other studies have shown—they were likelier to believe in conspiracy theories in general, meaning that the cause-effect sequence here may be a particular temperament looking for any appealing conspiracy, as opposed to a particular conspiracy appealing to any old temperament. People who purchased Jewish domination also liked climate change hoaxes.

Finally, as with so many things, the Internet has been both potentiator and vector for conspiracy fictions. Time was, you needed a misinformed town crier or a person-to-person whispering campaign to get a good rumor started. Now the fabrications spread instantly, and your search engine lets you set your filter for your conspiracy of choice.

None of this excuses willful numbskullery. And none of it excuses our indulgence in the sugar buzz of a sensational fib over the extra few minutes it would take find out the truth. If you don’t have those minutes, that’s why they invented Snopes.com. And if you don’t have time even for that? Well, maybe that should tell you something.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Science

Neil deGrasse Tyson on Interstellar, Abe Lincoln and Respecting Science

Jeffrey Kluger is Editor at Large for TIME.

The host of the new Cosmos has some smart things to say about Hollywood and hard science—in 140 characters

You’ve probably never seen a movie with Neil DeGrasse Tyson. That’s too bad, because now you have reason to wish you had. That, at least, is how I felt when I read the stream of Tweets Tyson sent out after he watched Interstellar, the sci-fi blockbuster that traffics in a lot of the same cosmological physics he tackles in his career as director of New York’s Hayden Planetarium and host of the 21st-century update of Carl Sagan’s Cosmos. The experience was a little like binge-watching the Syfy Channel with your smartest nerd friend. There were these, for example:

Then there was this, because your smartest nerd friend is no fool.

And then there was this, because that same nerd friend knows it’s high time science was made cool, fashionable and entirely gender-nonspecific.

Tyson stresses that he is an educator first, and a celebrity, public figure and commentator second, third and fourth. For that reason, he does not pretend to be a film critic.

“If you tell people you liked or didn’t like a movie, a fight immediately breaks out,” he told TIME in a phone conversation today. “I’m not a fighter. If a movie makes no pretense of being scientifically accurate, I like to point out the things they got right—like when Star Wars showed a planet with a double sunset. If a movie does make pretenses of accuracy, I feel it’s my responsibility to point out what they got wrong.”

Tyson didn’t find a lot to quarrel with in Interstellar scientifically—which has been the consensus among critics, whether they gave the movie a rave or a pan. But he was more struck by what the movie offers not just cosmologically, but culturally—beginning with those lead characters.

“You don’t have to look too far back in history for the Godzilla-type film in which the scientist is the one responsible for the problem, and he usually dies with his creation,” he says. “In this movie, the characters are all scientists, they all have fully fleshed-out personalities—as parents and children and spouses. It’s an important part of the story and it bodes well for this kind of movie in the future.”

The appeal of Interstellar also speaks to our primal—and improbable—fascination with cosmology. As I wrote in last week’s TIME cover story, the number of people on the planet who actually understand the physics that govern the universe is tiny; what’s more, that science has no direct impact on our lives day-to-day. And yet we can’t get enough of it. Compare that to equally complex fields like biology and medicine. Our health and very lives turn on those areas of study, but if you try to start a technical conversation about them, people go blank.

“I don’t think the fascination with cosmology necessarily begins with the science itself,” Tyson says. “I think it’s a primal fascination with exploration. Not every person has that urge—and that’s good. If we all did, humanity wouldn’t survive because a lot of explorers die. But I’m proud to be a member of a species in which there are enough explorers to lay out the terrain so that the people who follow don’t risk as much.”

There is, as well, something significant in the timing of Interstellar—which opened in the same week as the equally science-wise The Theory of Everything, a biopic of Stephen Hawking. Both movies are thriving, but both are also being released into a culture that is increasingly awash in science denialism. We may gobble up what Hollywood offers us, but when it comes time to fund NASA or the National Academy of Sciences (NAS), or time to simply accept the truth of climate change, we turn into primitives, killing the funding we need and ignoring the dangers we face.

As I’ve written before, politicians are increasingly playing the “I’m not a scientist” card, wearing their lack of scientific credentials as both populist badge and political shield, allowing them to deny what the real scientists are telling them without actually saying so. But Tyson believes that this ruse may have run its course.

“There’s a limit to how much you can continue to deny science, and its close cousin technology, without feeling an impact on your wallet,” he says. “It’s enlightened investments in—and innovations derived from—science and technology that fuel economies. To cherry-pick science for your cultural, political or religious morés is to deny the role it’s played in enabling the wealth the country enjoys.”

And while expedience and cynicism are bipartisan scourges, there’s no denying that one party is guiltier of science denialism than the other—and, as Tyson points out in another Tweet, that party just got a whole lot stronger. There’s a certain irony in that.

“Go back 150 years or so,” he says. “That’s when Abe Lincoln established the NAS. It was meant to serve as a scientific advisory body for the betterment of the country—and it was created by a Republican. I don’t think any leader could get into trouble for saying ‘I may not be a scientist, but I’ve got advisers who are—and I’m going to listen to them.'”

Lincoln may have known nothing about the science in Interstellar or the problem of global warming, but he surely would have known enough to listen and learn. Let’s hope his 21st century heirs—of both parties—can learn to do the same.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

Your browser is out of date. Please update your browser at http://update.microsoft.com