TIME faith

The Evolution of Modern Satanism in the United States

Occult Cover
Cover Credit: JACK AND BETTY CHEETHAM The June 19, 1972, cover of TIME

The unveiling of a statue in Detroit has garnered fresh headlines for Satanism

This weekend, hundreds of adherents and observers flocked to a Detroit warehouse to witness the unveiling of a statue erected on behalf of the Satanic Temple. As organizer Jex Blackmore told TIME, the Satanic Temple isn’t quite a religious organization, but rather a group of people who prioritize human logic. One of the meanings of the monument, Blackmore added, is to celebrate “a reconciliation of opposites”—particularly in relation to the public display of monuments of other faiths.

But, though the new statue has earned the Satanic Temple a fresh round of attention, Satanism has a long tradition.

In the early 1970s, interest in the occult in American culture was so high that TIME devoted a cover story to the topic, and a large portion of it was focused on Satanism. As the story pointed out, the idea of “the Devil” is an ancient one, predating the Old Testament’s coinage of Satan. The early days of Christianity saw the development of a theology about Satan, and an increase of his agency and power in religious stories. Narratives outside the biblical canon expanded that characterization; by the 13th century, Satan was seen to be mighty (and popular) enough to be worthy of condemnation.

“Some of the confessions [in the Inquisition age] must have been sheer defiance: faced with a ruling establishment that was sanctified by the church, a resentful peasantry followed the only image of rebellion they knew—Satan,” TIME posited. “The satanic messiah became especially appealing in times of despair, such as the era of the plague known as the Black Death. Real or imagined, the pact with the Devil may have been the last bad hope for safety in a world fallen out of joint.”

Perhaps for that reason, the Christian Church’s efforts to root out Satanism were not entirely successful. The French aristocracy under Louis XIV was titillated by tales of nude demonic ritual, and the prim and proper Victorian period saw a spike in interest too.

But the existence of Satanists as an organized, public group in the United States is a much newer phenomenon, much of which can be largely traced to one man: Anton Szandor La Vey, author of 1969’s The Satanic Bible. La Vey founded the Church of Satan in 1966 in San Francisco. As TIME explained in 1972, La Vey’s organization was not the scary Satanism of religious imagination:

La Vey’s church and its branches might well be called the “unitarian” wing of the occult. The members invest themselves with some of the most flamboyant trappings of occultism, but magic for them is mostly psychodrama —or plain old carnival hokum. They invoke Satan not as a supernatural being, but as a symbol of man’s self-gratifying ego, which is what they really worship. They look down on those who actually believe in the supernatural, evil or otherwise.

La Vey’s church is organized, incorporated and protected under the laws of California. La Vey, 42, stopped giving out membership figures when his followers, who are grouped in local “grottoes,” reached a total of 10,000. The most striking thing about the members of the Church of Satan (one of whom is shown on TIME’S cover) is that instead of being exotic, they are almost banal in their normality. Their most insidious contribution to evil is their resolute commitment to man’s animal nature, stripped of any spiritual dimension or thought of self-sacrifice. There is no reach, in Browning’s famous terms —only grasp. Under the guise of eschewing hypocrisy, they actively pursue the materialistic values of the affluent society—without any twinge of conscience to suggest there might be something more.

Though the 1960s and ’70s saw the introduction of several other concepts called Satanism—from actual religious belief, to a credo used to justify criminality—the Church of Satan did not fade away. In 1978, the U.S. Army even included the group in the manual of “Religious Requirements and Practices” delivered to its hundreds of chaplains. (TIME mentioned that the manual explained that Church of Satan devotees might need “candles, a bell, a chalice, elixir, a sword, a gong, parchment and ‘a model phallus,'” but that chaplains would not be expected to supply those materials.) Though La Vey died in 1997, the organization he founded continues without him.

The brand of Satanism on display in Detroit was of a different sort: political Satanism, a more recent innovation. Those activists are associated with the Satanic Temple, a New York-based group that has spent the last few years publicly offering alternatives to more mainstream displays of religiosity. The Satanic Temple sees Satan as a Paradise Lost-inflected metaphor who represents skepticism and the ability to challenge authority. A spokesperson for the Church of Satan told TIME in 2013, for a story highlighting the differences between the two groups, that the newer organization was focused on “politically oriented stunts” that had “cribbed” their philosophy from the more established group. Meanwhile, the Satanic Temple said that its aim was, in cases where religion had been inserted into the public sphere, “to ensure that its view of the world is included.” If the Detroit attendance figures are any indication, they’ve succeeded.

The continued existence of two organizations that claim Satanism for two different functions highlights a point made by John M. Kincaid, the Church of Satan’s minister of information in the mid-1970s: though it may take a variety of forms, interest in mystery and rebellion is timeless. “The need to believe,” he wrote to TIME in 1974, “is as dominant a factor in this so-called enlightened age of ours as it has ever been”—which means those who are skeptical are present and accounted for too.

Read the full story from 1972, here in the TIME Vault: A Substitute Religion

TIME conflict

See How Northern Ireland’s Bloody Sunday Turned Violent

An exclusive clip from the upcoming episode of CNN's 'The Seventies' shows how the historic clash began

In late January of 1972, the Prime Minister of Northern Ireland took a risk: he banned all protest demonstrations. Parades had been the starting points of several clashes during the long conflict over Britain’s role in the region, and it seemed like ending them couldn’t hurt—at first. “There were some suggestions that the I.R.A., for its part, might try a new tactic by organizing illegal parades of Catholics to test the ban and the government’s will,” TIME reported. “The result might well mean more bloody clashes between the warring sects, the need for still more British troops to maintain order, and more trouble for a land that has trouble enough.”

As shown in this exclusive clip from the next episode of CNN’s The Seventies, which airs on Thursday at 9 p.m. E.T., the prediction that the parade ban would not put an end to violence quickly proved correct. In fact, the violence that followed shortly after the ban was one of the best-known incidents of the period: Bloody Sunday.

On Jan. 30, 1972, a Catholic protest over the imprisonment of I.R.A. suspects turned violent, as TIME reported the following week:

On that bright, wintry afternoon, a march in the Catholic ghetto of Londonderry called the Bogside suddenly turned into a brief but violent battle between the marchers and British troops. When the shooting stopped, 13 people lay dead in one of the bloodiest disasters since the “troubles” between Ulster’s Protestant majority and Catholic minority began almost four years ago. The incident seemed to end almost all hope of a peaceful settlement in Northern Ireland. Not since the executions that followed Dublin’s 1916 Easter Rising have Catholic Irishmen, North and South, been so inflamed against Britain and so determined to see Ireland united in one republic at last.

Read more from 1972, here in the TIME Vault: The Bitter Road from Bloody Sunday

TIME movies

Bugs Bunny at 75: Watch the First-Ever ‘What’s Up, Doc?’ Moment

Bugs Bunny first appeared on July 27, 1940

The usual gestation period for a rabbit is a month. But Bugs Bunny, the iconic cartoon character who turns 75 on Monday, took a lot longer to come to life.

Scroll down to read that story–but not before watching a clip from his first official appearance, in the 1940 Tex Avery cartoon A Wild Hare. The Oscar-nominated cartoon has all the classic Bugs favorites: outwitting Elmer Fudd, the signature ears and tail, the “What’s up, Doc?”

(Looney Tunes characters, names, and all related indicia are TM & © Warner Bros. Entertainment Inc. You can watch the whole thing here.)

Here’s how the world’s favorite cartoon rabbit came to be. Animator Chuck Jones gave credit to Tex Avery for the character, but Warner Bros. had made several rabbit cartoons in the studio’s earlier years. There were cutesy rabbits and wacky rabbits, but those rabbits aren’t Bugs. (One distinction, Jones explained, was that Bugs’ craziness always serves a purpose–in contrast to the unhinged Daffy Duck.)

The Wild Hare bunny is uncredited, though that changed before the year was up. Bugs was an instant star. By 1954, TIME noted that he was more popular than Mickey Mouse. (Mel Blanc, who voiced the character, later claimed that the name was his idea, saying that they were going to call the character Happy Rabbit, but that Blanc suggested naming him after animator Ben “Bugs” Hardaway. Alternatively, the name is sometimes traced to a sketch that designer Charles Thorson did on Hardaways’ request, with the caption “Bugs’ bunny”—as in, it was the bunny that Bugs had asked him to draw.)

Though Virgil Ross was the animator on A Wild Hare, Chuck Jones became one of the more famous hands behind the Bugs Bunny magic. In 1979, when The Bugs Bunny/Road Runner Movie came out, TIME critic Richard Schickel noted that “it is possible that some day Animator Chuck Jones may come to be regarded as the American Bunuel” for the fact that Jones and the groundbreaking surrealist filmmaker so well understood the psychological underpinnings of comedy.

As these images from the late artist’s archives show, Bugs Bunny may have taken a long time to be born—but he sure has aged well.

"Picture the Future" a hand-painted cel art edition by Chuck Jones
Courtesy Chuck Jones Center for Creativity“Picture the Future” a hand-painted cel art edition by Chuck Jones

Read TIME’s take on Warner cartoons’ 50th birthday, in 1985, here in the TIME Vault: For Heaven’s Sake! Grown Men!

TIME People

What Hiram Bingham Got Wrong About Machu Picchu

Hiram Bingham 1911
Apic—Getty Images Yale graduate and American explorer Hiram Bingham (1875-1956) who discovered the Machu Picchu in Peru, July 24, 1911.

The explorer had first reached the ancient Incan city on July 24, 1911

Until the archeologist Hiram Bingham came across it on this day, July 24, in 1911, most of the world thought the ancient Incan city of Machu Picchu was lost, as was their capital Vilcabamba. As TIME reported in 1948, when Bingham returned to Peru to celebrate the opening of a road to the site, which would bear his name, he began by studying old charts and texts, until he was sure that there was an Incan capital city somewhere in the Andes that had never been found by the Spanish invaders. He got a key tip from a local muleteer and, upon climbing Machu Picchu peak, found the lost city hidden under vines.

Of course, the very fact that the muleteer had the tip to offer means that Machu Picchu was never completely lost in the first place. It was just ignored by all but the locals who lived their lives around the site. Shortly after Bingham’s death, when a plaque was dedicated to him at the site, the magazine had cause to revisit the tale:

Some experts believe that parts of the city, which Bingham named Machu Picchu (Old Peak), are 60 centuries old, which would make it 1,000 years older than ancient Babylon. More recently, if its ruins are interpreted correctly, it was at once an impregnable fortress and a majestic royal capital of an exiled civilization.

Built on a saddle between two peaks, Machu Picchu is surrounded by a granite wall, can be entered only by one main gate. Inside is a maze of a thousand ruined houses, temples, palaces, and staircases, all hewn from white granite and dominated by a great granite sundial. In Quechua, language of the sun-worshipping Incas and their present-day descendants, the dial was known as Intihuatana—hitching post of the sun.

By Bingham’s own reckoning, the city was actual a pre-Incan fortress that eventually became a Quechua city, where the first Incan king was born. When the Spanish arrived, Bingham said, the Incas who could fled to Machu Picchu, but the empire only lasted a few more decades before the last of their kings was killed in the 16th century.

Though Machu Picchu never lost its appeal to tourists, it did turn out that Bingham’s account of what had happened there wasn’t exactly true. Modern experts argue that Machu Picchu was a mere country retreat for aristocracy—and not a major center of Incan life at all.

Read more from 1948, here in the TIME Vault: Explorer’s Return

TIME movies

This Movie Is Changing the Way Pregnancy Is Shown on Film

Unexpected
The Film Arcade

Unexpected, out July 24, stars Cobie Smulders as a teacher coming to terms with a surprise pregnancy

It wasn’t until it came time to figure out the labor scene that Kris Swanberg realized her movie about pregnancy was unusual. After all, there are lots of movies out there about people having babies. But, when she sat down with her director of photography to look at some examples of how delivery had been handled in those forerunners, she noticed something strange: almost every on-screen birth that she watched was portrayed from the point of view of a man in the room.

“I didn’t set out and say, ‘I’m going to make a movie from the female perspective, dammit!’” Swanberg says. “But because I’m a woman and I wrote it and a lot of it was based on my own personal experience, it just sort of happened that way. Not until after the fact did I realize that it’s actually very rare.”

The film, Unexpected (in theaters, on demand and on iTunes July 24), directed by Swanberg and co-written with Megan Mercier, is the story of a high-school teacher (played by How I Met Your Mother’s Cobie Smulders) facing a surprise pregnancy at the same time as one of her most promising students (impressive newcomer Gail Bean). As they both face variations on the same question—how a baby will affect their plans for the future, whether it’s a dream job or a college education—they form a friendship; unusually, it’s that relationship, rather than their romantic ones, that’s at the movie’s center. Smulders’ character’s mother is played by Elizabeth McGovern, who starred in She’s Having a Baby, which perhaps the ultimate example of a movie about pregnancy and birth seen from a man’s perspective.

The personal experience on which Unexpected is based is a combination of Swanberg’s time spent as a teacher in Chicago’s west side and her experiences juggling work and motherhood. Swanberg’s husband is the filmmaker Joe Swanberg, so they’ve been able to alternate work and primary-parenting when it comes to their son, but the question took on an extra layer of meaning around Unexpected: their second child is due right around the same time that the movie is.

“Everyone asks, without ill intentions, what I’m going to do [about working]. Everyone asks every pregnant woman that,” she tells TIME. “Everyone expects a man to go back to work.”

That conundrum had an extra layer of meaning for Smulders too, who was already pregnant when she was offered the part. (Her baby was born in January, and she jokes that the best part of the coincidence was that the Unexpected production was able to save money on belly prosthetics.) It’s important to have movies about motherhood that focus on the identity-crisis aspect of being pregnant, she notes, because those pop-culture depictions determine many people’s ideas of what that life change will be like in reality.

“It was so unknown to me [before having children]. You have a general idea of what it’s going to be and that is formed by film or commercials or people you know,” she says. “I think if I were really fully informed by the things I watched on TV I would think that kids were really quippy and had amazing comedic timing and learned life lessons so fast and so easily, just with a conversation at the end of the bed.”

So balancing out Full House and Party of 5—a major source of information about families for a younger Cobie Smulders—is a big deal. And there’s already evidence that the message is getting through.

“Showing the film, I’ve gotten a lot of people come up and say, ‘This is exactly what I went through,’” Swanberg says. “And men coming up and saying, ‘Now I really understand.’”

Read next: How an Unplanned Pregnancy Ended Up Being the Right Choice for Me

Download TIME’s mobile app for iOS to have your world explained wherever you go

TIME White House

This Was the First Time a Sitting U.S. President Visited Africa

FDR Inspects The Troops
PhotoQuest / Getty Images Franklin D. Roosevelt (in suit, seated in jeep at left) reviews US troops as military commander Lieutenant General George S. Patton (right), Casablanca, Morocco, Jan. 17, 1943.

President Obama visits Kenya and Ethiopia this month. The circumstances were quite different in 1943

President Obama travels to Kenya on Thursday to attend the Global Entrepreneurship Summit, and then continues his African trip with a visit to Ethiopia, the first time a sitting U.S. president will visit that country. He’ll be focused on global business and peaceful diplomacy—a far cry from what happened with the first sitting president to visit Africa.

When Franklin D. Roosevelt landed on the continent 72 years ago amid World War II, it was the first time since the Civil War that a sitting president had visited an active war zone, as well as the first time ever that one had traveled by plane. The occasion was Roosevelt’s January 1943 visit to Casablanca to discuss the conflict with Winston Churchill.

As TIME reported shortly after, the trip was a fruitful one. The air-travel part of the plan was kept secret—an important concern given that the president’s plane was flying over an ocean patrolled by Axis planes and ships—but, once he arrived safely and the meetings got underway, the world was looped in on what had happened:

U.S. news correspondents in North Africa were flown secretly to Casablanca for a press conference on the tenth day. They found well-pleased Franklin Roosevelt in the garden of the villa where he had stayed: he was comfortable in a light grey suit, the angle of his long cigaret holder was even jauntier than usual.

This was the first press conference any American President had ever held beneath a protective umbrella of fighter planes. In the desert heat, beneath the roaring planes, General de Gaulle and General Giraud shook hands while photographers’ flash bulbs popped. The President said this was a momentous moment.

The two war leaders lived up to the moment. They explained that they had reached “complete agreement” on 1943 war plans, that the goal was “unconditional surrender” of the Axis nations. The President remarked that their meeting had been unprecedented in history; the Prime Minister added that it surpassed anything in his World War I experience. The President had some good morale-building words for American troops abroad: “I have seen the bulk of several divisions. I have eaten lunch in the field, and it was a darn good lunch, too. . . . Our soldiers are eager to carry on the fight and I want you to tell the folks back home that I am proud of them. …”

Read the full story from 1943, here in the TIME Vault: Appointment in Africa

TIME Books

Read TIME’s Original Reviews of E.L. Doctorow’s Books

His characters "discover the submerged foundations of the American psyche"

The death on Tuesday of E.L. Doctorow ended a decades-long career built on emphasizing the “story” in history.

As TIME described it in a 1975 bio that accompanied the review of his masterwork Ragtime, the story of how he became a writer was one built on belief in himself: “Not long after he got out of the Army in 1954, E.L. (Edgar Lawrence) Doctorow sat down on a wooden crate in front of his typewriter and told his wife Helen, ‘This is the way we are going to survive.’ He had $135 to his name. Forty-eight hours later, he had $50 left and a lot of blank paper. For the next 20 years, Doctorow fought the blank page—and won four times.” During those decades he had several other jobs (airline clerk, editor, teacher) but from that point on he was what he had intended to be: a writer.

Here’s what TIME said about several of his best-known works:

The Book of Daniel (1971):The Book of Daniel, transparently based on the Rosenberg case, is a bold novel that, all things considered, is surprisingly successful. Doctorow‘s biggest gamble was sinking his energies into the Rosenberg case in the first place. Not that successful fiction cannot spring from old newspapers, as Dostoevsky and Dreiser both demonstrated. But the Rosenberg trial was a kind of drawn-out, draining and rather grisly national ordeal.”

Read the full review

Ragtime (1975): “In Doctorow‘s hands, the nation’s secular fall from grace is no catalogue of sin, no mere tour de force; the novelist has managed to seize the strands of actuality and transform them into a fabulous tale.”

Read the full review

Loon Lake (1980): “The written surface of Loon Lake is ruffled and choppy. Swatches of poetry are jumbled together with passages of computerese and snippets of mysteriously disembodied conversation. Narration switches suddenly from first to third person, or vice versa, and it is not always clear just who is telling what. Chronology is so scrambled that the aftereffects of certain key events are described before the events occur. Such dislocations are undeniably frustrating at first, but they gradually acquire hypnotic force. Reading the book finally seems like overhearing bits of an oddly familiar tune.”

Read the full review

World’s Fair (1985):Doctorow calls it a novel. But the book reads like a memoir, and is unmistakably based on the author’s early boyhood in the Bronx. The account begins with a bed wetting in the middle of the Depression and ends on the eve of World War II with a nine-year-old Edgar Altschuler burying a cardboard time capsule containing a Tom Mix decoder badge, his school report on the life of F.D.R., a harmonica and a pair of Tootsy Toy lead rocket ships, ‘to show I had foreseen the future.'”

Read the full review

Billy Bathgate (1989): “[Doctorow] is mixing elements from his other novels in a manner that proves combustible and incandescent. Part of the allure springs from the subject, which plays upon the mysterious fascination that outlaws and gangsters have always held for law-abiding American citizens.”

Read the full review

The Waterworks (1994): “Even longtime readers, though, are likely to find The Waterworks Doctorow‘s strangest and most problematic invention so far. The setting is New York City in 1871, although the story of what happened there and then is told at an indeterminate later date by a man named McIlvaine, who notes, at one point in his narrative, ‘I have to warn you, in all fairness, I’m reporting what are now the visions of an old man.’ A number of similar caveats are interspersed throughout the story, and taken together they add another level of mystery to the point he makes over and over again: he has been a witness to horror and lived to tell the tale.”

Read the full review

City of God (2000): “The true miracle of City of God is the way its disparate parts fuse into a consistently enthralling and suspenseful whole. In such novels as Ragtime (1975) and Billy Bathgate (1989), Doctorow mixed historical and fictional figures in ways that magically challenged ordinary notions of what is real. His new novel repeats this process, with even more intriguing and unsettling consequences.”

Read the full review

The March (2005): “History. James Joyce called it a nightmare from which he was trying to awake. But for E.L. Doctorow it’s more of an ill-defined dream state that he doggedly revisits, working all the while to get the thing decoded. In his best books, like Ragtime and Billy Bathgate, Doctorow mixes historical figures with fictional characters to discover the submerged foundations of the American psyche. His spellbinding new novel, The March (Random House; 363 pages), is one to put beside those, a ferocious reimagining of the past that returns it to us as something powerful and strange.”

Read the full review

TIME feminism

See What Happened When Feminists Squared Off With Hugh Hefner in 1970

Watch an exclusive clip from the upcoming episode of CNN's 'The Seventies'

When Susan Brownmiller and Sally Kempton appeared as representatives of the women’s liberation movement alongside Hugh Hefner on The Dick Cavett Show in 1970, Cavett joked, “We really set you up tonight, didn’t we?”

Though Hefner’s Playboy was thriving, Cavett’s line really applied more to him. As seen in this exclusive clip from the upcoming episode of CNN’s The Seventies, airing on Thursday at 9:00 p.m., Hefner seemed to have no idea what was coming.

From the minute he referred to the activists as “girls,” he was put in his place. The women took full advantage of their public forum to express thoughts and feelings that had been bottled up for so long, and the nation took notice. When TIME’s Person of the Year honor for 1975 was given to 12 separate Women of the Year, Brownmiller was one of them.

The magazine dubbed her the “second-sex scholar” and explained why she deserved the recognition:

Four years ago Susan Brownmiller, one of feminism’s most articulate and visible activists, disappeared into the library stacks. She surfaced last fall with Against Our Will: Men, Women and Rape, the most rigorous and provocative piece of scholarship that has yet emerged from the feminist movement. Brownmiller’s meticulously researched book—a kind of Whole Earth Catalog of man’s inhumanity to woman or, as Novelist Lois Gould called it, “everything one never wanted to know about sex”—may significantly change the terms of the dialogue between and about men and women. Many shrink from her conclusions: that marriage as an institution has its historical roots in the fear of rape; that the rapist is the ultimate guardian of male privilege; that rape is “the conscious process by which all men keep all women in a state of fear.” But she persuasively argues that all forms of oppression have their origin in the often brutal reality of unequal physical power and that this primal fact of life continues to define and distort relationships between the sexes.

Read TIME’s 1972 special report on the state of feminism: The American Woman

Read the Women of the Year, 1975, issue, here in the TIME Vault: A Dozen Who Made a Difference

TIME Sports

See Photos From the First Special Olympics

The first Special Olympics kicked off on July 20, 1968

It was on this day, July 20, in 1968, that the first International Special Olympics Summer Games were held in Chicago. The event, conceived by Eunice Kennedy Shriver, had been years in the making. As TIME explained in Shriver’s 2009 obituary:

The middle child of nine, Shriver grew up in the shadow of Rosemary, the “mildly retarded” sister who loved to play but couldn’t keep up. When Rosemary was 23, she had a prefrontal lobotomy; from that point on, she spent most of her life in an institution. Shriver deplored the practice of keeping people with mental disabilities sedentary lest they injure themselves; of keeping their very existence a secret, as her family had hidden Rosemary.

In 1962, Shriver used money from their parents’ foundation to fund her vision for empowering the mentally disabled. What began as a summer camp in her Maryland backyard evolved into the Special Olympics, a competition that now attracts 1 million athletes from 160 countries. “She set out to change the world and to change us,” her family said in a statement when she died, “and she did that and more.”

The first Special Olympics hosted 1,000 athletes from the U.S. and Canada. In the years that followed, the event achieved greater recognition from the Olympic Committee and from athletes and advocates around the world, and it expanded to include winter events as well. By 2006, by the sponsoring organization’s official count, there were 2.5 million participating Special Olympics athletes worldwide.

Read a 1987 report from the Special Olympics, here in the TIME Vault: Heroism, Hugs and Laughter

TIME space

Remembering the Apollo 11 Moon Landing With the Woman Who Made It Happen

Margaret Hamilton standing next to listings of the actual Apollo Guidance Computer (AGC) source code.
NASA Margaret Hamilton standing next to listings of the actual Apollo Guidance Computer (AGC) source code.

Margaret Hamilton's software got man to the moon—but she didn't stop there

It was on this day, July 20, in 1969, that the Apollo 11 astronauts reached the moon and Neil Armstrong took his famous small step. People celebrated the world over, though few were more relieved than Margaret Hamilton.

“I remember thinking, Oh my God, it worked,” the pioneering software engineer tells TIME. “I was so happy. But I was more happy about it working than about the fact that we landed.”

The “it” that worked was Apollo 11’s on-board flight software, which Hamilton, as part of the MIT team working with NASA, led the effort to build. There was no guarantee things would play out so smoothly. In fact, just before the lunar landing was supposed to happen, alarms went off indicating that there wasn’t enough room on the computer for the landing software to work effectively. Turns out a radar was sending unnecessary data to the computer, overloading it with superfluous information.

The work that Hamilton had done helped enable the computer to figure out which of the multiple processes it had to do was most important. “It got rid of the lesser priority jobs and kept the higher priority jobs, which included the landing functions,” she explains.

That fix gave NASA the confidence to go ahead with the historic moon landing.

Hamilton was later given NASA’s Exceptional Space Act Award for her work on those Apollo systems. (She’s also credited with coining the term “software engineering.”) That she was successful in the pre-women’s lib era in a field that remains tough for women to crack has helped revive interest in her career: Hamilton achieved a sort of Internet fame recently when the photo above made the rounds.

Hamilton says that she was so wrapped up in her work that she didn’t notice the gender problems of the time until Mad Men came around and seemed a little too familiar. (Even if gender wasn’t uppermost in her mind, she did advance that cause too: Hamilton recalls that a woman on her team was told by the MIT credit union that she couldn’t get a loan without her husband’s signature, though male applicants didn’t need spousal approval. Hamilton complained about the policy and had it changed. “It was the culture, but I won, and I was so happy,” she recalls. “I didn’t do it because of male versus female; I was very conscious of what was fair and what wasn’t fair.”)

Part of what had made Hamilton’s work so effective was that she tested everything so rigorously, in a simulator that could demonstrate the “system of systems” at work, and the relationship between the software, the hardware and the astronaut. “We couldn’t run something up to the moon,” she says. But they could run lots of tests on the ground. Analyzing the errors that came up during testing, Hamilton’s team found that nearly three-quarters of them were interface errors, like conflicts in timing or priority. Since the computer code was on cards, a software engineer might write code that told the computer how many cards to advance; if someone later added a card in the middle while working on the code, that number would be wrong. Hamilton realized that those problems were avoidable.

“We’ve been working on this ever since Apollo—or starting with Apollo, I should say,” she says. “I’ve been on a mission in its own right, working with this Universal Systems Language, which allows you to get things up front. It’s kind of like a root canal: you waited till the end, [but] there are things you could have done beforehand. It’s like preventative healthcare, but it’s preventative software.”

She founded Hamilton Technologies Inc. in 1986, where she has continued her work with Universal Systems Language.

In 1969, TIME’s special report on the moon landing included the optimistic prediction that Mars would be up next, and soon (“as early as 1982″). Software reliability may not have much romantic appeal, but Hamilton believes it is key to future exploration of the universe, including Mars. Which makes sense, since reliable software saves money. Testing is expensive, so it’s cheaper for NASA (and private space programs) if problems can be caught in advance, thus requiring fewer tests. And cutting down on costly tests means smaller budgets that are more likely to get approved.

Which is something Hamilton is keen to see happen. “I hope,” she says, “that we continue with exploration.”

Read TIME’s special issue from 1969 about the Apollo 11 moon landing: Man on the Moon

Read next: The Smithsonian Needs Your Help to Display Neil Armstrong’s Spacesuit

Download TIME’s mobile app for iOS to have your world explained wherever you go

Your browser is out of date. Please update your browser at http://update.microsoft.com