TIME Innovation

Terrorism Isn’t Madness

liberty-leading-people-delacroix
Getty Images

Conflating terrorism and madness is a very old mistake, with a special history in France

Each time a terrorist act occurs in the world, the specter of madness looms on the horizon.

On Oct. 22, 2014, Michael Zehaf-Bibeau fatally wounded a soldier on Parliament Hill in Ottawa before being shot by the police. A Muslim convert and a drug addict, he didn’t have any psychiatric record, but his mother confirmed he was mentally deranged. Two days later, Zane Thompson, a Muslim convert, described as a “recluse” with mental problems, attacked four policemen in New York City with a hatchet, a “terrorist act” according to the NYPD commissioner. On Dec. 15, 2014, Man Haron Monis, a self-proclaimed Iranian Sheikh, who was suspected of murdering his wife and had been charged with 40 sexual offenses dating back a decade, took hostages in a café in Sydney during 16 hours, before being shot dead by the police – two hostages died in the raid. Australian Prime Minister Tony Abbott said the gunman had “a long story of violent crime, infatuation with extremism and mental instability”.

This may sound like a modern epidemic, but, as I know from my experience studying French history, connecting terror and madness is a very old story.

In 19th-century France, psychiatrists and politicians were particularly quick to accept the analogy between revolutionary terror and madness, leading psychoanalyst Sigmund Freud to say later that the French were a “people of psychical epidemics, of historical mass convulsion.” Psychiatrists coined new diseases such as “political monomania,” “revolutionary neurosis,” “paranoia reformatoria,” and even “morbus democraticus” (democratic disease). Theorists and writers concurred. Addressing readers potentially nostalgic of revolutionary spirit, the diplomat and historian Chateaubriand wrote that the Reign of Terror (1793 to 1794), a policy of political repression, “was not the invention of a few giants; it was quite simply a mental illness, a plague.”

But what does systematically combining political violence and madness mean? Not much, since it takes two complex terms and, by combining them, offers a simple explanation.

Scientists can fall into the same tempting trap. Théroigne de Méricourt, a feminist supposedly leading a group of armed Amazons during the Revolution, ended her life in a lunatic asylum, where she was diagnosed with dementia due to her political convictions. This clinical demonstration was full of factual errors and approximations, and based on plagiarism of a sort, as a sick condition was portrayed as the result of sick ideology. Of course, Théroigne may have been insane. But was her madness necessarily related to her beliefs or did the doctor’s opposing political (royalist) beliefs orient the diagnosis?

Beside politics, religion (and the acceptable “limits” of its practice) often interferes in diagnosis. On February 14, 1810, Jacob Dupont, a famous thinker who had advocated atheism, was institutionalized at Charenton, a lunatic asylum founded in the 17th century. Dupont’s medical file reads:

“Former Doctrinaire [i.e., former member of the Confraternity of Christian Doctrine], former representative in the Legislative Assembly and the Convention; withdrew to a small village near Loches, where he lived for eight years with a sister who died six months ago. Metaphysical and revolutionary reveries, notorious advocacy of atheism in the Convention; publicly gave a course on that subject on Place Louis XVI seven years ago. Many writings full of the same madness. No violence, no delusions on other subjects.”

Here it is spelled out: atheism is madness. The assertion itself is not surprising in a society that shared Louis Sébastien Mercier’s opinion that atheism was “the sum total of all the monstrosities of the human mind” and “a destructive mania … that is very close to dementia.” This time, however, the judgment served as a diagnosis penned by a physician who, even though he was using the term “madness” in a colloquial sense, admitted that Dupont had “no delusions on other subjects.”

This point is crucial, because it proves, black on white, that religious beliefs constituted a sufficient basis for confinement. If the doctor, Antoine-Athanase Royer-Collard, had known that Dupont had been forced to resign his seat in the Convention 1794 due to his mental state, and was arrested the following year for raping a blind old woman, he would have felt even more justified in his diagnosis. Though Royer-Collard had only looked at Dupont’s openly declared atheism to make his decision, the background information would have underscored how it was only part of a larger pathology.

What do we learn from history? That a plausible conflation of terms, if not carefully scrutinized and documented, often turns to be a very harmful confusion.

If we go back to our contemporary examples, it appears that the three men (at least according to what newspapers tell us) share some common traits: Islam, violence and hypothetical madness. In other words: religion, political extremism, and medical condition. The three men are considered lone-wolf jihadists, who live “on the fringe of the fringe,” as the Sydney hostage-taker’s attorney characterized his client.

Isolated, frustrated, unable to join any terrorist organization, these so-called jihadists are first and foremost social misfits, galvanized by causes that get daily media attention. No anti-terrorist laws could ever apply to them, unless you could put the entire population of the world under continuous surveillance. Recent studies from Indiana State University and University College London have demonstrated that 32 to 40 percent of lone-wolf attackers suffered from mental problems, while, actually, “group-based terrorists are psychologically quite normal.”

What can we take away from this? We must be more careful about differentiating solo attackers from organized political forces – just as we must be more careful about using the word “madness.” In other words, let’s restore the full meaning of complicated concepts. And let’s remind ourselves that terrorism is a real threat of political thought, that religion is not fanaticism, and that madness is a very serious social issue that deserves more attention in countries that have failed to create effective mental health policies.

Laure Murat, a historian, is a professor in the Department of French and Francophone Studies at UCLA. Her last book is entitled: The Man Who Thought He Was Napoleon. Towards a Political History of Madness (Chicago: University of Chicago Press, 2014). She wrote this for Thinking L.A., a project of UCLA and Zocalo Public Square. Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME society

Eva Kor: What It Was Like to Be Experimented on During the Holocaust

glass-window-train-tracks
Getty Images

Auschwitz provided no limit for Nazi doctors and researchers to experiment on human bodies

Answer by Eva Kor, Holocaust survivor and forgiveness advocate, on Quora.

My twin sister Miriam and I were used in Josef Mengele’s experiments at Auschwitz as ten-year-old girls. We were taken six days a week for the experiments. On Mondays, Wednesdays, and Fridays, we would be taken to the observation lab where we would sit for hours—naked—up to eight hours. They kept measuring most of my body parts, comparing them to my twin sister, then comparing them to charts. They were trying to design a new Aryan race, so they were interested in all these measurements. These experiments were not dangerous, but they were unbelievably demeaning and even in Auschwitz I had difficulty coping with the fact that I was a nobody and a nothing – just a mass of cells to be studied. On alternate days we would be taken to another lab that I call the “blood lab.” This is where they would take a lot of blood from my left arm and give me several injections in my right arm. Those were the deadly ones. We didn’t know the contents then and we don’t know today. After one of those injections, I became very ill with a very high fever. I also had tremendous swelling in my arms and legs as well as red spots throughout my body. Maybe it was spotted fever, I don’t know. Nobody ever diagnosed it.

As a guinea pig in Auschwitz, we had to realize that they could do to our bodies whatever they wanted and we had no control over what they put into us, what they removed, or how they treated us, and there was no place for us to go.

People often ask me, “Why didn’t you run away?” I am convinced those people know very little about Auschwitz. The barbed wire would electrocute you if you touched it. The whole camp was surrounded by that. Before you got to the high voltage fence, there was a ditch filled with water. So as you approached that fence, your hands were damp and you would be immediately electrocuted. At age ten, even if I succeeded to get out, where would I go?

Maybe I could have succeeded in running away when we were marched from Birkenau to Auschwitz I for some of the experiments. But as far as I could see when we were marching, that was all a military zone. Where on earth would I have gone if I escaped? I didn’t know how far I would even need to run. And of course, most of the time when someone escaped and they turned on the sirens, we would have to stand for roll call for two to four hours until the escapee was found dead or alive. If the escapee was found alive, they would be hanged in front of us. The lessons were very clear. If found dead, they would be brought in front of the group so we would know, nobody escapes from Auschwitz.

At age ten, I would not have dared to escape and I did not even think about it. That was so far from my mind. What I was thinking about every day was how to live one more day, how to survive one more experiment. I knew as the air raids were increasing, that this could not last for much longer. On the days when they would keep us for hours at roll call until the escapees could be found, I would often think, “Good luck – I hope you make it.” I never thought anyone did. I was lecturing in San Francisco about 15 years ago. They had about ten survivors who were introduced. One of them said, “I escaped from Auschwitz.” I was so excited! I went up to him and said, “Finally I know why I stood at roll call for so many hours – I am glad to know somebody made it.”

As twins, I knew that my sister and I were unique because we were never permitted to interact with anybody in other parts of the camp. But I didn’t know I was being used in genetic experiments.

I began lecturing about my own experiences in 1978. As I was telling my story, people would come up to me later on and ask about the experiments. Well, I remembered some details of my own experience, but I knew nothing about the bigger scope of the experiments. So I decided to read books about Josef Mengele, hoping to get more of an insight. But in all these books, it only had one or two sentences about him.

I was trying to figure out how I could get more information, and I was looking at the famous photo that was taken by the Soviets at liberation. I could see there maybe 100 children marching between those barbed wire fences who were liberated.

Here is a picture of me and Miriam, holding hands in the front row. I thought if I could somehow locate those other twins, we could have a meeting and share those memories.

It took me six years, but in 1984, with the help of my late twin sister Miriam, we found 122 “Mengele Twins” living in ten countries and four continents.

We had a meeting in Jerusalem in February of 1985.

We talked to many of them. What I found out was that there were many, many other experiments. For instance, the twins who were older than 16 or were of reproductive age would be put in a lab and used in cross-gender blood transfusions. So blood was going from the male to the female and vice versa. Sadly, they did not check—of course—to see if the blood was compatible and most of these twins died. There are twins in Australia who survived—Stephanie and Annette Heller—and there is a twin in Israel who was a fraternal twin—Judit Malick—and her twins’ brother’s name was Sullivan. I heard Judit testify in Jerusalem that she was used in this experiment with a male twin of reproductive age. She remembered being on a table during the experiment when the other twin’s body was turning cold. He died. She survived but had a lot of health problems.

The question is how many of these twins did survive? Most of them obviously died. I also know for a fact that Mengele did strange experiments on kidneys. Mengele himself suffered from renal problems when he was 16 in 1927. He was out of school three or four months according to his SS file. He was deeply interested in the way the kidneys worked. I know of three cases where twins developed severe kidney infections that did not respond to antibiotics.

One of them is Frank Klein, who lived in El Paso, Texas, after the war. He very much wanted to attend the gathering in Jerusalem, but he was on dialysis. He actually came with his nurse and very much hoped he would get a kidney so he could live like a normal person. He did get a transplant in 1986. I talked to him after the surgery and he said he was doing pretty good, but then three days later he died. The other twin whose name I don’t remember off the top of my head died also because of kidney failure problems.

Then, of course, my twin sister developed kidney problems with her first pregnancy in 1960. The problems did not respond with antibiotics. In 1963, when she expected her second baby, the infection got worse. This is when the doctors studied her and found out her kidneys never grew larger than the size of a ten-year-old’s kidneys. When I refused to die in the experiment where Mengele thought I would die (read about it here: What gives you hope during tough times?), Miriam was taken back to the lab and was injected with something that stunted the growth of her kidneys. After her third baby was born, her kidneys failed. In 1987, I donated my left kidney to her. We were a perfect match. At that hospital in Tel Aviv, they had been doing kidney transplants for ten years. None of them developed cancerous polyps except for my twin sister Miriam, in her bladder. All the doctors kept saying was that there had to be something in Miriam’s body that was injected into her that combined with the anti-rejection medication to create the cancerous polyps.

Other experiments I have heard of from survivors: Many twins who did not have blue eyes were injected with something into their eyes. Luckily Miriam and I had blue eyes. Mengele did some other strange experiments. Most of them were very much in the line of trying to understand how to make blue-eyed blondes in multiple numbers, the germ warfare experiments, etc. If one twin died, Mengele would have the other killed and then do the comparative autopsies. According to the Auschwitz Museum, Mengele had 1500 sets of twins in Auschwitz. There were only 200 estimated individual survivors. Everybody who has been researching that, including the Auschwitz Museum, said most died in the experiments and I agree. Dying in Mengele’s lab was very easy. I am one of the few I have heard about to be in the “barrack of the living dead” and get out of there alive.

I learned a great deal after the war in attending conferences, including one at the Kaiser Wilhelm Institute. This is where Mengele studied, and today it is called the Max Planck Society. They were trying to collect information about Mengele’s experiments. They invited several twins and a few other people used in experiments by Mengele. Here is a photo of me studying some of the vials used in experiments at Auschwitz.

Auschwitz was the laboratory for any experiments any Nazi scientists wanted to do. There was no limit on what doctors and researchers could do at these camps. So it was open season on twins and other human guinea pigs like us.

This question originally appeared on Quora: What was it like to be part of the genetic experiments on twins during the Holocaust?

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Civil Rights

What the International Response to the Civil Rights Movement Tells Us About Ferguson

Education Segregation, USA. pic: circa 1957. Little Rock, Arkansas. National Guardsmen, having admitted white children to a school, barr the way to a black student.
Little Rock, Ark., 1957: National Guardsmen, having admitted white children to a school, bar the way to a black student Paul Popper—Popperfoto / Getty Images

International criticism during the Civil Rights Movement helped bring about new legislation

Images of armed soldiers blocking nine African-American high school students from integrating a public high school in Little Rock, Ark. shocked the world nearly 60 years ago. Organs of Soviet propaganda, determined to disrupt perceptions of a tranquil American democracy, wrote of American police “who abuse human dignity and stoop to the level of animals” in the newspaper Izvestia. In the midst of stiff Cold War competition for hearts and minds around the world, the prospect of controlling international perceptions motivated officials at the highest levels of U.S. government to support new civil rights measures.

The U.S. representative to the United Nations warned President Dwight Eisenhower that the incident had damaged American influence, and the President listened.

“Before Eisenhower sent in the troops, there were mobs around the school for weeks, keeping these high school students from going to school,” says Mary Dudziak, a professor at Emory whose book Cold War Civil Rights argues that international pressures encouraged the federal government to work to improve civil rights, and which tells the above story about Little Rock. “The issue caused people from other countries to wonder whether the U.S. had a commitment to human rights.”

Today, the highly-publicized killings of unarmed black men like Michael Brown and Eric Garner have attracted similar international condemnation, and some historians wonder whether concerns about U.S. appearances around the world could once again influence the federal government.

Read More: One Man. One March. One Speech. One Dream.

During the Cold War, the Soviet Union, the sworn enemy of the U.S., had a lot to gain by showing that American democracy wasn’t all it was cracked up to be. The opposition of today’s day and age are less influential, but they appear equally eager to highlight American dysfunction. Iran’s Supreme Leader Ali Khamenei used the attention surrounding Michael Brown to remind his Twitter followers of America’s history on race issues. A tweet from the leader features images of police dogs in Birmingham during the Civil Rights Movement alongside an image of Michael Brown:

North Korea may have one of the world’s worst human rights records, but that didn’t stop the country from criticizing the U.S. for “wantonly violating the human rights where people are subject to discrimination and humiliation due to their races.”

It’s not surprising that North Korea and Iran would criticize the U.S., but the reprimanding hasn’t been limited to opponents. The U.N. High Commissioner for Human Rights said he is unsure whether the decision not to indict the police officer who shot Michael Brown “conforms with international human rights law.”

“It is clear that, at least among some sectors of the population, there is a deep and festering lack of confidence in the fairness of the justice and law enforcement systems,” Zeid Ra’ad Al Hussein, U.N. high commissioner for human rights, said in a statement.

Criticism from the U.N. is significant, but the international body’s desires, much less those of North Korea or Iran, have never driven U.S. policy — and the fact is, while there are many links between the Cold War era and today, times have changed. Thus far, the President has walked a fine line in his response. He proposed some measures, including encouraging police to wear body cameras, but it seems unlikely that he’ll be proposing any game-changing legislation like the Civil Rights Act of 1964.

It’s not surprising that Obama has hesitated to involve the federal government in what is typically a local or state issue. The international mandate simply isn’t as strong as it was during the Cold War. There’s no equivalent to the Soviet Union offering a credible alternative to America’s system of governance.

But that may not be the case forever: historians and political scientists say that a growing movement against police brutality has the potential to increase international pressure and, perhaps, force change.

“I’m sure the Obama administration and the State Department are concerned about [international perceptions],” says Rick Valelly, a political science professor at Swarthmore. “Right now it’s embarrassing, but I don’t think it’s internationally consequential.”

Movements to end police brutality don’t yet have the “same kind of legs” that the Civil Rights Movement had, Valelly says. This year’s demonstrations have been attention-grabbing — and the “Justice for All” March planned for Washington, D.C, this Saturday is certain to make headlines — but it may take many more years of sustained protest before the movement would be noticed internationally on a much larger scale, as the Civil Rights Movement of the 1960s was. For now, experts in the field still remain optimistic about the benefits of international attention, relatively minor though it may be.

“Public diplomacy begins with listening,” says Nick Cull, a professor at the University of Southern California, “and this would be a really good time to listen.”

TIME Family

How To Acknowlege Native Americans this Thanksgiving

Burke/Triolo Productions—Getty Images

Thanksgiving, you might vaguely remember from elementary school, celebrates a feast shared by the Wampanoag tribe and European settlers the tribe had saved from starvation. It turned out, of course, that the presence of Europeans was tragic for the Native Americans who had welcomed them.
With such a troubled history, how can we talk with our kids about Thanksgiving in a way that recognizes both sides of the tradition? Here are some tips from Dr. Randy Woodley, a Keetoowah Cherokee descendent, Director of Intercultural and Indigenous Studies at George Fox University, and author of Shalom and the Community of Creation: An Indigenous Vision.
For young kids, “It’s important to understand that the ‘first Thanksgiving’ was not really the first,” Woodley says. “Native Americans were celebrating Thanksgiving feasts for thousands of years prior to the European arrival. And those celebrations took place many times throughout the year.”
Middle school aged kids can understand their role in the occasion a bit more clearly. “Native Americans were the hosts of Thanksgiving,” says Woodley. “It’s part of our values, to welcome people.” Thanksgiving is still a celebration of hospitality. But Woodley also believes it’s a good time to think about what kind of guests we want to be, either at a feast, or as visitors to a new country.
By high school, the lens can be widened. “Feasts, and the hospitality of the Native Americans, can serve as a lesson for inter-cultural hospitality in America,” says Woodley. To him, it’s a natural time “to encourage reconciliation between your family and those who share a different history.” What does it mean to be a host, to extend yourself? This also might be the time to talk about how many Native Americans do not celebrate the holiday because of the painful history that followed. “Eventually the story did not end well for the Native Americans,” Woodley says. “We are still waiting for justice and reconciliation to take place. Perhaps over another feast in the future.”

14OTDODS_560x150_03

TIME Culture

Celebrating Thanksgiving With America’s First Rock Star

Massachusetts. Sign At Plymouth Rock
Getty Images

Plymouth Rock has been the subject of history lessons, songs, and speeches for 400 years. Why do we love it?

“Plymouth Rock is a glacial erratic at rest in exotic terrane.” So begins John McPhee’s classic 1990 New Yorker article, the best short piece ever written about the great American relic, pointing out how geological forces carried this rock far from its original home — Africa. It is an iconic mass of granite geologically formed by fire, but it certainly also qualifies as a sedimentary and metamorphic chunk of American political culture. Plymouth Rock has long been a symbol of America’s beginnings, the country’s bedrock, its very foundation. And in the Rock’s surprising travels, during its original journey to Plymouth Harbor and its subsequent wanderings and memorialization, it has embodied authentic Americanism, on the move.

As a historian of early America, I’ve long been fascinated by how the people, places, and things of the colonial era have been remembered in American popular culture — that is, in the sort of history that “we carry around in our heads,” not the history that history professors profess. And the Pilgrims and Plymouth Rock do feature prominently in our collective heads, particularly as November arrives each year and we turn our attention to Thanksgiving. Why make such a big deal of a small band of Puritan separatists who were not the first European colonists of America, and not even the most prominent Puritan colonists of Massachusetts? And did they actually ever step on Plymouth Rock, or treat it as anything other than an erratic? All this seems not to matter; I long ago resolved that history is and should be much more than mere debunking.

In the early 1990s, while on a ferry in New York Harbor, I overheard a conversation that demonstrated the ongoing significance of those Plymouth “forefathers.” Anticipating their visit to the historic Ellis Island and eyeing the Statue of Liberty in the distance, one passenger said to another, “These immigrants were just doing what the very first Americans, the Pilgrims, were doing.” Her companion nodded toward the island and agreed: “That was their Plymouth Rock.” Fractured history, yet rock-solid nonetheless.

McPhee reconstructed the Rock’s migration, via the mechanisms of plate tectonics, as part of a large slice of the earth’s crust called Atlantica, which joined North America some 580 million years ago from a distant locale — Africa, mostly likely. Then, approximately 20,000 years ago, the boulder was scooped up by moving ice and ultimately deposited at Plymouth Harbor when glaciers retreated at the end of the Ice Age. After the Pilgrims showed up in 1620, the Rock’s migrations only accelerated, as people circulated its shards as relics, just as believers disseminated the bones of saints and pieces of the true cross in medieval Europe to help sanctify a holy narrative. In this case, they were shoring up a New World narrative about the glorious rise of the American republic.

Until late in the 18th century, the Puritan Separatists who founded Plymouth colony in 1620 were but one marginal group of predecessors, not yet Capital-P “Pilgrims,” not yet the nation’s forefathers. Plymouth Rock belatedly received its first public recognition as the Pilgrims’ alleged landing place when church elder Thomas Faunce assembled his children and grandchildren on the spot and recited the tale in 1742, a legend later fossilized in print by Dr. James Thacher, who provided no corroborating evidence. The Separatists’ narrative and their Rock acquired symbolic power during the American Revolution, as the new United States sought independence.

By the late 1700s, Thanksgiving had become a well-established regional festival in New England, tracing its roots to a feast in the autumn of 1621. Plymouth Rock, however, was at first more directly tied to a different occasion: Forefathers’ Day, or Landing Day, on December 22, commemorating the debarkation of the Mayflower passengers in 1620. Popular among fraternal ancestor organizations, Landing Day was a civic event that expressed exclusivity, expansionism, and stubborn mission, while Thanksgiving was a domestic and community fete celebrating bounty, charity, and inclusion, And while both holidays spread beyond New England, carried by migrating Yankees who shared the Rock as a touchstone, only Thanksgiving’s spirit overtook the hearts and festive calendars of Americans.

Toasting the Pilgrims at a New England Society fete near the end of the 19th century, Frederic Taylor of New York proclaimed:

It is our habit to think of Plymouth Rock always as being at Plymouth, and nowhere else. Well it was there once [but] when the Pilgrims’ feet pressed that boulder at Plymouth it became instinct with life and began to broaden at its base; and its base has ever since been spreading out, till now Plymouth Rock underlies the continent.

Using the Rock as potent metaphor, Taylor imagined it as a stepping-stone and foundation for Union generals Grant and Sherman, as ballast for the Monitor as it defeated the Merrimac, and as Abraham Lincoln’s hammer of freedom.

Given the seismic tumult of American history, it’s perhaps no surprise that the gravitas surrounding the Pilgrims would be shaken. The 19th-century New York lawyer and politico Chauncey Depew, a frequent after-dinner speaker at New England Society banquets, once tweaked his patrician audience with this cheeky comment: “What a pity instead of the Pilgrim Fathers landing on Plymouth Rock, Plymouth Rock had not landed on the Pilgrim Fathers,” which set off a chain of similar jokes over the years, including in the opening lines of the Cole Porter standard “Anything Goes.”

Few relished deflating the puffery of Pilgrim celebrations as much as Mark Twain. He once facetiously urged attendees at a New England Society dinner to sell their chief asset: “Opulent New England [is] overflowing with rocks,” and “this one isn’t worth at the outside, more than 35 cents.” As Twain knew, this was no ordinary rock, not even merely a brand name, but a priceless relic, providing a more visceral link to the past, what one scholar has called a “zero-hand” account. Relics — including Plymouth Rock —are “things that speak.”

But the Rock could sometimes speak in controversial and unpredictable ways. In the bicentenary celebration at Plymouth in 1820, for example, the orator Daniel Webster linked it (somewhat creatively) not only to religious freedom but also to the anti-slavery cause. (Many Southerners initially resisted the celebration of Thanksgiving itself because it seemed to carry the taint of abolitionism.) One fragment later helped raise money for the Union effort in the Civil War at a Boston Sanitary Fair in 1863.

Disturbingly for some descendants of “First Comers,” the Rock could sometimes be claimed by other latter-day “pilgrims” — from Ireland or Italy, Poland or Puerto Rico. Its message has been pluralism as well as nationalism and chauvinism. And if the Rock powerfully symbolizes the United States as a nation of immigrants, ironically this granite boulder from Africa might better represent those immigrants who came against their will, in chains, in the largest forced migration in history. For non-immigrants — Native Americans — it sometimes served as a potent prop in their struggle for recognition and rights.

Sometimes relics fall mute. In the 1920s, the Plymouth Antiquarian Society (and later the Smithsonian’s National Museum of American History) acquired a large piece that had been cut from the doorstep of a house in Plymouth. A fragment of another rock at the Smithsonian — a craggy nugget chipped from the Mother Rock by Lewis Bradford in 1830, inscribed with the precise details of its collection and transformation into a circulating talisman — sat silently for years among a collection that had been donated in 1911. It only regained a “speaking part” in the museum’s displays and publications in the late 20th century.

Plymouth Rock is an emblem defined by its solidity. And yet it’s been anything but solid, in form or meaning. It’s been split, carted about, fragmented, broken and rejoined, reinstalled at the Plymouth waterfront, canopied, repaired. It has endured the changing forces of nature and American political culture. Its precious pieces have spread far and wide, across the Atlantic and to the shores of the Pacific, and they’ve been further disseminated through print and pixels, extending the Mother Rock’s power and reach. It’s clear that Plymouth Rock and its mobile relics still speak to us, in a voice that is both constant and changing.

Matthew Dennis is professor of history and environmental studies at the University of Oregon. He is the author or editor of five books, including Red, White, and Blue Letter Days: An American Calendar, and is currently at work on American Relics and the Politics of Public Memory. He wrote this article for Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Culture

The Long American Tradition of Not Feeling Particularly Thankful for Thanksgiving

Clay model of Pilgrim figure with turkey and axe on a white background
Getty Images

If you’re someone who feels a sense of angst, foreboding, or misery about this time of year, take heart: American history is on your side

Do you have complicated feelings about Thanksgiving? Maybe your ancestors were among this continent’s indigenous peoples, and you have good reason to be rankled by thoughts of newly arrived English colonists feasting on Wamapanoag-procured venison, roasted wild turkey, and stores of indigenous corn. Or maybe Thanksgiving marks the beginning of a holiday season that brings with it the intricate emotional challenges of memory, home, and family.

If you’re someone who feels a sense of angst, foreboding, or misery about this time of year, take heart: American history is on your side.

The truth of our history is that only a small minority of the early English emigrants to this country would have been celebrating as the New England Puritans did at the first Thanksgiving feast in 1621.

A thousand miles south, in Virginia and the Carolinas, the mood and the menu would have been drastically different — had there ever been a Thanksgiving there. Richard Frethorne, an indentured servant in the Virginia colony during the 1620s, wrote in a letter: “Since I came out of the ship, I never ate anything but peas, and loblollie (that is, water gruel).”

And don’t imagine for a second that those peas Frethorne was gobbling down were of the lovely, tender green garden variety dotted with butter. No, in the 1620s, Frethorne and his friends would have subsisted on a grey field pea resembling a lentil.

“As for deer or venison,” Frethorne wrote , “I never saw any since I came into this land. There is indeed some fowl, but we are not allowed to go and get it, but must work hard both early and late for a mess of water gruel and a mouthful of bread and beef.”

Frethorne’s letter is a rare surviving document reflecting the circumstances of the majority of English colonists who came to North America in the 17th century. The New England Puritans, after all, comprised only 15 to 20 percent of early English colonial migration.

Not only did the majority of English colonial migrants eat worse than the Puritans, but also their prayers (had they said any) would have sounded decidedly less thankful.

“People cry out day and night,” Frethorne wrote, “Oh! That they were in England without their limbs — and would not care to lose any limb to be in England again, yea though they beg from door to door.”

English migrants in Virginia had good reason not to feel grateful. Most came unfree, pushed out of England by big economic forces that privatized shared pastures and farmlands and pushed up the prices of basic necessities. By the 17th century, more than half of the English peasantry was landless. The price of food shot up 600 percent, and firewood by 1,500 percent.

Many peasants who were pushed off their homelands built makeshift settlements in the forests, earning reputations as criminals and thieves. Others moved to the cities, and when the cities proved no kinder, they signed contracts promising seven years of hard labor in exchange for the price of passage to the Americas, and were boarded onto boats.

A trip to Virginia cost Frethorne and others like him six months salary and took about 10 weeks. One quarter to one half of new arrivals to Virginia and the Carolinas died within one year due to diseases like dysentery, typhoid, and malaria. Others succumbed to the strain of hard labor in a new climate and a strange place — an adjustment process the English described as “seasoning.” Only 7% of indentures claimed the land that they had been promised.

Most of these common English migrants did not read or write, so vivid and revealing letters like Frethorne’s are rare. But in the research for my book Why We Left: Songs and Stories from America’s First Immigrants, I learned how English migrants viewed their situation through the songs they sang about the voyage across the Atlantic Ocean. Those songs survived hundreds of years by word of mouth before they were written down in the twentieth century.

These were not songs of thankfulness — not by a long shot. They were ballads full of ghastly scenes of the rejection, betrayal, cruelty, murder, and environmental ruin that had driven them out of England — and of the seductive but false promises that drew them to America. These 17th century songs planted the seeds for a new American genre of murder and hard luck ballads that was later picked up and advanced by singers like Johnny Cash, whose ancestors, like mine, were among those early hard luck migrants from England to America.

So if you find yourself a little blue this holiday season, take your marshmallow-topped sweet potatoes with a liberal dose of the Man In Black, and reassure yourself that you are a part of a long, long American tradition.

Joanna Brooks is Associate Dean of Graduate and Research Affairs at San Diego State University and author of Why We Left: Untold Stories and Songs of America’s First Immigrants (Minnesota, 2013). She wrote this for Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Opinion

The Reason Every One of Us Should Be Thankful

Thanksgiving Preparations
Illustration of preparing the Thanksgiving meal circa 1882. Kean Collection / Getty Images

As Thanksgiving approaches, a little bit of historical context goes a long way

Astronomy is a historical science because the distance scales involved are so immense that to look out into space is to look back into time. Even at the almost unfathomable speed of light — 300,000 kilometers per second — the sun is eight light minutes away, the nearest star is 4.3 light years away, the nearest galaxy, Andromeda, is about 2.5 million light years away and the farthest object ever observed is about 13.8 billion light years away. Astronomers call this way of describing such distances “lookback time.”

The concept is not limited to astronomy: current events also have their own lookback times, accounting for what gave rise to them. Just as looking at a star now actually involves seeing light from the past, looking at the world today actually involves looking at the reverberations of history. We have to think about the past in order to put current events into proper context, because that’s only way to track human progress.

Consider the longing many people have for the peaceful past, filled with bucolic scenes of pastoral bliss, that existed before overpopulation and pollution, mass hunger and starvation, world wars and civil wars, riots and revolutions, genocides and ethnic cleansing, rape and murder, disease and plagues, and the existential angst that comes from mass consumerism and empty materialism. Given so much bad news, surely things were better then than they are now, yes?

No.

Overall, there has never been a better time to be alive than today. As I document in my 2008 book The Mind of the Market and in my forthcoming book The Moral Arc, if you lived 10,000 years ago you would have been a hunter-gatherer who earned the equivalent of about $100 a year — extreme poverty is defined by the United Nations as less than $1.25 a day, or $456 a year — and the material belongings of your tiny band would have consisted of about 300 different items, such as stone tools, woven baskets and articles of clothing made from animal hides. Today, the average annual income in the Western world — the U.S. and Canada, the countries of the European Union, and other developed industrial nations — is about $40,000 per person per year, and the number of available products is over 10 billion, with the universal product code (barcode) system having surpassed that number in 2008.

Poverty itself may be going extinct, and not just in the West. According to UN data, in 1820 85-95% of the world’s people lived in poverty; by the 1980s that figure was below 50%, and today it is under 20%. Yes, 1 in 5 people living in poverty is too many, but if the trends continue by 2100, and possibly even by 2050, no one in the world will be poor, including in Africa.

Jesus said that one cannot live on bread alone, but our medieval ancestors did nearly that. Over 80% of their daily calories came from the nine loaves a typical family of five consumed each day. Also devoured was the 60 to 80% of a family’s income that went to food alone, leaving next to nothing for discretionary spending or retirement after housing and clothing expenses. Most prosperity has happened over the two centuries since the Industrial Revolution, and even more dramatic gains have been enjoyed over the last half-century. From 1950 to 2000, for example, the per capita real Gross Domestic Product of the United States went from $11,087 (adjusted for inflation and computed in 1996 dollars) to $34,365, a 300% increase in comparable dollars! This has allowed more people to own their own homes, and for those homes to double in size even as family size declined.

For centuries human life expectancy bounced around between 30 and 40 years, until the average went from 41 in 1900 to the high 70s and low 80s in the Western world in 2000. Today, no country has a lower life expectancy than the country with the highest life expectancy did 200 years ago. Looking back a little further, around the time of the Black Death in the 14th century, even if you escaped one of the countless diseases and plagues that were wont to strike people down, young men were 500 times more likely to die violently than they are today.

Despite the news stories about murder in cities like Ferguson and rape on college campuses, crime is down. Way down. After the crime wave of the 1970s and 1980s, homicides plummeted between 50 and 75% in such major cities as New York, Los Angeles, Boston, Baltimore and San Diego. Teen criminal acts fell by over 66%. Domestic violence against women dropped 21%. According to the U.S. Department of Justice the overall rate of rape has declined 58% between 1995 and 2010, from 5.0 per 1,000 women age 12 or older to 2.1. And on Nov. 10, 2014, the FBI reported that in 2013, across more than 18,400 city, county, state, and federal law enforcement agencies that report crime data to the FBI, every crime category saw declines.

What about the amount of work we have today compared with that of our ancestors? Didn’t they have more free and family time than we do? Don’t we spend endless hours commuting to work and toiling in the office until late into the neon-lit night? Actually, the total hours of life spent working has been steadily declining over the decades. In 1850, for example, the average person invested 50% of his or her waking hours in the year working, compared to only 20% today. Fewer working hours means more time for doing other things, including doing nothing. In 1880, the average American enjoyed just 11 hours per week in leisure time, compared to today’s 40 hours per week.

That leisure time can be spent in cleaner environments. In my own city of Los Angeles, for example, in the 1980s I had to put up with an average of 150 “health advisory” days per year and 50 “stage one” ozone alerts caused by all the fine particulate matter in the air—dirt, dust, pollens, molds, ashes, soot, aerosols, carbon dioxide, sulfur dioxide and nitrogen oxides—AKA smog. Today, thanks to the Clean Air Act and improved engine and fuel technologies, in 2013 there was only one health advisory day, and 0 stage-one ozone alerts. Across the country, even with the doubling of the number of automobiles and an increase of 150% in the number of vehicle-miles driven, smog has diminished by a third, acid rain by two-thirds, airborne lead by 97%, and CFCs are a thing of the past.

Today’s world has its problems — many of them serious ones — but, while we work to fix them, it’s important to see them with astronomers’ lookback-time eyes. With their historical context, even our worst problems show that we have made progress.

Rewind the tape to the Middle Ages, the Early Modern Period or the Industrial Revolution and play it back to see what life was really like in a world lit only by fire. Only the tiniest fraction of the population lived in comfort, while the vast majority toiled in squalor, lived in poverty and expected half their children would die before adulthood. Very few people ever traveled beyond the horizon of their landscape, and if they did it was either on horseback or, more likely, on foot. No Egyptian pharaoh, Greek king, Roman ruler, Chinese emperor or Ottoman sultan had anything like the most quotidian technologies and public-health benefits that ordinary people take for granted today. Advances in dentistry alone should encourage us all to stay away from time machines.

As it turns out, these are the good old days, and we should all be thankful for that.

Michael Shermer is the Publisher of Skeptic magazine, a monthly columnist for Scientific American, and a Presidential Fellow at Chapman University. He is the author of a dozen books, including Why People Believe Weird Things and The Believing Brain. His next book, to be published in January, is entitled The Moral Arc: How Science and Reason Lead Humanity Toward Truth, Justice, and Freedom.

TIME photography

The Fire Last Time: LIFE in Watts, 1966

A year after the Watts Riots in 1965, LIFE magazine revisited the neighborhood through a series of color pictures by photographer Bill Ray.

The August 1965 Watts Riots (or Watts Rebellion, depending on one’s perspective and politics), were among the bloodiest, costliest and — in the five decades since they erupted — most analyzed uprisings of the notoriously unsettled mid-1960s. Ostensibly sparked by an aggressive traffic stop of a black motorist by white cops — but, in fact, the combustive result of decades of institutional racism and municipal neglect — the six-day upheaval resulted in 34 deaths, more than 3,400 arrests and tens of millions of dollars in property damage (back when a million bucks still meant something).

The Fire Last Time: Life in Watts, 1966
Bill Ray—Time & Life Pictures/Getty Image

A year after the flames were put out and the smoke cleared from the southern California sky, LIFE revisited the scene of the devastation for a “special section” in its July 15, 1966, issue that the magazine called “Watts: Still Seething.” A good part of that special section featured a series of color photos made by Bill Ray on the streets of Watts: pictures of stylish, even dapper, young men making and hurling Molotov cocktails; of children at play in torched streets and rubble-strewn lots; of wary police and warier residents; of a community struggling to save itself from drugs, gangs, guns, idleness and an enduring, corrosive despair.

In that July 1966 issue, LIFE introduced Ray’s photographs, and Watts itself, in a tone that left no doubt that, whatever else might have happened in the months since the streets were on fire, the future of the district was hardly certain, and the rage that fueled the conflagration had hardly abated:

Before last August the rest of Los Angeles had never heard of Watts. Today, a rock thrown through a Los Angeles store window brings the fearful question: “Is this the start of the next one?” It brings the three armed camps in Los Angeles — the police, white civilians, the Negroes — face to face for a tense flickering moment. . . .

Whites still rush to gun stores each time a new incident hits the papers. A Beverly Hills sporting goods shop has been sold out of 9mm automatics for months, and the waiting list for pistols runs several pages.

Last week a Negro showed a reporter a .45 caliber submachine gun. “There were 99 more in this shipment,” he said, “and they’re spread around to 99 guys with cars.”

“We know it don’t do no good to burn Watts again,” a young Negro says. “Maybe next time we go up to Beverly Hills.”

Watts seethes with resentments. There is anger toward the paternalism of many job programs and the neglect of Watts needs. There is no public hospital within eight miles and last month Los Angeles voters rejected a proposed $12.3 million bond issue to construct one. When a 6-month-old baby died not long ago because of inadequate medical facilities, the mother’s grief was echoed by a crowd’s outrage. “If it was your baby,” said a Negro confronting a white, “you’d have an ambulance in five minutes.”

Unemployment and public assistance figures invite disbelief in prosperous California. In Watts 24% of the residents were on some form of relief a year ago — and that percentage still stands. In Los Angeles the figure is 5%.

[It] takes longer to build a society than to burn one, and fear will be a companion along the way to improvements. “I had started to say it is a beautiful day,” Police Inspector John Powers said, looking out a window, “but beautiful days bring people out and that makes me wish we had rain and winter year-round.”

For his part, Bill Ray recalls the Watts assignment clearly, and fondly:

In the mid-nineteen-sixties [Ray recently told LIFE.com], I shot two major assignments for LIFE in southern California, one after the other, that involved working with young men who were volatile and dangerous. One group was the Hells Angels of San Bernardino — the early, hard-core San Berdoo chapter of the gang — and the other were the young men who had taken part in the Watts riots the year before.

I did not try to dress like them, act like them or pretend to be tough. I showed great interest in them, and treated them with respect. The main thing was to convince them that I had no connection with the police. The thing that surprised me the most was that, in both cases, as I spent more time with them and got to know them better, I got to like and respect many of them quite a lot. There was a humanity there that we all have inside us. Meeting and photographing different kinds of people has always been the most exciting part of my job. I still love it.

Two big differences in the assignments, though, was that I shot the Hells Angels in black and white — which was perfect for their gritty world — and “Watts: A Year Later” was in color. Also perfect, because Watts had a lot of color, on the walls, the graffiti, the way people dressed — and, of course, my group of bombers who liked to practice making and throwing Molotov cocktails [see slides 17, 18 and 19 in gallery].

Those two assignments documented two utterly marginalized worlds that few people ever get to see up close. There was no job on earth as good as being a LIFE photographer.


Ben Cosgrove is the Editor of LIFE.com


© Bill Ray
© Bill Ray

Bill Ray (at right, on assignment in Sikkim in the Himalayas in 1965) was a staff photographer for LIFE from the mid-1960s until the magazine’s demise in the early 1970s.

Based in New York, Beverly Hills and Paris, he traveled the world covering major events, wars and great personalities, from Elvis Presley and Audrey Hepburn to JFK, Marilyn Monroe, the Beatles, Ray Charles, Frank Lloyd Wright, Brigitte Bardot and many more. See the LIFE.com gallery, “LIFE Rides With Hells Angels.”

[See more of Bill Ray’s work at BillRay.com]


TIME Civil Rights

FBI Letter to Martin Luther King Jr Reveals Ugly Truths From Hoover’s Era

MARTIN LUTHER KING A PARIS 1965
Martin Luther King, Jr., 1964
"First person in the Western world to have shown us that a struggle can be waged without violence"
Gamma-Keystone/Getty Images

MLK is depicted as evil and a fraud in the letter that urges the civil rights icon to commit suicide

A scathing letter sent by the Federal Bureau of Investigation to Civil Rights icon Rev. Martin Luther King Jr. has been uncovered, pulling back the curtain on J. Edgar Hoover’s efforts to discredit the leader as his popularity grew.

In the anonymous letter, published for the first time in the New York Times Wednesday, the author refers to the Nobel Peace Prize recipient as “evil,” a “fraud,” and a “dissolute, abnormal moral imbecile.” The author threatens to expose King as an adulterer and in the end flat-out suggests that the leader commit suicide.

One passage reads: “No person can overcome facts, not even a fraud like yourself. Lend your sexually psychotic ear to the enclosure. You will find yourself in all your dirt, filth, evil, and moronic talk exposed on the record for all time. I repeat—no person can argue successfully against facts. You are finished.”

The FBI under Hoover devoted a great deal of attention to Dr. King, whom Hoover considered a threat to national security, Vox reports. The letter reportedly came to be after Hoover failed to prove King was a Communist, which he could have used to disgrace him. Yale professor of American History Beverly Gage wrote in the New York Times, the letter is “the most notorious and embarrassing example of Hoover’s F.B.I. run amok.”

Read the full letter at the New York Times.

TIME Veterans Day

Sen. John McCain Remembers the Female Vets of the Gulf War

McCain is a U.S. Senator and the author, with Mark Salter, of Thirteen Soldiers: A Personal History of Americans at War.

Among the subjects profiled in Thirteen Soldiers: an army reservist whose life was forever changed by an Iraqi Scud missile attack in the 1990–91 Gulf War

Military service was a tradition in the families who joined the Army Reserve’s 14th Quartermaster Detachment. They came from communities and circumstances that yield more volun­teers for the military than do other parts of our society. They lived in a part of Pennsylvania where so many young people were in the military that “whenever a disaster happens anywhere in the world,” a local re­porter observed, “people around here hold their breath.” They were likely to know some of the casualties in the February 25, 1991, Scud missle attack in Saudi Arabia that killed 28 reservists.

Specialist Beverly Sue Clark, 23, was from Indiana County, Pennsylvania. She had joined the Reserves out of high school. She worked as a security guard and as a secretary at a local window and door manufac­turer. She wanted to be a teacher. She was popular and athletic and loved to ski. Her best friend in the 14th Quartermaster Detachment, headquartered in Greensburg, Pennsylva­nia, was Mary Rhoads, a meter maid in California Borough, Pennsylva­nia. Mary joined the Army Reserve in 1974, during the summer between her junior and senior years at Canon-McMillan Senior High, south of Pittsburgh. She didn’t have clear plans for her life after graduation, and she thought a part-time job in the army would let her follow in the family tradition and bring home much needed extra income.

In 1979 she transferred from the engineering company to the 1004th General Supply Company, also based at the Army Reserve Cen­ter in Greensburg. Mary and Beverly became friends when Beverly joined the 1004th in 1985. They hit it off right away. Mary, ten years in the Reserves by then, took the younger woman under her wing. When Mary trans­ferred to the 14th Quartermaster Detachment at Greensburg in 1988, Bev followed her. They were close, and thought they always would be. They would watch each other’s kids grow up. Mary’s daughter, Samantha, called Beverly “Aunt Bev” and always pes­tered Mary to pass the phone to Beverly when she called home.

Predictions varied about how many dead and wounded the United States would suffer in the war. Most were wildly off the mark. The U.S. Armed Forces were im­measurably better war fighters, better armed and equipped, and better led than the armed forces of the Republic of Iraq. None of the prognosticators realized just how much of a war you could fight from the air over a desert battleground where the enemy parked his tanks and ar­tillery in the glaring sun and sheltered his soldiers in sand berms. Nor did they appreciate just how determined Desert Storm’s commander, General H. Norman Schwarzkopf Jr., was to use the immense force he assembled to keep casualties low.

Given the nature of the war—a long air campaign followed by a short ground war and Iraq’s quick capitulation—casualties were far fewer than the most optimistic analyst had expected. But there were ca­sualties: 149 killed in action, a comparable number of noncombat deaths, and eight hundred or so wounded. Three hundred graves over which three hundred families wept and prayed. Many thousands of survivors wept too and bore their own wounds, seen and unseen. It helps none of them to know it could have been worse.

***

In January President Bush authorized the call-up of one million re­servists and national guardsmen for up to two years. The sixty-nine sol­diers of the 14th Quartermaster Detachment had started hearing scuttlebutt back in November that they would eventually deploy to the Gulf. Their order to mobilize came on January 15, 1991, the day before Desert Storm commenced. They left for Saudi Arabia on February 18 and arrived at the air base the next day. They were quartered temporarily in a large corrugated metal warehouse in Al Khobar, a suburb several miles from Dhahran.

Of course, they wouldn’t be on the front lines, although to do their jobs they would have to be closer than two hundred miles behind the front in Al Khobar. Some soldiers had premonitions, as soldiers off to war often do. Beverly Clark told her friend Mary Rhoads she had a bad feeling about the whole thing. She also mentioned her apprehension in the journal she kept. Soldiers’ families have premonitions too, especially the mothers. Just before she passed away from pancreatic cancer in November, Rhoads’s mother had told her that something terrible would happen but that Rhoads would be okay. Whatever fears disturbed them, none of the reservists resented their call-up.

Eleven of the reservists in the 14th who deployed to Saudi Arabia were women. The Persian Gulf War occasioned the largest single deploy­ment of women to a combat zone in American military history. Forty-one thousand officers and enlisted—one out of every five women in uni­form—deployed. They were pilots, aircrew, doctors, dentists, nurses, military police, truck drivers, communications technicians, intelligence analysts, security experts, administrative clerks, and water purification specialists deployed to a society built on tribalism, Islamic fundamental­ism, and primitive notions of gender inequality. Thirteen of them would be killed, four from enemy fire. Twenty-one were wounded in action and two taken prisoner. They did just about everything the men did, includ­ing flying missions and accepting other assignments that blurred the lines separating women from combat roles. But this was a war where lines were readily blurred. Even the idea of a front line seemed an anachronism in a war where so much of the fighting was in the air and where missiles were fired at targets located far to the rear, even at a country that wasn’t a bel­ligerent. The metaphor “a line in the sand” has come to mean a state­ment of resolve, but it originally indicated something impermanent, something that disappears in the first breeze. That is an apt metaphor for the Persian Gulf War, where the front was, literally and figuratively, a line in the sand. Even two hundred miles in the rear, the front could sud­denly encompass you.

For people of an active disposition, the Gulf War, irrespective of its high-tech thrills, its stunning successes and surprising brevity, could have been stultifying to soldiers who weren’t involved in the fighting. Mary Rhoads was bored to tears sitting in that big warehouse, and she hated being bored. She had spent seventeen years in the Army Reserve, half her life. She looked at the kids in the unit as her kids, saw herself as the mother hen. She picked up stuff they liked to eat, things to read, games to play, any­thing that might shorten the days until they were sent forward to do the job they had come to do. She had purchased a Trivial Pursuit game, among other diversions, and it was instantly a favorite entertainment in the barracks. She still felt closest to Clark. They both brought teddy bears with them to war; Clark’s was white and Rhoads’s brown. One night they were both on guard duty on the warehouse roof when Bev noticed a mist forming in the desert. “Look,” she pointed, “the angel of death.” Rhoads would remember that through all the years that followed, wondering if her friend had had another premonition.

***

The Iraqis fired four Scuds the night of February 25. Three of them appeared to break up in the atmosphere. The missile fired at 8:32 p.m. was detected by satellite and its position relayed to Patriot crews in Saudi Arabia. Three batteries tracked it on their radarscopes but didn’t launch their missiles because the Scud was outside their respective sectors. Two batteries, Alpha and Bravo, protected the air base at Dhahran. Bravo was shut down for maintenance that night. Alpha’s crew had been alerted to the Scud traveling in their direction, but their screen was blank. They checked to make sure their equipment was operating properly and were satisfied that it was. Still they saw nothing. They didn’t know their range gate had miscalculated the missile’s whereabouts. No one knew a Scud was plunging to earth at five times the speed of sound above the big metal warehouse where 127 reservists were living.

Ten minutes later, driving down the highway toward Dhahran, Rhoads heard the siren. They pulled off the road and watched as the Scud slammed into the barracks and detonated, creating a red and orange inferno that engulfed twisted beams, flying shrapnel, the modest posses­sions and mementos of the dead, and their charred bodies. Twenty-eight people were killed and ninety-nine wounded, grievously wounded in many cases. Among the dead were thirteen reservists in the 14th Quar­termaster Detachment, including Clark. Forty-three of the reservists wounded in the attack were from the 14th, which meant the detachment had suffered in a single attack a casualty rate higher than 80 percent, about as high a rate as any recorded. They had been in Saudi Arabia only six days.

Rhoads and her companions raced back to the base. They had to climb a fence to get into the compound, where all was bedlam. Fire trucks and ambulances had raced to the scene, sirens wailing. Blackhawks de­scended from the dark heavens to airlift the most seriously wounded. Rhoads tried to enter the burning building, but one of the rescuers stopped her. “My friends are in there,” she repeated over and over again. “You don’t want to go in there,” he warned her. When the ambulances pulled away, she ran to the other side and entered the building there. The smell of burned flesh, of death, filled her nostrils. She thought they were all dead. A moment later she tripped over a girder, wrenching her knee. A soldier in a transportation unit pulled her back outside and told her to stay there. That was where she saw the bodies. The Vietnam veterans in the unit who survived the attack had retrieved them and lined them up side by side. She recognized Clark right away. She limped over to her friend, embraced her lifeless form, and shrieked at the treacherous night, while a news camera recorded her agony.

Everyone who wasn’t badly hurt was quartered that night in a large, convention center–like meeting space, where television sets replayed the disaster on what seemed a continuous loop. Rhoads called her husband to let him know she was alive and reported to a sergeant back at the Re­serve center in Greensburg. Then she and a few others, impatient and wanting to help, commandeered a van and drove first to the warehouse, then to different hospitals to locate the wounded, and then to the morgue to identify the dead. Rhoads identified the bodies of Tony Madison, Frank Keough, and Beverly Clark.

***

Rhoads eventually returned to her job with her leg in a big white brace. She was eager to get going; she wanted her life back. Something was wrong, though. She had frequent nightmares; she lost her temper. She used to shrug off the kids who hassled her and called her names for giving them a parking ticket; now she got into it with them, right in their faces, daring them. She wasn’t herself. She froze once while directing traffic when she heard an emergency vehicle’s siren. Then she started getting really sick.

Chronic vaginal bleeding resulted in a hysterectomy. She had her gall bladder removed and her appendix. Stomach ailments, headaches, sinus troubles, and serious difficulty breathing brought her to Walter Reed Army Medical Center in Bethesda, Maryland, then the hospital in Brownsville, Pennsylvania, the VA hospital in Pittsburgh, then back to Walter Reed and again to Pittsburgh. Doctors discovered precancerous cells in her esophagus. She developed liver disease.

These and other ailments were attributed to the mysterious malady that afflicted many Desert Storm veterans, called Persian Gulf War syn­drome. None of the doctors Rhoads saw in Bethesda or Pittsburgh could figure out what was making her so sick. She was becoming almost com­pletely incapacitated. Scott Beveridge and another local reporter, Connie Gore, took a genuine interest in her case and wrote about her often. Her local congressman, Frank Mascara, and his aide, Pam Snyder, got involved and pushed the VA to recognize that whatever its cause, Gulf War syndrome was real, and it was destroying the lives of people who had risked everything to serve their country and who deserved their government’s attention to their service-related illness. Their persistent appeals on her behalf re­sulted in a full disability pension, one of the first awarded to a sufferer of Gulf War syndrome. She gave testimony to the Senate Veterans Affairs Committee in 1991 and traveled to Washington in 1995, while very ill, to testify to President Bill Clinton’s Advisory Commission on Gulf War Illnesses. Congressman Mascara began his statement in a hearing at the House Veterans Committee by invoking her as the poster child for Gulf War syndrome.

When word got around about his successful intervention on Rhoads’s behalf, Mascara’s office was swarmed with calls from veterans around the country, who like Rhoads were plagued by numerous illnesses since com­ing home from the Gulf. No one has yet to establish a cause or causes of the disorder that appears to weaken the immune system, making its vic­tims susceptible to multiple illnesses. There are many theories—fumes from the oil well fires, reactions to inoculations, Iraq’s undetected use of chemical weapons, Scud warheads carrying biological agents, combat stress—but none have been proven. Whatever its cause, thousands of Gulf War veterans suffer chronic and multiple illnesses attributed to it.

After her testimony to President Clinton’s advisory commission, Rhoads dropped out of public view. Beveridge wrote that he had received “anonymous hate mail” attacking Rhoads for publicizing her suffering and condemning the deployment of women to war theaters. It appears she heard some of the same criticism. She might have been estranged, for a brief time anyway, from a few others in her unit. When asked, she said the 14th was like a family, and like all families, they have their squabbles and then make up. “We love each other,” she maintains.

THIRTEEN SOLIDERS

Senator John McCain is a United States Senator and an author, with Mark Salter, of Thirteen Soldiers: A Personal History of Americans at War, out today. He served in the U.S. Navy from 1954 until 1981.

Mark Salter is the author, with John McCain, of several books, including Faith of My Fathers. He served on Sen. McCain’s staff for 18 years.

From Thirteen Soldiers: A Personal History of Americans at War, by John McCain and Mark Salter. Copyright © 2014 by John McCain and Mark Salter. Reprinted by permission of Simon & Schuster, Inc.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser