TIME conflict

Why Mar. 7, 1965, Was the End of an Era

LBJ Signs Civil Rights Bill
Warren Leffler / Getty Images President Lyndon Johnson signs the Civil Rights bill on April 11, 1968

The events of exactly 50 years ago marked both the climax and the beginning of the end of the last heroic age in American history

Seldom had the United States seemed to be more blessed than in early 1965. The country’s economy had been expanding steadily for four years, and unemployment was about 4%. Inflation, a serious problem in the late 1940s and early 1950s, was now negligible. Meanwhile, thanks to President Johnson’s predecessor, John F. Kennedy, the nightmare of nuclear war had retreated during the previous two years. The nation was freer to take care of its own needs, and it was building many thousands of miles of interstate highways and dozens of new schools and colleges to meet the needs of the Baby Boom generation, now between 5 and 22 years old. At the same time, Johnson was determined to push Medicare through Congress, making it possible for seniors to enjoy old age without the threat of illness that might not only take away their own savings, but those of their grown children.

One problem, however, still deeply troubled the nation: civil rights. For some weeks, Martin Luther King, Jr. had been leading a protest for voting rights in Selma, Ala., designed to open up voter registration rolls to the black citizens of that community, and, more broadly, of the whole deep south. Johnson had been preparing for possible additional legislative action to solve this problem for months, and the Justice Department had prepared a bill to allow the federal government to send registrars into the most affected states to sign up voters where local officials refused to do so. Johnson had been promising King such a bill for weeks, but he was waiting for an overwhelming northern consensus before putting the legislation before Congress

A dramatic event was necessary to get the attention of the whole nation, and on Sunday, Mar. 7, King’s Southern Christian Leadership Conference and the Student Non-violent Coordinating Committee (SNCC), along with the opposition of Sheriff Jim Clark of Selma and the Alabama State Police, provided it. Defying Alabama authorities, marchers were savagely beaten when they tried to cross the Edmund Pettus Bridge. National television cameras captured the spectacle of troopers beating Americans who simply wanted to vote, and the momentum for action became irresistible. A week later, LBJ called for the voting rights act before a joint session of Congress, and concluded, “We Shall Overcome.”

The premise of the civil rights bills, in those days, was simple. The United States was on the right track. The nation’s extraordinary economic growth had given the mass of the American people an unprecedented standard of living. It had defeated totalitarian states in the Second World War and held the line against Communism in the 20 years that followed. It stood for freedom and opportunity. The exclusion of Negro Americans (as they were still called) from the rights, privileges and benefits of American life was a blot upon the nation and a contradiction to its values, one that had to be removed were the United States to live up to its promises. This belief, in the wake of the 1964 election, clearly enjoyed overwhelming support. Johnson had won 60% of the popular vote and 44 states against Barry Goldwater, who had opposed the Civil Rights Act of 1964. LBJ could get virtually any proposal he wanted through Congress, and that was what he was about to do.

Yet on that very same day, Mar. 7, Johnson had already set in motion another series of events that would destroy not only his Presidency, but the entire postwar consensus which he seemed at that moment to embody. On March 7, U.S. Marines had landed in Da Nang—the first American ground forces to enter South Vietnam.

Several weeks earlier, in mid-February, the sustained bombing of North Vietnam had begun, in response to a Viet Cong attack on a US air base at Pleiku. Johnson calculatedly refused to acknowledge that the nation’s policy had changed, and the Administration portrayed the initial Marine deployment as a defensive measure. But that was false. The deployment was the first step in the execution of a massive war plan, one that would bring close to 200,000 young Americans into South Vietnam by the end of the year, and more than half a million by 1968.

Like the voting rights act, the landing at Da Nang had a long history, reflecting widely held beliefs in postwar America. It was the Eisenhower Administration, as I discovered twenty years ago when writing my book, American Tragedy, that had committed to defending Vietnam and neighboring Laos against Communist aggression. Eisenhower had been on the verge of implementing that commitment but Kennedy had immediately backed away from it. During the first year of his presidency Kennedy had received a long series of specific proposals from his cabinet and bureaucracy to send combat troops into Laos, Vietnam or both, but he had rejected them all and refused to declare South Vietnam a vital interest of the United States. Instead, he increased the American presence with 3,000 advisers and about 14,000 support troops, and focused on other foreign policy issues.

The public did not know this, but Johnson had announced that Vietnam was his Administration’s foreign policy priority within weeks of taking office. By March 1964 he had expressed willingness to go to war to save South Vietnam as soon as he had been re-elected himself. A planning exercise for American action began within days of his re-election, and it was finished by the first week of December. Sometime in the near future, the final paper stated, the United States would begin a sustained bombing campaign in North Vietnam, accompanied by “appropriate US deployments to handle any contingency.” An appendix to the approved document spelled out what that meant. The Marine battalion that had been floating in ships off Da Nang for months would land three days after the bombing began, and an entire Marine division would embark to join it. An Army brigade would follow at once, and several more divisions would arrive by the summer.

During the second half of February 1965, while the Selma protests grew and Great Society legislation began moving through Congress, Johnson refused to approve the war plan in writing, much less to announce it to the nation, but he evidently eventually told Robert McNamara to go ahead. After Da Nang, the other planned deployments followed the Marine landing almost at once. Johnson and his whole generation—the GI or “greatest” generation, as it has come to be known—believed in great enterprises. Civil rights was one such enterprise; defending threatened nations from Communism was another. With interstate highways spreading around the country, the Apollo program halfway underway and a booming economy, they—and most of their countrymen—thought they could do anything. More than a few prominent voices were already expressing skepticism about the Vietnam war, but they were drowned out.

Still, within two more years, the war had alienated much of the younger generation, substantially lowered Johnson’s popularity and split the Democratic Party. It also split the civil rights movement, and eventually broke the alliance between Johnson and Martin Luther King, Jr. American participation continued until 1973, on a scale that would be unimaginable today. The entire roster of killed in action in both the Iraq and Afghanistan wars is still less than that incurred in six months of combat during 1968. But the war could not be won.

Tragic heroes, Aristotle argued, are brought down not by evil, but by a combination of their own ambition for greatness and an ignorance of the facts. So it was for Lyndon Johnson, the “best and the brightest” who served him and, indeed, the whole leadership class of the postwar United States. The biggest casualty of the war, one from which we still suffer, was the consensus, the optimistic spirit and the belief in our national purpose. The landing at Da Nang was the beginning of the end of one of history’s greatest eras of civic achievement. No American under 55 has ever experienced such a spirit, and it is not clear that any American now living will experience it again. That is the tragic legacy of the events of March 7, 1965.

The Long View

Historians explain how the past informs the present

David Kaiser, a historian, has taught at Harvard, Carnegie Mellon, Williams College, and the Naval War College. He is the author of seven books, including, most recently, No End Save Victory: How FDR Led the Nation into War. He lives in Watertown, Mass.

TIME language

Winston Churchill Did Not Coin the Phrase ‘Iron Curtain’

Churchill's 'Iron Curtain' Speech
George Skadding—The LIFE Picture Collection/Getty British Prime Minister Winston Churchill delivers a speech at Westminster College that addressed the Communist threat, and in which he uttered the now-famous phrase 'Iron Curtain,' Fulton, Mo., Mar. 5, 1946.

On the anniversary of his famous speech, TIME takes a look at why people misattribute quotes and just plum make them up

Exactly 69 years ago, on Mar. 5, 1946, Winston Churchill stood in a college gymnasium in Fulton, Mo., at the beginning of the Cold War, while President Harry S. Truman sat behind him in a gown and mortarboard. Speaking to students gathered at Westminster College, he accepted an honorary degree and famously condemned the Soviet Union’s ways: “From Stettin in the Baltic to Trieste in the Adriatic, an iron curtain has descended across the Continent.”

The actual title of Churchill’s speech was “Sinews of Peace,” though most people know it as the “Iron Curtain speech.” Over the years there has been another twist of the record. Churchill often gets credit for coining that metallic metaphor—on that stage—for the figurative barrier drawn across Europe between the capitalist West and the communist East. But he did not. In fact, there’s evidence of the phrase being used to mean exactly that a good 26 years earlier when an E. Snowden (seriously) published a travelogue about her adventures in Bolshevik Russia.

So why do quotes get false histories? Lots of reasons.

Misattribution can be convenient. It’s easy not to question a coinage that it seems plausible—especially when it just so happens to give us good gravitas by association.

“You reach for a famous name to give authority,” says Elizabeth Knowles, editor of the Oxford Dictionary of Quotations. “You want to say Churchill said it. Because you have associated what you’re saying with that particular person, it gives the saying a bit more oomph.” Iron curtain just feels like something a dulcet, witty orator like Churchill would come up with, right? And it’s a much clearer signal that you’re educated and that your words have heft if you attribute a quote to Winston Churchill than to Snowden, an unremembered member of a trade-union delegation.

Many times people invoke quotations that were never said at all.

“Play it again, Sam.” Neither Bogart nor Bergman said these words.

“Elementary, my dear Watson.” Doyle wrote no such thing.

“Beam me up, Scotty.” Sorry, nope.

These get passed on because we wish people had uttered them. “A misquotation of that kind can be, almost, what you feel somebody ought to have said,” says Knowles. “It summarizes for somebody something very important about a particular film, a particular relationship, a particular event.” Even if it’s made up and especially if it’s close to things people really did say, we embrace it as gospel. After all, Bergman did utter, “Play it, Sam,” in Casablanca. And Bogart did say, “If she can stand it, I can! Play it!”

Sometimes misquotations get handed down because they convey the right idea and sound better to us than what the person actually said.

In 1858, Abraham Lincoln gave a speech in which he said, “To give victory to the right, not bloody bullets, but peaceful ballots only, are necessary.” Over time, that sentiment been recrafted as, “The ballot is stronger than the bullet.” The latter version is snazzier. Even when sources know this precise phrasing was probably never really used by Lincoln, they continue to pass it on. Take Dictionary.com’s quotes site, where the well-sourced quote from 1858 is in the fine print. Also in fine print is the admission that the quote in giant font across the top of the page was “reconstructed” 40 years after Lincoln was supposed to have said it. Which, as far as editors at Oxford are concerned, he did not.

Screen shot taken from Dictionary.com

“It’s a very natural thing, that we edit as we remember,” Knowles says. “So when we quote something we very often have in mind the gist of what’s being said. So we may alter it slightly and we may just make it slightly pithier or simpler for someone else to remember. And that’s the form that gets passed on.”

While it may be easier to remember that Churchill invented the iron curtain, here’s the real history:

In its earliest use, circa 1794, an “iron curtain” was a literal iron screen that would lowered in a theater to protect the audience and auditorium from any fire occurring backstage. From there, it became a general metaphor for an impenetrable barrier. In 1819, the Earl of Muster described the Indian river Betwah as an iron curtain that protected his group of travelers from an “avenging angel” of death that had been on their heels in that foreign land. Then, in 1920, Ethel Snowden made it specifically about the East and West in Throughout Bolshevik Russia (1920):

At last we were to enter the country where the Red Flag had become a national emblem, and was flying over every public building in the cities of Russia. The thought thrilled like new wine … We were behind the ‘iron curtain’ at last!

Read TIME’s original coverage of the Mar. 5, 1946, speech, here in the TIME Vault: This Sad & Breathless Moment

Read next: You Can Now Own a Vial of Winston Churchill’s Blood

Listen to the most important stories of the day.

TIME politics

See Photos From the Speech that Made ‘Iron Curtain’ a Household Term

On March 5, 1946, Winston Churchill warned that without a "special relationship" between Britain and the United States, World War III could be imminent

“The world last week listened to some blunt words from Winston Churchill.” When LIFE reported on the former (and future) prime minister’s now-famous Iron Curtain speech delivered on March 5, 1946, the magazine suggested that its readers would do well to listen. Churchill had somberly told a small audience at Westminster College in Fulton, Missouri, that the time had come to take action to prevent World War III. “Coming from a man who warned of World War II before the world was ready to listen, his words commanded sober thought.”

Churchill’s speech, which he titled “The Sinews of Peace,” came at a time when the Allies’ coalition with the Soviet Union had morphed into distrust. “From Stettin in the Baltic to Trieste in the Adriatic, an iron curtain has descended across the continent,” he warned. The Soviet Union, he said, was exerting unchecked control over an expanding geographic area. The only way to keep it in check was through a fortified alliance between the U.S. and Britain.

Though the speech is famous for introducing “iron curtain” into the Cold War vernacular, it was not the first time the phrase had been uttered. It was used in the Babylonian Talmud (compiled between the 3rd and 5th centuries) and by Queen Elisabeth of Belgium to demonstrate her country’s stance against Germany as World War I began. It appeared several times in literature, and Churchill himself used the term in telegrams to President Truman in 1945.

But from that day forward, no other phrase would better encapsulate the line—both geographic and ideological—that separated the Eastern Bloc from the West. One month later, LIFE reported on the world behind the iron curtain, where inflation, food scarcity and political tumult were rapidly becoming the norm.

Churchill amped up his rhetoric to emphasize the importance of cementing his country’s partnership with the U.S. “If all British moral and material forces and convictions are joined with your own in fraternal association,” he concluded, “the highroads of the future will be clear, not only for our time, but for a century to come.”

Truman, for his part, was present at the speech, sitting onstage in a show of solidarity. He knew that Britain, which was nearly bankrupt and still enforcing food rations, was on the wane as a world power, and needed the U.S. more than the other way around. Apart from the speech that day, the two men stuck mainly to light conversation. They “talked weather, family, countryside,” LIFE reported—“little politics.”

Liz Ronk, who edited this gallery, is the Photo Editor for LIFE.com. Follow her on Twitter at @LizabethRonk.

TIME politics

51 TIME Magazine Covers Featuring a Bush

From a father's run for Senate to a second son's possible run for President

Correction appended, March 12, 2015

Stick around American politics long enough, and your story becomes the country’s. That’s one lesson from the longevity of the Bush family.

Though not every Bush who ventured into politics made the cover of TIME — sorry, Prescott Bush — the clan, including George H.W. Bush, George W. Bush and some Barbara Bush along the way, has garnered dozens of covers over the years. (For the record, a complete gallery of the Clinton clan’s covers would be 54 slides long.) Flip through the images and you’ll notice a lot more than the years passing. The Bush story traces the fall of Nixon, the end of the Cold War, the contested 2000 election, the tragedy of Sept. 11, the war in Iraq. It is, essentially, a record of a few decades of American history.

George Bush Presidential Library and Museum/CorbisGeorge H. W. Bush, Jeb and George W. from 1970.

Now Jeb Bush, who had so far only made one cover cameo (Aug. 7, 2000), may be on the verge of adding another chapter to political history. No matter what happens in 2016, he’s got a good start: this week, he’s on the cover of TIME.

Correction: The original version of this story misstated the number of times a Clinton had been featured on the cover of TIME. It was 54, as of March 5, 2015.

TIME politics

The Art of the Deal

Joe Klein is TIME's political columnist and author of six books, most recently Politics Lost. His weekly TIME column, "In the Arena," covers national and international affairs.

Benjamin Netanyahu offers stark warnings (and perhaps an assist) on a pact with Iran

Is there anybody here from Texas?” the Prime Minister of Israel asked the 16,000 assembled for the annual American Israel Public Affairs Committee’s annual policy conference. Of course there were. Whoops and cheers erupted. It is one of Benjamin Netanyahu’s conceits that he knows how to do American politics, how to both present himself in a user-friendly way to the American public and play the back alleys of power in Washington. He has had some success with this, but not always. His attempt to intervene in the 2012 presidential campaign on Mitt Romney’s behalf was disastrous. His strong speech on March 3 to members of Congress, assailing the ongoing nuclear negotiations with Iran, may be better received, both in America and, more to the point, in Israel, where he faces a difficult re-election campaign. “People are tired of Bibi. I’m tired of Bibi,” said an Israeli attending the AIPAC meeting. “But I have two sons in the military, and I have confidence that Netanyahu will make decisions that will keep them as safe as possible. I don’t feel the same about any of the opposition leaders.” Certainly no other potential Israeli leader could have made so powerful an appeal to Congress.

And despite the cheesy political context of the moment, there are aspects of Netanyahu’s speech that should be cheered even by those of us who believe that President Obama is pursuing the right course in seeking a nuclear deal with Iran. Netanyahu’s bluster and bombing threats have been invaluable to the negotiating process. He’s been a great scary-tough cop to President Obama’s sorta-tough constable. And Obama has needed all the help he can get. “The Persians believe that the time to get really tough is just before a deal is cut,” an Israeli intelligence expert who favors the deal told me in December. “So tell me why your President is sending nice personal letters to the Supreme Leader at exactly the wrong time?”

On the very day that Netanyahu spoke, Iranian Foreign Minister Mohammad Javad Zarif “rejected” the 10-year restrictions on Iran’s nuclear-energy program that he’d spent the past few months negotiating. If the haggle were taking place in the bazaar in Tehran, this would be the time for the U.S. to “call their bluff,” as Netanyahu said, and perhaps even counter with a 15-year deal. There would be danger in hanging tough; the Iranians could easily walk away, even though this is a deal they desperately need. The Iranian people, not just the Ayatullah’s regime, are extremely sensitive to perceived humiliation by the West; a certain, often justified, paranoia is part of the Persian DNA. “They think they invented bargaining,” a South Asian diplomat told me. “They push it too far.”

So Netanyahu’s speech was, at least, a useful reminder about the art of the deal in the Middle East. It was also a useful reminder that Iran’s extremist Shi’ite leaders are no picnic, though nowhere near the threat to American security that Sunni radicals like ISIS are. It is easy, in the midst of the current near embrace, to overstate the case for Iran. It is the most middle-class, best-educated country in the region, aside from Israel and Turkey, with the best-educated and most professional women; it also has a cheerily pro-American populace. But it is, along with Cuba, the greatest mismatch between a people and a government of any country in the world. The regime’s support for Hizballah, the Houthis in Yemen and other Shi’ite militant organizations is indefensible. A nuclear deal with Iran might grease the way for the diminution, through democracy, of the Supreme Leader’s regime–or it might further empower the Iranian Revolutionary Guards Corps, which controls at least 20% of the economy and would be enriched by the lifting of sanctions.

But here is what Netanyahu cannot argue: that his position represents a step forward. Indeed, it is in fact the exact opposite. Right now, under the interim agreement negotiated by the U.N. and U.S., Iran has stopped–in fact, it has reversed–the enrichment of highly enriched (20%) uranium. It has allowed extensive inspections of all its facilities. It has agreed to stop plans for a plutonium reactor. There is a good chance, if the deal is made, that it will continue in this mode, in compliance with the Nuclear Non-Proliferation Treaty. Netanyahu’s rhetoric that a deal would “pave” the way toward an Iranian bomb is a ridiculous overstatement; his “plan” would guarantee an Iranian rush to arms.

Revolutions grow old. It is difficult to sustain fanaticism. The Iranian people are tired of their global isolation. It may be that their semi-democratically elected leaders, as opposed to the theocratic military regime, are ready to rejoin the world. There is nothing to lose by testing that proposition–if the Iranians stop playing around and make the deal.

TO READ JOE’S BLOG POSTS, GO TO time.com/swampland

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME politics

Why March 4 is a Great Day for Women in Politics

Jeannette Rankin and Frances Perkins
FPG / Getty Images (L) and Gamma-Keystone / Getty Images (R) Jeannette Rankin pictured in 1916 (L) and Frances Perkins in 1928 (R)

Two women took historic public offices on this day

On the same date, March 4, many years apart, two women made history in American politics. In 1917, Jeannette Rankin took office as a representative of Montana, the first woman ever in Congress. The first female member of a president’s cabinet, Frances Perkins, took her post on Mar. 4, 1933. Both were pioneers, carving out their roles as they went along. But their accomplishments extended beyond just showing up as the first women in what had previously been male political spheres. Rankin took an influential stand for women’s suffrage and pacifism, and Perkins’ ideas laid the groundwork for the New Deal and social security.

“They both had to invent the roles for themselves in public life,” says Kirstin Downey, author of The Woman Behind the New Deal, a biography of Perkins. “And they had to do it by being strong and independent-minded.”

The two women were born the same year, 1880 (although Perkins, who notoriously lied about her age, always claimed 1882). Rankin was raised in Montana by a rancher and a schoolteacher. After working for years in the woman’s suffrage movement, in 1914 she succeeded in winning women the right to vote in her state. Two years later she was running for one of two of the state’s at-large Congressional seats, which she secured by 6,000 votes with the help of Montana’s newly enfranchised women.

Pacifism played a major role in the women’s suffrage movement, and Rankin was one of its staunchest champions. She took office amid intense debate over whether or not the U.S. should enter World War I. In the final vote, she was one of 50 dissenters, a decision that generated controversy in the media and among her fellow suffragists. Among the citizens of Montana, her move wasn’t as unpopular, but new state legislation changed the rules about Congressional elections, placing the Republican Rankin in an overwhelmingly Democratic district. Rankin decided to gamble on a Senate race in 1918, which she lost.

Although she wanted to be remembered as the congresswoman who voted for women’s suffrage nationwide, the 19th Amendment passed in 1919, when she was out of office. Instead, she immortalized her reputation as a pacifist when in 1940 she was elected to a House seat a second time. After the attacks on Pearl Harbor, she stood firm as the only vote against the United States entering World War II. Responding to nearly universal pleas to change her vote to a “yes,” or at least to abstain, she responded, “As a woman I can’t go to war, and I refuse to send anyone else.” The ethical stand she took drew a major backlash, making it difficult to do much with the rest of her term and obliterating her chances at reelection.

Frances Perkins, for her part, got her political start in 1911 when she witnessed the horrific Triangle Shirtwaist Factory fire in New York City. She made a name for herself on work-safety commissions in the city, helping to conceive and draft many fire regulations that still exist today. When Franklin Delano Roosevelt was elected the governor of New York, he appointed her as his industrial commissioner. According to Downey, she was one of his most trusted, lifelong advisers. Yet her appointment to Secretary of Labor was still uncertain after he won the presidency, as her gender made her a highly controversial choice. When Roosevelt offered her the job, Downey says, she accepted with conditions—that he let her pursue policy goals that would eventually make up the New Deal.

“She’s really the creator of social security. She’s the driving force behind the Federal Labor Standards Act. It lead to the 40-hour work week. It banned child labor,” Downey tells TIME. “And it changed America.”

To be taken seriously, Perkins dressed very modestly, rarely wearing any makeup. She noted that men tended to take more seriously women who reminded them of their mothers—one of many observations she recorded in a journal she’d been keeping throughout her professional career titled, “Notes on the Male Mind.” She was quiet in meetings, lest she interrupt and be shouted down, or worse, bruise the ego of the man she sought to contradict. She later wrote:

“I tried to have as much of a mask as possible. I wanted to give the impression of being a quiet, orderly woman who didn’t buzz-buzz all the time. … I knew that a lady interposing an idea into men’s conversation is very unwelcome. I just proceeded on the theory that this was a gentleman’s conversation on the porch of a golf club perhaps. You didn’t butt in with bright ideas.”

Rankin and Perkins confronted odds that exceeded under-representation, from repressive stereotyping to an absence of bathroom facilities. When Rankin first took office, American women were three years away from having the vote guaranteed. Perkins herself was unable to vote for much of her early political career, until New York state voted for suffrage in 1917.

“There was a lot of stigma attached to them. There was an unsavory association to unattended women,” Downey says. “Back then there was a saying about women, ‘You only want to be in the newspaper twice—when you’re married and when you die.'”

Perkins hoped her unquestionable success would earn her a comfortable professorship after her retirement from public life. Unfortunately this wasn’t the case. She had trouble securing offers, moving universities often when she wasn’t granted tenure. Eventually, she ended up at Cornell, where she stayed until her death in 1965.

Meanwhile, though Rankin never again took public office, she used her notoriety as a pacifist to continue lobbying against war. She led a peace march in the Washington in 1968 to protest U.S. involvement in Vietnam, and when she died at age 91 in 1972, she was considering yet another run for Congress.

TIME politics

Hillary Clinton’s E-Mail Trouble Started in 1997

Oct. 20, 1997, cover of TIME
Cover Credit: PATRICK DEMARCHELIER The Oct. 20, 1997, cover of TIME

The former Secretary of State is in hot water over her e-mail usage while in that office. While First Lady, she resolved to overcome a fear of computers

Possible Presidential contender Hillary Clinton may have broken the e-mail rules during her time as Secretary of State, according to a new story in the New York Times. Clinton used her own personal e-mail account to conduct government business, the Times reports.

It’s not the first time Clinton’s e-mail has given her trouble — her use of personal e-mail accounts had been made public at least two years ago, but it was almost two decades ago she didn’t hide the fact that she was, as a TIME cover story about the then-First Lady put it, “computer illiterate.”

That particular story used the First Lady’s 50th birthday as a way to discuss the Baby Boom generation’s maturation: Clinton, newly an empty-nester, was re-examining her life and deciding where to go from there. One possible direction was online:

With Chelsea’s departure, the First Lady who mastered Game Boy has resolved to overcome her phobia of computers. Her chief of staff, Melanne Verveer, lately caught her thumbing through a book called Internet E-Mail for Dummies.

At the time, President Clinton said he imagined the couple retiring one day to sit on a beach as “old people laughing about our lives”; TIME commented that such a future was unlikely to satisfy his wife, who said that she would instead “go on to do something else that I find challenging and interesting.” Years later, there’s no doubt that she made good on that prediction. She may have even overcome her fear of computers. After all, by today’s standards when it comes to “Internet E-Mail,” most people in 1997 were pretty much dummies.

Read the 1997 cover story here, in the TIME Vault: Turning Fifty

TIME politics

Netanyahu Will Be Speaking in Winston Churchill’s Shadow

Netanyahu is only the second foreign leader to address Congress three times

A leader of a close U.S. ally arrives in Washington to speak before Congress for his third time, as relations between the two countries begin to fray.

That was British Prime Minister Winston Churchill in January 1952, making what TIME then called a “cautiously billed” visit to the United States to attempt to restore the close ties that had carried the U.S. and Britain through World War II.

The same description might also work for Israeli Prime Minister Benjamin Netanyahu, who addresses Congress on Tuesday, becoming only the second foreign leader to address Congress three times. The close relationship between Israel and the U.S. has been buffeted by Israeli policies in the West Bank (opposed by the White House) and by U.S.-led negotiations with Iran over its nuclear program (opposed by Netanyahu). Now, Netanyahu is hoping to convince Washington to see eye-to-eye with him on Iran’s nuclear program.

Netanyahu has already been compared to Churchill by Republicans in Congress. “There is a reason that the adjective most often applied to Prime Minister Netanyahu with respect to Iran is Churchillian,” said Senator Ted Cruz on Monday. House Speaker John Boehner said he plans to give Netanyahu a bust of Churchill.

Here’s how Churchill handled the situation:

In 1952, the post-war state of affairs had brought with it a new set of grievances between Washington and London. What approach should be taken toward Communist China? Would the U.S. support British influence in the Middle East? Would Britain allow the U.S. to use bases in England for nuclear-armed flights against Russia? “But above all else was the fact that, in the time of her own financial and foreign-affairs crises, Britain had somehow lost touch with the U.S.,” TIME wrote in the Jan. 14, 1952 issue.

Still, Churchill faced a friendlier environment than Netanyahu might on Tuesday. While the Prime Minister did not share the same bond with President Truman that he had with Truman’s predecessor, Franklin D. Roosevelt, he was warmly received in Congress and he met personally with Truman. (Obama has declined to meet with Netanyahu, citing concern about influencing upcoming elections in Israel.)

In an article in the Jan. 28, 1952 issue, TIME reported on his entrance into the chamber: “The great man, bearing his 77 history laden years with impassive dignity, walked slowly through the standing, clapping U.S. Congressmen. He had aged, of course, but Winston Churchill seemed hardly a shade less pink-cheeked, rocklike and John Bullish than when he spoke before the House and Senate during World War II.”

One of those speeches had been given nine years earlier, on May 19, 1943, when Churchill had spoken to Congress to provide a confident report on wartime progress and to pledge Britain’s support in the fight against Japan. It was “not one of Churchill’s greatest speeches,” TIME reported, “though any other orator might well have envied it.” The bar had been set high by his first appearance, on Dec. 26, 1941, when Churchill arrived in Washington to rally a disheartened nation that was still reeling from the Pearl Harbor attack three weeks earlier.

Wrote TIME:

Churchill arrived like a breath of fresh air, giving Washington new vigor, for he came as a new hero. Churchill—like Franklin Roosevelt, not above criticism at home —is, like Franklin Roosevelt in Britain, a man of unsullied popularity in his ally’s country…. There were tears in Winnie Churchill’s eyes at the ovation which greeted him, from isolationist and interventionist Congressmen alike. He shoved his thick, hornrimmed glasses over his nose, blinked, balanced himself like an old sailor. With a sly grin, he made his joke, established himself as one of the boys.

Then he let go: eloquence, blunt, polished and effective as an old knobkerrie, the growling, galling scorn for his enemies, the passages of noble purple for his friends. Between bursts of applause in which Supreme Court Justices and diplomats joined as lustily as doormen, the galleries wondered whether ever before had such a moving and eloquent speech been made on the Senate floor. Actually it was not so much the speech as the personality that put it over.

Though Churchill’s third speech was received less “lustily,” Netanyahu, who previously spoke to Congress in 1996 and 2011, might learn from the British Prime Minister’s performance that day. Despite the circumstances, and despite not accomplishing all his aims, Churchill’s visit in 1952 ultimately proved helpful.

“In spite of the very serious failure to make progress on Middle East policy,” TIME observed, “the Churchill visit was a success; it reversed the Anglo-American drift away from unity.”

Read TIME’s story about Churchill’s first speech to Congress: The U.S. at War; Great Decisions

TIME photography

Meet the Man who has Photographed Mount Rushmore for Eight Decades

The monument turns 90 years old on March 3. 'People change...but the mountain stays the same,' says Bill Groethe

Bill Groethe was only a baby when Congress first passed legislation authorizing the establishment of a monument to “America’s founders and builders” at Mount Rushmore in the Black Hills of South Dakota, on Mar. 3, 1925. When the work of carving began — an event celebrated by President Coolidge, who wore a cowboy outfit to the ceremony in 1927 — Groethe was too young to care very much.

But that didn’t last long. Groethe, who is now 91, grew up and still lives and works in Rapid City, S.D.. He has seen the monument evolve over the years, and not just with his eyes: Groethe has been photographing Mount Rushmore since 1936.

“The first time I went up to the mountain as an assistant was in 1936 when Franklin Roosevelt was here to dedicate the Jefferson figure,” Groethe tells TIME. “I carried the film bag for my boss. I was 13 years old and I have pictures of me standing by the [president’s] limousine.”

Groethe, who grew up next door to the man who owned what was then his town’s only camera shop, got his first camera at age 10 and ended up working for the photographer Bert Bell by trading his labor for photo supplies. Bell had been sent to photograph South Dakota by the Chicago and Northwestern Railroad in order to drum up interest in tourism and ended up settling in Rapid City.

Courtesy of Bill GroetheBill Groethe holds a camera during his time as a photographer for the Army Air Corps in WWII.

Groethe apprenticed for Bell beginning in 1935 and began to take his own photos with a folding Kodak in 1937. Groethe worked for Bell for another two decades (with the exception of three years during World War II when he was photographer for the Army Air Corps). In 1957, he opened up his own photography business. Groethe also ended up inheriting files from before his own time, of early Mount Rushmore construction; he has thousands of those negatives, from which he still makes prints.

All these years later, Roosevelt’s visit to Rapid City — the occasion for Groethe’s first trip up Mount Rushmore — ranks among his favorite memories of monument. He remembers that people came from several states nearby to attend. TIME noted the following week that the crowd nearly doubled the town’s population. “At a signal from Sculptor Borglum’s daughter, his son, across the valley, dropped the flag, revealing an heroic head of Jefferson, 60 feet from crown to chin,” the magazine reported. “Simultaneously five dynamite blasts sent rock clattering down from the space where Lincoln’s face is to be carved.”

Courtesy of Bill GroetheBill Groethe with his 8×10 camera in front of Mount Rushmore, c. 1990s.

What Groethe remembers of that day is a little different, though no less exciting. “When you’re 13 years old you’re thinking mostly of being lucky to have a job and get to go along and go up in the cable car,” Groethe says. “I continue to have that interest in the mountain, of course. It means a lot to me. I still get a good thrill out of seeing the mountain. It hasn’t changed much. People change and facilities change, but the mountain stays the same.”

Mount Rushmore has not been without its detractors. The mountain is considered defaced by some, for reasons relating to the environment or Native American traditions. But Goethe says that, in his experience, the arguments against the monument don’t take away from its grandeur.

“I can attest to the fact that when I sit at a table [at Mount Rushmore], as I have for the last almost 20 years every week for a day or two in the summer, I have people from Europe and all over Asia come and tell me that all their lives they’ve wanted to come and see Mount Rushmore,” he says. “It’s an international symbol of freedom.”

Read TIME’s original story about FDR’s trip to Rapid City, here in the TIME Vault: Roosevelt & Rain

Your browser is out of date. Please update your browser at http://update.microsoft.com