TIME remembrance

How Leonard Nimoy Almost Wasn’t Spock

Gene Roddenberry prevented what would have been a casting catastrophe

News of the death of actor Leonard Nimoy will invariably mention the role for which he was most famous, that of Spock on Star Trek. Nimoy and Spock have been mentioned in the same breath for almost exactly 50 years now, and that’s also as long as he has been loved for the role, even when he wasn’t actively involved in a Star Trek project (and even despite calling his first autobiography I Am Not Spock). In fact, the actor’s very first mention in the pages of TIME was in a 1975 article about how the show’s fan culture had picked up after the cancellation of the original series.

But that pairing of actor and role almost didn’t happen.

As TIME recounted in a 1994 cover story about Star Trek (around the time of Star Trek: Generations, the franchise’s seventh feature film, in which Nimoy did not appear), a lot of the original series’ DNA was added after the 1964 pilot displeased executives at NBC, who requested that the casting be changed up before the show went to production—including Spock, who had been played by Nimoy since the beginning. Thankfully, Gene Roddenberry stepped in to plead Nimoy’s case, and the network was convinced to keep him around.

In a tragic twist, the network also requested that Spock smoke a “space cigarette” in order to please a tobacco company that was one of the show’s sponsors. Roddenberry was able to intervene on that point as well, and surely Spock would approve: Nimoy, who died as a result of lung disease, last year urged his fans to quit smoking.

Read the full 1994 story, here in the TIME Vault: Trekking Onward

TIME movies

Why Hollywood’s Diversity Problem Can’t Just Be Solved with Fancy Award Ceremonies and Gold Statues

Noble Johnson
John D. Kisch—Separate Cinema Archive/Getty Images Publicity still of American actor Noble Johnson, 1920

For most of its history, Hollywood has worked hard to identify—and undermine—the work of black actors and filmmakers

History News Network

This post is in partnership with the History News Network, the website that puts the news into historical perspective. The article below was originally published at HNN.

Last Sunday’s Oscars have once again renewed debates over Hollywood’s diversity problem. “Not surprising that an organization who’s 94% White & 77% Male doesn’t recognize diverse talent,” one critic tweeted before the ceremony, using the #OscarsSoWhite hashtag that first trended last month, after the Academy announced its all-white list of nominees for best actor and actress, and snubbed director Ava DuVernay. Meanwhile, supporters of Mexican filmmaker Alejandro González Iñárritu, who won for Best Director and Best Picture, argued that Hollywood was at least making progress. Iñárritu’s awards proved “compelling stories can be told by diverse talent,” Jack Rico wrote on NBC’s website the following day.

But recognizing black, Latino, and Asian talent has never been Hollywood’s problem. Hollywood has seldom overlooked the abilities of promising non-white filmmakers. In fact, for most of its history, Hollywood has worked hard to identify—and undermine—their work, which has been more detrimental to African American film than any Oscar snub. Keen to maintain its control over global film production, Hollywood wielded its political connections and economic might to establish systems that prevented independent black filmmakers from distributing their movies. When black filmmakers overcame these challenges, Hollywood responded by co-opting black cinema’s most marketable genres and directly competing with independent black film producers.

This history reaches back more than a century. When members of the first cohort of powerful American film producers, the Motion Picture Patents Company (MPPC),built up a national film market, they avoided offending their white audiences and censors in the South. That meant blacks wouldn’t be treated as equals either behind the camera or onscreen. Hollywood’s early producers were not members of the MPPC, but they gladly embraced and eventually strengthened these business policies as they battled their way to the top. When the first Hollywood blockbuster, Birth of a Nation debuted–a hundred years ago this month–Hollywood was already unmistakably invested in pleasing its white audiences at the expense of African Americans.

Fortunately, African Americans had their own cinema. It’s a little known fact, but long before the rise of Hollywood or better-known black filmmakers like Oscar Micheaux, black men and women began producing their own films. They developed sophisticated editing techniques, and invented new technologies for exhibiting motion pictures. In my book Envisioning Freedom: Cinema and the Building of Modern Black Life I describe how African Americans such as Harry A. Royston toured the country in the 1890s with film exhibitions “put together to please a colored audience.” Just a few years later, filmmakers like Mr. and Mrs. Conley, and William G. Hynes produced motion pictures about black progress. These pioneers of black cinema were the children of former slaves, or were born into slavery themselves. Their motion pictures broadcast ideas about black progress and raised money for black churches and other institutions dedicated to the mission of “racial uplift.” By the early 1900s, African American film could be found throughout the country.

Hollywood studios were suspicious of any threat to their markets. With few exceptions, early Hollywood producers were unwilling to invest in black film, but they still wanted to lock out any competition. To do so, Hollywood played dirty. Hollywood studios forced theaters that wanted the screen their films into “block booking,” which meant the theater could only screen films by their production houses. Later, the big players, including Paramount, Universal, and Fox, directly purchased their own theaters and conspired to corner the market by marginalizing the opportunities of independent producers to distribute their pictures, and by closing in on profits of “second run” theaters–the only places that exhibited independent black films.

Independent black filmmakers continued to produce movies, but found themselves boxed in. To grow into an industry that could produce big-budget, feature films, black filmmakers would need bigger distribution markets. But as Hollywood tightened its grip on the channels of film distribution, filmmakers like Oscar Micheaux found it impossible to place their movies in enough theaters to earn back their money. The Supreme Court eventually ruled that Hollywood’s monopolistic practices violated US antitrust laws, but not before hundreds of independent black film companies had been destroyed.

In other cases, Hollywood muscled out black independents by making their most bankable actors sign non-competition agreements. In 1917, Noble Johnson, an African American actor who played Native American, Latino, and Asian characters in Hollywood movies, co-founded the Lincoln Motion Picture Company. He produced and starred in three films before Universal demanded he disassociate himself from Lincoln Pictures or never work for Universal again. Johnson, who relied on his earnings from Universal to help finance his venture with Lincoln, had little choice but to resign. As the Lincoln Picture’s main draw, Johnson’s departure sounded a death knell to the company.

Despite the challenges that independent black producers faced, they proved there was a market for “race films.” Hollywood producers, having established a national (white) market for their films, began paying attention to the audiences they had ignored for decades. In the late 1920s, a growing number of Hollywood studios began producing “race films”; others toned down the virulent racism in their own films, and replaced white actors in blackface makeup with more African American performers. When the Great Depression hit, Hollywood, strapped for profits, doubled down on its efforts to woo over black audiences. The industry was still unwilling to offend the South, but after decades excluding African Americans actors, Hollywood producers could pitch featured roles as maids and butlers as “progress.” The 1939 film Gone with the Wind, and black actress Hattie McDaniel’s Academy award an Oscar for best supporting actress, exemplified Hollywood’s new inclusivity.

Hollywood’s strategies in Mexico haven’t been all that different from its efforts to squelch independent black film in the US. From World War I, when US films first came to dominate Mexico’s film markets, to NAFTA, the industry has relied on its powerful lobbies, tactics like block booking, and the recruitment of talented Mexican actors and filmmakers to work on Hollywood films. None of this, of course, is any secret. “Freed of fences and trade spikes, more folks in foreign countries will want to buy what Americans make and market,” Jack Valenti, former president and CEO of the Motion Picture Association of America (MPPA) wrote in support of NAFTA in 1993. Today, Hollywood controls about 90% of Mexico’s box office.

Without a doubt, Hollywood has a diversity problem, but one that can’t just be solved with fancy award ceremonies and gold statues. Above all, Hollywood is an industry motivated by profits, with a century-long history of aggressive and monopolistic business practices. So next time the Academy hands out its awards, we should remember to ask ourselves–who’s really winning the prize?

Cara Caddoo is the author of “Envisioning Freedom: Cinema and the Building of Modern Black Life (Harvard University Press, 2014). She teaches at Indiana University, Bloomington.

TIME Music

Jewel’s Pieces of You at 20: What TIME Said About the Album

July 21, 1997, cover of TIME
Cover Credit: HERB RITTS The July 21, 1997, cover of TIME

The record was released on Feb. 28, 1995

Twenty years after its release, Jewel’s album Pieces of You—featuring hits like “Who Will Save Your Soul” and “Foolish Games”—seems like a key piece of 1995-iana.

But, in 1995, when the album dropped, what TIME had to say about it was…nothing.

As the magazine noted when the singer-songwriter made the cover a few years later, the record that had by then sold more than 5 million copies had at first gone nowhere. It took years of touring for word of mouth to make the difference. By that time, she wasn’t just a star in her own right. She was the face of a trend, as female singers caught the attention of the country. As TIME put it:

This summer female pop stars are clearing out space for themselves, and the season’s usual sea of masculinity is parting. The debut CD by Alaskan pop-folkie Jewel, Pieces of You (Atlantic), has sold more than 5 million copies and is still riding high on the charts. Erykah Badu, with her poetry-slam soulfulness, has sold more than 1 million copies of her brilliant new CD Baduizm (Kedar Entertainment/Universal) and is a headliner on this summer’s neo-soul Smokin’ Grooves Tour. And Canadian singer-songwriter Sarah McLachlan has masterminded the summer’s most talked-about musical event: Lilith Fair, a traveling show featuring a rotating lineup of 61 female singer-songwriters, including Cassandra Wilson, Tracy Chapman, Fiona Apple, Paula Cole, Jewel and McLachlan herself. There’s a different melody in the air: macho is out; empathy is in. “People want to be given hope,” says Atlantic Records senior vice president Ron Shapiro, “and these female artists are giving young people a life preserver.”

Read the full 1997 Jewel cover story, here in the TIME Vault: Jewel and the Gang

TIME conflict

Who Started the Reichstag Fire?

World War Two
FPG / Getty Images Firemen surveying the ruins following the Reichstag fire in Germany, 1933.

On Feb. 27, 1933, the building was destroyed — and no matter who did it, the Nazis got what they wanted

It’s a semi-mystery that’s over eight decades long: who set fire to the Reichstag, the German parliament, on Feb. 27, 1933?

As described in the Mar. 6, 1933, issue of TIME, the arson came amid “a campaign of unparalleled violence and bitterness” by then-Chancellor Adolf Hitler, in advance of an approaching German election, and it turned a building that was “as famous through Germany as is the dome of the Capitol in Washington among U. S. citizens” into “a glowing hodge-podge of incandescent girders.”

Marinus van der Lubbe, an unemployed Dutch bricklayer linked to the Communist party, was tried and executed for the crime the following year, but even then TIME questioned whether the Nazis who held him responsible were also the ones who had paid him to set the fire, “promising to save his neck by a Presidential reprieve and to reward him handsomely for hiding their identity and taking the whole blame in court.”

In 1981, a West Berlin court declared that the trial had been “a miscarriage of justice,” though they stopped short of saying that he had been innocent. In 2001, evidence emerged that the conspiracy theory had been right along, with historians announcing that the Nazis had been the ones responsible for the fire, though even then others disagreed — and, as recently as 2014, the United States Holocaust Memorial Museum noted that “the origins of the fire are still unclear.”

But, while van der Lubbe’s life still hung in the balance, reporting on the aftermath of the fire made clear that, whoever set the spark, the aftermath had already been determined by Nazi powers, in their own favor. Here’s how TIME summed it up just a week after the original report on the fire:

Before German Democracy could thus be downed this week, the Hitler Cabinet had to launch last week a juggernaut of super-suppressive measures & decrees for which they needed an excuse. What excuse could be better than the colossal act of arson which had just sent a $1,500,000 fire roaring through the Reichstag Building […] gutting completely the brown oak Reichstag Chamber and ruining its great dome of gilded copper and glass.

The Reichstag fire was set by Communists, police promptly charged. Over a nationwide radio hookup the Minister of Interior for Prussia, blustering Nazi Captain Hermann Wilhelm Göring, cried: “The Reichstag fire was to have been the signal for the outbreak of civil war! … The Communists had in readiness ‘terror squads’ of 200 each … These were to commit their dastardly acts disguised as units of our own Nazi Storm Troops and the Stahlhelm … The women and children of high Government officials were to have been kidnapped as hostages and used in the civil war as ‘living shields’!…

“The Communists had organized to poison food … and burn down granaries throughout the Reich … They planned to use every kind of weapon—even hot water, knives and forks and boiling oil!…

“From all these horrors we have saved the Fatherland! We want to state clearly that the measures taken are not a mere defense against Communism. Ours is a fight to the finish until Communism has been absolutely uprooted in Germany!”

The “juggernaut” of new decrees included increasing the weaponry provided to Nazi troops (despite violation of the Treaty of Versailles) and the transfer of the majority of state powers from President Paul von Hindenburg to Hitler and his cabinet. Rights ensured by the German constitution were suspended, and a gag rule was placed on foreign journalists within the country, with severe punishments for violation. The German government was moved from Berlin to Potsdam. Within the month, TIME reported that nearly all of the country’s leading Communists and Socialists were in jail. By April, Nazis were using the threat of another fire to ensure the passage of the Enabling Act, which solidified Hitler’s place as dictatorial leader for years to come.

Whether Nazi involvement in the Reichstag fire was direct or indirect or, improbably, nonexistent, the result was the same.

TIME Music

The Canadian Heartthrob That Had Girls Screaming Long Before Justin Bieber

As the teen idol turns 21 on March 1, a look back at the career of Paul Anka, the Canadian export who stopped young girls’ hearts half a century ago

Comparing anyone to Justin Bieber is a risky endeavor. It exposes one to the wrath of the Beliebers, who clutch their Bieber-emblazoned iPhone cases ready to Tweet angrily at those who dare suggest their idol is anything but peerless. It also risks offending the subject of comparison, who may not deem the singer’s company desirable.

But we feel we are on safe ground in this comparison. Bieber is not the first Canadian teen idol to make teenage hearts the world over skip a beat. Anyone would do well to learn the name Paul Anka.

Though he rose to fame half a century before Bieber’s shaggy mop-top cropped up on YouTube, Anka was, like Bieber, a young talent who came of age amidst the admiration of adoring fans. As a 1960 LIFE profile titled “Paul Anka, Kids’ Wonder Singer” put it, the 19-year-old had “the look of a small boy trying to become a grownup.” To demonstrate this point, the article described a shopping trip in New York during which Anka purchased a platinum watch from Tiffany’s before buying a haul of toys from F.A.O. Schwartz.

Anka was born in Ottawa in 1941 to parents of Syrian and Lebanese descent. He told LIFE that the Arabic music that flowed through his home was one of his earliest musical influences. After performing with friends around town, he began playing country clubs at 14, recorded a few singles and got increasing play on Canadian radio. When a local disc jockey called his father to tell him, “your boy Paul is too big for Canada,” they went to New York to launch a full-scale operation on the American front.

Anka’s music—which featured his smooth voice, backed by violins, singing about puppy love and goodnight kisses—was “admired by teen-agers but almost unknown to oldtimers over 20.” And his young fans’ appetite for it was voracious. “The girls threw their panties on the stage,” Anka told LIFE of one 1958 show. For a show in Japan, 2,000 fans spent a day standing outside in a typhoon waiting to buy standing room only tickets.

But the life of a teen idol can be surprisingly lonely, and Anka, who wrote his own music, occasionally worked these darker feelings into his songs. He later explained that his hit “Lonely Boy” stemmed from the isolation of traveling amidst a sea of adoring fans whom he never got close to. Similarly, “Put Your Head On My Shoulder” was inspired by looking out at a sea of teens canoodling at his concerts and then going to his hotel room to eat dinner alone.

The transition from teen heartthrob to adult entertainer is rarely seamless, and it proved difficult for Anka. Though he began performing in adult clubs like the Copacabana, this led to a drop-off in interest from teens, which was not fully replaced by adult record sales. A writer and composer above all, Anka was able to fall back on these skills as his bread and butter when the girls stopped throwing their panties. He penned tunes for Buddy Holly and Connie Francis, and later co-wrote Michael Jackson’s “This Is It,” released after Jackson’s death.

Anka’s career—his most recent album was released in 2013 and he’s currently on tour at age 73—proves that there is life after teen idoldom, even if it takes a slightly different form. Beliebers, take note.

Liz Ronk, who edited this gallery, is the Photo Editor for LIFE.com. Follow her on Twitter at @LizabethRonk.

TIME politics

How a Little-Known Supreme Court Case Got Women the Right to Vote

MPI / Getty Images A poster, published by the League of Women Voters, urging women to use the vote which the 19th amendment gave them, from circa 1920

Happy birthday, Leser v. Garnett

Pop quiz: when did women in the United States get the right to vote?

If you answered June 4, 1919, or Aug. 18, 1920 — the dates on which the 19th Amendment was passed and ratified — then you’re almost right. Yes, the Amendment guaranteed that the right to vote could not be denied on account of sex. But the right wasn’t fully secured until this day, Feb. 27, in 1922. That’s when the Supreme Court decided Leser v. Garnett.

Here’s what the case was about: Two Maryland women registered to vote a few months after the 19th Amendment passed. Oscar Leser, a judge, sued to have their names removed from the voting rolls, on the grounds that the Maryland constitution said only men could vote, and that Maryland had not ratified the new amendment to the federal constitution — and in fact, Leser argued, the new amendment wasn’t even part of the constitution at all. For one thing, he said, something that adds so many people to the electorate would have to be approved by the state; plus, some of the state legislatures that had ratified the amendment didn’t have the right to do so or had done so incorrectly.

The Supreme Court found that both arguments flopped: when suffrage had been granted to all male citizens regardless of race the Amendment had held up, despite the change to the electorate, and the ratification powers Leser questioned had in fact been granted by the Constitution. (And in a few states where things were iffy, it didn’t matter because enough other states had ratified.)

So, while the 19th Amendment granted women the right to vote, Leser made sure that the right could actually be used, even where the state constitution said otherwise. It’s not one of the more famous Supreme Court decisions in American history, but without it the electorate would be, well, lesser.

TIME Transportation

Why a JetBlue Tweet About ‘Bluemanity’ Was Controversial

LZ-129 Disaster
Sam Shere—Getty Images The Hindenburg disaster at Lakehurst, New Jersey, in 1937

A tongue-in-cheek tweet from the airline didn't go over well. The reason goes back to 1937

JetBlue apologized and deleted a “not well thought-out” tweet on Thursday, after some of the airline’s followers noted that the pun in the tweet—”Oh, the Bluemanity!”—was a reference to one of the 20th century’s worst air-travel disasters. But, though you might expect better from someone who works in the air-travel industry, it’s also easy to see how a social-media writer on the look out for words that rhyme with blue might not have thought about the implications of this particular pun. After all, the disaster to which it refers took place long before commercial aviation in planes was a common travel option.

Here’s the reason why “bluemanity” caused a controversy:

In 1937, travel by dirigible—a means of transportation that uses lighter-than-air gas to stay up—was thought to be the safest way to go. The Hindenburg, one such dirigible, had been making the voyage back and forth between Frankfurt and New Jersey for months already. Its captain had made nearly 200 transatlantic flights. It was, according to a TIME report that year, Nazi Germany’s “greatest transport pride.” Everything was going fine during its first 1937 trip to the U.S., until it came time to land on May 6.

For reasons that were not immediately clear—some suggested sabotage, though static electricity has proven more likely—the hydrogen with which its balloon was inflated caught fire. All 803 ft. of it burned up in about 32 seconds, killing dozens of passengers and crew members. It was, at that point, the worst accident in the history of commercial aviation.

Meanwhile, New Jersey radioman Herbert Morrison was recording a transcription of the landing that would be broadcast the next day; because it wasn’t the first such landing, it wasn’t a big enough deal to cover live. As he narrated what he saw, the poetic tone (it was “like a great feather” at first) turned panicked. TIME reported his words the following week:

“It is practically standing still now. The ropes have been dropped and they have been taken hold of by a number of men on the field. It is starting to rain again. The rain had slacked up a little bit. The back motors of the ship are holding it just enough to keep it—


“Get out of the way! Get this—Charley, get out of the way please! It is bursting into flames. This is terrible! This is one of the worst catastrophes in the world! The flames are 500 ft. into the sky. It is a terrific crash, ladies and gentlemen. It is in smoke and flames now. Oh, the humanity! Those passengers! I can’t talk, ladies and gentlemen! Honest, it is a mass of smoking wreckage. Lady, I am sorry. Honestly, I can hardly—I am going to step inside where I can’t see it. Charley, that is terrible! Listen, folks, I am going to have to stop for a minute because I have lost my voice.”

Morrison’s exclamation—”Oh, the humanity!”—became famous, and hydrogen-filled airships became a matter of history too, as that was pretty much the end of their commercial use. The only upside was that it was also the end of Nazi transportation pride, as Germany did not have any helium, which would have been a safer gas to use for flight.

“The Hindenburg represented the world and for that reason our eyes lighted when we saw its silver grandeur in the sky,” wrote columnist Dorothy Thompson, according to TIME. “It contended with another world which might make it at any moment an object of terror and of hatred.”

Read the full story here in the TIME Vault: “Oh, the Humanity!”

TIME politics

The Conservative Case for Legalizing Marijuana

William F. Jr. Buckley
Truman Moore—The LIFE Images Collection/Getty William F. Buckley Jr., riding in airplane en route to Washington DC, in 1965

American conservatives haven't always opposed legalizing pot

The United States’ latest skirmish in the battle over marijuana laws is still ongoing and, for lawmakers, it hits close to home. On Thursday, possession of a limited amount of the drug became legal for adult residents of Washington, D.C. — but, thanks to the intervention of a group of Congressmen, there’s still no way to legally buy it or sell it there, which may lead to the development of a “free weed economy.”

The legislative action taken to stop the District from developing a monetary economy for pot has broken down along party lines, with Republican lawmakers against the change in stance toward the drug and Democrats urging the city to go ahead.

It may seem like a natural thing for conservatives to be, well, conservative about changing drug laws — polls have shown that Republicans are much less likely than Democrats to support legalization —but that wasn’t always the case. In fact, there was a time during the 1970s when the nation’s leading conservative voices spoke out on behalf of legalizing marijuana, for many of the same reasons that advocates of legalization cite today.

At that time, in late 1972, a large study from the nonpartisan Consumers Union had just come out, urging legalization, as well as government-supported treatment for addictions to other substances. The report found that it was too late for law enforcement to keep pot from becoming part of American culture — and, surprisingly, its authors weren’t the only ones to think so, as TIME reported that December:

…American conservatives may have arched their eyebrows well above the hairline when they glimpsed the latest issue of William F. Buckley Jr.’s staunchly nonpermissive National Review. There on the cover was the headline: THE TIME HAS COME: ABOLISH THE POT LAWS. Inside, Richard C. Cowan, a charter member of the conservative Young Americans for Freedom, sets forth his arguments that the criminal penalties for marijuana possession and use should be stricken from the books. Cowan contends that pot is comparatively harmless, demonstrably ubiquitous and that the laws against it only alienate the young and breed disrespect for American justice.

The attitude was a shift for Buckley, who in 1971 testified against loosening penalties but wrote in 1972 that he agreed with Cowan. “It seems, in fact, that Buckley has smoked grass himself—but only on his sailboat, outside the three-mile limit,” TIME noted. “His verdict: ‘To tell the truth, marijuana didn’t do a thing for me.'”

See the full story, here in the TIME Vault: Concerning Pot and Man at The National Review

TIME People

Why Napoleon Probably Should Have Just Stayed in Exile the First Time

Napoleon I, Emperor of France, in exile.
Print Collector/Getty Images An illustration of Napoleon I, Emperor of France, in exile.

Feb. 26, 1815: Napoleon escapes from Elba to begin his second conquest of France

For the man with history’s first recorded Napoleon complex, it must have been the consummate insult. After Napoleon Bonaparte’s disastrous campaign in Russia ended in defeat, he was forced into exile on Elba. He retained the title of emperor — but of the Mediterranean island’s 12,000 inhabitants, not the 70 million Europeans over whom he’d once had dominion.

Two hundred years ago today, on Feb. 26, 1815, just short of a year after his exile began, Napoleon left the tiny island behind and returned to France to reclaim his larger empire. It was an impressive effort, but one that ended in a second defeat, at Waterloo, and a second exile to an even more remote island — Saint Helena, in the South Atlantic, where escape proved impossible. And he didn’t even get to call himself emperor.

From this new prison perspective, he may have missed Elba. After all, as much as he hated the idea of his reduced empire, he didn’t seem to dislike the island itself. His mother and sister had moved there with him, and they occupied lavish mansions. According to a travel writer for the Telegraph, “Though his wife kept away, his Polish mistress visited. He apparently also found comfort in the company of a local girl, Sbarra. According to a contemporary chronicler, he ‘spent many happy hours eating cherries with her.’”

It was easy to believe — until he fled — that he meant what he said when he first arrived: “I want to live from now on like a justice of the peace.” He tended to his empire with apparent gusto, albeit on a smaller scale than he was used to. In his 300 days as Elba’s ruler, Napoleon ordered and oversaw massive infrastructure improvements: building roads and draining marshes, boosting agriculture and developing mines, as well as overhauling the island’s schools and its entire legal system.

The size of the island, it seemed, did not weaken Napoleon’s impulse to shape it in his own image. The title of emperor brought out the unrepentant dictator in him, so confident in his own vision that, as TIME once attested, he “never doubted that [he] was wise enough to teach law to lawyers, science to scientists, and religion to Popes.”

When a collection of Napoleon’s letters was published in 1954, TIME noted that his “prodigious” vanity was most apparent in the letters he’d written from Elba, in which “he referred to his 18 marines as ‘My Guard’ and to his small boats as ‘the Navy.’ ”

The Elbans seemed to think as highly of their short-lived emperor as he did of himself. They still have a parade every year to mark the anniversary his death (on May 5, 1821, while imprisoned on his other exile island). And, as TIME has pointed out, “not every place that the old Emperor conquered is so fond of his memory that they annually dress a short man in a big hat and parade him around…”

Read TIME’s review of a collection of Napoleon’s letters, here in the archives: From the Pen of N

TIME curiosities

How Sword Swallowing Contributed to Modern Medicine

On World Sword Swallower's Day, practitioners of the ancient art raise awareness that their tradition is more than a circus sideshow

This weekend, spectators will gather at a dozen Ripley’s Believe It or Not! Odditoriums across America to watch performers stick swords down their throats, through their esophageal sphincters and into their stomachs. According to the Sword Swallowers Association International, World Sword Swallower’s Day exists to celebrate the ancient art, dispel myths and “raise awareness of the contributions sword swallowers have made in the fields of science and medicine.”

If that last bit is a little hard to swallow, chew on this historical nugget: The first endoscopy of the upper gastrointestinal tract, or esophagoscopy, was performed on a sword swallower in 1868 by the German physician Adolph Kussmaul. After experiencing frustration at not being able to see far enough into the esophagus of a patient with a tumor, he was able to see all the way into the stomach of the sword swallower. The subject swallowed a 47-centimeter tube, which Kussmaul looked through using a laryngeal mirror and gasoline lamp.

Electrocardiography also owes a debt to the sword swallowing community, as the first electrocardiogram of the esophagus used a sword swallower as a test subject in 1906. The physician, M. Cremer, also a German, inserted an electrode into the sword swallower’s esophagus in order to record his heart activity.

The nineteenth and twentieth century medical contributions of sword swallowers are a fortuitous byproduct of the practice, which dates back to 2000 B.C.E. It began in ancient India, where it was performed, like firewalking, as a test of courage and a demonstration of faith. The practice gradually spread across Asia and Europe, morphing over the course of centuries from religious rite to street entertainment.

What was once a widespread global phenomenon is now a dwindling profession, with the SSAI estimating no more than a few dozen professional sword swallowers still performing. But those active in the small community will insist that people inclined to write them off as a circus sideshow acknowledge their contributions to the annals of medicine. So the next time a doctor looks inside your body through a tube, thank a sword swallower.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser