TIME Holidays

The Dark History of Fireworks

Fourth Of July
Hulton Archive / Getty Images circa 1960: An American flag flies on a flagpole while fireworks explode in the background

Explosions are a great way to celebrate Independence Day—and also a great way to get hurt

A spectacular pyrotechnics display almost never disappoints. So, the week before Independence Day in 1964, New Yorkers with a view of the Hudson River were delighted by a preview of the fireworks they weren’t expecting until days later. Assuming it was a planned lead-up to the Macy’s Fourth of July show, they clapped and cheered, per TIME — and only later learned that they’d witnessed the accidental eruption of a barge full of fireworks, which killed two crewmembers and injured four others.

“So thoroughly institutional have fireworks become these days that the postwar generations hardly think of them as dangerous,” TIME lamented.

But Americans had been downplaying their dangers for more than a century by the time of the barge explosion. Fireworks first became a Fourth of July fixture in the mid-1800s, according to Fireworks, Picnics, and Flags: The Story of the Fourth of July Symbols. These early incendiaries were unrestricted and widely available to the general public, who came up with inventive and sometimes inhumane ways to use them: throwing them at horses, for example, or putting them under milk bottles and flowerpots to create explosive bursts of dangerous shrapnel.

Not everyone saw the patriotism in blowing things up, of course. The book excerpts a Pennsylvania man’s diary entry from Independence Day, 1866:

July 4th is the most hateful day of the year, when the birth of democracy is celebrated by license and noise. All last night and all of today, the sound of guns and firecrackers around us never stopped. It is difficult to feel patriotic on the Fourth of July.

Statistics offer an even grimmer snapshot of the harm done by an unregulated fireworks industry: Over the course of five consecutive Fourths, from 1903 to 1907, 1,153 people were killed and 21,520 more were injured, per the Fireworks authors.

Those numbers have declined over the years, although they’re still high enough for alarm. In 2013, the worst Fourth for fireworks casualties in over a decade, more than 11,000 people were injured and eight were killed, either from head and chest trauma or in house fires resulting from the blasts, per the Washington Post.

In recent years, the most devastating explosions have occurred where fireworks are manufactured and stored—for example,in 2000, a Dutch fireworks factory blew up with such force that it leveled 400 houses, killing 17 people and injuring more than 900, according to TIME—but it’s still worth remembering that a more complete Fourth of July wish might be for the holiday to be not just happy, but also safe.

Read the full account of the 1964 fireworks accident, here in the TIME archives: Safe & Sane

TIME movies

Read TIME’s Prescient Review of Back to the Future

Michael J Fox In 'Back To The Future'
Universal Pictures/Getty Images Michael J Fox walking across the street in a scene from the film 'Back To The Future', 1985.

The movie was released on July 3, 1985

Turning the big 3-0 is always a big deal, but for Back to the Future it’s particularly so. After all, 30 years is the time span that sets the whole movie in motion: Marty McFly travels three decades back in time from 1985 to 1955. Now, on July 3, 2015, he’ll have made it just that many years into the future. (Or, rather, the movie will have made it: the 2015 date to which Marty zooms in the movie’s sequel won’t roll around until October.)

Looking back at TIME’s original review of the movie classic (a two-fer that paired BttF with Goonies) it’s clear that the charm of the story was immediately clear—and that critic Richard Corliss had his finger on the pulse, or at least his foot on the gas of the film-criticism DeLorean. When looking at the plot structure, he ventured a guess at what might happen to the movie by 2015:

The choice of year is canny, for 1955 is close to the historical moment when television, rock ‘n’ roll and kids mounted their takeover of American culture. By now, the revolution is complete. So the child of 1985 must teach his parents (the children of 1955) how to be cool, successful and loved. When they learn it — when the Earth Angel meets Johnny Do-Gooder — the picture packs a wonderful wallop. But Back to the Future goes further: this white ’80s teenager must teach black ’50s musicians the finer points of rock ‘n’ roll. Out-rageous! After a thunderous heavy-metal riff, Marty stares at his dumbfounded audience and shrugs, ”I guess you guys aren’t ready for that yet. But your kids are gonna love it.” You bet, Marty. You and your whole movie. Now and for 30 years to come.

At this point, we don’t need a flux capacitor to guess that, another 30 years from now, that prediction will still hold true.

Read the full review, here in the TIME Vault: This Way to the Children’s Crusade

TIME Hong Kong

The British Once Considered Moving the Entire Population of Hong Kong to Northern Ireland

111713039
Getty Images

An official at the Northern Ireland office was inspired by a university lecturer's proposal to "transplant" Hong Kong to Northern Ireland

(LONDON) — A bizarre plan to relocate the entire population of Hong Kong to Northern Ireland was considered an option in the uncertain years before Britain handed back the former British colony to Chinese rule, formerly classified government files showed.

Britain’s National Archives on Friday released a 1983 government file called “Replantation of Northern Ireland from Hong Kong,” which showed British officials discussing a far-fetched proposal to settle 5.5 million Hong Kong people in a newly built “city state” between Coleraine and Londonderry.

George Fergusson, an official at the Northern Ireland office, was inspired by a university lecturer’s proposal to “transplant” Hong Kong to Northern Ireland — a move that would supposedly revitalize the local economy as well as save Hong Kong, which the lecturer believed had “no future on its present site.”

“At this stage we see real advantages in taking the proposal seriously,” Fergusson wrote in a memo to a colleague in the Foreign Office.

While it wasn’t clear if Fergusson was writing tongue-in-cheek, the droll reply he received showed that it wasn’t taken seriously.

“My initial reaction … is that the proposal could be useful to the extent that the arrival of 5.5 million Chinese in Northern Ireland may induce the indigenous peoples to forsake their homeland for a future elsewhere,” quipped David Snoxell at the Republic of Ireland Department. “We should not underestimate the danger of this taking the form of a mass exodus of boat refugees in the direction of South East Asia.”

An official scribbled in the margins: “My mind will be boggling for the rest of the day.”

Though outlandish, the idea illustrated anxieties at the time about the future of Hong Kong. Prime Minister Margaret Thatcher began talks with China on the topic in 1982. Two years later, the two sides agreed that the city would return to Chinese rule in 1997.

TIME Opinion

How the Declaration of Independence Can Still Change the World

Declaration of independence 1776 from the Congress of Representatives. Signed by John Hancock, President of the Congress
Universal Images Group / Getty Images Declaration of independence

The key is that its language is inclusive

Three weeks ago Britain observed the 800th anniversary of the Magna Carta, the charter of liberties King John was forced to issue to his barons in 1215. Most contemporary commentaries took the opportunity to point out how far short that document fell of modern principles of justice. It benefited only the great nobles, not the common people; it was not, in any case, fully put into effect for a long time; and it contained some provisions, such as those relating to Jews, reflecting medieval prejudices. As the Fourth of July rolls around once again, some commentators will undoubtedly make similar points about the Declaration of Independence. Yes, the Declaration declared that “all men are created equal,” but it thereby left the female half of humanity out of account. It said nothing about slavery, which then existed in every colony and obviously contradicted its principles. It referred to “merciless Indian savages” whom the King had incited against the colonists. In short, the authors and signatories of the Declaration did not use the language that is fashionable in the 21st century, and thus it is a relic from another time that is irrelevant to our world today.

That view misses two very important points. Earlier generations have revered both Magna Carta and the Declaration because they were critical milestones in the development of modern ideas of liberty and government—milestones that can only be understood in the context of their own times, not according to 21st-century views. More importantly, the authors of the Declaration used universal language which has inevitably led to the extension of the rights and freedoms they championed to more and more of humanity. That language is why the Declaration of Independence still has the power to inspire progress.

Because we have taken the principles of the declaration for granted for so long, we must remind ourselves of how revolutionary they were in 1776. It was “necessary,” Thomas Jefferson and the others wrote, “to dissolve the political bonds” which had connected the Americans and the British, because the royal government no longer met the standards for just and effective government that they themselves were defining. The colonists were acting, they wrote, in the face of “a long train of abuses and usurpations,” acts by the King that in their opinion violated the long-standing principles of British law that had developed over the centuries, and especially since the Glorious Revolution of 1688, during which parliamentary control over the Throne was solidified. The King had refused to allow colonial governments to function properly. He had sent troops to the United States to enforce his will, and quartered those troops among the population. He had tried to deprive large numbers of people the right to elect legislators, and much more. But his government—like all governments—did not exercise power by divine right, only insofar as it respected established principles and traditions of liberty. That idea was shortly to set not only the colonies, but much of the western world, aflame.

In its most famous passage, the declaration asserted the ultimate authority of human reason. “We hold these truths to be self-evident,” it said: “that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.–That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed, –That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness.” Yes, it is true that the colonies had not, and for many years would not, extend all those rights to poorer men, or indentured servants, or slaves—but their language made no exception for any of those categories. Thus, the Declaration established a contradiction between their principles and existing conditions in the 18th-century world. That contradiction was bound to lead to further political struggles. So, although the Founding Fathers referred to “all men”—the Constitution, written 20 years later, generally referred more broadly to “persons”—it was equally inevitable that women would clam their rights as well, and that the logic of the founders’ language would allow that progress, too.

No one understood this better than Jefferson himself. Fifty years later, in the spring of 1826, he was invited, along with the few other surviving signatories, to attend a celebration of the signing in Washington. He began his reply by regretting that illness would not permit him to attend. (Indeed, his remaining ambition was simply to survive until July 4, which is exactly what he and his fellow signatory John Adams managed to do.) Yet he proclaimed the enduring significance of the declaration he had drafted:

“May it [the declaration] be to the world, what I believe it will be, (to some parts sooner, to others later, but finally to all), the signal of arousing men to burst the chains under which monkish ignorance and superstition had persuaded them to bind themselves, and to assume the blessings and security of self-government. That form which we have substituted, restores the free right to the unbounded exercise of reason and freedom of opinion. All eyes are opened, or opening, to the rights of man. The general spread of the light of science has already laid open to every view the palpable truth, that the mass of mankind has not been born with saddles on their backs, nor a favored few booted and spurred, ready to ride them legitimately, by the grace of God.”

And so it was, through most of the rest of the 19th and 20th centuries, on every continent.

The struggle for these principles, however, has proven to be an enduring one. In much of the world reason is once again fighting with superstition, and finds itself in retreat. In our own nation, inequality threatens to create a new aristocracy that will ride upon the backs of the masses. The principles and language of the declaration remain by far the best defense against oppression and superstition. Most importantly of all, it is only upon the basis of impartial principles that new coalitions for justice can form. The Declaration of Independence remains a precious part of our heritage—one which we simply cannot do without.

The Long ViewHistorians explain how the past informs the present

David Kaiser, a historian, has taught at Harvard, Carnegie Mellon, Williams College, and the Naval War College. He is the author of seven books, including, most recently, No End Save Victory: How FDR Led the Nation into War. He lives in Watertown, Mass.

 

TIME Food & Drink

This Graphic Shows How Many Hot Dogs It Takes to Win the Nathan’s Eating Contest

Nathan's Hotdogs
George Heyer—Getty Images Crowds outside Nathan's Famous hot dog stand on Coney Island, New York City, circa 1955

They scarf down a lot more than they used to

According to Nathan’s Famous lore, the first Fourth of July hot-dog-eating contest took place the very year the hot dog stand on New York’s Coney Island opened in 1916. The story goes that it began when four immigrants were trying to determine who was the most patriotic by scarfing the dogs. But there’s no proof that there was an organized contest until the 1970s– as the press agent Mortimer Matz told the New York Times and Nathan’s then acknowledged. So our tally of how many hot dogs it took to win the contest begins in 1972, when Nathan’s started keeping records. That year’s winner, Jason Schechter, ate 14 wieners—a number that’s puny by today’s standards. Current record-holder Joey Chestnut won his title by noshing a whopping 69 in 2013.

For your awe-filled—or vomit-tinged—enjoyment, scroll down to see how many frankfurters have been consumed by the winners of every Nathan’s Famous Hot Dog Eating Contest between 1972 and 2014. As for why the thing is at Nathan’s at all, here’s how TIME explained the importance of Nathan Handwerker’s beachside joint in 1960:

The spiritual home of the U.S. hot dog —and the world’s largest hot dog stand—is Nathan‘s Famous on Brooklyn’s Coney Island. To Nathan‘s gaudy green and white stands each summer flock many of the millions of visitors to Coney, gobbling up more than 200,000 hot dogs (at 20¢ each) on a weekend. Summer or winter, Nathan‘s never closes. Its customers have braved blizzards just to reach a Nathan‘s hot dog: it is a regular last stop for many early-morning survivors of Manhattan’s cafe society.

TIME Civil Rights

The Meaning Behind the Civil Rights Act’s Signing Date

Johnson Signs Civil Rights Act
PhotoQuest / Getty Images President Lyndon B Johnson signs the Civil Rights Act in a ceremony at the White House, Washington DC, July 2, 1964 .

President Johnson signed the bill into law on July 2, 1964

For President Johnson to sign the Civil Rights Act into law on July 2, 1964, was a no-brainer: the date was a Thursday, just as it is this year, and the symbolism of marking the hard-fought victory just before Independence Day would be a shame to waste.

But, as TIME noted in its original 1964 coverage of the landmark legislation, the Fourth of July wasn’t the only significant date in play. The date on which the Senate passed the bill was June 19, 1964—precisely one year after “President John Kennedy sent to Congress a civil rights bill, [and] urged its speedy passage ‘not merely for reasons of economic efficiency, world diplomacy or domestic tranquility, but above all because it is right.'” Though Kennedy had been assassinated the previous fall, the law he had advocated for had actually grown in strength and scope.

After the House also passed the bill and it went on to the President, the season of its signing—and not just the calendar date—would also prove significant.

The bill included many obviously important provisions affecting matters of great weight, like voting rights and equal employment. But, as TIME pointed out, it would take months to see the voting rules take effect, and the labor matters included a period during which businesses could adjust. On the other hand, one of the parts of the law—a part that may seem today to be far less important—was, as TIME put it, “effective immediately, and likely to cause the fastest fireworks.”

The law entitled all persons to equal use of public accommodations, from hotels and movie theaters to soda fountains and public swimming pools. In the run up to the final vote, St. Augustine, Fla., proved why pools—long a contentious point, for the necessary closeness that comes with sharing the water with other people—would be a hot topic:

There, five Negroes and two white fellow demonstrators dived into the swimming pool at the segregated Monson Motor Lodge. The motel manager, furious, grabbed two jugs of muriatic acid, a cleansing agent, tried unsuccessfully to splash the stuff on the swimmers. Cops moved in, one of them stripped off his shoes and socks, leaped gracelessly into the water and pummeled the swimmers with his fists. When the fracas was over, 34 people, including the swimmers and other civil righters who kept dry, were hauled off to jail.

Due to the time of year, the new law’s effects would be immediately visible at swimming pools around the country.

TIME photography

See the Best Vintage Photos of People Stuffing Their Faces With Hot Dogs

On July 4, the hot dog reigns supreme

On the 4th of July, Americans will eat approximately 150 million hot dogs. According to the National Hot Dog & Sausage Council, that’s enough to stretch across the continental U.S. more than five times. A disproportionate number of those wieners will be consumed on Coney Island in New York, where competitive eaters at Nathan’s International Hot Dog Eating Contest will attempt to best Joey “Jaws” Chestnut’s record of 69 hot dogs on the men’s side and Sonya “The Black Widow” Thomas’ feat of 45 on the women’s side.

Here, in honor of America’s favorite tubes of miscellaneous meat and the eaters who enjoy them—competitively and recreationally—is a collection of photos of Americans stuffing their faces with hot dogs.

Liz Ronk, who edited this gallery, is the Photo Editor for LIFE.com. Follow her on Twitter @lizabethronk.

TIME Theater

See Photos From One of Alice in Wonderland’s Most Memorable Adaptations

On the 150th anniversary of the classic novel, a look at the Broadway rendition that drew rave reviews

July 4, 2015 is not only the 239th anniversary of the signing of the Declaration of Independence. It’s also the 150th anniversary of the introduction of one of literature’s most memorable characters: Lewis Carroll’s Alice, of Alice’s Adventures in Wonderland, often shortened to Alice in Wonderland.

Carroll’s 1865 novel has been adapted for stage and screen dozens of times, but it was the 1947 theater adaptation at New York’s American Repertory Theater that drew high praise from LIFE Magazine for the way its lead actress, Broadway veteran Bambi Linn, embodied the ideals of Alice. LIFE’s editors explained a common misconception about the golden-haired heroine:

Carroll was a professor of mathematics and this is reflected in the character he created. For what makes Alice one of the great heroines of fiction is not that she is whimsical or imaginative but that she is a realistic person who remains superbly logical even in a land of fantastic nonsense.

All these years later, it’s a trait that serves well far beyond the outer limits of Wonderland.

April 28, 1947 cover of LIFE magazine
Philippe Halsman—LIFE Magazine

Liz Ronk, who edited this gallery, is the Photo Editor for LIFE.com. Follow her on Twitter @lizabethronk.

TIME gender

See 9 Striking Historical Photos of African American Women

From the collection of the National Museum of African American History and Culture

The history of what it has meant to be black and female in the United States is not easily summed up—a point that the upcoming Smithsonian photo book African American Women makes plain. As Kinshasha Holman Conwill, deputy director of the National Museum of African American History and Culture, points out in an introductory essay, the images in the book “[illuminate] a narrative that reflects large and small moments in U.S. history and culture.”

Famous faces like Lena Horne are presented alongside those whose personal stories are far less well known. Leona Dean, for example, lived a relatively prosperous life in the Midwest in the early 20th century—a place and time that has been largely eclipsed in the national memory. “We made a point of choosing images of people who aren’t famous,” says Michèle Gates Moresi, the museum’s supervisory curator of collections. “They aren’t known as leaders, but they were to their communities.”

The book is part of the Double Exposure series from the National Museum of African American History and Culture; the first installment in the series was released earlier this year and both African American Women and Civil Rights and the Promise of Equality will be released on July 7.

TIME Books

These Books Boosted Troop Morale During World War II

Armed Services Edition
Harry Ransom Center

The Armed Services Editions also fostered a new generation of readers

This post is in partnership with the Harry Ransom Center at The University of Texas at Austin. A version of the article below was originally published on the Ransom Center’s Cultural Compass blog.

The book When Books Went to War: The Stories That Helped Us Win World War II by Molly Guptill Manning celebrates the importance of the Armed Services Editions. Published between 1943 and 1947, these inexpensive paperback editions were given to servicemen on the frontlines. As Manning points out, not only did the editions achieve their principal purpose of raising morale, they encouraged a whole generation of readers who retained their appetite for reading when they returned home. Possibly a few stopped bullets or shrapnel. It’s necessary to remember that the cheap paperback edition was still a novelty at the beginning of the war, having been pioneered by Penguin Books in England and Albatross Books in Germany during the 1930s.

Armed Services Editions were made possible by a group of publishers called the Council of Books in Wartime. This group collaborated by eliminating royalty payments and arranging for the production and distribution of paperbacks in the most inexpensive possible formats. The Ransom Center has a couple of connections with these books. Although there are larger collections at the University of Virginia and the Library of Congress, we own more than 1,400 of the books, most of them shelved together as a discrete collection in the stacks, while some are kept with other editions of our major authors, such as John Steinbeck. Because they were printed on poor-quality wartime paper that is now brittle and brown, each is protected in a simple acid-free enclosure, invented by the Center’s Conservation department in the 1980s, and called a “tuxedo case.” Students of publishing history can use the collection to study which books were most successful (Manning concludes that books with a touch of nostalgia or sex were particularly popular with soldiers, and F. Scott Fitzgerald’s The Great Gatsby was one of the best-selling titles, even though it was considered a flop when first published in hardback during the 1920s). The books were generally published in an oblong format, with the cover notation “This is the complete book—not a digest.” In all, some 125 million copies were produced.

Among the founding members of the Council of Books in Wartime was Alfred A. Knopf, the eminent literary publisher (the massive Knopf, Inc. archive is here at the Center). Ironically, Knopf was famous for encouraging high production values in his own trade books, but he immediately recognized the importance of encouraging reading and raising morale and contributed a number of series titles by familiar authors in the Knopf stable, including thrillers by James M. Cain and Raymond Chandler and more literary works by Thomas Mann and Sigrid Undset.

In the postwar era, a number of paperback reprint publishers capitalized on increased demand for books, the availability of new outlets for cheap editions, such as chain department stores and drugstores, and Americans’ newly enhanced disposable income. Pocket Books debuted in 1939 and became well known after the war for its lurid covers, which, as Louis Menand points out in an illustrated recent New Yorker piece, graced not only the unabashed pulp of Mickey Spillane but also higher-toned works by William Faulkner and James Joyce. Ballantine and Bantam editions flourished, and the era of the mass market paperback had arrived. Nearly every prominent American hardback publisher developed a line of paperback books. Oddly, Knopf, Inc. was a holdout, arriving late to the game with Vintage Books in 1956. But it was the Armed Services Editions that gave the American paperback its big push.

See more photos of the book here at the Harry Ransom Center blog

Your browser is out of date. Please update your browser at http://update.microsoft.com