TIME Courts

How Attica’s Ugly Past Is Still Protected

Attica After Riot
Santi Visalli Inc. / Getty Images The vacant prison yard strewn with debris at the Attica State Correctional Facility in Attica, New York, on Sept. 14, 1971

Why advocates and survivors are disappointed by a new release of records from the 1971 retaking of Attica Correctional Facility

On the morning of Sept. 13, 1971, New York State troopers, along with other members of law enforcement, shot 39 men to death. They wounded scores of others so severely that many suffered permanent disabilities. Not only did these men shoot countless rounds of ammunition at people who had no guns, but they also decided not to log the serial numbers of the deadly weapons they wielded. Some of them had even removed their badges so they couldn’t be identified.

This carnage took place at one of America’s most forbidding penal institutions—the Attica State Correctional Facility in upstate New York—where, four days earlier, over 1,200 prisoners had begun a historic protest against abysmal conditions and abuses. To ensure that the State of New York would consider their demands and not harm them, these men had taken hostages and also asked that outside observers come to the prison to oversee their negotiations with state officials. On day five, however—despite hostages, observers, and prisoners alike begging them to continue talks—state officials decided to retake the prison with force.

The results were disastrous. Within 15 minutes of hundreds of troopers, as well as some sherrifs, park police and corrections officers storming the prison, Attica’s D Yard was soaked in blood. Hundreds of men lay dead, dying or descending into shock from their gunshot wounds.

Despite this horrific scene, however, no member of law enforcement was ever held responsible for what he did that day. Indeed, even though 39 people had been felled by bullets, including the state’s own employees, no one ever stood trial for these killings. Instead, the State of New York spent the next 40 years resisting any attempt to make public its extensive records regarding who exactly did what to cause so much trauma at Attica.

A few days ago, however, Attica’s survivors and families of the deceased had some measure of hope that this might finally change. On Thursday morning, they waited with bated breath because finally, it seemed, the State of New York was going to release some particularly key records related to Attica, thanks to their relentless pressure and a judge’s ruling back in April. For decades these survivors have wanted the records opened. They believe that, if nothing else, they have the right to know who was responsible for their catastrophic losses. And, at 2:00pm those records were officially released–at least sort of.

What survivors hoped to see was insider information about the retaking of Attica, information that is contained in two previously-sealed volumes of the so-called Meyer Report—a report commissioned back in 1976 when the state was accused by one of its own of, among other things, covering up law-enforcement crimes committed at Attica. Finally getting to read volumes two and three of this report might mean, for example, that surviving hostage Michael Smith could finally find out who had riddled his lower abdomen with bullets, almost killing him and causing him years of agony. These volumes might also allow Traycee Barkley, the younger sister of slain 21-year-old prisoner L.D. Barkley to learn who had shot her brother to death and perhaps also to learn whether in fact, as various prisoners as well as respected state assemblyman observer Arthur Eve kept insisting back in 1971, he was shot to death after the retaking was complete. Or possibly the family of Attica prisoner Kenny Malloy might finally learn the name of the trooper who not only shot him to death, but also then proceeded to shoot out his eyes. Maybe the family of correction officer John Monteleone could finally learn how he ended up bleeding to death from a bullet wound to his chest.

Instead they learned little more than many of them already knew firsthand.

Prisoners’ decades-long insistence that police officers had brutalized them long after the state had regained control of the prison was, for example, confirmed in this release of records. This is no small thing since, for decades, state officials firmly denied that such abuses ever took place. Notably though, the prisoners who endured the retaking of Attica already knew that they had suffered mightily at the hands of law enforcement and, thanks to their tenacity in the courts over the last 40 years, their trauma had been acknowledged legally; juries awarded them damages for that trauma back in 1997 and in 2000 the State of New York was forced to pay them a settlement. That troopers had used excessive force when they retook Attica had also been confirmed and legally censured in 1982 when a judge issued his scathing ruling in a case brought against the State of New York by the widow of a slain hostage.

But, remarkably, of the 340 pages of the Meyer Report that the judge viewed for release, only 46 pages were given to the public and even these pages were partially redacted. It was no surprise that there would be redactions: To the satisfaction of the state’s Police Benevolent Association, the April ruling that led to the release of the papers had specified that no grand-jury testimony would be disclosed, in respect for the traditional secrecy of such proceedings. Still, the extent of the omissions came as a shock. The crucial names and the hoped-for details contained even in non-grand jury materials also remain hidden.

And so, despite the fanfare that accompanied the recent release of records related to the Attica uprising, Attica’s victims are left disappointed. Still they remain in the dark as to whom state officials knew, or believed strongly, had killed 39 people during the retaking or who had worked so hard to make sure those shooters would never be held accountable. Still those who retook Attica with such ugliness are protected–those individuals who unloaded round after round of buckshot and countless explode-on-impact bullets into the backs and bellies of hundreds of unarmed men. Indeed, in this disclosure of the Attica records there is no mention of these acts or these men–let alone any information revealed about them that might allow Attica’s victims some sense of closure.

Perhaps the State of New York is right to be concerned about disclosing all it knows about what exactly happened at Attica back in 1971. After all, there is no statute of limitation on murder in this country, and not only has the public been glad to see quite a few decades-old cases reopened in recent years, but men have gone to prison for crimes they committed long ago. And, what is more, for many months now Americans in cities across the country have been taking to the streets to demand that law enforcement be held accountable for citizen deaths. But Attica’s survivors don’t seem to be out for that kind of justice—or at least they hold no hope that it might be attained. Yet individuals, along with organizations like the Forgotten Victims of Attica, continue to press for an opening of the records. They just want closure. One day, they hope, state officials will finally care more about Attica’s victims than protecting those who caused the chaos.

The Long ViewHistorians explain how the past informs the present

Heather Ann Thompson is a professor of history at the University of Michigan who writes regularly on contemporary issues of incarceration and policing and soon will be publishing the first comprehensive history of the Attica Prison Uprising of 1971 and its legacy for Pantheon Books.

Read TIME’s original coverage of the events at Attica, here in the TIME Vault: The Bitter Lessons of Attica

TIME Opinion

How School Dress Codes Shame Girls and Perpetuate Rape Culture

Laura Bates is the co-founder of The Everyday Sexism Project which collects stories of sexual harassment and gender discrimination from minor incidents to more severe situations.

When teachers punish girls for wearing clothes deemed 'too distracting' for boys to handle, it teaches a damaging lesson

Some of our most powerful and lasting ideas about the world around us are learned at school. Hard work pays off. Success comes from working together. Girls’ bodies are dangerous and harassment is inevitable.

This might sound inflammatory, but it is not an exaggeration. It is the overriding message being sent to thousands of students around the world by sexist school dress codes and the way in which they are enforced.

In the past month alone a Canadian teen says she was given detention for wearing a full length maxi dress because it violated her school dress code by showing her shoulders and back and a UK school announced plans to ban skirts altogether.

These are just the most recent cases in an ever-growing list that has seen shoulders and knees become a battleground, leggings and yoga pants banned and girls in some cases reportedly told to flap their arms up and down while their attire was inspected, or asked to leave their proms because chaperones considered their dresses too ‘sexual’ or ‘provocative’.

Many schools respond to criticism of dress codes by citing the importance of maintaining a ‘distraction free’ learning environment, or of teaching young people about the importance of dressing appropriately for different occasions.

But at the Everyday Sexism Project, where people from around the world share their experiences of gender inequality, we have received over a hundred testimonies from girls and young women who are affected by the dress codes and feel a strong sense of injustice.

One such project entry read:

“I got dress coded at my school for wearing shorts. After I left the principal’s office with a detention I walked past another student wearing a shirt depicting two stick figures: the male holding down the females head in his crotch and saying ‘good girls swallow’. Teachers walked right past him and didn’t say a thing.”

Girls are repeatedly told the reason they have to cover up to avoid ‘distracting’ their male peers, or making male teachers ‘uncomfortable’…

“At my school our dress code dictates everything about a girls outfit: knee length shorts or skirts only, no cleavage, no bra straps, no tank tops. We can’t even wear flip flops, and girls will be given detentions and sent home for breaking any one of these rules. There’s no dress code for men, and the reasoning? Girls can’t dress “provacatively” [sic] because it could distract and excite the boys.”

I can’t help feeling there is a powerful irony in accusing a girl of being ‘provocative’ – in projecting that societal assumption onto her adolescent body – before she is even old enough to have learned how to correctly spell the word.

One student says she was given three specific reasons for the school dress code:

“1) There are male teachers and male sixth formers [high school seniors]
2) Teachers feel uncomfortable around bras etc.
3) Don’t want the boys to target you or intimidate you”.

This sends an incredibly powerful message. It teaches our children that girls’ bodies are dangerous, powerful and sexualised, and that boys are biologically programmed to objectify and harass them. It prepares them for college life, where as many as one in five women is sexually assaulted but society will blame and question and silence them, while perpetrators are rarely disciplined.

The problem is often compounded by a lack of any attempt to discipline boys for harassing behavior, which drives home the message that it is the victim’s responsibility to prevent. We have received thousands of testimonies from girls who have complained about being verbally harassed, touched, groped, chased, followed, licked, and assaulted at school, only to be told: “he just likes you”, or: “boys will be boys”. The hypocrisy is breath taking.

Meanwhile, the very act of teachers calling young girls out for their attire projects an adult sexual perception onto an outfit or body part that may not have been intended or perceived as such by the student herself. It can be disturbing and distressing for students to be perceived in this way and there is often a strong element of shame involved.

“I’ve been told by a teacher that the way I was wearing my socks made me look like a prostitute in my first year of school, making me 13, and I’ve been asked whether I’m ashamed of myself because I rolled my skirt up,” wrote one young woman.

The codes aren’t just problematic for sexist reasons. One project entry reads:

“At age 10 I was pulled out of my fifth grade class for a few minutes for a ‘special health lesson’. As an early bloomer, I already had obvious breasts and was the tallest in my class. I thought they were giving me a paper about reproductive health that’s normally given to the 12 year old girls. Instead I was told to cover my body more because I was different.”

Other incidents have also seen boys banned from school for having hair ‘too long’ or wearing traditionally ‘feminine’ fashion, from skinny jeans to skirts. A transgender student said he was threatened with having his photo barred from the school yearbook simply because he chose to wear a tuxedo to prom. Black girls are more likely to be targeted for ‘unacceptable’ hairstyles. The parents of a 12-year old African American student said she was threatened with expulsion for refusing to cut her naturally styled hair. Her mother was told she violated school dress codes for being “a distraction”.

At this point it starts to feel like such ‘codes’ are less about protecting children and more about protecting strict social norms and hierarchies that refuse to tolerate difference or diversity.

This is a critical moment. The school dress code debate will be dismissed by many for being minor or unimportant, but it is not.

When a girl is taken out of class on a hot day for wearing a strappy top, because she is ‘distracting’ her male classmates, his education is prioritized over hers. When a school takes the decision to police female students’ bodies while turning a blind eye to boys’ behavior, it sets up a lifelong assumption that sexual violence is inevitable and victims are partially responsible. Students are being groomed to perpetuate the rape culture narrative that sits at the very heart of our society’s sexual violence crisis. It matters very much indeed.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Television

Why Did The Bachelorette Let Rape Jokes Air?

A drunk contestant's bad behavior proves the show has a real problem

On Monday’s premiere of The Bachelorette, the reality show debuted its big twist. Instead of letting the woman choose her suitors as per the show’s usual formula, two women, Britt Nilsson and Kaitlyn Bristowe, would compete for the men’s attention. At the end of a cocktail party, 25 potential suitors would vote for whoever they would prefer to woo over the course of the season. It was a controversial move that turns the ostensibly woman-powered Bachelor spin-off on its head, giving all the control back to the scrum of men. But we were willing to give it a chance. Last night, though, The Bachelorette went too far, allowing misogynistic comments, threats of rape and wholly inappropriate behavior to slide in the name of a game and supposedly “good” television.

The premiere began with a cavalcade of handsome suitors heading into a cocktail party after staking their claim on either Britt or Kaitlyn. (You can read a full recap here.) These parties are usually where contestants tend to overindulge and last night was no exception. The main culprit was a man named Ryan M., a 28-year old “junkyard specialist” hailing from Kansas City, Mo. Ryan got wasted in primetime and sat in the shrubbery, heckling suitors who arrived, yelling, “You suck!” from the bushes. He then drunkenly escorted Britt outside for some one-on-one time, proceeding to hug her and touch her face until one of the other suitors distracted him with the promise of another drink. Then Ryan slapped Kaitlyn’s rear as she walked by and she scowled at him, but presumably because of the show’s set-up, she couldn’t just take the power into her own hands and kick him off. Instead, she just had to take it. Both women were rendered defenseless by the fact that they needed to earn as many votes as possible to stay on the show. As he stripped down for the pool, Ryan declared that he was, “Totally horned up.” And that’s not even the worst of it.

Finally another contestant confronted Ryan about his behavior, and Ryan replied, “Why am I not raping you right now? That’s my whole thing.” The other suitor looked horrified and called him out on it, causing Ryan to simply yell, “You suck!” and stumble onward. It was a shocking moment of literal — not implied — rape culture come to life on the small screen, with one drunken loser spewing horrible things on television. Still, the cameras just rolled and a producer in an editing booth somewhere decided to leave that clip in.

When producers finally had enough of Ryan’s antics, a giant Bachelorette bouncer sent him to speak to host Chris Harrison outside of the house. “I hate to do this, but you’re clearly not here for either one of these girls or for sincere reasons,” Harrison told the disgraced contestant. “I really think it’s best if you go home. There’s a car waiting for you.” And with that, Ryan was finally sent packing.

Good riddance, but let’s back up a second. While sending Ryan off, Harrison said, “I hate to do this.” Why would he hate do that? Seems like kicking off a man who got drunk, stripped, inappropriately grabbed one of the show’s stars and then threatened to rape another contestant would be the sort of person you should enjoy sending home. It would set a good precedent not only for future contestants, but also for the at-home audience. Instead, Harrison made it clear that they weren’t sending Ryan home for his inappropriate conduct, or for getting black-out drunk, but because “he wasn’t there for the right reasons.” That reasoning seems to imply that the show and its mission of following one of these women on their journey to find love, was more important than a man acting wildly inappropriately toward the women and threatening violence on national television. It makes it seem like the producers don’t care about his conduct as long as he was “there for the right reasons.”

While The Bachelorette is not the most feminist of franchises — there’s an ongoing discussion about the show’s perceived slut shaming — there was something refreshing about a woman choosing a partner on her own terms (well, terms worked out in advance with the producers). This season was already shaking things up by removing any sense of female empowerment from the show, by letting the women compete for the men’s votes. But after last night, it seems clear that the show has lost its way. Whether viewers are Team Britt or Team Kaitlyn, it’s unlikely that they are Team Misogyny.

TIME Fine Art

Why Aren’t American Museums Doing more to Return Nazi-Looted Art?

Portrait Adele Bloch-Bauer I. D150. 138:138 cm. Oil on canvas. 1907
Imagno / Getty Images Portrait Adele Bloch-Bauer I. Oil on Canvas by Gustav Klimt. 1907

Seventeen years after the US hosted the Washington conference on Nazi-confiscated art and pledged to facilitate “just and fair” solutions, a lack of transparency in American museums remains

History News Network

This post is in partnership with the History News Network, the website that puts the news into historical perspective. The article below was originally published at HNN.

Helen Mirren’s latest film, Woman in Gold, tells a true story of an arts battle.

Mirren stars as Maria Altmann, a naturalized US citizen who sues the Austrian government to recover a glittering, golden portrait of her aunt, Adele Bloch-Bauer, painted by Viennese art nouveau master Gustav Klimt and looted from her family’s home by the Nazis.

Justice prevails through pressure imposed by US courts: the portrait of Adele finds a welcoming new home in America.

While Woman in Gold is a feel-good, triumphant tale for American audiences, it’s important to note that the country’s own art museums still have work to do to ensure justice for Holocaust victims. The story of this one painting doesn’t mitigate the fact that at least 100,000 works of art confiscated by the Nazis haven’t been returned to rightful owners.

In museums across the US there are paintings, sculptures and other works of art with provenance gaps from the Nazi era, signaling a need for ongoing research into rightful ownership.

An international court battle

The Austrian government believed the painting had been willed to them in 1925 by its subject, Adele Bloch-Bauer. For this reason, they argued the painting had nothing to do with restitution – the return of works to the victims of Nazi theft or their heirs.

However, in 1998 Austrian journalist Hubertus Czernin unearthed documents in Austrian archives indicating that Adele’s husband Ferdinand had been the rightful owner when the Nazis seized the collection in 1938.

Ferdinand had died nearly penniless in Zurich in 1945, leaving all his assets to his niece Maria Altmann, along with her brother and sister.

Czernin’s findings boosted Altmann’s claims to the portrait and four other Klimt paintings still held by the Austrian government.

Ultimately, Altmann – a naturalized US citizen – decided to file her claim in a US court by invoking the Foreign Sovereignty Immunities Act. The law provides exceptions to sovereign immunity when a country violates international law with US commercial interests at stake.

The case went all the way to the Supreme Court, which ruled in Altmann’s favor in 2004.

Rather than face a protracted legal battle, Austria offered to try the case via an arbitration panel of Austrian experts. The panel awarded the painting to Altmann.

Altmann would eventually sell the portrait to cosmetics heir and World Jewish Congress leader Ronald Lauder for a then-record sum of $135 million.

Austria was already seeking to make amends

Altmann’s ultimate victory allowed Americans to relish the US role in the restitution of Nazi-looted art.

In this way, Woman in Gold continues a heroic art recovery narrative also reflected in The Monuments Men, the 2014 George Clooney film that extols the bravery of American men who risked their lives to recover thousands of works from Nazi repositories in castles, churches and salt mines.

But Woman in Gold glosses over the broader historical context of art restitution.

In fact, after decades of thwarting restitution claims, the Austrian government – like other European countries – had already been actively taking steps to compensate victims of Nazi persecution.

In 1996, it auctioned looted artworks still held by the government, giving $14.6 million in proceeds to Jewish organizations.

Two years later, Austria was among 44 countries that signed the Washington Conference Principles on Nazi-Confiscated Art, a non-binding agreement to pursue just and fair solutions in restitution cases.

And the same year, an Austrian Federal Art Restitution Law provided for restitution of works held by state museums that had been donated under duress or looted in the Nazi era.

Americans aren’t always heroes in the story

Art enthusiasts in America should ask themselves whether more could be done to ensure US museums are not holding Nazi-looted art.

After World War II, some American museums expanded their collections by purchasing art or accepting donations without researching the objects’ ownership history in the Nazi era.

In Manhattan not far from the Neue Galerie, where the Adele Bloch-Bauer portrait now hangs, the Museum of Modern Art (MoMA) holds one of the greatest modern art collections in the world.

But according to historian Jonathan Petropoulos, author of The Faustian Bargain: The Art World in Nazi Germany, the museum’s founding director, Alfred Barr, acquired pieces confiscated or stolen by the Third Reich.

And MoMa recently defended its ownership of three paintings by German artist George Grosz purchased by MoMA in 1952 from Curt Valentin, a New York dealer who had funneled art from Nazi Germany to the US.

In 2009 heirs of the artist had filed a restitution lawsuit in US District Court, which found the statute of limitations invalidated the heirs’ claim, a decision confirmed on appeal in 2010.

MoMA won a legal victory but the ethical implications are less clear.

Uncertainty remains about the origin of other works in the collection. The MoMA Provenance Research Project provides a list of 800 works under investigation. But ownership gaps abound. In each case, there’s no indication of ongoing research.

Meanwhile in Pasadena, California, the Norton Simon museum is currently embroiled in a lawsuit over two Cranach paintings claimed by Marei von Saher, heir of the Jewish Dutch dealer Jacques Goudstikker, whose collection was seized by Nazi leader Hermann Göring.

The Norton Simon website does have a general statement on the importance of provenance research and “filling gaps” in Nazi-era ownership. But it provides no list of relevant works in the collection.

And while most US art museums agree to abide by ethical standards in acquisitions and provenance research established by the American Alliance of Museums, there’s no government-mandated enforcement mechanism.

So here we are today: 17 years after the US hosted the Washington conference on Nazi-confiscated art and pledged to facilitate “just and fair” solutions, a lack of transparency in American museums remains.

Yes, the “Woman in Gold” was returned to its proper owner. But how many Nazi-era portraits, landscapes and still-lifes painted in countless colors remain in America’s museums – havens that are not their rightful homes?

Elizabeth Campbell Karlsgodt is an Associate Professor of History at University of Denver

TIME Opinion

In Defense of Flip-Flopping

President Abraham Lincoln, circa 1855.
Archive Photos/Getty Images President Abraham Lincoln, circa 1855.

What’s so wrong about admitting you were wrong?

History News Network

This post is in partnership with the History News Network, the website that puts the news into historical perspective. The article below was originally published at HNN.

The matter does not appear to me now as it appears to have appeared to me then.”– Baron Bramwell, Andrews v. Styrop (1872)

The presidential campaign of 2016 has barely begun. The Iowa caucuses are nine months away. Yet the hounds, baying and snapping, have been loosed against the political crime of flip-flopping, charging after any suggestion that a candidate has reconsidered an earlier opinion.

On the Democratic side, former Secretary of State Hillary Clinton is being pilloried (hard to resist the rhyme) for changing her views on a foreign trade deal, not to mention immigration, and gay marriage, and government support for ethanol.

Then there’s former Florida Governor Jeb Bush, as close to a front-runner as the Republicans have. Could he have flip-flopped? You betcha. How about Indiana’s “religious freedom/anti-gay” law, plus immigration, or even his wavering commitment to the Paleo diet.

What about fellow Floridian Senator Marco Rubio? Surely he’s too fresh-faced and principled to fall into the flip-flop trap? Ah, but he did. That fiendish immigration issue snagged him squarely in the flip-flop.

Governor Chris Christie of New Jersey? Flip-flopper on the Common Core education program. And immigration. And the War on Drugs.

What about Wisconsin Governor Scott Walker, a straight-shooting Midwesterner, not him too? Immigration again, this time an “Olympics-quality” flip-flop according to a former Walker aide, plus that pesky ethanol issue, and anti-gay discrimination.

Whoa, whoa, whoa. To paraphrase the Bob Dylan lyric, something is happening here, and we do know what it is. Presidential campaigns are devolving into a circular game of “gotcha.” Have you changed your position, rethought an idea? Then you are by presumptively unworthy, unreliable, and downright scummy. To the political guillotine with you!

But don’t we want our leaders to revise their views and understanding of the world based on new information or new ideas? Indeed, don’t we want them to be wise and candid enough to figure out when they’re just plain wrong?

Today’s demands for unyielding political consistency would disqualify a number of former presidents from office. In 1861, Abraham Lincoln supported a constitutional amendment to preserve slavery in those Southern states where it was legal. Eighteen months later he issued the Emancipation Proclamation. Flip. Flop. Or maybe it was the sharpening of moral sensibility and strategic sense in the face of a horrible civil war.

Woodrow Wilson won re-election in 1916 by boasting that he “kept us out of war,” shielding American boys from the slaughter of The Great War. Less than five weeks after he took the oath of office for his second term, Wilson responded to German submarine attacks by asking Congress to declare war.

James Madison so disliked Alexander Hamilton’s Bank of the United States – fearing it as a centralizing force – that he created a formal opposition to the administration of President George Washington. When the bank’s charter expired twenty years later, Madison decided the bank was useful after all and called for its renewal. After Congress created the second Bank of the United States, Madison signed the enacting legislation into law.

Even the great Washington, pillar of American rectitude and integrity, was forced by British military successes to abandon his strategy of confronting the British Army in major battles. He adopted instead a “Fabian” strategy of avoiding pitched fights while harassing the British and draining its soldiers and its citizens of commitment for a foreign war.

We’ve all flip-flopped on something. Between 2003 and 2013, an opinion survey found that support for gay marriage rose from 32 percent to 53 percent. That’s millions and millions of flip-floppers.

Presidents sometimes need to stand up against public opinion and political opposition. Sometimes they need to accommodate both. A candidate who changes position frequently may lack the ability to stand firm. One who rarely changes position may be impervious to reason. What really matters isn’t whether the candidate has changed his or her view, but whether the candidate makes good decisions, whether he or she makes sense. Let’s concentrate on that.

David O. Stewart is the author of “Madison’s Gift: Five Partnerships That Built America” and other works of history and fiction.

TIME Opinion

The Tom Brady Suspension Shows the NFL Plays in Its Own Warped Moral Universe

Charlotte Alter covers women, culture, politics and breaking news for TIME in New York City.

Where crimes against women require more evidence than crimes against footballs

The thing about footballs is that footballs don’t talk. Footballs can’t accuse, footballs have no motives, footballs have no credibility to lose. Footballs do not dream and do not fear. And yet, in the National Football League in 2015, suspected abuse of a football merits a roughly equivalent punishment as suspected abuse of a woman.

In the moral universe of the sane, there is a clear pecking order of sins. Crimes against people are worse than crimes against things. Hurting a person’s body is worse than hurting a person’s feelings (or wallet). Beating is worse than cheating. But the NFL operates in a moral universe all of its own.

New England Patriots quarterback Tom Brady was suspended Monday for four games without pay after the NFL concluded that he was probably “at least generally aware” that footballs had been intentionally deflated to give him an edge. The team was fined a million bucks, and forfeited its first-round draft pick in 2016 and fourth-round draft pick in 2017. Immediately afterwards, the Twittersphere erupted in outrage that Brady had been suspended for four games for Deflategate, while former Ravens running back Ray Rice was originally suspended for two games after he beat up his girlfriend.

Then again, it’s not that simple. We can’t make a direct comparison between Tom Brady’s four-game suspension for his alleged involvement in Deflategate and Ray Rice’s two-game suspension for punching a woman in the face. For one thing, the NFL later admitted they had been too lenient with Rice, and he was ultimately suspended indefinitely, although that suspension was overturned after Rice won an appeal. When Adrian Peterson was indicted for child abuse after beating his 4-year old with a switch, the NFL suspended him for the season, but only after loud public outcry, and that suspension was also overturned on appeal. And earlier this year, Greg Hardy was suspended for 10 games for four domestic violence incidents against his ex-girlfriend, in accordance with the NFL’s new Personal Conduct Policy (although it’s worth noting that a 10 game suspension for four incidents is a punishment of just over two games per incident.)

So it’s not as simple as “two games for beating a woman, four games for delating a football.” The two incidents came at different points during the NFL’s long, slow process of growing a conscience. Instead, it’s more telling to look at what kind of evidence was used to come to these conclusions.

The NFL did not conclude beyond a reasonable doubt that Brady was the mastermind of Deflategate. Instead, the league decided that Brady was “at least generally aware” that the balls were intentionally deflated, citing a “preponderance of the evidence, meaning that ‘as a whole, the fact sought to be proved is more probable than not.'” That evidence is mostly a flurry of phone calls between Brady and equipment assistant John Jastremski, and the fact that Brady would not hand over his text and email records.

After two women filed police reports last year saying Dallas Cowboys player C.J. Spillman sexually assaulted them in 2013, he’s still playing— Cowboys head coach Jason Garrett said he’ll continue to play until an arrest is made and charges are filed. Two women accused Ray McDonald last year — one for domestic violence, one for sexual assault— but when he was cut for the 49ers because of a “pattern of poor decision making,” officials said it was a team decision, not a league decision. That means he’s free to keep playing for the NFL, and he just signed with the Bears. And despite Erica Kinsman‘s extensive, detailed account of the night she was allegedly raped by Jameis Winston, a rape which was investigated by the New York Times but mostly ignored by the police, he was selected as the top NFL draft pick. (Winston has filed a counter-suit to Kinsman’s civil lawsuit, claiming Kinsman is lying, and that she’s “0 for 6″ in her claims against him.)

So when it’s a question of footballs, a series of phone records that indicate Brady was probably “generally” aware is enough to merit a tough punishment. But when it’s a woman accusing a football player of abuse, police reports and rape kits are not.

This isn’t really about Tom Brady or Ray Rice. This is about what the NFL thinks counts as “proof.” Clearly, women’s stories don’t make the cut.

Read next: Why the Tom Brady Suspension Is Ridiculous

Listen to the most important stories of the day.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Opinion

How Our War on Drugs Undermines Mexico

Suspects being searched for narcotics at
Co Rentmeester—The LIFE Picture Collection/Getty Images Suspects being searched for narcotics at the US-Mexico border customs post in 1969

The continued dominance of multi-billion dollar Mexican drug cartels is linked to aggressive drug policies of the U.S. in the 1960s and '70s

History News Network

This post is in partnership with the History News Network, the website that puts the news into historical perspective. The article below was originally published at HNN.

This June marks 44 years since President Richard Nixon declared a war on drugs, marking an unprecedented mobilization of U.S. resources to combat the drug trade domestically and abroad. The reason? A fear that narcotics use, especially during the Vietnam War era, posed a grave threat to society. Or perhaps, more accurately, drugs threatened authority figures during a period of large-scale revolution against social norms across much of the western world.

With a perceived increase in U.S. drug use, Nixon deployed the now ubiquitous war on drugs metaphor to generate increased funding for drug control. “Public enemy number one in the U.S. is drug abuse,” Nixon said to Congress in 1971. “In order to fight and defeat this enemy, it is necessary to wage a new, all-out offensive.” With this address came a complete discarding of any sort of prevention and treatment solutions to drug abuse, and the wholehearted embrace of a “war” within our own borders.

Enter Mexico, 1969. For years, Mexico’s domestic supply of psychoactive raw materials (cannabis, peyote, opium poppies, hallucinogenic mushrooms, etc.) concerned authorities on both sides of the border. Jack Kerouac, Allen Ginsberg, and a host of other popular figures had trekked to Mexico in the 1950s for the specific purpose of acquiring psychoactive substances. Throughout the 1960s, many others—hippies, students, and sympathizers of the antiestablishment culture—followed suit.

Months after coming into office Nixon ordered the border shut down in an operation known as “Interception” to cut off the flow of Mexican marijuana coming into the U.S. But as the complete shutdown of border commerce debilitated the Mexican economy, it was clear that Interception was also intended to force Mexico into complying with newly established U.S. drug policies such as supply-focused initiatives and more intense marijuana policing.

With the Drug Enforcement Agency’s (DEA) formation in 1973, the DEA and other U.S. institutions would utilize the war on drugs as grounds for increased involvement overseas. The abduction, torture, and murder of undercover DEA agent Enrique “Kiki” Camarena in 1985 resulted in another U.S. clampdown on Mexico when Mexican authorities failed to locate Camarena or any suspects in the crime.

Believed by the reigning Guadalajara cartel to be the leak of a major drug bust by the Mexican military, Camarena was kidnapped by corrupt police officers in broad daylight. The DEA eventually forced the Mexican government’s apprehension of suspects. Interestingly, a number of 2013 reports alleged CIA involvement in Camarena’s murder, as he was purportedly a threat to U.S. drug operations.

As the focus of the drug trade expanded to the Caribbean in the 1980s, the U.S. government’s strict targeting of the drug supply over air and water ultimately facilitated Mexican cartels’ control of land routes.

So did the North American Free Trade Agreement (NAFTA), established in 1994, which opened the U.S.-Mexico border for free trade. As cheap U.S. agricultural goods flooded Mexican markets, the Mexican farming industry was severely hit, prompting the cultivation of drugs. In due time, the increased flow of north-south commerce also included illegal substances from Mexico and other parts of the Americas.

Much more is known about Mexico’s drug violence since the early 2000s. Places like Ciudad Juárez epitomize the contemporary Mexican drug war. Between 2006 and 2012, Mexican president Felipe Calderon presided over an unparalleled mobilization of Mexican resources to fight off drug cartels.

The resulting death toll: more than 70,000.

Calderon’s counterpart in the U.S., George W. Bush, responded to calls for aid by establishing the Merida Initiative in 2008. A security cooperation agreement between the U.S., Mexico, and Central American countries, the Merida Initiative was intended to combat drug trafficking and crime.

The more liberal, forgiving attitudes toward drug control in the U.S. of recent years might suggest a break form the long, at times, violent history of U.S.-influenced drug enforcement. There is, for example, widespread acknowledgement that the war on drugs has been a policy failure. The Obama administration has distanced itself from the war on drugs, at least rhetorically. And yet last year’s federal drug war budget — topping $25 billion — and the continued efforts of U.S. institutions abroad in the name of drug control, remind us that a war on drugs is still alive and well.

The current system is propped up by many different U.S. and Mexican institutions—police forces, the military, the CIA, the State Department, etc.—each with its own set of interests. Methodical funding cuts would have to be made alongside fundamental revisions of the roles these institutions play for real change to take place. For all of the talk of marijuana legalization and an end to the war on drugs, policies along these lines have yet to be established, let alone brought more fully into the global drug debate.

There are a number of unrecognized dimensions to the way the war on drugs has played out in Mexico over the last four decades. During the 1970s and 1980s, for example, Mexican police forces and its military conflated drug policing with targeting political subversives and campesino farmers. In other words, drug policing became a convenient justification for monitoring perceived enemies of the state.

It is baffling how the U.S. remains very much involved in Mexican drug affairs at the same time it is trying to distance itself from the impact of the drug trade in Mexico. This tension played out last fall in the rather silent U.S. response to the drug-related murders of 43 student teachers in the Mexican state of Guerrero and the government’s attempts at a cover-up. Mexicans know the U.S. is embedded in their country’s drug control efforts. One can only imagine their sense of disappointment when the U.S. failed to speak up against a major incident of government corruption.

If the U.S. has chosen to break with the war on drugs, then it should start making noticeable funding cuts in drug control. Sounds simple enough, but this should be done in conjunction with a reduction in the size of drug enforcement missions at home and abroad. Finally, moving past a war on drugs in the long term must involve nuanced, multilateral reforms with foreign counterparts. Otherwise, what the U.S. facilitates in Mexico would not be the reduction of the drug trade, but increasing divisions between north and south of the border, divisions largely of its own creation, as the longer history of the war on drugs reminds us.

Aileen Teague is a Ph.D. Candidate in History at Vanderbilt University and formerly a U.S. Marine Corps Officer. She is currently a Researcher with the Fulbright Program in Mexico City.

Read next: Suspended Priest Dubbed Monsignor Meth Gets Over 5-year Term

Listen to the most important stories of the day.

TIME Opinion

Lessons of the Fall of Saigon

War of Vietnam. Saigon's fall. Taken of the presid
Francoise De Mulder—Roger Viollet/Getty Images Saigon's fall and the taking of the presidential palace, on April 30, 1975

The Vietnam War changed the United States as much as it changed South Vietnam

Forty years ago today, on April 30, 1975, helicopters carried away the last Americans in Saigon as North Vietnamese troops entered the city. What followed showed that the war had changed the United States as much as it had changed South Vietnam.

Only 28 months before the end, President Nixon had announced that the war’s end would come with “peace with honor,” and promised to respond vigorously to any North Vietnamese violations of the peace agreement. But Congress had insisted upon a final end to military action in Southeast Asia in the summer of 1973, and Watergate had driven Nixon out of office a year later. Neither the US government nor their South Vietnamese ally, President Thieu, had shown any interest in implementing the provisions of the peace agreement designed to lead to genuine peace The millions of young Americans who had served in South Vietnam from 1962 through 1972, and the thousands of planes that had flown bombing missions from carriers and airfields in the region, had proven time and time again that they could hold on to most of the country as long as they were there. But the Americans could do nothing about the political weakness of the South Vietnamese government. The communists still effectively ruled much of the countryside and had infiltrated every level of the South Vietnamese government from the Presidential palace on down. American money, not loyalty, had driven the South Vietnamese war effort. With no prospect of American help, the South Vietnamese Army simply collapsed in the spring of 1975 after Thieu ordered a precipitous withdrawal from the Central Highlands. The North Vietnamese won their final victory almost without fighting.

A variant of this sad story has already been replayed in Iraq, where tens of thousands of supposedly American-trained Iraqi Army troops melted away in 2014 when faced with ISIS. There, too, the American-backed government had totally failed to secure the allegiance of the population in Sunni areas. The same thing may well happen in Afghanistan, where a new President has already persuaded the Obama Administration to delay a final withdrawal. That was the overwhelming lesson of Vietnam: that American forces, no matter how large, cannot create a strong allied government where the local will is lacking.

Like most historical lessons, that one lasted for as long as men and women who were at least 40 years old in 1975 held power. Army officers like Colin Powell were determined never to see anything similar happen on their watch, and they kept the military out of similar situations in El Salvador and Lebanon during the Reagan years. Instead, the Soviet Union found its own Vietnam in Afghanistan, and that last foreign policy adventure helped bring Communism down. In 1990-1, George H. W. Bush decided to expel Saddam Hussein from Kuwait, but Powell and others made sure that operation would be carried out quickly, with overwhelming force, and with no long-term occupation of enemy territory. Bill Clinton, who had opposed the Vietnam War, kept the United States out of any ground wars as well.

The neoconservatives who took over policy and strategy under George H. W. Bush were either too young to have fought in Vietnam, or, like Bush (and, for that matter, myself), had served in non-combatant roles. Some of them had persuaded themselves that Vietnam would have been successful if the United States had sent South Vietnam more aid, and all of them were certain they could topple the Iraqi government without serious repercussions. Iraq in 2003 was about twice as populated and much larger in area than South Vietnam in 1962, but they were certain that less than a third of the troops eventually needed in South Vietnam would do the job. They were wrong on all counts. Late in Bush’s second term, American troops showed once again that they could quiet an uprising as long as they remained in the country. But the Iraqi government was determined to see them leave, and last year it seemed that that government might go the way of President Thieu. That has not happened, but Baghdad seems to have lost control of much of the Sunni region for a long time to come.

President Gerald Ford was the American hero of the last phase of the Vietnam War. Although Congress had refused his requests for additional aid to the South in those last desperate weeks, he refused to blame Congress, war protesters, or the media for the fall of Saigon. On April 23, with the complete collapse of South Vietnam only days away, the President gave a major speech in New Orleans. “Today,” he said, “America can regain the sense of pride that existed before Vietnam. But it cannot be achieved by refighting a war that is finished as far as America is concerned. As I see it, the time has come to look forward to an agenda for the future, to unify, to bind up the Nation’s wounds, and to restore its health and its optimistic self-confidence.” This much-underrated President, who was destined to lose a close election in another 18 months, had caught the mood of the American people. Henry Kissinger had explained to Nixon in the fall of 1972 that the US could survive the eventual fall of South Vietnam if the South Vietnamese could clearly be held responsible—immediately began blaming the Soviet Union on the one hand, and the Congress on the other, for the debacle. But Ford gave the American people permission to feel that they had given far more than anyone could ever have expected to this hopeless cause.

It seems today as if another frustrating series of interventions has temporarily vaccinated the US against any such large-scale deployments. Neither politicians nor military leaders will be eager to repeat the Iraq experience for a long time, and the Obama Administration has moved from “counterinsurgency” to “counterterror,” relying on drone strikes. But since the interventions in Iraq and Afghanistan seem likely to lead to endless chaos rather than to the symbolic fall of a capital, it seems unlikely that Obama or any future President will manage to put our Middle East adventure behind us in the way that Ford did for Vietnam. That is unfortunate, because great powers need to be able to come to grips with the limits of their power, especially in highly troubled times like our own.

The Long ViewHistorians explain how the past informs the present

David Kaiser, a historian, has taught at Harvard, Carnegie Mellon, Williams College, and the Naval War College. He is the author of seven books, including, most recently, No End Save Victory: How FDR Led the Nation into War. He lives in Watertown, Mass.

TIME Opinion

Lessons for Baltimore From 1968

Baltimore Arrest During Riot
Picasa / Getty Images A man carried away by police during riots, Baltimore, Maryland, 1968.

How history can heal a harmed city

In the 20 years that I have lived in Baltimore City, I have seen guns fired only twice; in each instance the targets were black men and the shooters were police. In one case the officer was trying to stop a group of men who had apparently stolen a car. They bailed out in front of my house, and as they were running away, the officer fired, but missed. In the second case the officer’s aim was better; an assailant held up a medical student on a bicycle, then ran through traffic right in front of our car. An off-duty cop saw the scuffle and fired. He turned out to be a 14-year-old with a BB gun. The boy lay in the street, shot in the stomach; my 12-year-old son and I waited until the police told us to move on. I called my district and set up an appointment with a detective. No one ever came to question me.

Those incidents came back to me this week when the death of Freddie Gray triggered days of peaceful protests that splintered into something uglier on Saturday, and anti-police violence erupted on Monday. But those weren’t the only moments from the past that seemed worth thinking about. The looting and arson led to comparisons to the unrest that followed the assassination of the Rev. Dr. Martin Luther King, Jr.—and, as an assistant professor of history at the University of Baltimore who has studied Baltimore in 1968, I can see a number of similarities. After several days of peaceful commemoration of Dr. King’s death, disenfranchised youth instigated disturbances in fifteen neighborhood commercial districts. Curfews were imposed, just as they were in Baltimore this week, and hundreds of citizens were eventually swept into custody. During both of the crises, members of the clergy of all faiths walked the streets in attempts to restore order.

But the real link between the two moments, 1968 and today, runs deeper than that. It’s not about the appearance of similarity, but rather the causes and effects.

As UB discovered in a community-based, multi-disciplinary examination of the riots 40 years later, the causes and consequences of urban unrest are complex and multifaceted. As part of our project, our diverse student body interviewed their friends and family, and we heard stories that illustrated deep systemic trends that led to generations of anger and frustration: practices in the private sector like residential covenants that forbade sales to black and Jewish buyers, federal policies like redlining that discouraged bank loans to poor and aging neighborhoods, urban renewal policies that used federal funds to build highways that cut neighborhoods off from the rest of Baltimore; limited job opportunities as Baltimore’s blue-collar jobs began to evaporate. All of those forces had been at work long before Dr. King’s assassination, and, as we see violence along the same streets almost five decades later, Baltimoreans still feel their effects today.

We also heard stories about businesses that were destroyed after families had poured years of effort and capital into them. In 1968 the Pats family lost its pharmacy on West North Avenue, just a few blocks from the CVS that burned this Monday evening. Their business was looted, then their entire block was burned, including their apartment. Their neighbors, who lost their jewelry store, had been relocated to Baltimore after surviving the Holocaust. Baltimore’s retail sector has still not recovered in many areas of the city. A number of neighborhoods have been declared food deserts, and no department store exists within the city limits. When a Target arrived at Mondawmin Mall and hired city residents, Baltimoreans welcomed it. But on Monday night we watched with dismay as looters ran out of Mondawmin, their arms full of merchandise.

In 1968, the governor of Maryland called out the National Guard, just as Governor Larry Hogan did on Monday night, and soon tanks patrolled the city streets. The unrest quieted, and by the end of the week the Orioles held opening day on schedule.

Here’s where the stories diverge. Maryland’s then-governor, Spiro Agnew, rode the wake of Baltimore’s disturbances right into the White House, using his tough-on-crime reputation to become Richard Nixon’s vice-presidential running mate. It is too simplistic to say that the policing approach Agnew advocated led directly to the kind of practices that killed Freddie Gray, Michael Brown, and Eric Garner. We cannot exclude from the list of causes Nixon’s War on Drugs, the crack epidemic of the 1980s and ‘90s, the growth of the prison-industrial complex, and the continuing hemorrhaging of blue-collar jobs from America’s aging industrial cities—but the reaction to the urban riots of the 1960s certainly started us down this path.

The similarities can stop. Knowledge of the aftermath of 1968 can help prevent its repetition. In the early 1970s law and order policing reinforced divisions around race, class, and geography in an attempt to lock up the problems instead of addressing them. We can learn from those mistakes. On Tuesday morning the NAACP announced that they would open a satellite office in Sandtown-Winchester, Freddie Gray’s neighborhood, to provide counsel to residents on a host of legal issues, including police misconduct. An external oversight board to monitor reports of police violence would serve as a powerful partner in this effort. Out on the streets on Tuesday morning, Baltimoreans worked together to clean up the debris from the night. I hope that as we work we will find a chance to tell each other our stories, and that this time we will listen.

The Long ViewHistorians explain how the past informs the present

Elizabeth M. Nix is a professor of legal, ethical and historical studies at the University of Baltimore, and co-editor with Jessica Elfenbein and Thomas Hollowak of Baltimore ’68: Riots and Rebirth in An American City.

 

TIME Opinion

Exclusive: Dr. Oz Says ‘We’re Not Going Anywhere’

The physician and TV personality slams his critics and responds to their critiques

I started my show to give TV audiences advice on how to find a good life, not to practice medicine on air. This means celebrating them wherever they are in their search for health, and offering tools to nudge them along in the right direction. In the same hour-long show, a board certified doctor will discuss cancer followed by a celebrity sharing their personal weight loss story and concluding with an audience member learning to manage their money better. I don’t expect all of my colleagues to understand this marriage between conventional medicine and the broader definition of wellness that the show pursues. I expect and respect the criticism of colleagues who struggle with my approach and I try to improve the show accordingly.

But I was surprised by a brazen note as I entered the operating room at New York Presbyterian/Columbia University this week. A small group of physicians unknown to me were asking my dean to revoke my faculty position for manifesting “an egregious lack of integrity by promoting quack treatments and cures in the interest of personal financial gain.”

The dean politely reinforced that the academic tradition of all institutions protects freedom of speech for their faculty, and I assumed the matter was over. The surgery went much better than the media fury around this letter. Within 12 hours, most major media outlets had published articles on the note, many mistakenly stating Columbia faculty were trying to oust me. Who were these authors and why were they attacking now?

With a few clicks and some simple searches, a remarkable web of intrigue emerged—one that the mainstream media has completely missed. The lead author, Henry I. Miller, appears to have a history as a pro-biotech scientist, and was mentioned in early tobacco-industry litigation as a potential ally to industry. He also furthered the battle in California to block GMO labeling—a cause that I have been vocal about supporting. Another of the letter signees, Gilbert Ross, was found guilty after trial of 13 counts of fraud related to Medicaid. He is now executive director of American Council on Science and Health, a group that has reportedly received donations from big tobacco and food and agribusiness companies, among others. Another four of the 10 authors are also linked to this organization.

I have spent my entire career searching for ways to lessen the suffering of my patients. The best and safest paths have generally been the traditions of conventional medicine. They are tried and true, well funded, and fast. But there are other routes to healing that offer wisdom as well, so I have been willing to explore alternative routes to healing and share any wisdom that can be gathered. I have done this throughout my career as a surgeon, professor, author and, of late, as a talk-show host. Despite being criticized, I want to continue exploring for myself and my audience. Why?

Because in some instances, I believe unconventional approaches appear to work in some people’s lives. They are often based on long-standing traditions from different cultures that visualize the healing process in very different ways from our Western traditions. They are aimed at chronic conditions like lack of energy, fogginess, or moodiness—which are frequently overlooked or under-treated by conventional practitioners. They are also often inexpensive. With limited profit motive, companies understandably do not wish to invest significant resources into proving benefit, so these unconventional remedies do not undergo rigorous clinical studies. So we have practitioners recommend therapies that they find effective in their own practices. When I interview an unusual or interesting person on my show, often it’s expository or out of fascination—not to tell my audience they should see a psychic instead of their primary care physician.

It’s vital that I drive the following point home: My exploration of alternative medicine has never been intended to take the place of conventional medicine, but rather as additive. Critics often imply that any exploration of alternative methods means abandoning conventional approaches. It does not. In fact, many institutions like mine use the names “complementary” or “integrative” medicine, which is also appropriate.

This can lead to confusion and irritation when analyzed by conventional physicians. For example, another daytime TV show and mine were recently noted in a BMJ article for only having proof for half of what we shared with the audience. A similar figure is often used to approximate the amount of randomized clinical trial data underlying conversations in physician’s offices across America. This reflects that natural gap between what is proven in clinical trials and the needs of our patients.

The BMJ authors were correct in reporting that advising people with the flu to rest or cough into the crook of their arms is completely unproven. But major organizations like the Centers for Disease Control and Prevention (CDC) give rational advice of this nature that isn’t directly linked to a research paper. When there isn’t data, we rely on the non-literature-based guidance of the CDC, the National Institutes of Health, the Food and Drug Administration, the World Health Organization (WHO), as well as specialty professional organizations and experts. (The authors of the BMJ piece later acknowledged being “disappointed that the overwhelming commentary seems to be that our study somehow proves that Dr. Oz or The Doctors are quacks or charlatans or worse. Our data in no way supports these conclusions.”) The reality of being a healer is that we won’t ever know everything about our chosen field, which is what attracts many of us to medicine in the first place.

So I have traveled off the beaten path in search of tools and tips that might help heal. These explorations are fraught with their own unique peril. For example, my voyage into the land of weight loss supplements left me in a very unsavory place. I wish I could take back enthusiastic words I used to support these products years ago. And I understand the criticism I’ve received as a result.

I discovered problems in the promising research papers that supported some products; the products themselves were often poor quality; and scammers stole my image to promote fake pills. So I have not mentioned weight loss supplements for a year and have no plans to return to that neighborhood.

Other times the topics are controversial, but are still worthwhile, like our campaign supporting GMO labeling. And this brings me back to a motive for the letter. These doctors criticized my “baseless and relentless opposition to the genetic engineering of food crops,” which is another false accusation. Whether you support genetically engineered crops or not, the freedom to make an informed choice should belong to consumers. The bill in Congress this month proposing to block states from independently requiring labeling offers a coup to pro-GMO groups.

As a scientist, I am not that concerned about GMOs themselves, but I am worried about why they were created. Highly toxic herbicides would kill crops unless they were genetically modified, but with the genetic upgrade, these plants can be doused with much higher doses, with potential complications to the environment. The WHO believes that glyphosate is “probably a human carcinogen.” Perhaps we are all showing “disdain for science and evidence-based medicine,” but I would argue that unleashing these products creates a real-time experiment on the human species. Sure, we will eventually know if these pesticides are a problem, but at the expense of the pain and suffering and disease in real people. I owe my kids more. And so do you.

I know I have irritated some potential allies. No matter our disagreements, freedom of speech is the most fundamental right we have as Americans. We will not be silenced. We’re not going anywhere.

Your browser is out of date. Please update your browser at http://update.microsoft.com