TIME Business

Labor Day: Raising the Minimum Wage Stiffs the Poor

Demonstrators take part in a protest to demand higher wages for fast-food workers outside McDonald's in Los Angeles on May 15, 2014.
Demonstrators take part in a protest to demand higher wages for fast-food workers outside McDonald's in Los Angeles on May 15, 2014. Lucy Nicholson—Reuters

There are at least three better ways to help low-income workers — and few ways that are worse

Another Labor Day, another bold plan to increase the minimum to help the working men and women of America!

On Monday, Los Angeles Mayor Eric Garcetti will announce a proposal to jack his city’s minimum wage from $9.00 all the way up to $13.25 over three years. That puts him ahead of President Obama, who has called for goosing the federal minimum wage from $7.25 to $10.10.

Increasing the minimum wage is typically sold as a way of aiding poor people — LA business magnate and philanthropist Eli Broad says Garcetti’s plan “would help lift people out of poverty.” But it’s actually a pretty rotten way to achieve that for a number of reasons.

For starters, minimum-wage workers represent a shrinking share of the U.S. workforce. According to the Bureau of Labor Statistics (BLS), the percentage of folks who earn the federal minimum wage or less (which is legal under certain circumstances) comes to just 4.3 percent of hourly employees and just 3 percent of all workers. That’s down from an early 1980s high of 15 percent of hourly workers, which is good news — even as it means minimum wage increases will reach fewer people.

What’s more, contrary to popular belief, minimum-wage workers are not clustered at the low end of the income spectrum. About 50 percent of all people earning the federal minimum wage live in households where total income is $40,000 or more. In fact, about 14 percent of minimum wage earners live in households that bring in six figures or more a year. When you raise the minimum wage, it goes to those folks too.

Also, most minimum-wage earners tend to be younger and are not the primary breadwinner in their households. So it’s not clear they’re the ones needing help. “Although workers under age 25 represented only about one-fifth of hourly paid workers,” says BLS, “they made up about half of those paid the federal minimum wage or less.” Unemployment rates are already substantially higher for younger workers — 20 percent for 16 to 19 year olds and 11.3 percent for 20 to 24 year olds, compared to just 5 percent for workers 25 years and older — and would almost certainly be made worse by raising the cost of their labor by government diktat. While a number of high-profile economists such as Paul Krugman have lately taken to arguing that minimum wage increases have no effect on employment, the matter is far from settled and basic economic logic suggests that increases in prices reduce demand, whether you’re talking about widgets or labor.

Finally, there’s no reason to believe that people making the minimum wage are stuck at the bottom end of the pay scale for very long. According to one study that looked at earning patterns between 1977 and 1997, about two-thirds of workers moved above the minimum wage within their first year on the job. Having a job, even one that pays poorly, starts workers on the road to increased earnings.

If we want to actually raise the standard of living for the working poor via government intervention, the best way to do it is via transfer payments — food stamps, housing subsidies, or even plain cash — that directly target individuals and families at or below the poverty line.

University of California sociologist Lane Kenworthy, a progressive who has called for a more generous social safety net, argues that virtually all increases in income for poor families in the U.S. and other wealthy countries since the late 1970s have been a function of “increases in net government transfers — transfers received minus taxes paid.” That’s partly because workers in poor households often have “psychological, cognitive, or physical conditions that limit their earnings capability” and partly because today’s “companies have more options for replacing workers, whether with machines or with low-cost laborers abroad.”

To be sure, arguing that you want to increase direct aid to poor families doesn’t give a politician the same sort of photo-op as standing with a bunch of union leaders on Labor Day and speechifying about the urgent need to make sure an honest day’s work is rewarded with a living wage.

But making just such a case could have the benefit of actually helping poor people in the here and now. Certainly a savvy politician could sell that to voters who know the value of hard work — and the limits of economic intervention.

 

TIME Parenting

Millennials Are Selfish and Entitled, and Helicopter Parents Are to Blame

Parent Child Climbing
Peter Lourenco—Flickr RF/Getty Images

There are more overprotective moms and dads at a time when children are actually safer than ever

It’s natural to resent younger Americans — they’re younger! — but we’re on the verge of a new generation gap that may make the nasty old fights between baby boomers and their “Greatest Generation” parents look like something out of a Norman Rockwell painting.

Seventy-one percent of American adults think of 18-to-29-year-olds — millennials, basically — as “selfish,” and 65% of us think of them as “entitled.” That’s according to the latest Reason-Rupe Poll, a quarterly survey of 1,000 representative adult Americans.

If millennials are self-absorbed little monsters who expect the world to come to them and for their parents to clean up their rooms well into their 20s, we’ve got no one to blame but ourselves — especially the moms and dads among us.

Indeed, the same poll documents the ridiculous level of kid-coddling that has now become the new normal. More than two-thirds of us think there ought to be a law that kids as old as 9 should be supervised while playing at a public park, which helps explain (though not justify) the arrest of a South Carolina mother who let her phone-enabled daughter play in a busy park while she worked at a nearby McDonald’s. We think on average that kids should be 10 years old before they “are allowed to play in the front yard unsupervised.” Unless you live on a traffic island or a war zone, that’s just nuts.

It gets worse: We think that our precious bundles of joy should be 12 before they can wait alone in a car for five minutes on a cool day or walk to school without an adult, and that they should be 13 before they can be trusted to stay home alone. You’d think that kids raised on Baby Einstein DVDs should be a little more advanced than that.

Curiously, this sort of ridiculous hyperprotectiveness is playing out against a backdrop in which children are safer than ever. Students reporting bullying is one-third of what it was 20 years ago, and according to a study in JAMA Pediatrics, the past decade has seen massive declines in exposure to violence for kids. Out of 50 trends studied, summarize the authors, “there were 27 significant declines and no significant increases between 2003 and 2011. Declines were particularly large for assault victimization, bullying, and sexual victimization. There were also significant declines in the perpetration of violence and property crime.”

There are surely many causes for the mainstreaming of helicopter parenting. Kids cost a hell of a lot to raise. The U.S. Department of Agriculture figures a child born in 2013 will set back middle-income parents about $245,000 up to age 17 (and that’s before college bills kick in). We’re having fewer children, so we’re putting fewer eggs in a smaller basket, so to speak. According to the Reason-Rupe poll, only 27% of adults thought the media were overestimating threats to the day-to-day safety of children, suggesting that 73% of us are suckers for sensationalistic news coverage that distorts reality (62% of us erroneously think that today’s youth face greater dangers than previous generations). More kids are in institutional settings — whether preschool or school itself — at earlier ages, so maybe parents just assume someone will always be on call.

But whatever the reasons for our insistence that we childproof the world around us, this way madness lies. From King Lear to Mildred Pierce, classic literature (and basic common sense) suggests that coddling kids is no way to raise thriving, much less grateful, offspring. Indeed, quite the opposite. And with 58% of millennials calling themselves “entitled” and more than 70% saying they are “selfish,” older Americans may soon be learning that lesson the hard way.

TIME U.S.

Make Cops Wear Cameras

Outrage In Missouri Town After Police Shooting Of 18-Yr-Old Man
A police officer standing watch as demonstrators protest the shooting death of teenager Michael Brown conceals his/her identity on August 13, 2014 in Ferguson, Missouri. Scott Olson—Getty Images

“Everyone behaves better when they’re on video”

Michael Brown, an unarmed 18-year-old, shot to death in Ferguson, Missouri, by police. Eric Garner, a 43-year-old New Yorker, dies from a police chokehold. John Crawford III, 22, shot and killed by police in a Walmart outside of Dayton, Ohio.

Enough is enough. Each of these incidents has an unmistakable racial dimension—all of the victims were black and all or most of arresting officers were white–that threatens the always tense relationships between law enforcement and African Americans. As important, the circumstances of each death are hotly contested, with the police telling one story and witnesses (if any) offering up very different narratives.

Brown’s death in particular is raising major ongoing protests precisely because, contrary to police accounts, witnesses claim that he had his hands up in the air in surrender when he was shot. The result is less trust in police, a situation that raises tensions across the board.

While there is no simple fix to race relations in any part of American life, there is an obvious way to reduce violent law enforcement confrontations while also building trust in cops: Police should be required to use wearable cameras and record their interactions with citizens. These cameras—various models are already on the market—are small and unobtrusive and include safeguards against subsequent manipulation of any recordings.

“Everyone behaves better when they’re on video,” Steve Ward, the president of Vievu, a company that makes wearable gear, told ReasonTV earlier this year. Given that many departments already employ dashboard cameras in police cruisers, this would be a shift in degree, not kind.

“Dash cams only capture about 5% of what a cop does. And I wanted to catch 100% of what a cop does,” explains Ward, who speaks from experience. He used to be a Seattle police officer and his company’s slogan is “Made for cops by cops. Prove the truth.”

According to a year-long study of the Rialto, Calif., police department, the use of “officer worn cameras reduced the rate of use-of-force incidents by 59 percent” and “utilization of the cameras led to an 87.5 percent reduction in complaints” by citizens against cops.

Such results are the reason that the ACLU is in favor of “police body-mounted cameras,” as long as various privacy protections and other concerns are addressed. And it also explains growing support for the policy among elected officials. In the wake of Eric Garner’s chokehold death in July, New York City’s public advocate is pushing a $5 million pilot program in the city’s “most crime-plagued neighborhoods” as a means of restoring trust in the police.

Since 1991, when the beating of Rodney King by the Los Angeles Police Department was captured on tape by an amateur videographer, small, cheap recording devices have become a ubiquitous and effective means by which citizens are able to watch the watchers. In some cases, crowd-sourced footage exonerates the police, while in others it undermines the official narrative.

Over the same period, as the Washington Post’s Radley Balko has documented in Rise of the Warrior Cop, even small-town police departments have become “militarized” in terms of the training they received and the hardware they carry. When the results aren’t tragic, they increase tensions between police and the people they serve and protect.

Mandating that cops wear cameras wouldn’t prevent all tragedies from happening but they would certainly make deaths like those of Brown, Garner and Ferguson less likely. And in difficult cases, body cams would help provide crucial perspective that would build trust in law enforcement across the board.

TIME Media

Why I’m Actually Pretty Psyched for the New Sarah Palin Channel

It adds to the incredible variety of media sources but will flourish only if it actually contributes to ongoing conversations about news, politics, culture and ideas.

+ READ ARTICLE

Former governor, vice-presidential candidate and reality-TV star Sarah Palin has started her own subscription-only web-based news channel. That’s good news for people who want to follow her – and for people who want to ignore her, too (she’ll be showing up far less often on cable news channels). “I want talk directly to you on our channels, on my terms, and no need to please the powers that be,” she explains in a (free!) intro video.

Palin’s new project is the latest sign that we live in world of gloriously fragmented media and culture that allows just about anyone to express themselves more fully than at any time in human history. That’s a great thing, even if it means trouble for long-established media companies and empowers conspiracy ranters such as Alex Jones.

Twenty years ago, just as the Internet was developing into a mass medium that catered to individuals’ unique tastes and interests in unprecedented ways, critics were foolishly flipping out about “media consolidation” and how a few companies such as AOL Time Warner would control all our news and information (as if!). Now, they are more likely to worry over the loss of a common news culture and the seeming ability of people to consume only self-confirming points of view. That may seem plausible on the face of things, but it’s equally wrong.

Palin is hardly a trailblazer in launching her own channel. Her ideological confrere Glenn Beck launched The Blaze network on the web in 2011. It spread to satellite TV a year later, and claims north of 300,000 subscribers paying $9.95 for full access to tons of print, video and audio content. Elsewhere on the political spectrum, pioneering blogger Andrew Sullivan sells access to The Dish (which touts itself as “biased and balanced”) for $1.99 a month and The Young Turks offer free, basic ($10) and premium ($25) access to a wide variety of text and video. RedState, The Daily Kos, Huffington Post, PJ Media and others all offer unlimited amounts of news, commentary and community for free. Everywhere you look, there are not just more ways to access the news, but more voices entering the marketplace of ideas.

The Sarah Palin Channel will flourish only if brings something truly different and substantial to the table. The eponymous host promises her service will be “a community” and that she’s most excited about hearing directly from her audience. That’s a start (and a shift from the old-style news broadcasting), but only time will tell whether that’s enough to keep folks shelling out $10 a month for the long haul.

What is clear is that even with the proliferation of news sources with distinct points of view, Americans are reading deeply and widely. Earlier this year, the American Press Institute released a study called “The Personal News Cycle: How Americans choose to get their news.” Among the key findings: 75% of us consume news every day and increasingly we pay attention throughout our waking hours, checking in across different platforms, media and sources.

Far from walling ourselves off in ideological gardens that tell us just what we want to hear, “the majority of Americans across generations now combine a mix of sources and technologies to get their news each week.” We go deep on stories that interest us, reading multiple accounts from multiple places to get more information—something that wasn’t possible back in the days of three broadcast channels and one or two hometown newspapers. Perhaps most interestingly, we apply a sliding scale of credibility based on sources, with 43% having high trust levels in reports from well-established news organizations, 21% from “word of mouth” ones, and even less from unsubstantiated social media sources.

So welcome to the 21st Century media world, Sarah Palin. New voices and platforms are always welcome, but it’s a jungle out here. You don’t have to “please the powers that be,” but you do have to bring real value to your readers and viewers – and that’s no walk in the park in the mediascape of endlessly fascinating and proliferating choices.

TIME foreign affairs

Malaysia Airlines Ukraine Crash: Tragedy Fuels the U.S. Intervention Machine

John McCain
U.S. Sen. John McCain, R-Ariz., criticizes the Obama administration during a Jackson, Miss., runoff rally in support of Republican U.S. Sen. Thad Cochran at the Mississippi War Memorial in Jackson, Miss., June 23, 2014. Rogelio V. Solis—AP

Whatever happened in Ukrainian airspace doesn’t immediately or obviously involve the United States.

Apart from the probable cause of its destruction, we know almost nothing about the Malaysian Airlines Flight MH17 that was “blown out of the sky” yesterday over eastern Ukraine, according to Vice President Joe Biden. President Obama confirmed today that one American was among the dead and that separatists with ties to Russia are allowing inspectors to search the wreckage area. In today’s press conference, Obama stressed the need to get real facts — as opposed to misinformed speculation — before deciding on next steps.

Yet even with little in the way of concrete knowledge — much less clear, direct ties to American lives and interests — what might be called the Great U.S. Intervention Machine is already kicking into high gear. This is unfortunate, to say the least.

After a decade-plus of disastrous wars in Afghanistan and Iraq that resulted in the deaths of hundreds of thousands of people (including almost 7,000 American soldiers) and constitutionally dubious and strategically vague interventions in places such as Libya, it is well past time for American politicians, policymakers, and voters to stage a national conversation about U.S. foreign policy. Instead, elected officials and their advisers are always looking for the next crisis over which to puff up their chests and beat war drums.

Which is one of the reasons why Gallup and others report record low numbers of people think the government is up to handling global challenges. Last fall, just 49 percent of Americans had a “great deal” or “fair amount” of trust and confidence in Washington’s ability to handle international problems. That’s down from a high of 83 percent in 2002, before the Iraq invasion.

In today’s comments, President Obama said that he currently doesn’t “see a U.S. military role beyond what we’ve already been doing in working with our NATO partners and some of the Baltic states.” Such caution is not only wise, it’s uncharacteristic for a commander-in-chief who tripled troop strength in Afghanistan (to absolutely no positive effect), added U.S. planes to NATO’s action on Libya without consulting Congress, and was just last year agitating to bomb Syria.

Despite his immediate comments, there’s no question that the downing of the Malaysian plane “will intensify pressure on President Obama to send military help,” observes Jim Warren in The Daily News. Russia expert Damon Wilson, who worked for both the Clinton and George W. Bush administrations, says that no matter what else we learn, it’s time to beef up “sanctions that bite, along with military assistance, including lethal military assistance to Ukraine.” “Whoever did it should pay full price,” Sen. Carl Levin (D-Mich.), the head of the Senate’s Armed Services Committee, says. “If it’s by a country, whether directly or indirectly, it could be considered an act of war.”

The immediate response of Arizona Sen. John McCain, the 2012 Republican presidential, was to appear on Fox News’ Hannity and fulminate that America appears “weak” under the leadership of President Obama and to imply that’s why this sort of thing happens. If the Russian government run by Vladimir Putin or Russian separatists in Ukraine are in any way behind the crash — even “indirectly” — said McCain, there will be “incredible repercussions.”

Exactly what those repercussions might be are anybody’s guess, but McCain’s literal and figurative belligerence is both legendary and representative of a bipartisan Washington consensus that the United States is the world’s policeman. For virtually the length of his time in office, McCain has always been up for some sort of military response, from creating no-fly zones to strategic bombing runs to boots on the ground to supplying arms and training to insurgents wherever he may find them. He was a huge supporter not just of going into Afghanistan to chase down Osama bin Laden and the terrorists behind the 9/11 attacks but staying in the “graveyard of empires” and trying to create a liberal Western-style democracy in Kabul and beyond.

Similarly, he pushed loudly not simply for toppling Saddam Hussein but talked up America’s ability to nation-build not just in Iraq but to sculpt the larger Middle East region into something approaching what we have in the United States. Over the past dozen-plus years, he has called for large and small interventions into the former Soviet state of Georgia, Libya, and Syria. He was ready to commit American soldiers to hunting down Boko Haram in Nigeria and to capturing African war lord Joseph Kony. In the 1990s, he wanted Bill Clinton to enter that Balkan civil wars early and often.

In all this, McCain resembles no other politician more than the presumptive Democratic presidential nominee, Hillary Clinton, whose hawkishness is undisputed. Like McCain, Clinton has long been an aggressive interventionist, both as a senator from New York and as secretary of state (where her famous attempt to “reset” relations with Russia failed spectacularly when it turned out that the “Reset” button she gave her Soviet counterpart meant “overcharged” rather than the intended conciliatory term). In the wake of Flight MH17 being shot down, Clinton has already said that the act of violence is a sign that Russian leader Vladimir Putin “has gone too far and we are not going to stand idly by.”

For most Americans, the failed wars in the Iraq and Afghanistan underscore the folly of unrestrained interventionism. So too do the attempts to arm rebels in Syria who may actually have ties to al Qaeda or other terrorist outfits. Barack Obama’s unilateral and constitutionally dubious deployment of American planes and then forces into Libya under NATO command turned tragic with the death of Amb. Chris Stevens and other Americans, and we still don’t really have any idea of what we were trying to accomplish there.

No one can doubt John McCain’s — or Hillary Clinton’s — patriotism and earnestness when it comes to foreign policy. But in the 21st century, America has little to show for its willingness to inject itself into all the corners of the globe. Neither do many of the nations that we have bombed and invaded and occupied.

Americans overwhelmingly support protecting Americans from terrorism and stopping the spread of nuclear weapons. They are realistic, however, that the U.S. cannot spread democracy or preserve human rights through militarism.

When the United States uses its unrivaled military power everywhere and all the time, we end up accomplishing far less than hawks desire. Being everywhere and threatening action all the time dissipates American power rather than concentrates it. Contra John McCain and Hillary Clinton, whatever happened in Ukrainian airspace doesn’t immediately or obviously involve the United States, even with the loss of an American citizen. The reflexive call for action is symptomatic of exactly what we need to stop doing, at least if we want to learn from the past dozen-plus years of our own failures.

President Obama is right to move cautiously regarding a U.S. response. He would be wiser still to use the last years of his presidency to begin the hard work of forging a foreign-policy consensus that all Americans can actually get behind, not just in this situation but in all the others we will surely encounter.

TIME politics

The Secret Language of Millennials

187345382
Concert crowd Getty Images/Vetta

Boomers just don’t understand what younger people are saying about politics and culture.

Fifty years ago, baby boomers and their parents suffered through what was ubiquitously understood as “the generation gap,” or the inability for different generations to speak clearly with one another.

A new national poll of Americans ages 18 to 29 — the millennial generation — provides strong evidence of a new generation gap, this time with the boomers (born from 1946 to 1964) playing the role of uncomprehending parents. When millennials say they are liberal, it means something very different than it did when Barack Obama was coming of age. When millennials say they are socialists, they’re not participating in ostalgie for the old German Democratic Republic. And their strong belief in economic fairness shouldn’t be confused with the attitudes of the Occupy movement.

The poll of millennials was conducted by the Reason Foundation (the nonprofit publisher of Reason.com, the website and video platform I edit) and the Rupe Foundation earlier this spring. It engaged nearly 2,400 representative 18-to-29-year-olds on a wide variety of topics.

This new generation gap certainly helps explain why millennials are far less partisan than folks 30 and older. Just 22% of millennials identify as Republican or Republican-leaning, compared with 40% of older voters. After splitting their votes for George W. Bush and Al Gore in 2000 (each candidate got about 48%), millennials have voted overwhelmingly for Democratic candidates in the 2004, 2008 and 2012 elections. Forty-three percent of millennials call themselves Democrats or lean that way. Yet that’s still a smaller percentage than it is for older Americans, 49% of whom are Democrats or lean Democrat. Most strikingly, 34% of millennials call themselves true independents, meaning they don’t lean toward either party. For older Americans, it’s just 10%.

Millennials use language differently than boomers and Gen X-ers (those born from 1965 to 1980). In the Reason-Rupe poll, about 62% of millennials call themselves liberal. By that, they mean they favor gay marriage and pot legalization, but those views hold little or no implication for their views on government spending. To millennials, being socially liberal is being liberal, period. For most older Americans, calling yourself a liberal means you want to increase the size, scope and spending of the government. (It may not even mean you support legal pot and marriage equality.) Despite the strong liberal tilt among millennials, 53% say they would support a candidate who was socially liberal and fiscally conservative. (Are you listening, major parties?)

There are other areas in which language doesn’t track neatly with boomer and Gen X definitions. Millennials have no firsthand memories of the Soviet Union or the Cold War. Forty-two percent say they prefer socialism as a means of organizing society, but only 16% can define the term properly as government ownership of the means of production. In fact, when asked whether they want an economy managed by the free market or by the government, 64% want the former and just 32% want the latter. Scratch a millennial “socialist” and you are likely to find a budding entrepreneur (55% say they want to start their own business someday). Although they support a government-provided social safety net, two-thirds of millennials agree that “government is usually inefficient and wasteful,” and they are highly skeptical toward government with regards to privacy and nanny-state regulations about e-cigarettes, soda sizes and the like.

For all the attention lavished on the youthful, anticapitalist Occupy movement a few years ago, it turns out that millennials have strongly positive attitudes toward free markets. (Just don’t call it capitalism.) Not surprisingly, they define fairness in a way that is less about income disparity and more about getting your due. Almost 6 in 10 believe you can get ahead with hard work, and a similar number want a society in which wealth is parceled out according to your achievement, not via the tax code or government redistribution of income. Even though 70% favor guaranteed health care, housing and income, millennials have no problem with unequal outcomes.

Like most older Americans, too, millennials are deeply worried about massive and growing federal budgets and debt, with 78% calling such things a major problem.

It would be a real shame if we can’t have the sorts of conversations we need to address and remedy such issues because different generations are talking past each other. Millennials are different from boomers or Gen X-ers: culture comes first and politics second to them. They are less partisan, and they are less hung up about things such as pot use, gay marriage and immigration. But in many ways, they agree with older generations when it comes to the value and legitimacy of work, the role of government in helping the poor and the inefficiency of government to do that.

Everyone agrees that there are crises everywhere: Social Security and Medicare are going bust, and the economy has been on life support for years. The best solutions will engage and involve Americans of all ages. The Reason-Rupe poll points to some places where generations are talking past each other and others where there is wide agreement. Giving its finding, a close read might just help narrow today’s generation gap so we can get on with improving all generations’ prospects.

TIME politics

Politics Make Us Petty—But Americans Actually Agree on More Than Ever

Primaries
Karen Dimon walks to her car after voting at the Meadowthorpe precinct at Meadowthorpe Elementary School in Lexington, Ky., May 20, 2014. Lexington Herald-Leade—MCT/Getty Images

An increasing number of Americans are calling themselves independents, and there are huge and growing areas of consensus developing not just on once-controversial social issues but also on the proper role of government.

A new national survey of 10,000 Americans tells you what probably already thought: “Republicans and Democrats are more divided along ideological lines — and partisan antipathy is deeper and more extensive — than at any point in the last two decades.”

Pollsters at Pew Research report that “92% of Republicans are to the right of the median Democrat, and 94% of Democrats are to the left of the median Republican.” And the percentage of people who hold “consistently” conservative or liberal opinions has doubled over the past two decades, to 21 percent.

Yet such rank partisanship and ideological extremism tell an incomplete and ultimately misleading story of contemporary America. Yes, those who identify themselves as members of Team Red or Team Blue are more at one another’s throats than ever (just check out C-SPAN if you don’t believe me). But an increasing number of Americans are calling themselves independents, and there are huge and growing areas of consensus developing not just on once-controversial social issues but also on the proper role of government in everyday life.

Gallup reports that in 2013, 42 percent of Americans identified themselves as politically independent, up 10 points from 1988. Over the same period, those willing to call themselves Democrats dropped from 36 percent to 31 percent and those calling themselves Republican fell from 30 percent to 25 percent. While it’s true that self-described independents often lean toward either the Democrats or Republicans, the number of “pure” independents has been growing for more than a decade and stands above 10 percent.

And there’s no question that people are leaving the major parties in droves. Between the 2008 and 2012 elections, USA Today reports, more than 2.5 million voters left the Democrats and the Republicans. “Registered Democrats declined in 25 of the 28 states that register voters by party,” according to USA Today’s tally. “Republicans dipped in 21 states, while independents increased in 18 states.” As politics gets more viciously partisan, more Americans are saying no thanks.

Then there are the areas in which consensus already exists or is growing rapidly. As political scientist Morris Fiorina explains in his book Culture War? The Myth of a Polarized America, Americans actually generally agree on many topics that inflame political partisans. Consider abortion, gay marriage, gun control, and pot legalization. Research from Pew itself shows only “modest generational differences in views of abortion gun control.” Fifty-five percent of Americans now support same-sex marriage (up from just 42 percent in 2004) and 58 percent support legalizing pot (up from 34 percent a decade ago). When it comes to Congress, few topics seem to engender more rage than immigration, but it turns out that 71 percent of voters — including 64 percent of Republicans — support comprehensive immigration reform.

When it comes to larger questions of the role of government in everyday life, for the past four years about 55 percent of Americans believe the “government is doing too much” and only 38 percent believe it should be doing more. That generally skeptical view of government is borne out in the record high level of people — a whopping 72 percent — who agree that government poses a bigger threat to our future than big business (21 percent) or big labor (5 percent).

Fiorina’s Culture War? helps to explain how politics can be getting more polarized even as Americans seem to agree on many, maybe most, big issues. “The answer,” he writes, “is that while voter positions have not polarized, their choices have.” The ways that the Democratic and Republican parties select their representatives and build their platforms is more fully in the hands of partisans who push more extreme candidates and policies. That means that the typical voter is faced not just with the lesser of two evils but two major-party choices that don’t really represent her beliefs.

Partisan and ideological polarization is a sour note in an America that is increasingly singing in harmony about things such as immigration, the drug war, marriage equality, and more. No wonder, then, that more and more of us refuse to say we’re Republican or Democrat, or to trust Washington, D.C. — that hotbed of the worst sort of to-the-death politics — with our lives and our futures.

TIME Media

Don’t Accept Jonah Hill’s Apology!

The Tonight Show Starring Jimmy Fallon - Season 1
Jonah Hill during an interview with host Jimmy Fallon on June 3, 2014. NBC—NBCU Photo Bank via Getty Images

And don’t take celebrities seriously in the first place.

Jonah Hill, the Oscar-nominated actor, just did two things for which celebrities are famous. First, he called a paparazzo a homophobic slur (“Suck my dick, you faggot!” as captured by TMZ). Second, he appeared on The Tonight Show with Jimmy Fallon to apologize for what he’d said.

Hill says he’s always been in favor of equality for gays, so his “heart’s broken” that he said what he said. He’s also got a big movie coming out next week (22 Jump Street, a sequel to the very funny and very successful 21 Jump Street), which surely adds to his case of the sads and helps explain the urgency with which he’s trying to create a teachable moment from his unseemly outburst.

Addressing not simply The Tonight Show’s studio audience but all humanity, Hill counseled: “If someone says something that hurt you or angers you, use me as an example of what not to do. Don’t respond with hatred or anger, because you are just adding more ugliness to the world.”

If only he’d taken that advice before filming Get Him to the Greek.

I assume Hill’s act of contrition is sincere, but who cares about his hard-won, PR-friendly insight into the human condition? Why should we take his apology any more seriously than we take his ugly outburst?

That goes for any celebrity, whether it’s Alec Baldwin, whose chronic and highly entertaining rage-aholism gushes more regularly and spectacularly than Old Faithful, or Gwyneth Paltrow, with her natterings about how being famous is like surviving war and her paradigm-shattering discovery that emotional “negativity changes the structure of water.

Celebrities don’t seem to understand their role in the modern world. They are not here to educate us or to make us aspire to a higher level of consciousness or dream of a world where sharks are treated with kindness rather than fear.

They’re here to entertain us, both on- and offscreen. That’s what they get paid so well for, and they can stop apologizing for it. As economist Tyler Cowen wrote in What Price Fame?, today’s renown is nothing like its premodern antecedent. Centuries ago, fame stemmed from martial prowess and involved brutish, dictatorial leaders forcing subjects to worship them as demigods and to mindlessly follow their directions.

With the exception of a handful of former Soviet republics and certain precincts of Hollywood, that’s thankfully all behind us. In modern societies, we get to make our own decisions about work, love and the meaning of life. “Contemporary stars are well-paid but impotent puppets,” wrote Cowen, who “serve their fans rather than making their fans serve them.”

Celebs like Hill, Justin Bieber (recently caught hurling racial epithets), James Franco (who called a theater critic a “little bitch”) and others should learn to take themselves as seriously as their fans do. Which is not very.

That’s not to say celebrities aren’t occasionally capable of genuine insights into the stuff of life. Back in 1993, the chronically controversial basketball great Charles Barkley created a scandal that was unrelated to his various drunken-driving incidents and episodes of unsportsmanlike conduct. The Round of Mound of Rebound got in trouble for stating a self-evident truth. “I am not a role model,” he announced in a Nike ad.

That wasn’t his job, he explained. He was just a guy who was good at putting a ball through a hoop. Barkley also never hesitated to point to his own role models: “My mother and grandmother were two of the hardest-working ladies in the world, and they raised me to work hard.”

Truer words were never spoken by a celebrity. Or forgotten as quickly.

TIME health

You Say Potato, Mrs. Obama. I Say, Please Stop Micromanaging Our Diets and Our Schools

If you’ve ever wondered just where the role of government ends and where the ability of adults to choose things for themselves and their children begins, don’t bother. The answer, at least according to First Lady Michelle Obama, is nowhere.

Marching under the banner of Science with a capital S, Obama believes the federal government should be able to tell you what to eat. Or, more precisely, not eat. At least if you’re poor enough to be on relief or if you’re remanded to the custody of a K-12 public school.

Writing in the New York Times, Obama warns that “right now, the House of Representatives is considering a bill to override science by mandating that white potatoes be included on the list of foods that women can purchase using WIC dollars.”

Don’t get the wrong idea, though. Obama agrees that “there is nothing wrong with potatoes.” It’s just that according to the Institute of Medicine (a.k.a. “science”), the “low-income women and their babies and toddlers” served by the WIC program would be better off if they chowed down on “nutrient-dense fruits and vegetables.”

When it comes to schoolkids, Obama is just as emphatic that decisions are best made in Washington, rather than in the countless cafeterias of the nation’s 100,000 public schools. Some House members, she writes, “want to make it optional, not mandatory, for schools to serve fruits and vegetables to our kids. They also want to allow more sodium and fewer whole grains than recommended into school lunches.”

The First Lady believes that the various programs she’s championed over the past few years (like Let’s Move!, which hectors kids to exercise) are producing “glimmers of progress” in the War on Fat People, especially among children ages 2 to 5. The fact is, however, that there is no clear link between any of the programs she promotes and the trends she applauds.

According to a new Centers for Disease Control study, the obesity rate among kids that age is 8%, down from 14% in 2003. That’s all well and good, but the authors caution that one year doesn’t make a trend, especially since that group makes up “a tiny fraction” of the population. Indeed, the same report also notes that obesity rates among Americans 19 years old and younger had already stopped climbing by 2003 and have been flat ever since, at about 17%. Other accounts suggest that youth obesity rates peaked even earlier, in 1999. Over the same general time frame, adult obesity rates have stayed steady, at around 30%. This all came after a tripling of rates between the 1970s and 1990s.

Obama is welcome to take credit for a general flattening of trends that began years before her husband became President. However, when she starts urging the federal government to limit individual choices and centralize control in the federal government, attention should be paid. “As parents, we always put our children’s interests first,” she writes. “We wake up every morning and go to bed every night worrying about their well-being and their futures.”

If she really believes that, then why not treat poor people with the same respect that we treat middle-class and upper-middle-class folks? If we’re going to supplement their incomes, why not give them a cash payment and let them figure out how to make the best use of it?

Similarly, if we can’t trust our schools to figure out how best to fill their students’ stomachs, why the hell are we forcing our children to attend such institutions in the first place? When is the last time you heard kids who attend schools of choice—whether private, religious or public charters (which enroll disproportionately high numbers of low-income students)—even mention food?

During the debate over Obamacare’s individual mandate, we had a fiery national conversation over whether the government could force you to buy broccoli. But even when the Supreme Court effectively said it could, nobody believed it could make you eat the stuff. That debate, it seems, took place in a simpler time.

TIME politics

Where the War on Pot Will Go to Die

Republican lawyer and marijuana advocate.
Some pot and a pipe from one of the first medical marijuana stores in California. The Washington Post—The Washington Post/Getty Images

In some states, there's an untenable mismatch between the crime and the time, but does anyone think that pot—medical or recreational—will still be illegal in 10 years?

Now that a majority of Americans—54% and climbing, according to Pew Research—believe that marijuana should be treated like beer, wine and liquor, it’s time to ask: Where does the war on pot go to die?

What episode will trigger that final skirmish that kicks over the hollowed-out edifice of marijuana prohibition like the Berlin Wall? What will be the final outrage against common sense and common decency that triggers an Arab Spring for weed in these U.S.? Twenty-one states and the District of Columbia already have medical marijuana (with more to come), and full legalization has gained 13 percentage points in just the past five years.

Ironically, whatever ends the war on pot won’t happen in Colorado or Washington, which have already legalized recreational pot and have received vague promises from Attorney General Eric Holder that the feds won’t bust people and businesses who comply with state laws. Colorado is further along in the retail process than Washington (where pot shops won’t open until mid-July), and so far the only problem of note is that the state is raking in 40% more tax revenue than originally projected.

Look instead to places such as Round Rock, Texas, where 19-year-old Jacob Lavoro faces a sentence between five and 99 years for allegedly selling a 1.5-lb. slab of hash brownies. Under state law, selling up to five pounds of plain old pot is punishable by no more than two years in the clink and a $10,000 fine. But hash, a concentrated form of pot, is considered a controlled substance, and even the tiny amount in Lavoro’s brownies qualifies him for what amounts to a potential life sentence. Through a convoluted rationale, you see, the law can count all the brownie ingredients—the eggs, butter, flour, cocoa—as hash.

Oh well, everything’s bigger in Texas, including the unconscionable mismatch between the crime and the time. If he were only a couple of states away, Lavoro wouldn’t be facing jail, he’d be a successful entrepreneur. That sort of mind-blowing disjuncture is exactly the sort of thing that takes the fight out of the war on pot.

Or look to recent comments made by FBI director James Comey, who admitted that he can’t hire the 2,000 cybercrime fighters the bureau needs to protect America because of workplace drug tests. “I have to hire a great workforce to compete with those cybercriminals, and some of those kids want to smoke weed on the way to the interview,” Comey said. He was upbraided by Senator Jeff Sessions (R., Ala.) for providing yet “one more example of leadership in America dismissing the seriousness of marijuana use.” Whatever you can say about Comey, he’s in good company in acknowledging the ubiquity of pot smoking in today’s America. According to the latest government data, 43% of Americans—including the three most recent Presidents—have tried pot at least once. And when asked whether alcohol or marijuana is more harmful to society, fully 63% say booze and just 23% say pot. How much longer can the Jeff Sessionses of the world hold back the tide of public opinion?

And, finally, look to California, which passed the nation’s first medical-marijuana ballot initiative way back in 1996 and saw 46.5% vote in favor of recreational pot in a 2010 proposition. In 2011, federal agents raided the operations of business of dispensary owner and medical grower Aaron Sandusky. This came after repeated promises by the Obama Administration that it wouldn’t go after medical-pot providers who were operating within state law. And even though officials from the city of Upland, which had tipped off the feds, later admitted in court that Sandusky was operating properly within state law.

Sandusky refused on principle to cop a plea because he thought he was in the right. Tried in federal court, he was unable to offer a defense based on California state law. Sandusky ended up pulling a 10-year sentence. In March of this year, he lost his final appeal. If he’s lucky and stays on good behavior, he’ll be out in 2021. Does anyone think that pot—medical or recreational—will still be illegal by then?

As it happens, Sandusky is doing time in Texas’ Big Spring Federal Correctional Institute, which is only a four-hour drive from Jacob Lavoro’s hometown of Round Rock. As Lavoro ponders whatever deal prosecutors might offer him, he’d be smart to visit Sandusky and ask what life behind bars is like. Because while the war on pot is surely in its final stage, there will still be plenty of casualties before peace is declared.

MORE: Inside a Christian Pot Shop

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser
Follow

Get every new post delivered to your Inbox.

Join 46,190 other followers