TIME Culture

Pot Is the New Normal

Demand for marijuana edibles is pushing several Colorado manufacturers to expand their facilities or move to larger quarters.
Steve Herin, Master Grower at Incredibles, works on repotting marijuana plants in the grow facility on Wednesday, August 13, 2014 in Denver, Colorado. Kent Nishimura—Denver Post via Getty Images

Nick Gillespie is the editor in chief of Reason.com and Reason.tv.

Face it: marijuana is legal, crime is down, traffic fatalities are declining and fewer teens are lighting up

If you want to know just how crazy marijuana makes some people, look no further than the race for governor of Colorado, where Democratic incumbent John Hickenlooper is neck and neck with Republican challenger Bob Beauprez. They’re high-profile examples of a growing backlash against pot, even as none of the scare stories about legal weed are coming true. Drug-addled addicts embarking on crime sprees? Not in Denver. Stupified teens flunking tests in record numbers? Uh-uh. Highway fatalities soaring? Nope.

About the worst you can say so far is that New York Times columnist Maureen Dowd wigged out while high. But she does that from time to time when she’s sober as a judge, too.

Neither Hickenlooper nor Beauprez has cracked 50% with voters, which makes sense since neither candidate can stomach the fact that 55% of Coloradoans voted to legalize recreational pot in 2012. “I’ll say it was reckless” to legalize pot, averred Hickenlooper at a recent debate. Beauprez goes further still. When asked if it’s time to recriminalize marijuana, he said, “Yes, I think we’re at that point…where the consequences that we’ve already discovered from this may be far greater than the liberty…citizens thought they were embracing.”

In fact, sales and tax revenues from legal pot continue to climb and more people now buy recreational pot than medical marijuana, even though the former is taxed at much higher rates. Pot has kicked about $45 million into tax coffers since it became legal this year and is projected to come in between $60 million and $70 million by year’s end. Murders in the Denver area, where most pot sales take place, are down 42% (so is violent crime overall, though at a lower rate) and property crime is down 11.5%.

There’s more bad news for alarmists: Pot use by teenagers in Colorado declined between 2001, when the state legalized medical marijuana, and 2013, the last full year for which data are available. When medical marijuana was introduced, critics worried that any form of legalized pot would increase usage among kids, but the reverse happened. It remains to be seen if that trend continues in the face of legal recreational pot, but Colorado teens already use dope at lower rates than the national average. So much for the Rocky Mountain High state.

Yet Colorado pols are in good company in harshing on legal weed. The recovering addict and former congressman Patrick Kennedy heads up SAM (which stands for “Safe Alternatives to Marijuana”) and categorically argues, “we cannot promote a comprehensive system of mental health treatment and marijuana legalization.”

Researchers who find that regular marijuana use among teenagers correlates with mental problems, academic failure and other bad outcomes get plenty of ink, even though such studies fail to show causation. Underperforming students and kids with problems abuse alcohol and smoke cigarettes at higher rates, after all. In any case, even advocates of legalization argue that teens shouldn’t be smoking pot any more than they should be drinking. Given its pariah status for decades, it’s not surprising that the science is both unsettled and highly politicized.

Will legalizing pot increase access to a drug law enforcement officials concede has long been readily available to high schoolers? “Criminalizing cannabis for adults has little if any impact on reducing teens’ access or consumption of the plant,” argues the pro-legalization group NORML, a claim supported by declining teen use rates during Colorado’s experience with medical marijuana. Certainly, pot merchants who are registered with and regulated by the state are more likely to check IDs than your friendly neighborhood black-market dealer.

At least this much seems certain: In a world where adults can openly buy real pot, you’re also less likely to read stories headlined “More People Hospitalized by Bad Batch of Synthetic Marijuana.” And support for legalization isn’t fading. The market research firm Civic Science finds that 58% of Americans support laws that “would legalize, tax, and regulate marijuana like alcohol.”

That figure obviously doesn’t include either candidate for governor of Colorado. But just like the rest of the country, whoever wins that race will have to learn to live with pot being legal, crime being down, traffic fatalities declining and fewer teens lighting up.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Media

Let’s Give America’s Royal Baby a Time Out

Clinton Baby Chelsea Clinton Hillary Clinton Bill Clinton
Former President Bill Clinton, right, and former Secretary of State Hillary Rodham Clinton, second from right, wave to the media as Marc Mezvinsky and Chelsea Clinton pose for photographers with their newborn baby, Charlotte, after the family leaves Manhattan's Lenox Hill hospital in New York City on Sept. 29, 2014. William Regan—AP

Nick Gillespie is the editor in chief of Reason.com and Reason.tv.

Is there anything more distasteful than the obviously strategic use of babies by the rich and powerful to gild their images?

As the father of two, I can personally attest to the power of babies and toddlers to melt the coldest heart (mine) and coax a smile from the stoniest of faces (also mine). Well, my kids’ power, anyway. They were (and are) supernaturally, objectively, transcendently beautiful creatures. About yours, I couldn’t honestly say (though I kind of doubt it).

But really, is there anything more distasteful than the obviously strategic use of babies by the rich and powerful to gild their images—and the media’s feckless complicity in the spectacle? Whether it’s the British royal family constantly pushing the toddler Prince George toward the camera or breathless reports of Hillary Clinton’s newfound “grandmother glow,” can we just change its diaper, give it a pacifier, and put it to sleep already?

It’s easy to understand why Brits might take an interest in George and his parents. Not only will the 14-month-old one day rule over them, they’ve really got nothing better to do. Dr. Who is only on so many hours a day, after all. Watching the boy-king pad around on his hands and knees is a welcome diversion from contemplating a century-long slide from world domination, his father Prince William’s advancing baldness, and his grandfather Prince Charles’ continuing existence. And now that the Duchess of Cambridge Kate Middleton has taken care not to flash the commoners anymore, watching George hold court at “low-key tea parties” is about as diverting as it’s going to get in old Blighty.

But why do Americans care about this kid (or, same thing, why does the press assume we care)? There’s nothing more genuinely antithetical to American values than the wealth, titles, and leisure of the British royal family. If memory serves, we even fought a war over it. Inherited privilege brings with it a empire-sized sense of entitlement, which is on brilliant display in William and Kate’s new legal action against photographers who they claim are “stalking” the very baby they trot out daily like an exotic monkey.

William and Kate, a royal spokesman explained to CNN, want their son to have “as normal a childhood as possible” and demand that the press not publish unauthorized shots of George. Please. If you want the kid to have anything approaching a normal childhood, put him up for adoption or have him work his way up from the royal stables (sort of like how the Duchess of Cornwall did). Even America’s raggedy version of royalty—that would be the Kardashians or maybe the Duggars—understand that unearned wealth comes at the cost of your privacy and control over your image.

Controlling your image, of course, is something that Hillary Clinton knows a thing or two about. The former first lady, senator, defense secretary, and presumptive 2016 presidential candidate is a master of adaptation and continued success. Indeed, it’s tempting to say at this point that her husband is slowly being revealed as a bit player in her personal and professional epic, rather than vice versa.

During Bill Clinton’s presidential years, Hillary Clinton readily morphed from a feminist icon who would serve as co-president to a long-suffering, stand-by-your-man, cookie-baking spouse and back again. As a senator from New York, she earned high praise for pragmatism, coalition building, and bipartisan binge-drinking. Despite a patently disastrous term as secretary of state (exemplified by the violent death of the American ambassador to Libya and the failure to “reset” relations with Russia), she’s nimbly laid all the blame for U.S. foreign policy in President Obama’s lap and has emerged as an unreconstructed war hawk at a moment when Americans are calling for blood again.

If Clinton has had one blind spot in her image, it’s that she’s often perceived as less than fully human. If Bill felt our pain, Hillary either kind of enjoyed it or couldn’t be bothered to deal with it. Now that daughter Chelsea—whose undistinguished professional life is reminiscent of British royalty—has produced baby Charlotte, that’s all changing.

Hillary is missing no opportunity to publicly play at grandmother, a role that can only soften and round out her image as the presidential campaign season swings into high gear. “I highly recommend it,” she told CBS News about becoming a grandparent. At a recent speech to a group of women real estate agents, a member of the audience told Clinton that she looked “beautiful.” To which Clinton responded, “I think it’s a grandmother’s glow.”

Or maybe it’s the fire of political ambition lighting up her cheeks. As far back as June, she was systematically linking her grandchild to world events, telling People, “I’m about to become a grandmother… I want to live in the moment. At the same time I am concerned about what I see happening in the country and in the world.”

OK, we get it. The kid is a prop in a political play. The baby doesn’t just soften you up, Mrs. Clinton, it softens us up, too. Which may actually be excellent public relations but is also deeply disturbing.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME politics

Eric Holder’s Legacy: Duplicity, Incompetence, and Obliviousness

Attorney General Addresses Ferguson Police Shooting, Day After Visiting The City
Attorney General Eric Holder makes a separated during a major financial fraud announcement press conference on Aug. 21, 2014 at the Justice Department in Washington, DC. Alex Wong—Getty Images

Nick Gillespie is the editor in chief of Reason.com and Reason.tv.

Holder exists to protect the president and his policies. Worse, his successor will almost certainly take up exactly where he leaves off.

So Eric Holder is stepping down as attorney general of the United States, reportedly just as soon as a successor is named and confirmed.

It’s a shame that it can’t happen sooner.

Despite some positive actions — refusing to enforce the Defense of Marriage Act, a federal law that is plainly discriminatory, and calling for long-overdue sentencing reform, for instance – Holder’s tenure has been marked by a disturbing mix of duplicity, incompetence, and obliviousness.

Which is another way of saying that he was a thoroughly typical attorney general, a cabinet position that has long been held by individuals whose first loyalty is to the president that appointed them rather than to the Constitution they swear to defend.

From A. Mitchell Palmer (who rounded up and deported real and imagined Communists) to John Mitchell (convicted on perjury charges related to Watergate) to Janet Reno (who ordered the disastrous assault on the Branch Davidians and spent years threatening to censor cable TV), the position has long been a holding tank for low-performing miscreants.

Early on his tenure, Holder told Congress that federal agents wouldn’t raid and arrest proprietors of medical marijuana dispensaries that were complying with state laws (all pot is illegal under federal law). Yet through 2013, the Obama administration was averaging 36 medical marijuana prosecutions a year, compared to 20 a year for the George W. Bush administration. Either Holder directed the Department of Justice and was lying to Congress or he was an administrator whose subordinates routinely disobeyed him. Neither possibility is comforting.

In 2012, he was held in contempt by a bipartisan vote in the House of Representatives for refusing to testify about the “Fast and Furious” scandal emanating from the Phoenix office of the Bureau of Alcohol, Tobacco, Firearms, and Explosives (BATF). Fast and Furious involved government agents allowing illegal sales of guns that were later found at the scene of the murder of a Border Patrol agent. The BATF is part of the Department of Justice and Holder has given conflicting statements about the operation.

Holder managed to earn the ire of progressive politicians such as Sen. Elizabeth Warren (D-Mass.) and Sen. Bernie Sanders (I-Vt.) when he admitted that some Wall Street banks were not only too big to fail but too big to jail. The sheer size of some institutions, he told Congress, “has an inhibiting influence — impact on our ability to bring resolutions that I think would be more appropriate.”

Arguably more disturbing was Holder’s central role in signing off on the secret monitoring of Fox News’ James Rosen and other journalists and his staunch defense of National Security Agency surveillance programs (even when federal oversight boards decreed them unconstitutional and ineffective). It took a 13-hour filibuster by Sen. Rand Paul (R-Ky.) to get Holder to acknowledge in plain language that there were in fact limits to the president’s secret kill list (the existence of which is itself deeply disturbing).

That Holder has moderated on some of these issues — just a couple of weeks ago, Holder voiced support for NSA reforms that would “provide the public greater confidence in our programs and the checks and balances in the system” — only drives home just how situational his ethics and actions always have been as attorney general.

Back in 2007, then-Sen. Barack Obama rightly attacked the risible performance of Attorney General Alberto Gonzales, whom he said conceived of his job as being “the president’s attorney” rather than “the people’s attorney.”

Yet that’s exactly how Eric Holder has behaved during his time in office. Holder exists to protect the president and his policies. Worse still, his successor will almost certainly take up exactly where he leaves off.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME technology

iOS 8: The Operating System That Would Be King

Apple Inc.'s iPhone 6 and iPhone 6 Plus Go On Sale
An Apple Inc. iPhone 6 stands on display at the company's Causeway Bay store during the sales launch of the iPhone 6 and iPhone 6 Plus in Hong Kong, China, on Friday, Sept. 19, 2014. Bloomberg—Getty Images

Nick Gillespie is the editor in chief of Reason.com and Reason.tv.

No company, even one as worshiped by its fans as Apple, is ever more than a couple of flops away from being cast into furnace of hell.

Why are we so obsessed with the release of Apple’s new mobile operating system, iOS 8? The election of a new Pope barely generated as much as anticipation and coverage. Sure, Apple is touting it as the “biggest iOS release ever,” and if there’s anything Steve Jobs’ heirs know better than sleek design and high-profit margins, it’s how to hype something P.T. Barnum–style.

But we’re right to care. Our gadgets—phones, tablets, PCs, wrist monitors, you name it—are nothing less than the magic that we use to generate the illusion (and sometimes the reality) that we can actually control our lives. “Any sufficiently advanced technology is indistinguishable from magic,” quipped the science-fiction legend Arthur C. Clarke, whose dark vision of a supercomputer that bends humankind to its will animated 2001: Space Odyssey. A similar question haunts us, especially whenever our OS fails and we find unexpected, unscheduled, un-busy time on our hands: Are we running our machines or are they running us?

Your smartphone not only lets you talk with anyone in the world wherever you are, it also acts as a portable GPS that lets you find the most off-the-beaten-track address (unless, that is, you’re stuck using the original version of Apple Maps). What was once science fiction in Dick Tracy comics—video calls—is now known as Facetime or Skype. “Wearables” like Fitbit allow us to know how well we slept, how many steps we’ve taken in a day and how many calories we need to burn before eating another doughnut.

In the new world of technology über alles, the operating system is the god of the machine. If it functions well, all is smooth sailing in our lives and we can be pretend to be wizards and witches from storybooks, able to traverse time and space and make miracles happen. We can arrive on time via uncongested routes and we can bank from the coffee shop and invest while riding the subway. We can set the home alarm long after we’ve left for work, track the kids’ homework and use a flight delay as a way to catch up with our friends, lovers and co-workers.

As we voluntarily become cyborgs and wire ourselves up and express ourselves instantaneously through an ever growing array of social media, we need a perfect operating system that allows us to multitask with ever greater flexibility and ever greater ease across computers, phones, tablets and more. Who has time to reboot any of our peripheral devices? Indeed, what machines even count as peripheral anymore? They are all central to us, and more so with each passing day.

In its heart of hearts, Apple knows just how high the stakes are. No company, even one as worshiped by its fans as Apple, is ever more than a couple of flops away from being cast into furnace of hell. Its previous mobile operating system, iOS 7, was a disappointment, if not an epic fail like Windows 8 or Vista, to name two of Microsoft’s ill-conceived and executed operating systems. iOS 8 must be better, smoother, faster, or else even true believers may revolt and slay their god. (Alas, the early reviews for iOS 8 are not heartening.)

In Rudyard Kipling’s The Man Who Would Be King, a common Englishman is mistaken for a god by natives. When they realize he is not the divine entity they took him for, they cut off his head with the crown still on it. That is the irony that Apple seeks to avoid: Nobody causes more disappointment than a god that fails.

Gillespie is the editor in chief of Reason.com and Reason.tv and the co-author with Matt Welch of The Declaration of Independents: How Libertarian Politics Can Fix What’s Wrong with America.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Education

Welcome to College—Now Please Stop Thinking

School University College Lecture
Getty Images

A scandal at Canada’s University of Western Ontario shows just how censored higher education has become

When freshmen first arrived at Canada’s University of Western Ontario a few weeks ago, they were introduced not to cutting-edge research or “the best which has been thought and said” (in Matthew Arnold’s magisterial phrasing), but to a brazen, petty, and all-too-common act of censorship that infantilizes young adults even as it chills free speech and open communication among students and faculty alike.

Welcome to college, kids! Now stop thinking. And for god’s sake, don’t make jokes, talk freely, or even compliment your fellow students.

A student publication at the university, The Gazette, published an irreverent special issue for incoming freshmen. Among the articles was a clearly satirical piece titled, “So you want to date a teaching assistant?” It included such tips as, “Do your research. Facebook stalk and get to know your TA. Drop in on his or her tutorials, and if you’re not in that class — make it happen…. Ask your own smart questions, answer others’ dumb questions, and make yourself known in the class. Better yet, stand out as a pupil of interest.”

If any hard-of-humor students didn’t understand the ironic nature of the advice, there was this: “Know when to give up. At the end of the day, TAs are there to guide you through the curriculum – so there’s a good chance you have to be okay with that and only that. They may not be giving you head, but at least they’re giving you brain.”

The piece immediately set off “a furor,” with the union representing T.A.s calling for the piece to be taken down for promoting sexual harassment and the university provost publicly castigating the paper for being “disrespectful.” The offending material was quickly pulled off from the paper’s website and the editors wrote a groveling, ritualistic apology, promising to report “on these issues in a more serious manner in the future.”

This episode represents what pedagogues like to call a “teachable moment,” but the lesson being learned has nothing to do with the higher-level thinking or analysis you’re supposed to learn at college. It has to do with straitjacketing students (and faculty, too) into a rigid, narrow, and altogether inhuman mode of expression in which the overriding principle is to never give offense, real or imagined.

The Western Ontario case might have happened anywhere. Indeed, to get a sense of how thin-skinned colleges have become, check out the long and always-growing case list of The Foundation for Individual Rights in Education (FIRE), which pushes for freedom of expression on campuses.

One of FIRE’s recent cases involved a female University of Oregon student who was initially disciplined for yelling the sexual innuendo “I hit it first!” at a couple she didn’t know (the school backed down after FIRE intervened). Not all cases involve sexually suggestive language: FIRE is also suing a number of schools for unconstitutionally restricting specifically political speech. Among the cases: California’s Citrus College threatened to remove a student who was gathering signatures on a petition critical of the National Security Agency’s surveillance of Americans.

Why are we treating the next generation of leaders, entrepreneurs, and citizens as hot-house flowers that cannot for one second be discomfited by what they see, hear, or read? Isn’t one of the main reasons to go to college precisely to be pulled out of the world in which you grew up? It is not particularly difficult to espouse free expression for all without endorsing everything that gets said in the marketplace of ideas. It’s exactly in the conversations among those with whom we disagree that old ideas get made better and new ideas flourish. But suppression of speech, whether done by the medieval Church, anti-sex crusaders in the 19th century, or contemporary campus commisars, leads nowhere good.

Yet last year saw the mainstreaming of so-called microagressions, or “quiet, unintended slights” that perpetuate racism, sexism, and classism. According to popularizers of the concept, microaggressions often masquerade as compliments, such as when a man tells a woman she did well in math. Churlish, yes, but actionable speech?

The same sort of hyper-sensitivity is apparent with the rise of “trigger warnings,” in which professors are asked or mandated to give advance notice when engaging course materials that might offend students who have experienced traumas in the past. As a student at Rutgers put it, undergraduates shouldn’t be forced to encounter The Great Gatsby without first being told that the novel “possesses a variety of scenes that reference gory, abusive and misogynistic violence.” Suggested language for professors in a trigger-warning guide at Oberlin runs like this: “We are reading this work in spite of the author’s racist frameworks because his work was foundational to establishing the field of anthropology.”

We’re told that college is an absolute necessity in today’s advanced society. Higher education alone can cultivate the critical thinking skills and independence of thought that drives not just economic innovation but social progress too. Yet over the past 30 or so years, college has become an irony-free zone, one in which every utterance is subjected to withering cross-examinations for any possibility of offense across a multitude of race, class, gender and other dimensions.

As the Western Ontario case demonstrates, when offense is taken, open discussion and debate is no longer the preferred method for dealing with disagreements. No, the bad words must be disappeared and the malefactors forced not simply to apologize but to admit their errors in thinking and promise not to do it again. That’s the way a cult operates, not a culture. And it’s certainly no way to help young adults learn how to engage the world that waits them after graduation.

TIME Media

Hacked Celebrity Pics Should Not Be an Excuse To Throttle Our Free and Open Internet

Christian Dior : Outside Arrivals - Paris Fashion Week : Haute-Couture Fall/Winter 2014-2015
Actress Jennifer Lawrence attends the Christian Dior show as part of Paris Fashion Week - Haute Couture Fall/Winter 2014-2015. Rindoff/Dufour—2014 Rindoff/Dufour

For better or worse, the Internet is the greatest free-speech forum ever imagined

In the wake of the nude-picture-hacking scandal involving images of Jennifer Lawrence, Kate Upton, and dozens of other mostly (but not exclusively) female celebrities, calls to shut down or legally punish the sites at which they were posted—such as Reddit and 4chan—are flying fast and furious. So are calls to increase the scope and penalties for “involuntary porn” and “revenge porn,” in which intimate photos and videos are shared without the consent of all involved parties.

Such reactions are as understandable as they are ultimately misguided. There’s something deeply disturbing about people’s most intimate information being hacked and distributed across the globe. But most remedies threaten not bad behavior as much as the very openness of expression the Internet makes possible.

It’s already a criminal act to hack into private online accounts, so it’s not exactly clear how new laws will change bad actors’ behavior. Under the best of circumstances, it’s notoriously difficult to prove exactly who uploaded what where, and the types of people who are likely commit such acts tend to have an overriding disregard not just for common decency but legal sanctions. Indeed, the hacker believed to be responsible for the posting of the celebrity nudes is reportedly both on the run from the FBI and still threatening to release yet more photos. Similarly, attempts to shut down the so-called Darknet, on which illegal drugs and other illicit goods and services are traded, have proven ineffective. Last year, federal agents arrested the alleged mastermind of the biggest such site, Silk Road, only to see Darknet activity increase by nearly 60% or more since then.

Under current federal law, Internet Service Providers (ISPs) and websites enjoy broad legal immunity from the actions of people who use online services. That’s as it should be and the main reason the Internet evolved into the greatest free-speech forum ever imagined. Yet recent laws designed to criminalize revenge porn effectively nullify such protections.

Earlier this year, for instance, Virginia passed a law that makes it illegal for “any person…with the intent to coerce, harass, or intimidate” to “disseminate or sell” images of someone “in a state of undress” where “such person knows or has reason to know that he is not licensed or authorized” to disseminate. Violations are Class 1 misdemeanors and carry monetary fines and up to a year in prison. The first case brought under the new law was filed in July and the defendant is currently out on bond. Members of Congress such as Rep. Jackie Speier (D-Calif.) are pushing federal versions of such laws, which would strip ISPs and websites of their immunity.

The problem with such legislation is that it doesn’t just criminalize the posting of images whose meanings and intentions are rarely as clear-cut as prosecutors want to believe. It also has the potential to massively chill free speech by gulling ISPs and websites into either pulling down totally legal material when faced with any sort of complaint, but also proactively policing free expression. Individuals, too, will also feel the chill as they wonder exactly what sort of material may land them in court.

As Lee Rowland of the ACLU told one of my colleagues at Reason TV earlier this year, “Criminal law is such a blunt instrument that we have real doubts that it’s possible to draft these laws in a way that won’t end up criminalizing pure speech.”

It’s only been little more than a year that revelations from Edward Snowden detailed just how much of all of our on- and off-line communications are being monitored by any number of government agencies and programs. While the Internet has exponentially increased the possibilities of human rudeness, crudeness and rotten behavior, it has also similarly exploded our ability to communicate openly and to speak truth to power—even as that power is trying harder than ever to keep track of every random thought we have.

The celebrities affected by this latest online scandal will survive with their careers intact. They have every right to be aggrieved and to pursue legal claims that exist against hacking and invasion of privacy. But all of us deserve a free and open Internet, too. Anything we do to tamp down the free flow of information on the Internet will ultimately come at a price that is steeper than advertised.

Nick Gillespie is the editor in chief of Reason.com and Reason.tv and the co-author with Matt Welch of The Declaration of Independents: How Libertarian Politics Can Fix What’s Wrong with America.

TIME Business

Labor Day: Raising the Minimum Wage Stiffs the Poor

Demonstrators take part in a protest to demand higher wages for fast-food workers outside McDonald's in Los Angeles on May 15, 2014.
Demonstrators take part in a protest to demand higher wages for fast-food workers outside McDonald's in Los Angeles on May 15, 2014. Lucy Nicholson—Reuters

There are at least three better ways to help low-income workers — and few ways that are worse

Another Labor Day, another bold plan to increase the minimum to help the working men and women of America!

On Monday, Los Angeles Mayor Eric Garcetti will announce a proposal to jack his city’s minimum wage from $9.00 all the way up to $13.25 over three years. That puts him ahead of President Obama, who has called for goosing the federal minimum wage from $7.25 to $10.10.

Increasing the minimum wage is typically sold as a way of aiding poor people — LA business magnate and philanthropist Eli Broad says Garcetti’s plan “would help lift people out of poverty.” But it’s actually a pretty rotten way to achieve that for a number of reasons.

For starters, minimum-wage workers represent a shrinking share of the U.S. workforce. According to the Bureau of Labor Statistics (BLS), the percentage of folks who earn the federal minimum wage or less (which is legal under certain circumstances) comes to just 4.3 percent of hourly employees and just 3 percent of all workers. That’s down from an early 1980s high of 15 percent of hourly workers, which is good news — even as it means minimum wage increases will reach fewer people.

What’s more, contrary to popular belief, minimum-wage workers are not clustered at the low end of the income spectrum. About 50 percent of all people earning the federal minimum wage live in households where total income is $40,000 or more. In fact, about 14 percent of minimum wage earners live in households that bring in six figures or more a year. When you raise the minimum wage, it goes to those folks too.

Also, most minimum-wage earners tend to be younger and are not the primary breadwinner in their households. So it’s not clear they’re the ones needing help. “Although workers under age 25 represented only about one-fifth of hourly paid workers,” says BLS, “they made up about half of those paid the federal minimum wage or less.” Unemployment rates are already substantially higher for younger workers — 20 percent for 16 to 19 year olds and 11.3 percent for 20 to 24 year olds, compared to just 5 percent for workers 25 years and older — and would almost certainly be made worse by raising the cost of their labor by government diktat. While a number of high-profile economists such as Paul Krugman have lately taken to arguing that minimum wage increases have no effect on employment, the matter is far from settled and basic economic logic suggests that increases in prices reduce demand, whether you’re talking about widgets or labor.

Finally, there’s no reason to believe that people making the minimum wage are stuck at the bottom end of the pay scale for very long. According to one study that looked at earning patterns between 1977 and 1997, about two-thirds of workers moved above the minimum wage within their first year on the job. Having a job, even one that pays poorly, starts workers on the road to increased earnings.

If we want to actually raise the standard of living for the working poor via government intervention, the best way to do it is via transfer payments — food stamps, housing subsidies, or even plain cash — that directly target individuals and families at or below the poverty line.

University of California sociologist Lane Kenworthy, a progressive who has called for a more generous social safety net, argues that virtually all increases in income for poor families in the U.S. and other wealthy countries since the late 1970s have been a function of “increases in net government transfers — transfers received minus taxes paid.” That’s partly because workers in poor households often have “psychological, cognitive, or physical conditions that limit their earnings capability” and partly because today’s “companies have more options for replacing workers, whether with machines or with low-cost laborers abroad.”

To be sure, arguing that you want to increase direct aid to poor families doesn’t give a politician the same sort of photo-op as standing with a bunch of union leaders on Labor Day and speechifying about the urgent need to make sure an honest day’s work is rewarded with a living wage.

But making just such a case could have the benefit of actually helping poor people in the here and now. Certainly a savvy politician could sell that to voters who know the value of hard work — and the limits of economic intervention.

 

TIME Parenting

Millennials Are Selfish and Entitled, and Helicopter Parents Are to Blame

Parent Child Climbing
Peter Lourenco—Flickr RF/Getty Images

There are more overprotective moms and dads at a time when children are actually safer than ever

It’s natural to resent younger Americans — they’re younger! — but we’re on the verge of a new generation gap that may make the nasty old fights between baby boomers and their “Greatest Generation” parents look like something out of a Norman Rockwell painting.

Seventy-one percent of American adults think of 18-to-29-year-olds — millennials, basically — as “selfish,” and 65% of us think of them as “entitled.” That’s according to the latest Reason-Rupe Poll, a quarterly survey of 1,000 representative adult Americans.

If millennials are self-absorbed little monsters who expect the world to come to them and for their parents to clean up their rooms well into their 20s, we’ve got no one to blame but ourselves — especially the moms and dads among us.

Indeed, the same poll documents the ridiculous level of kid-coddling that has now become the new normal. More than two-thirds of us think there ought to be a law that kids as old as 9 should be supervised while playing at a public park, which helps explain (though not justify) the arrest of a South Carolina mother who let her phone-enabled daughter play in a busy park while she worked at a nearby McDonald’s. We think on average that kids should be 10 years old before they “are allowed to play in the front yard unsupervised.” Unless you live on a traffic island or a war zone, that’s just nuts.

It gets worse: We think that our precious bundles of joy should be 12 before they can wait alone in a car for five minutes on a cool day or walk to school without an adult, and that they should be 13 before they can be trusted to stay home alone. You’d think that kids raised on Baby Einstein DVDs should be a little more advanced than that.

Curiously, this sort of ridiculous hyperprotectiveness is playing out against a backdrop in which children are safer than ever. Students reporting bullying is one-third of what it was 20 years ago, and according to a study in JAMA Pediatrics, the past decade has seen massive declines in exposure to violence for kids. Out of 50 trends studied, summarize the authors, “there were 27 significant declines and no significant increases between 2003 and 2011. Declines were particularly large for assault victimization, bullying, and sexual victimization. There were also significant declines in the perpetration of violence and property crime.”

There are surely many causes for the mainstreaming of helicopter parenting. Kids cost a hell of a lot to raise. The U.S. Department of Agriculture figures a child born in 2013 will set back middle-income parents about $245,000 up to age 17 (and that’s before college bills kick in). We’re having fewer children, so we’re putting fewer eggs in a smaller basket, so to speak. According to the Reason-Rupe poll, only 27% of adults thought the media were overestimating threats to the day-to-day safety of children, suggesting that 73% of us are suckers for sensationalistic news coverage that distorts reality (62% of us erroneously think that today’s youth face greater dangers than previous generations). More kids are in institutional settings — whether preschool or school itself — at earlier ages, so maybe parents just assume someone will always be on call.

But whatever the reasons for our insistence that we childproof the world around us, this way madness lies. From King Lear to Mildred Pierce, classic literature (and basic common sense) suggests that coddling kids is no way to raise thriving, much less grateful, offspring. Indeed, quite the opposite. And with 58% of millennials calling themselves “entitled” and more than 70% saying they are “selfish,” older Americans may soon be learning that lesson the hard way.

TIME U.S.

Make Cops Wear Cameras

Outrage In Missouri Town After Police Shooting Of 18-Yr-Old Man
A police officer standing watch as demonstrators protest the shooting death of teenager Michael Brown conceals his/her identity on August 13, 2014 in Ferguson, Missouri. Scott Olson—Getty Images

“Everyone behaves better when they’re on video”

Michael Brown, an unarmed 18-year-old, shot to death in Ferguson, Missouri, by police. Eric Garner, a 43-year-old New Yorker, dies from a police chokehold. John Crawford III, 22, shot and killed by police in a Walmart outside of Dayton, Ohio.

Enough is enough. Each of these incidents has an unmistakable racial dimension—all of the victims were black and all or most of arresting officers were white–that threatens the always tense relationships between law enforcement and African Americans. As important, the circumstances of each death are hotly contested, with the police telling one story and witnesses (if any) offering up very different narratives.

Brown’s death in particular is raising major ongoing protests precisely because, contrary to police accounts, witnesses claim that he had his hands up in the air in surrender when he was shot. The result is less trust in police, a situation that raises tensions across the board.

While there is no simple fix to race relations in any part of American life, there is an obvious way to reduce violent law enforcement confrontations while also building trust in cops: Police should be required to use wearable cameras and record their interactions with citizens. These cameras—various models are already on the market—are small and unobtrusive and include safeguards against subsequent manipulation of any recordings.

“Everyone behaves better when they’re on video,” Steve Ward, the president of Vievu, a company that makes wearable gear, told ReasonTV earlier this year. Given that many departments already employ dashboard cameras in police cruisers, this would be a shift in degree, not kind.

“Dash cams only capture about 5% of what a cop does. And I wanted to catch 100% of what a cop does,” explains Ward, who speaks from experience. He used to be a Seattle police officer and his company’s slogan is “Made for cops by cops. Prove the truth.”

According to a year-long study of the Rialto, Calif., police department, the use of “officer worn cameras reduced the rate of use-of-force incidents by 59 percent” and “utilization of the cameras led to an 87.5 percent reduction in complaints” by citizens against cops.

Such results are the reason that the ACLU is in favor of “police body-mounted cameras,” as long as various privacy protections and other concerns are addressed. And it also explains growing support for the policy among elected officials. In the wake of Eric Garner’s chokehold death in July, New York City’s public advocate is pushing a $5 million pilot program in the city’s “most crime-plagued neighborhoods” as a means of restoring trust in the police.

Since 1991, when the beating of Rodney King by the Los Angeles Police Department was captured on tape by an amateur videographer, small, cheap recording devices have become a ubiquitous and effective means by which citizens are able to watch the watchers. In some cases, crowd-sourced footage exonerates the police, while in others it undermines the official narrative.

Over the same period, as the Washington Post’s Radley Balko has documented in Rise of the Warrior Cop, even small-town police departments have become “militarized” in terms of the training they received and the hardware they carry. When the results aren’t tragic, they increase tensions between police and the people they serve and protect.

Mandating that cops wear cameras wouldn’t prevent all tragedies from happening but they would certainly make deaths like those of Brown, Garner and Ferguson less likely. And in difficult cases, body cams would help provide crucial perspective that would build trust in law enforcement across the board.

TIME Media

Why I’m Actually Pretty Psyched for the New Sarah Palin Channel

It adds to the incredible variety of media sources but will flourish only if it actually contributes to ongoing conversations about news, politics, culture and ideas.

Former governor, vice-presidential candidate and reality-TV star Sarah Palin has started her own subscription-only web-based news channel. That’s good news for people who want to follow her – and for people who want to ignore her, too (she’ll be showing up far less often on cable news channels). “I want talk directly to you on our channels, on my terms, and no need to please the powers that be,” she explains in a (free!) intro video.

Palin’s new project is the latest sign that we live in world of gloriously fragmented media and culture that allows just about anyone to express themselves more fully than at any time in human history. That’s a great thing, even if it means trouble for long-established media companies and empowers conspiracy ranters such as Alex Jones.

Twenty years ago, just as the Internet was developing into a mass medium that catered to individuals’ unique tastes and interests in unprecedented ways, critics were foolishly flipping out about “media consolidation” and how a few companies such as AOL Time Warner would control all our news and information (as if!). Now, they are more likely to worry over the loss of a common news culture and the seeming ability of people to consume only self-confirming points of view. That may seem plausible on the face of things, but it’s equally wrong.

Palin is hardly a trailblazer in launching her own channel. Her ideological confrere Glenn Beck launched The Blaze network on the web in 2011. It spread to satellite TV a year later, and claims north of 300,000 subscribers paying $9.95 for full access to tons of print, video and audio content. Elsewhere on the political spectrum, pioneering blogger Andrew Sullivan sells access to The Dish (which touts itself as “biased and balanced”) for $1.99 a month and The Young Turks offer free, basic ($10) and premium ($25) access to a wide variety of text and video. RedState, The Daily Kos, Huffington Post, PJ Media and others all offer unlimited amounts of news, commentary and community for free. Everywhere you look, there are not just more ways to access the news, but more voices entering the marketplace of ideas.

The Sarah Palin Channel will flourish only if brings something truly different and substantial to the table. The eponymous host promises her service will be “a community” and that she’s most excited about hearing directly from her audience. That’s a start (and a shift from the old-style news broadcasting), but only time will tell whether that’s enough to keep folks shelling out $10 a month for the long haul.

What is clear is that even with the proliferation of news sources with distinct points of view, Americans are reading deeply and widely. Earlier this year, the American Press Institute released a study called “The Personal News Cycle: How Americans choose to get their news.” Among the key findings: 75% of us consume news every day and increasingly we pay attention throughout our waking hours, checking in across different platforms, media and sources.

Far from walling ourselves off in ideological gardens that tell us just what we want to hear, “the majority of Americans across generations now combine a mix of sources and technologies to get their news each week.” We go deep on stories that interest us, reading multiple accounts from multiple places to get more information—something that wasn’t possible back in the days of three broadcast channels and one or two hometown newspapers. Perhaps most interestingly, we apply a sliding scale of credibility based on sources, with 43% having high trust levels in reports from well-established news organizations, 21% from “word of mouth” ones, and even less from unsubstantiated social media sources.

So welcome to the 21st Century media world, Sarah Palin. New voices and platforms are always welcome, but it’s a jungle out here. You don’t have to “please the powers that be,” but you do have to bring real value to your readers and viewers – and that’s no walk in the park in the mediascape of endlessly fascinating and proliferating choices.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser