TIME society

What’s More American Than Skydiving?

Legs and feet of skydiver above coastline
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

Encounters with freedom, optimism, and exploration at 10,000 feet

When I quit my first real job, I didn’t have a plan. I just walked out with the recklessness of a Harvard graduate who had come of age during the Clinton-era Internet bubble. I was barely out the door when reality set in, and elation gave way to doubts about the wobbling post-Y2K economy. What if I had doomed myself to poverty? I wanted catharsis. That’s when I got the idea to jump out of an airplane.

Soon after, in a boozy haze at a San Francisco loft party, I recruited friends to skydive with me over the Russian River. Everyone sounded brave, but the next morning I was the only one who showed up. Instead of bowing out, I signed the paperwork. My senses felt dulled by a vortex of never-ending work and play, and I wondered what my inner voice would tell me about the path ahead if I could actually hear it.

When we opened the door at 10,000 feet, the only thing I saw was blue. It was a threshold to air, to nothingness. I am scared of heights, but the blue was more abstract: the terror of the unknown. I hadn’t even told my parents I was going to jump. I dug in for a moment, heartbeat in my throat, reconsidering.

The tandem instructor nudged me toward the edge like a reluctant sheep while telling me to pull back my head. I breathed deep, looked up, and, much to my surprise, found calm. Safety was supposed to be inside the plane, with a seatbelt on. But a deeper voice stirred, and it said: Maybe the places most enclosed—by walls, by rules—are the ones that pose the greatest danger. After all, isn’t that why I had quit my job? Outside was an uninhibited place, full of possibility.

“Ready, set …” And we launched into the wind.

My senses were overwhelmed by the relative wind at terminal velocity, a feeling not of falling but of flying. The parachute deployed with a big, decelerating tug. In the quiet peacefulness under the nylon canopy, floating thousands of feet above the sparkling river and green hills, I came home to myself.

We reached the ground softly. My instructor high-fived me and said, “You could be good at this!” I was adrenalized to the gills, driving away well over the speed limit with the windows down, radio blasting and dancing like a maniac. The following week, I started training for my first skydiving license. Sometimes I was so scared to jump that I prayed for high winds to keep me on the ground. Still, I kept showing up.

Exiting through that door became a passion, an addiction, a ritual. I woke up early to go skydiving at tiny little airstrips surrounded by artichoke fields. People I would never have encountered in the Harvard bubble changed the way I thought about friendship. The drop zone was a magical equalizer, where trust fund kids with BMWs hung out with elevator technicians. Parachute packers living on ramen noodles schooled emergency room doctors in flying skills.

There are skydivers all over the world, but the United States has the greatest numbers—our Parachute Association has around 35,000 members. We are a big country with a relatively free market for personal risk-taking and a high enough average income to put extreme sports within popular reach. The early history of sport skydiving is filled with innovations by both members of the military and pot-smoking, barefoot hippies, reflecting a cultural and socio-economic diversity that is rare in places where skydiving is more expensive and therefore more exclusive.

True, the sport’s pioneers were largely white and male, and skydiving remains demographically skewed that way. The culture is evolving to be more inclusive and welcoming to minorities (for whom daily life may seem to carry enough risk). No matter what they look like, the skydivers I’ve encountered in this country seem to share the core values of freedom, optimism, and exploration, all essential elements of the American character.

About a year after I started jumping, I embraced my own desire for new frontiers. I sold most of my belongings and moved to South Africa to pursue my dream of a meaningful career researching the impacts of war and violence on marginalized communities. Taking my skydiving rig with me, I fell in love with the man who first drove me to the Johannesburg Skydiving Club. Freefall became an emotional choice.

Eric, who became my life partner, was the chief instructor at the club and an early adopter of the new discipline of wingsuit flying. A wingsuit is a jumpsuit spanning nylon between arms and legs to transform the body into a human glider (think: flying squirrel). Eric taught me how to use one, igniting a shared passion.

We spent weekends at the drop zone chasing clouds and holding hands. Sometimes at the end of a day we would sit at the end of the runway, tracing its cracks, philosophizing as we took the world apart and put it back together. We knew what we did carried risk, and we talked about what would happen if one of us died.

It was a Sunday morning when I got the call. Eric had made a small mistake on a high-speed landing and the error had, as he had once phrased it, “cascaded into eternity.” All of the matter in the universe is sucked into the moment when the consequences of risk become real. The impossible density of it squeezed everything alive inside of me into pulpy deadness.

As a skydiver, I had learned to handle situations most people can’t deal with. Even beyond the sport we both loved, Eric had never shied away from bearing responsibility for others, even when doing so was painful. And so I wrapped his strength and conviction around my own and refused to give up on our—now my—life.

Four months passed before I was ready to try skydiving again. I didn’t want to let fear of the unknown—how would it feel to fly again without him?—dictate whether I quit. On my first jump back, I wept in the plane and performed the ritual of exiting into the blue. When the time came, it took everything I had to pull my parachute and choose life. I saw him next to me, flying on, and understood that I could not follow. Yet there was so much joy in sharing the flight.

Eight months later, I took some of his ashes up on a wingsuit jump and set them free. Achingly, I dismantled the dream life I had built and returned to the United States, where I felt I had the greatest chance at finding another open door. I spend a lot of my life in the air now, teaching people to fly and organizing world record wingsuit formations. I survived the transitions from sensory-overloaded novice to lifelong student to teacher and leader. On this path, Eric became part of me.

I continue to bear witness to small human errors that take my friends away. But like any other risk-embracing journey, there are trade-offs that make the seemingly perpetual loss worth it. I have become part of a family made up of people from all walks of life. We are joined by our desire to experience the space between sky and ground, using the very force that pulls us down to help us fly. My hope is that our resilience, and the triumphs of our explorations, will inspire all who dream of freedom in any form to take the first step.

Taya Weiss is a professional skydiver, wingsuit flyer, and chronicler of gravity-powered adventures. More of her writing can be found at http://www.tayaweiss.com. She wrote this for Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME society

Is America Still the Home of the Brave?

american-flag
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

Tracing a national tradition from the American revolutionaries and Amelia Earhart to graffiti artists and venture capitalists

On January 14, 2015, the world waited with bated breath as Tommy Caldwell and Kevin Jorgeson came over the rim of a notoriously steep section of the rock known as El Capitan, the largest single block of granite in the world. Over the course of 19 days, the pair had climbed the Dawn Wall, the most difficult part of the famous rock formation at Yosemite National Park, with just their hands and feet; rope and harnesses were used only to break deadly falls. Caldwell and Jorgeson became the first people to “free climb” the Dawn Wall, a feat many thought could never be accomplished.

The pair had trained for more than five years and encountered serious injuries on previous attempts. In recognition of their arduous and potentially fatal quest, one of them even called this climb his Moby-Dick, after that white whale that taunted—and destroyed—Captain Ahab. When Caldwell and Jorgeson made it to the top with their bloodied, bandaged, and superglued fingers, it was such a quintessential moment of American optimism that even President Obama sent his congratulations, tweeting, “You remind us that anything is possible.”

Could anyone other than Americans have scaled this incredibly difficult granite face with so scant a safety net? Of course—but it was Americans who made the seemingly impossible climb. And many of the world’s elite rock climbers—including the one considered the world’s best, Alex Honnold—are Americans. In advance of the “What It Means to Be American” event “Are Americans Risk-Takers?” we asked scholars and people who dabble in risk for a living: What is it about American culture that encourages risk-taking?

A young mindset for a young country — Joyce Appleby

Risk-taking appeals to the young; it’s only as we grow older that we’re cautioned by the downsides. One of the remarkable aspects of the American Revolution is the freedom won by young people. Both girls and boys escaped the drudgery of farming by becoming schoolteachers, an occupation greatly expanded after the Revolution. Similarly, the union of the states made it possible for boys to become peddlers, carrying goods from the Northern states to Southern plantations. These experiences made youth a time for experimenting in new careers.

Crucial for risk-takers in the early national period was the fact that old colonial wealth withdrew from speculative economic ventures, leaving many opportunities open to ordinary men and women. Old wealth stayed in the city and benefited from the rise in the prices of urban real estate. Early manufacturing centered in rural areas because of the available water power, and a new enterprise could begin with sweat equity and borrowed seed money from family and friends.

The opening up of the lands in the national domain west of the Appalachian Mountains also enticed many—mostly young people—to pull up stakes and move west where they might acquire land and the respect land ownership bestowed. First comers had an unusual chance to capitalize on their labor, clearing land and selling it to those in the second wave of westward adventurers.

Unlike European societies, American society freed its youth to create their own careers. Giving the natural risk-takers such a free scope led soon to embedding an admiration for risk-taking in American culture. It has continued to prevail as a distinctive feature of the culture of the United States.

Joyce Appleby is professor emeritus of history at UCLA who has studied England, France, and America in the 17th and 18th centuries, focusing on how economic developments changed people’s perceptions of politics, society, and human nature. Her recent publications are The Relentless Revolution: A History of Capitalism (2010) and Shores of Knowledge: New World Discoveries and the Scientific Imagination (2013).

You can’t pass up an opportunity — Sket-One

Living in the Land of the Free and Home of the Brave gives us a kind of “I can do that” attitude. No wonder we’re the birthplace of Nike’s “Just do it” campaign. Immigrants have brought their hopes and dreams here for a long time.

Growing up a graffiti artist in America isn’t all that free, and you have to be at least a little brave. To hone your skills, you mostly have to make art illegally—most people don’t have large blank walls sitting that they’re allowed to practice on. I remember being a kid and writing all my plans on pieces of paper—what colors to use, how to execute the sketch of the work I planned to do that night. And then I’d have to sneak out and hike to abandoned places like train yards or bridges. I could’ve been caught, electrocuted, or hit by a train or car. I could’ve been fined, had my artwork removed, or gone to jail. But I did it because I couldn’t find any other outlet to express myself. I wanted to create, and be seen creating.

If I imagine embarking on the same journey in a different country, I’m not sure it would’ve been worth the risk. Friends, for example, have told me about punishment for graffiti in Singapore—from huge fines to caning. If I were hit with a punishment of this caliber, I wouldn’t have continued or received any support for my artistic endeavors. Instead, I have a career doing what I love, and a comfortable home for my family.

Sket-One, also known as Andrew Yasgar, is a painter, illustrator, and designer who began as a graffiti artist in the 1980s. His studio is in Long Beach, California.

The ‘self-made man’ is an entrenched story, but a fable to many — Zulema Valdez

With 13 percent of the working-age population, the United States boasts the highest rate of entrepreneurship across 25 industrialized economies. Robert Fairlie, an economist at the Kauffman Foundation, noted that in 2013, the U.S. economy added 476,000 new business owners each month.

These numbers are consistent with the strongly held belief by most Americans that the United States is the land of opportunity, where anyone with a good idea, a positive attitude, and a willingness to work hard can own a business and succeed. This ideology is expressed by a higher percentage of Americans when compared to people in other nations. In a 2013 report published by the Global Entrepreneurship Monitor, fully 47 percent of Americans agreed that good opportunities for new businesses exist, and 56 percent “believed they had the capabilities to launch a business.”

Yet the economic reality for most American entrepreneurs is that most businesses fail. Regardless of personal drive, hard work, and risking it all, successful businesses are generally owned by older, white, middle-class men, who, yes, possess a propensity toward risk.

The idea of America as the land of opportunity—which we can even call the American Creed—sparks risk-taking among a large and diverse population willing to take a leap of faith and start a business. What the ideology fails to reveal, however, is that the U.S. remains a highly stratified society where successful entrepreneurs are rarely “self-made.” The Horatio Alger “rags-to-riches” fable is just that, a fable that exists to reinforce the possibility of the American dream. The reality, however, is that risk-taking, while perhaps a necessary ingredient for entrepreneurship, is not sufficient in the absence of human capital (education and work experience), social capital (business networks), and financial capital (personal savings, wealth, access to credit or loans). Ultimately, these factors trump risk-taking, or perhaps diminish the “risk” entirely.

Zulema Valdez is associate professor of sociology at the University of California, Merced and the author of The New Entrepreneurs: How Race, Class, and Gender Shape American Enterprise.

Challenging authority and taking chances is in America’s DNA — Susan Wels

From the beginning, independence and self-determination have been essential elements of the American character. As a culture, we tend to challenge authority and take risks to pursue our own convictions and interests.

That history helps shape who we are, and the scope of our expectations. Amelia Earhart—the first woman and second person to fly solo across the Atlantic—is one example. She grew up in a family of risk-takers. Her grandfather, Alfred Otis, moved to Kansas in 1855 to help escaping slaves—hiding them in trunks and covering them with grain in the back of wagons. He raised his daughter, Amy Otis, to embrace risk, travel, and adventure. She raised her daughter, Amelia Earhart, to have boundless potential. She began Amelia’s baby book with a quote from Ruskin: “Shakespeare has no heroes; he has only heroines.” And Amelia was determined to push every limit and break every boundary. Risk was never a hurdle—it was an attraction.

When I was researching my book on Amelia Earhart, I came across an essay she wrote called “Thrill” that was never published. “When I undertake a task,” she confided in the piece, “over all protest, in spite of all adversity, I sometimes thrill, not with the task, but with the realization that I am doing what I want to do.”

Amelia Earhart was an extraordinary risk-taker. But her insistence on self-determination was quintessentially American, and it was worth everything—even the risk of death on her final, record-setting flight around the world.

Susan Wels is the author of Amelia Earhart: The Thrill of It.

American life is so predictable we can make ‘educated’ guesses — Alfonso Morales

Deciding whether to take a risk involves thinking about a number of variables—the consequences of a bad decision, figuring out your choices, and understanding how much effort you are willing to exert to gather knowledge about a business opportunity or to continue your education, for example.

The relative predictability and stability of American life helps Americans take risks. While corruption scandals do erupt from time to time, the local, state, and federal governments do function, as a general rule. Supermarket shelves are always well stocked unless there is some kind of super storm. Traffic is always bad at 5 p.m. on a weekday. Largely this is the same in “Western” societies where the rule of law provides predictability, but the U.S. combines an ease of entry with an institutional transparency that encourages new immigrants (and others) to funnel their energy into entrepreneurship.

Our society is stable enough that we can imagine the circumstances that have enabled another person to succeed and then take our own risk to do what he or she has done. Our society is also diverse enough that an interest in business found in one generation might get replicated in a very different way in the next generation. For instance, each adult and child in a family I knew at Chicago’s Maxwell Street Market had his or her own business: The adults sold clothing and recorded music; the children had their own line of toy cars, Rubik’s Cubes, and other novelties. They made choices built on previous experiences, and these experiences and incomes led to new, risky choices. Those risks were moderated by investing in the children’s education. (All four children earned post-secondary degrees, including a Ph.D. and a law degree.) This balancing act is not easy, nor are people always successful, but in the U.S. people can see themselves navigating risky situations successfully, even if it means exerting the effort for years before succeeding or even if their efforts might not bear fruit for a generation.

In short, when we take risks, we make “educated” guesses about what we’re going to do.

Alfonso Morales is an associate professor in the department of urban and regional planning at the University of Wisconsin-Madison. He co-edited the books Street Entrepreneurs and An American Story: Mexican American Entrepreneurship and Wealth Creation.

Our cultural idols are people who are willing to take enormous risk — Peter Sims

I’m not one for pandering the notion of “American exceptionalism” as politicians do. But after working as a venture capital investor in the United States, then in Europe, I realized one day—while riding on a train through the English countryside—that when it came to risk-taking, there really isn’t anything like the culture of entrepreneurship in America.

In England, you’re considered an entrepreneur if you buy a small company and try to grow it. In Germany, most of the economy is driven by the Mittelstand, large, privately held companies that grow 5 to 10 percent a year. In France, Italy, and Spain, government regulations and high capital costs hamper start-ups.

Yet in many parts of America, especially the valleys and universities, almost everyone is an entrepreneur, willing to tinker, toil, and enthuse about ideas late into the night, perfectly aware that failure is probable, even likely. Our cultural idols are the people who are willing to take enormous personal risk and toil through troughs of defeat. They emerge somehow as stronger human beings, perhaps wildly wealthy, or at the very least wiser and more original versions of themselves.

Some call that the American dream. And the challenge in America today is to ensure that entrepreneurial capitalism doesn’t take a back seat to a kind of crony capitalism that excessively enriches executives while cutting back innovation budgets. We don’t want the kind of capitalism that depends on a cozy relationship with government, where contributions flow from corporate pockets to Washington and back, deteriorating our faith in both government and the functioning of our market institutions. In my opinion, the danger in America today is that we forget the courageous, risk-taking, entrepreneurial spirit that got us here, and replace it with a corruption of the ideals that built our country.

Peter Sims is co-founder of The Silicon Guild and founder of The BLK SHP (“black sheep”) Foundation. His latest book is Little Bets: How Breakthrough Ideas Emerge from Small Discoveries, which grew out of a long collaboration with faculty at Stanford University’s Institute of Design, as well as his previous work in venture capital.

This article was written for Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Education

How ‘Reform’ Hurts My Teaching

hand-holding-chalk-chalkboard
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

National education 'reform' programs are getting in the way of improvement efforts by local teachers for local schools

Being a teacher has always been a challenge. It is a much greater challenge in the era of “reform.” It has been infuriating to read all the silliness, even worse to have to comply with the misguided mandates. But we teachers are ourselves partly at fault — for reacting rather than acting.

The “reform” movement has been an obstacle to our own organic reform efforts at Hillsdale High School in San Mateo, California, where I have been teaching since 1985. Although the impact is hard to quantify, “reform” has too often distracted and diverted us from reform.

By “reform,” I mean the primacy of standardized testing; the concomitant elements of merit pay; a focus on individual teachers and classrooms, on heroes and entrepreneurs; and attacks on teacher seniority and due process rights and the unions that support them. Teachers react against “reform.”

By reform, I mean a staff working together at a site over years to create a collaborative working environment for teachers and establish better structures and instruction for students. The goal is to build a culture in which we know students well, hold them and teachers to high standards, and take collective responsibility for student achievement and wellbeing. Reform means recognizing that there are no silver bullets, no quick fixes, no heroes, except everyday ones. Teachers act on reform.

At my school, tenure, seniority and the union have been key to effective reform. We are a school where culture — informed by a combination of talented newcomers and stable, secure, senior teachers who can speak their mind, lead by example, drive initiatives, and provide mentorship — is more important than the quality of any individual teacher. We are a school where it is nearly impossible for one teacher’s efforts to be measured without taking into account the work of others.

The uncomfortable juxtaposition between our reform and “reform” emerged in all too clear focus over the last two days of January. On a Thursday, after many months and long hours of preparation, our assessment committee, of which I am part, guided the staff through the process of coaching and evaluating students for their Senior Defenses in March.

We conceived of the Senior Defense as a kind of graduate interview in which our students show us who they are, academically and personally, what they know, and what they can do. In the fall, the whole staff participated in auditing how students had progressed on our Graduate Profile, a definition of achievement we established in 2009 that is similar to, though broader than, the Common Core State Standards. We have five main categories: Read, Communicate, Think, Understand and Apply, and Respect. Within each category we have a number of subcategories, which, for the defense, we winnowed down to those we believed are the most essential.

In February the staff would coach students in presenting and defending three parts in front of a panel of teachers: 1) a summary of previously written reflections on three subcategories of the graduate profile; 2) one piece of work, usually a major essay, class project, or research paper, in a particular subject; and 3) an application task, during they are given – an hour before their defense — four prose and visual sources on a controversial subject and they have to evaluate the sources, take and support a position, make connections between the topic and previous course work, and describe the reading strategies they used to make sense of the sources After the presentation the majority of the defense consists of students answering questions from the panel.

We have chosen this path because we expect a lot of our students and a lot of ourselves. We readily admit that plenty of work remains for students and teachers to meet these expectations.

On the Friday morning after that preparation for the Senior Defenses, I had to deal with the California State High School Exit Exam (CAHSEE), one of the first mandates that appeared in California in the name of “reform.” In my advisory student meeting, I went over the rules and guidelines with the sophomores I am responsible for mentoring and coaching along with several other teachers. In the advisory teacher meeting we went over logistics and talked about snacks, one of the bribes we have developed to motivate students and mitigate the sheer drudgery of the six-hour test.

From Senior Defenses to CAHSEE in less than 24 hours is a microcosm of a teacher’s pendulum. They say that in education reform the pendulum swings back and forth every ten years or so. But in actuality, teachers swing back and forth all the time. To reach one side of the pendulum’s arc, you fight against gravity to help human beings reach their potential. Then gravity pulls you back and you focus on the mandates that generate numbers that mean little to you or your students, say little to nothing about what you and they actually do, yet upon which you and they are judged.

The CAHSEE was first instituted in 2001, with the claim that it would be an accountability measure of basic competency. But for the vast majority of students, the only challenge is how to interpret poorly constructed questions unrelated to what they have been doing in class, and how to occupy themselves once they’ve finished. For those who struggle with the test, primarily special education students and English language learners, the standardized format is half of the problem. Those who support testing say that it’s the only way parents can know how their kids are doing in school. However, it is hard to point to a single thing we learn about our students from the exam results that we didn’t already know. And if parents don’t know what their kids are doing in school, the solution is in better communication, not worse assessment.

The California Standards tests that we followed for so many years were a similar time drain. The instructional and administrative time involved — adding special prep classes, administering the tests and retests, gaming the test to avoid the phony judgments that come from published ratings, etc. — robbed us of hours and resources we could use on smarter things.

The tests also dehumanize the education process with reductionist thinking. Tests define achievement in a narrowly inhuman, and therefore unproductive way that produces vast amounts of useless data and sloppy reports. Words like achievement, performance, success, and proficiency all conflate around “good test scores.”

This trend has made me doubt most educational research. We used to say anecdotes were not useful; instead you have to look at “the research.” Any teacher will be familiar with the phrase “the research shows…” But digging into most of “the research” I’ve seen in the last five years, standardized testing data is at the root of almost any conclusion. If we can’t trust that data, why should we trust the conclusions?

One final negative of the “reform” mandate has been the de-professionalization it demands. Because it is a mandate, because teachers and students are judged based on the mandates, teachers lose agency, they lose the ability to make professional judgments and they fall in line as school district officials chase after “reforms” by adopting programs that promise to deliver results based on “research.” When one “reform” decade peters out and is replaced by the next one, teachers must chase again, as we see now with the Common Core standards.

The elements of “reform” came together in the 2014 Vergara case, in which a wealthy Atherton businessman brought a lawsuit supposedly on behalf of poor school children, claiming that teacher seniority deprived them of their civil rights. The judge generally agreed, and U.S. Secretary of Education Arne Duncan, who has called for “hero teachers” and been a cheerleader for “reform,” lauded the decision.

But there’s a whole lot of experience that shows poverty plays a more significant role in the problems identified in the suit. So why don’t these “reformers” attack poverty with the same passion they attack teachers’ rights? And what plan do they have to find better teachers? Where is the pool of more talented replacements? What kind of talented hero would want to become a teacher in this climate of “reform?”

The only answer is the tired business analogy; competition, differential salaries, managerial control. I can’t contemplate this ill-conceived nexus without wringing my hands in indignation. For it is the loss of dignity that teachers feel most acutely when it comes to “reform.”

But what has bothered me most about “reform” has been the fact that we teachers and our unions have spent too much time reacting to “reform” rather than acting against it. We’ve been playing defense when we need to play offense.

We need to craft a message about real reform and act on it. Learning is a complex human endeavor and hard to measure. Teaching is difficult. Our education system faces some overwhelming challenges, for which there are no simple solutions, partly because of resources, partly because of training, partly because of differing values. But there is also no excuse for not trying to confront the challenges.

If teachers act as professionals, collaborate, and make decisions in the best interests of students, they can gradually enact real reform. They will have a better shot at success if they don’t have to comply with misguided “reform” and be better prepared to stand up to them if they act rather than react.

Greg Jouriles has been teaching social science at Hillsdale High School in San Mateo since 1985. He is presently working on school wide performance assessment. He wrote this for Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Money

Our March Madness Office Pools Should All Be Legal

hands-holding-cash
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

The U.S. government doesn’t see it that way, but everyone could benefit from the $9 million Americans wager on college hoops

Roughly 40 million Americans are expected to fill out a total of 70 million brackets and bet $9 billion on March Madness this month, according to data from the American Gaming Association.

And all of them will be criminals in violation of up to three federal laws.

The most commonly violated law will be the Professional and Amateur Sports Protection Act (PASPA), which passed through Congress in 1992 at the behest of the four major pro sports leagues (NBA, MLB, NFL, and NHL) and the NCAA. It effectively prohibits sports gambling outside of the four states that had previously allowed it.

Yet it should surprise no one to hear that betting on the Super Bowl or March Madness doesn’t take place only in Nevada (the only state where it’s legal to make a bet on a single game), and that Federal agents don’t bother to descend on your workplace to arrest everyone who entered the office pool.

Gambling—largely legal and heavily regulated throughout Europe—has swept American sports culture. Lines and spreads are impossible to miss in media coverage. The most popular sportswriter in America, Bill Simmons, hosts a weekly NFL gambling podcast where he discusses the bets he’d “place if gambling were legal” (wink, wink). Sports betting is no longer some underground habit. It’s mainstream.

Gallup polls say 17 percent of Americans have wagered on sports in the last 12 months. In 2010, the FBI spent taxpayer resources to arrest 10,000 of those gambling Americans. Instead of spending some rounding up a few of us gamblers, imagine how much money Washington could collect for more worthwhile pursuits –like healthcare and education, and even gambling addiction treatments – if it legalized and taxed our sports betting.

Let’s be clear: the debate over gambling legalization is not a debate over whether or not to allow gambling. That’s already happening, whether you like it or not. It’s estimated that the amount of money wagered illegally on this year’s Super Bowl was 38 times greater than the amount wagered legally in Vegas casinos.

With the explosion in fantasy sports—many of which are played for money—over the last decade, more and more Americans are coming face to face with the folly of gambling “prohibition.” Even a strong supporter of the initial PASPA bill, Arizona senator John McCain, is now calling on Congress to rethink the law.

The discussion has become so animated that NBA commissioner Adam Silver penned a November New York Times op-ed arguing for a Congressional framework for states to legalize and regulate gambling, taking the activity take illegal gambling “out of the underground and into the sunlight.” Silver doubled down on these words in a February ESPN The Magazine story, calling himself not necessarily a gambling advocate but merely a “realist.”

Anti-gambling proponents will argue that legalized gambling will lead to more match-fixing and point-shaving – akin to what happened with the University of San Diego basketball team during the 2009-10 season. This could not be further from the truth.

Toreros guard Brandon Johnson was able to go undetected taking money under the table from bettors to deliberately not cover point spreads for so long specifically because gambling activity in California is kept in the dark. Had such odd fluctuations in San Diego lines been in the view of the public and regulatory officials, the shaving would have been nipped in the bud much sooner. In fact, there’s a good chance the greater risk of detection would have prevented any game manipulation attempts in the first place.

And in the pro ranks, athletes making millions simply aren’t going to risk their already lucrative careers for a tiny cut in match-fixing bribes. In any case, the general principle applies that it is easier to police and regulate activity happening in the open than what takes place in the shadows.

Some anti-gambling proponents are also concerned about the potential for increases in problem gambling. Given the extremely easy access to sports books and bookies, not to mention office pools and fantasy leagues, most experts are unconvinced legalization would actually increase gambling much, if at all. If you have a problem gambling, you’re already susceptible.

If anything, legalized gambling would remove many of the stigmas that prevent problem gamblers from speaking out about their issues and getting them the help they need sooner. Also, legal sports books do not allow wagers on credit, which can lead to dangerous situations in which bettors become heavily indebted to bookies.

Making bets on the Super Bowl or March Madness is a form of entertainment that is far more social and interesting than buying a state-sanctioned lottery ticket. Millions of Americans engage in this entertainment without harming anyone, including themselves. And as a society, haven’t we reached a consensus that we don’t ban things because a few of us will become addicted to them? Or would you also embrace prohibition for fast food, Netflix, and Candy Crush too?

Gambling is here to stay, and it is deeply rooted in our sports fan culture. Acknowledging this reality once and for all would be a smart bet.

Jim Pagels is a regular contributor to Forbes whose work has appeared in Reason, FiveThirtyEight, ESPN, and The Atlantic. He wrote this for Zocalo Public Square.

Read next: The Staggering Numbers—and Dollars—Behind March Madness

Listen to the most important stories of the day.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Education

Will Technology Kill Universities?

laptop-chalkboard
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

Free online courses, crowdsourcing, and big data are transforming the university from a gatekeeper to a public resource

In 2001, the Massachusetts Institute of Technology announced it was going to put the university’s entire body of course materials online, for free. That meant syllabuses, as well as problem sets and exams—and their solutions. There were even going to be some video lectures online. In 2002, the MIT OpenCourseWare pilot project debuted with 32 courses. Today, according to MIT, 125 million visitors access material from 2,150 classes, including the very popular “Introduction to Computer Science and Programming,” which helps students feel confident about “writing small programs that allow them to accomplish useful goals.”

MIT’s creation of OpenCourseWare is credited with sparking a global movement to make educational resources free to access, adapt, and redistribute. It’s been over a decade and hundreds of universities now offer open course material online. The Internet has expanded its reach, computers have gone through several generations, and mobile phones are nearly ubiquitous. In this new environment, it’s clear that sitting down in front of a chalkboard with a spiral notebook and pen is an anachronism—but what else will be? In advance of the Zócalo/Arizona State University event, “Will Technology Kill Universities?” we asked experts: How will technology—from massive open online courses and web-based textbooks to big data collection—change universities?

Students will be in the driver’s seat — Andy Miah

Technology will force universities to re-define their role within 21st century life, and this has a lot to do with the DIY generation, who figure out what they need to know via Google and Wikipedia. These platforms are the equivalent of the single-celled organisms that gave birth to humanity’s evolution.

In a world where learning experiences are ubiquitous and we rely less and less on institutions to deliver them, technology forces universities to re-think what they offer in the 21st century. Universities are no longer the gatekeepers of new knowledge, even less so with the rise of citizen science experiments, where non-experts can gather important data, and alternative qualification options, such as Mozilla Open Badges.

Students of tomorrow will want flexible, mobile-enabled learning experiences that are as compelling as film or theatre. The success of TED talks is indicative of the changing demands on teachers today and the changing attention economy of the new generation. Universities need to think carefully about how to curate learning experiences, making each lecture truly memorable and life-changing. The classroom now has to empower students to set the agenda and drive their own learning.

As we move into an era of sentient computing, universities need also to see technology not just as a vehicle for communicating ideas or enriching learning, but as a co-collaborator. Computers will become entities onto which students will project learning expectations. The machines will teach us, they will also learn, and they will spend more time with students than a lecturer ever can. If we want humans to remain at the heart of that interaction, we then need to really reconsider what we offer that they can’t.

Andy Miah is a professor and chair in science communication and future media at the University of Salford in Manchester, England. Follow him on Twitter @Andymiah.

Teachers and physical classrooms won’t go away — Kui Xie

Every generation of new technology brings excitement because it changes and improves human experience. Think of how excited people were when the book, the radio, and the television first entered their lives. These tools significantly changed the way we taught and learned—and the Internet, the personal computer, and today’s participatory cyber-infrastructure are carrying on that tradition. The creation of massive open online courses (MOOCs), for example, enables flexible and free educational opportunities to hundreds and thousands of learners around the world.

MOOCs have sparked debates about whether they will replace teachers and physical schools, but this is an alarm that has been sounded before with other new technologies. From a historical perspective, the answer has been a clear and consistent no. The reason: the human element is indispensable for educational systems. Student experiences in schools and universities aren’t only about mastering a particular body of domain knowledge and acquiring cognitive skills such as problem solving. They are also—and more importantly—about interpersonal social experiences, such as collaboration, leadership, friendship, and apprenticeship. MOOCs just cannot afford such immersive and comprehensive educational experiences. And let’s not forget that current MOOCs have limitations (for instance: credibility, accessibility, the high demand for motivation, and self-regulation).

Having said that, I still recognize that technologies have had far-reaching impacts on teachers and physical schools. The voluminous amount of MOOC content enables “blended learning” where a student learns partly in traditional classrooms and partly through online learning activities, and “flipped classes” where students watch lectures at home and take on “homework” in class with the teacher as a guide. Virtual games have revolutionized students’ perception of education: learning can be motivating, engaging, and effective. Big data analytics can provide new insights to inform teaching practice, diagnose when student interest flags, and help administrators make decisions.

Future learning technologies will continue to bring more exciting changes to educational systems that will improve the functions of universities, making them more efficient, effective, and able to make an impact on a broader population of human society.

Kui Xie is an associate professor in learning technologies and he directs the Research Laboratory for Digital Learning at The Ohio State University. His research interests include computer-supported collaborative learning, engagement, motivation and human cognition, learning analytics, and instructional design.

Education will spread to every corner of the world — Shai Reshef

The field of higher education is undergoing rapid and profound transformation: Demand is surging, providers are increasingly diverse and students are more mobile than ever. However, the accessibility and quality of education is vastly unequal. Huge populations remain underserved. With the number of college-age and college-eager students rapidly outpacing both material and human resources, there is a critical need for smarter, online resources.

Technology will transform higher education from being a privilege of the few to being a right for all. With increasing scale and spread has come a decreasing cost for Internet and wireless technologies. This has resulted in three important realizations: (1) access to education is a human right, (2) freedom of information is a universal freedom, and (3) people are naturally willing to help one another, as shown through social networking.

The spread of technology will ultimately bring education and academic studies to every corner of the world. But many of those who have Internet access still don’t have broadband, limiting the educational benefits of technology. This is a challenge we have thought about at University of the People, the world’s first non-profit, tuition-free, accredited American online university, and so we have designed our courses to use the technology that is available to our students, not the broadband we might wish they had. This has greatly increased our outreach across the world (to over 150 countries).

While technology has the power to spread knowledge everywhere, in order for people to study most effectively, they need personalized attention. (This is even truer for those who live at the margins—away from cities or barred historically from higher education.) For this reason, at University of the People, we put our students in small virtual classes (of 20-30 students) to ensure that those who need personalized attention in order to succeed get it.

The combination of technology, open access, and personalized attention, in a tuition-free, accredited online university, is the education of the future. University of the People runs a virtual university in English, but its model could very well be one that translates.

Shai Reshef is the president and founder of University of the People. He has appeared in WIRED magazine’s list of 50 people changing the world, and was selected as a top global thinker by Foreign Policy magazine. Previously, Reshef chaired KIT e-learning, the first online university in Europe.

The public will help classify galaxies and tag paintings, but universities will survive — Kathryn Eccles

Technology is transforming universities, right across the sciences, arts and humanities. Just look at my own career: I’ve metamorphosed from a traditional historian to a “digital humanities scholar,” examining the impacts of technology on humanities scholarship.

Many of the technological shifts we have witnessed have enhanced and improved access to learning for everyone, providing formal and informal routes into education that were previously unheard of. Through citizen science and crowdsourcing, we’re solving some big data problems by inviting the public in, asking for help and giving people the chance to participate in research problems: cataloguing, transcribing, archiving, and all the time seeing the workings of collections, libraries, and research projects that would previously have been off limits. In crowdsourcing projects such as Galaxy Zoo and Your Paintings Tagger, the public was asked to classify galaxies and provide information about things and ideas in paintings, tasks that are difficult to ask a computer to do because of the subjective nature of information. Some tasks, decisions, and interpretations can’t be done by machines. They go to the heart of what makes us human.

Are these changes killing universities? Not from where I’m sitting. I work in a deeply traditional university, with an ancient, unrivalled mechanism for teaching and an excellent, rich, and diverse research culture. This traditional university houses a multi-disciplinary department devoted to understanding life online, and one of the largest communities of digital humanities scholars and projects in the U.K. Universities are places of research, reflection, and understanding, as well as of teaching and learning. If there is anything certain to ensure their survival, it is the continued need to research, understand and reflect upon the things that shape our world. Douglas Adams once said that books would never die, because “books are really good at being books and no matter what books will survive.” Technology won’t kill universities, because they’re too good at what they do.

Kathryn Eccles is a research fellow at the Oxford Internet Institute and digital humanities champion at the University of Oxford. Her primary research interests are in the impacts of new technologies on public access to and understanding of cultural heritage resources, and on scholarly behavior and research, particularly in the humanities.

Community colleges will show the way — Hans Johnson

A popular prediction is that new technology will revolutionize higher education, making traditional brick and mortar colleges obsolete. Certainly, new technology offers tremendous potential—democratizing access to college, enhancing instruction, and improving graduation rates, to name a few. But before we jump on the bandwagon of declaring a new era in higher education, we should assess the degree to which new technology can address fundamental challenges in higher education.

Perhaps the greatest challenge of all is to ensure that higher education serves as a ladder for economic and social mobility rather than simply reinforcing economic and class divides. By that standard, we can dismiss most Massive Online Open Courses offered in conjunction with the nation’s elite universities. Most of those courses are taken by people who already have a college degree, and the vast majority of students who enroll in such courses never finish them.

A different experiment in online learning, and one that serves hundreds of thousands of students who come from disadvantaged backgrounds, is taking place at California’s community colleges. With over one million course enrollments, California’s community colleges are the largest public provider of online education in the country. They are the gateways to higher education for low-income and nontraditional students—those with jobs and family obligations.

At the Public Policy Institute of California, we examined student success in online courses in the state’s community colleges. In our study, we found that course completion and passage rates are substantially lower in online courses than in traditional ones, even though students in online courses tend to be more advantaged and academically prepared. Moreover, gaps in academic performance that we see among demographic groups in real-life classrooms are exacerbated in the online setting.

What these early findings demonstrate is not failure, but the need to improve both technology and the way it is used in instruction. If we can get it right at the community colleges, we can deliver on the promise of online education.

Hans Johnson is a senior and Bren policy fellow at the Public Policy Institute of California.

This article was written for Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Culture

How America Invented St. Patrick’s Day

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

Immigration and nativism transformed a quiet religious celebration into a day of raucous parades and shamrock shakes

When I was growing up in Britain in the 1970s, St. Patrick’s Day didn’t exist. The conflict in Northern Ireland was at its bloodiest, and it was not a time when British cities would open their civic spaces for a celebration of things Irish. My sense of what St. Patrick’s Day looked like was informed by the odd news story about celebrations in the U.S. The day appeared as something that was more about Irish America than it was about Ireland.

Years later, I was in a bar in Dublin with a friend discussing Irish history topics that needed to be written about. We agreed that the most obvious Irish date in the calendar, March 17, had never been touched by scholars, and a journey thus began. For the pair of us, the following years were all about understanding parades, Irishness, green beer, and corned beef and cabbage. We looked at a number of countries to try and comprehend why the Irish, perhaps above any other national group, have so successfully exported their national day so that it’s now a global phenomenon. The day is now celebrated in the form of parades, parties, and festivals on every continent.

The modest observance of St. Patrick’s Day in Ireland dates back to the 17th century, as a religious feast day that commemorates the death of St. Patrick in the fifth century. Patrick is credited with having brought Christianity to Ireland, and as such became a figure of national devotion and, in due course, the nation’s patron saint. The day’s importance was confirmed in 1631 when it was recognized by the Vatican.

For most Irish people at home, the day remained primarily religious into the 20th century. The elite of Irish society did mark the day with a grand ball in Dublin Castle each year in the second half of the 19th century. But for the public at large, it was a quiet day with no parades or public events. The day wasn’t even a public holiday in Ireland until 1904.

In the 20th century, the day became a public spectacle, with a military parade running through Dublin’s streets from the 1920s to the 1950s. Right through this period, the day was rather somber: mass in the morning, the military parade at noon and—this will shock American readers—the bars across the country closed for the day. (Irish bars didn’t begin opening on March 17 until the mid-1960s.) The military parade was replaced by a more general parade of floats and entertainment beginning in the 1960s, which in turn was transformed, in 1996, into the St. Patrick’s Festival, which still runs to this day. It’s a four-day event of music, treasure hunts, performances, and of course, on the day itself, a two-hour parade that draws up to half a million people onto the streets of Dublin.

But to understand the day and its significance is to tell an American rather than an Irish story.

The shift in the 1960s, after all, to a parade in Dublin (and many other Irish towns and cities) that was celebratory and fun was directly inspired by what was happening in the real home of St. Patrick’s Day, the U.S. The first recorded celebrations of March 17 took place in Boston in 1737, when a group of elite Irish men came together to celebrate over dinner what they referred to as “the Irish saint.” The tradition of parading began amongst Irish Catholic members of the British Army in New York in 1766 when the day “of St. Patrick, Saint of Ireland, was ushered in with Fifes and Drums,” as described in J.T. Ridge’s 1988 history of the New York parade.

The day grew in significance following the end of the Civil War and the arrival, across the 19th century, of ever increasing numbers of Irish immigrants. Facing nativist detractors who characterized them as drunken, violent, criminalized, and diseased, Irish-Americans were looking for ways to display their civic pride and the strength of their identity. St. Patrick’s Day celebrations were originally focused on districts where the Irish lived and were highly localized. Through the use of symbols and speeches, Irish-Americans celebrated their Catholicism and patron saint and praised the spirit of Irish nationalism in the old country, but they also stressed their patriotic belief in their new home. In essence, St. Patrick’s Day was a public declaration of a hybrid identity—a belief in the future of Ireland as a nation free from British rule, and a strict adherence to the values and liberties that the U.S. offered them.

By the end of the 19th century, St. Patrick’s Day was being observed on the streets of major Irish cities such as Boston, Chicago, and New York, as well as in other cities such as New Orleans, San Francisco, and Savannah. The evolution of highly localized Irish celebrations to broader public events and parades tracked the rise of Irish-Americans in local governments. In the face of growing nativist opposition, to parade down major avenues in city after city announced that Irish-Americans were numerous and powerful, and not going anywhere.

The tradition of celebrating St. Patrick’s Day grew across the U.S. and became a day that was also celebrated by people with no Irish heritage. By the 20th century, it was so ubiquitous that St. Patrick’s Day became a marketing bonanza: greetings cards filled drugstores, imported Irish shamrocks (indeed anything green) showed up on T-shirts, and the food and drink that became associated with the day became bar promotions. Corned beef and cabbage—rarely eaten in Ireland but commonplace in American cities as a springtime dish—became the meal for March 17. Dietary innovations for the day have grown over the years with all types of green food, including milk shakes, beers, and candy. Once a food giant like McDonald’s latched onto the marketing potential of St. Patrick’s Day, it was clear that celebrating had jumped from a solely Irish day into the American mainstream.

The power of St. Patrick’s Day in the U.S. was its ability to survive and then spread. It survived over the decades because generations of Irish immigrants were eager to celebrate their origins. The sheer number of those claiming Irish descent in the U.S., coupled with their mobility and assisted by a network of Irish societies and the forces of Irish commerce (namely Guinness and the ubiquitous Irish bar in very town) has meant that St. Patrick’s Day celebrations have spread across the country.

The holiday also spread by becoming a means for all Americans to become Irish for the day. The shared sense of being Irish, of wearing green and in some way marking March 17, has resulted in St. Patrick’s Day being observed in a similar fashion to July Fourth or Halloween. It’s the closest thing in America to National Immigrant Day, a tribute not only to the Irish, but to the idea that Americans are all part “other.” That may be why the holiday was slower to take off among the Irish diaspora in other nations around the world, where people are less comfortable with hyphenated identities.

Only more recently, once it was established as a bona fide American cultural phenomenon, and again aided by such Irish cultural ambassadors as U2, Guinness, and those ubiquitous pubs, did St. Patrick’s Day become a full-fledged celebration—whose spirit was re-imported in its Americanized form back to Ireland itself.

So, wherever you may be on this day, raise a glass to toast not only good old Ireland, but America’s interpretation of it as well.

Mike Cronin is a professor at Boston College and the academic director of its program in Dublin. He is the author, with Daryl Adair, of The Wearing of the Green: A History of St. Patrick’s Day. He wrote this for What It Means to Be American, a national conversation hosted by the Smithsonian and Zocalo Public Square.

Read next: 10 Supposedly Irish Things That Aren’t Remotely Irish

Listen to the most important stories of the day.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Education

How Would Students Spend the Principal’s Money?

dollars-calculator
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

A Phoenix high school’s experiment shows that kids can prioritize and collaborate when their education is at stake

During the 2013-14 school year, Quintin Boyce, the principal of Bioscience, a public high school in Phoenix, took a portion of his discretionary budget and told students they could decide how it was spent. He set no rules, except that the projects should benefit the school community. He knew many things could go wrong, but trusted that the students were going to assign those resources with responsibility and fairness.

This was a historic experiment – to the best of our knowledge, it was the first time that American high school students had used a process called participatory budgeting that we, as scholars of participatory democracy, have studied. But the Bioscience budgeting was more than history – it was an answer to the broader problem of participation.

Americans are often told to participate in local democratic decision-making, but we are rarely told how. And so most of us don’t know how. Effective participation requires knowledge of local laws, regulations, and processes. Participation also demands skills like active listening, public speaking, negotiation, conflict resolution, open-mindedness, compromise, and willingness to move from self-interest to the common good. Since none of these are innate, these civil and democratic competencies must be learned somewhere. Like schools.

The idea of schools as a place to produce democratic citizens is an old one, but it has been little practiced. Schools seldom nurture the capacity to participate in local democracy, and civics curricula have eroded over the past two generations.

To counter this trend, some educational institutions have begun trying to reach democracy through democratic processes. Bioscience is an interesting case. It is a STEM (science, technology, engineering, and math) public high school with approximately 300 students of diverse socioeconomic backgrounds. The student population is more than 60 percent Hispanic, and roughly two-thirds of students qualify for the Free and Reduced Meals program. Bioscience teachers emphasize project-based, student-centered learning through exploration and inquiry.

This atmosphere and comfort with experiential learning made Bioscience a favorable environment for launching a school participatory budgeting project. Participatory Budgeting, or PB, is a democratic process of deliberation and decision making over budget allocations that started in 1989 in Porto Alegre, Brazil. It is currently implemented in more than 1,500 cities around the world. Participatory budgeting provides not only a more transparent and accountable way of managing public money but also a means for participants to learn more about their community.

In the U.S., the first participatory budgeting process took place in Chicago in 2009; PB has since been used in New York, Greensboro, N.C., San Francisco, Long Beach, Boston, and post-bankruptcy Vallejo, Calif. Although each process is different, participatory budgeting tends to follow a similar structure. In the first phase, residents identify local needs, brainstorm potential ideas to address to these needs, and elect delegates to represent individual communities in deliberations. In the second phase, delegates discuss their communities’ priorities and formulate project proposals. Then, delegates bring these proposals to the community for a vote. The most popular projects are funded and implemented, and the process begins in the next year or budget cycle.

Although participatory budgeting is normally implemented at the municipal level, there have been participatory budgeting experiments in other settings (like public housing) and with specific age groups. For instance, Boston is in the second year of a participatory budgeting project that allows teens and young adults to decide together how to allocate $1 million of the city’s budget.

At Bioscience in Phoenix, which broke ground as the first school to implement participatory budgeting with students, the democratic work began with the design of the process itself. First, each grade level elected four student representatives to a steering committee. This committee of 16 students created the rules for the PB process and invited all students to submit proposals. In the first year, a total of 45 students collaborated on preparing 30 proposals.

In the next phase, the steering committee refined the final pool of projects to 18 by eliminating incomplete proposals and designed promotional materials to inform all students about the competing projects. Students hung posters in the school cafeteria that described the 18 projects and their total costs, everything from $157 for volleyball equipment to $1,000 to fund a music club, and from $217 for a school garden to $740 for shade umbrellas in the school courtyard.

More detailed project descriptions were shared in class at each grade level, and a slide show presentation of the projects was posted on the school’s internal social media site. At forums divided by grade level, steering committee members led discussions about each project so that students could ask questions and debate the merits of each option. Finally, the steering committee distributed ballots in each class to their peers. All students were given the option to vote on their three favorite projects. The steering committee tallied the votes and submitted the results to their principal. Throughout the process, nearly the entire student body participated in some form.

The three projects that received the most votes were educational in nature. The first was a sustainability education display for the school’s courtyard, the second was color ink for a student-built 3-D printer, and the third funded camera adapters for laboratory microscopes. The three most popular projects exceeded the PB budget by a few hundred dollars, but Principal Boyce was so enthusiastic about the way the process unfolded that he agreed to fund all three options. Reflecting on the experience, he told us he felt honored to provide an opportunity for students to participate in the improvement of the school community, and he hopes this experience will inspire students to engage in — and change— their communities.

At the end of that school year, Dr. Boyce was transferred to another school, and there was some concern that the incoming principal would cancel the experiment. But incoming principal DeeDee Falls decided to continue the process. The second cycle of participatory budgeting is currently underway. At this early stage, students are brainstorming ideas like soccer goals, a bike share program, and an aquaponic system and an algae reactor to produce biofuel. Principal Falls told us that this process makes sense for a school like Bioscience that tries to involve students in many aspects of their educational process, For her, participatory budgeting is valuable because it encourages students to collaborate with their peers and make meaningful decisions,together.

What better way to learn democracy than by doing it?

Matt Cohen is a Ph.D. candidate in the School of Sustainability at Arizona State University. Daniel Schugurensky is a Professor in the School of Public Affairs and the School of Social Transformation at Arizona State University.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME society

Are Cars Driving Into the Sunset?

blue-car-parking-lot
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

How our love affair with automobiles is changing the face of climate change and denser urban living

On a typical Saturday night in the 1970s, Whittier Boulevard in East L.A. would have been thumping with lowriders—those lacquered, richly colored sedans with chassis that could bounce up and down with the flip of a switch. Slow cruising in a Chevy Impala was perfect for people watching and showing off your glorious Frankenstein handiwork.

Cars have long defined who Americans are, how we socialize, where we live, and where we work. They still have a hold over us—just look at how many Fast and Furious movies keep coming at us—but the world we drive in is changing. It’s now been about a century since we were introduced to cars. Gas prices are on the rise while wages stay flat. We’re increasingly aware of how burning fossil fuels harms the environment. And commutes into downtown from the ever-expanding suburbs can take two hours or longer. In advance of the Zócalo/Metro event, “Is Car Culture Dead?” we asked experts to weigh in on the question: In an age of climate change and dense urban living, what role will cars play in our lives?

Who says ‘mass transit’ can’t include cars? — Geoff Wardle

This may be shocking coming from someone who supports cycling for mobility—but I would argue that cars could become the mass transit of the future.

As we contemplate future cars and other road vehicles that drive themselves, there is an opportunity for huge paradigm shifts in the way that we as individuals access cars, which will radically alter the nature of the automobile industry. Indeed, if automated road vehicles can fulfill their promise of creating an efficient, self-organizing streaming of vehicles along our infrastructure with a significant reduction in vehicular, pedestrian, and other road-related accidents; and if those vehicles can become highly energy efficient and matched precisely to our individual journey needs, then cars could provide much more efficient, convenient, and sustainable mobility than buses, trains, and subways.

Geoff Wardle is executive director, graduate Transportation Systems and Design, Art Center College of Design in Pasadena.

Living with less ‘stuff,’ including cars — Victoria Namkung

I think driving your own car is becoming less important to people living in dense, urban cities where public transportation, walkability, and rideshare apps such as Uber and Lyft are readily available. From increased awareness of climate change and dependence on foreign oil to the expense of car insurance, parking, and the soul-sucking time spent in traffic and road-rage incidents, driving comes at a high cost these days.

When I first moved to L.A. 17 years ago, blinged-out Hummers were a major status symbol. Today, most people would look down upon you for driving a gas-guzzling eyesore. What was “cool” 10 or 20 years ago simply doesn’t fly today, especially in our post-recession economy where people’s credit and finances have been drastically cut. Today, it’s all about personal responsibility, living with less “stuff,” including cars, and caring about the environment and future generations.

I’m particularly excited about the forthcoming Expo Line train between downtown and Santa Monica since there’s a stop just a couple of blocks from my house (which means I can easily meet friends downtown for dinner or hit the galleries in Culver City.) That commute has trapped drivers in their cars for years, and soon people will be able to save money, get more exercise, and talk to fellow commuters for a change. For those living and working near the Expo Line, I think we will see numerous two-car households go down to one-car households.

American car culture will not go away anytime soon, particularly in suburbs and rural areas where there is no other real option for transportation, but it’s hard to believe we’ll see another renaissance of car culture in the tradition of cruising, hot rodding, low riding, or import car racing. Well, maybe not until Tesla’s mass market Model 3 comes out.

Victoria Namkung is a Los Angeles-based writer and cultural commentator. She received her master’s degree from UCLA and wrote a thesis on import car racing and Asian-American youth in Southern California.

Free bus passes won’t make cities like Albuquerque stop worshipping the Ford F150 — Virginia Scharff

Let’s start with more questions. How many places do you need to go every day? And how can you get where you need to go?

The answer to all these questions depends a whole lot on whether you live in New York City or Los Angeles, Portland or Albuquerque. Everybody in New York takes the subway—check out recent Instagram pictures of Dame Helen Mirren and Keanu Reeves on the trains. Everybody in Portland (Oregon), a city that embraced multimodal public transport, brags about the light rail, streetcars, and buses. Every Portland hipster owns a really cool bike, which many of them actually ride.

I live in Albuquerque, a car city like Los Angeles. It’s hard to get to where you need to go without a car. Urbanists and environmentalists here would love to get drivers to use the buses (free passes for university students, staff and faculty!), bike routes, and services like Uber. Twenty-somethings like my own kids do take the bus and ride bikes. People who live in Albuquerque and work in Santa Fe (or the reverse) can commute daily via the Railrunner train.

But we are at a disadvantage. Cities that invested in mass transportation and encouraged density already possess assets that car culture cities will envy as the planet heats up. We’re seeing many more hybrids, electric cars, and smartcars in Albuquerque, where we worship the Ford F150 and the 1970s Oldsmobile. But in cities where most of us have to be many places every day, and we measure the distance between home and work and school and groceries in multiples of miles and chains of destinations, people will cling to their steering wheels for dear life.

Virginia Scharff is associate provost for faculty development and distinguished professor of history at the University of New Mexico. She is the author of Taking the Wheel: Women and the Coming of the Motor Age (1991), The Women Jefferson Loved (2010), and novels under the name of Virginia Swift.

Millennials actually like cars, and they’re here to stay — James E. Moore, II

Let’s assume for the sake of discussion that climate change is occurring and that greenhouse gases from human activity are the culprit. If you analyze greenhouse gas emissions per passenger mile, public transit and automobiles have very similar numbers outside the New York metropolitan area. As hybrids penetrate the market and fleets shift to take advantage of cleaner and cheaper natural gas (yes, lower prices are here to stay), automobiles emerge as part of the solution to reducing greenhouse gas emissions.

When it comes to density and transit, what people actually do runs contrary to what many pundits expect and many urban planners hope for. Cities continue to decentralize, and grow most quickly when they do. The 2013 American Community Survey of work trips reports that 80 percent of the small national increase in transit ridership was in only six metropolitan markets, and 40 percent was in New York. Los Angeles has lost transit riders. Now the share of L.A. commutes on mass transit is at 1980 levels.

The media drumbeat that the Millennial generation is rejecting automobiles and suburban living is fanciful, not factual. I often rely on Wendell Cox ‘s Demographia.com for U.S. trends in housing, population, transportation, employment, and underlying economic forces. These data show that, when it comes to housing, Millennials tend to prefer more rather than less. The fraction of Millennials living in traditional urban cores dropped between 2000 and 2010, and the trend for all age groups is toward detached homes in suburban locations with bigger houses and lots. These changes were most predominant at the urban fringe and outer suburbs, where delivering transit service is a challenge. Millennials prefer the personal and scheduling freedom provided by the automobile, just like almost everybody else.

So cars will continue to play many roles in our lives, getting most of us to work, and enabling the consumption of goods, education, entertainment, and leisure, even if someone or something else is driving them. Now if you will pardon me, I have a ride to catch on Uber.

James E. Moore, II, is vice dean of USC’s Viterbi School of Engineering and director of the transportation engineering program.

This article was written for Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME society

I’m Ashamed of Who I Am on Twitter

twitter-logo-two-phones
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

The popular social media network is fueled by exposure and fear of being out of touch

I am ashamed of the way that I am on Twitter. I am ashamed of the things that I write, those wan attempts at wit and weak gestures toward wisdom. I am ashamed that the things I write go unread, less cultural signal than digital noise. I am ashamed of my need for approval, of the way I yap like a puppy at my more famous friends. And, if I’m being honest, I’m ashamed that my mother, whose last two tweets were about me, has four times as many followers as I do.

I am ashamed of the way that I am on Twitter, and my clearest consolation is that I am far from alone. Indeed, there are others, so many others, whose shame is surely greater than my own. Justine Sacco, a corporate communications executive, faced public derision after an ill-advised, and seemingly racist, tweet spread across the site in 2013. Sacco has recently returned to the public eye thanks to a widely circulated New York Times Magazine story by the journalist Jon Ronson, offering a powerful reminder of just how easy it is to get into trouble online, and just how long the effects can last. Her experiences might have taught a thing or two to Keith Olbermann, who was forced into a leave of absence last week after he took to Twitter to abusively spar with Penn State students and alumnae over their charity efforts.

When we revisit such self-destructive lapses, we almost always talk about the spectacle of public shaming rather than the experience of being ashamed. Unnoticed in these scandals is the fact that Twitter evokes some of shame’s fundamental characteristics long before anyone actually gets into trouble. On Twitter, we tango with shame from the moment we first log on to the day we delete our accounts. If Twitter has become a powerful tool for shaming others, it may be because it recreates the basic shape of shame so well.

Shame, it often seems, has more to do with proximity than isolation. The psychologist Silvan Tomkins, for example, positioned shame in opposition to excitement and interest, suggesting that we encounter it when we get too caught up and need to pull away. We feel ashamed when we worry that others are too close to us, too close for us to hide the parts of ourselves that we dislike. Worse still, they are too close to see the whole person: Peering at us through magnifying glasses, they observe only our ugliness.

To put it another way, then, shame is all about exposure, and on Twitter exposure is the fundamental currency. Some of my friends speak of deleting tweets that failed to garner attention. I confess that I’ve done the same, eager as I am to curate an image of myself at my cleverest and most popular. I, at least, worry that my unnoticed tweets (and there are so many of them!) show me to be out of sync with the world. When I write something—whether clever or heartfelt—and get no response, my first feeling is not one of loneliness. Instead, I worry that I’ve actively shown myself to be out of touch with those around me. Tell an unfunny joke at a party, and only a few people will roll their eyes; make the same mistake online and the eye rolling is potentially unlimited.

At its core, shame involves a feeling of misattunement, the lingering sensation that we’re up to one thing while the rest of the world is doing something else altogether. Precisely because it promises to connect us with everyone at once, Twitter almost inevitably exposes us to this exact sensation of misattunement. On Twitter, we always teeter on the brink of shame—both because no one sees us and because too many do. Thinking we have the world’s pulse, we speak up, only to realize we’re drumming in an altogether different rhythm. What could be more shameful than that?

Perhaps because they always tarry with shame, Twitter’s users carefully police the act of shaming, treating it as a privileged tool and punishing those who employ it incorrectly. Keith Olbermann, for example, arguably ran into trouble for trying to make others feel bad about their conduct. Some have offered psychologizing explanations for Olbermann’s own behavior, but his true mistake may have been his decision to go after the Penn State collective as a whole, even when he was addressing individual students. As Jennifer Jacquet notes in her new book Is Shame Necessary, “For the reason of increased anonymity in larger groups, shame can be weaker in big groups.” Because shame has everything to do with isolation, it is difficult, if not impossible, to use it against a group that sees itself as a coherent whole. Unable to isolate his targets, Olbermann effectively ostracized himself.

We never feel smaller than when we stand alone, never more so than when the world arrays itself against us. By insisting on pithiness, Twitter only amplifies this sensation of smallness. Simultaneously, it proves that nothing looms as large as a few short characters. This is, of course, an old idea. Long ago, the German Romantic philosopher Friedrich Schlegel proposed that fragments are interesting because they point toward absent wholes, thereby activating the imagination. Paradoxically, this allows fragments to promise more than any complete object can. An intact ancient vase teaches us about the hand that shaped it, while a broken one can tell the story of an entire civilization.

A tweet, likewise, threatens to describe the person who composed it rather than the conditions of its composition. Speaking to Vulture, Jon Ronson observes, “We’re creating a hard, frightening world where somebody can get defined by their stupidity, as opposed to their stupidity being put into a sort of wider human context.” Ronson’s forthcoming book So You’ve Been Publicly Shamed seeks to counteract this trend by examining the experiences of the shamed, not just the things that made them targets. Nevertheless, shame may be embedded within the very tools we use to communicate today, especially those that privilege speed and brevity.

I am ashamed of the way that I am on Twitter and you probably are too. I am ashamed of all those little fragments of me, all those splinters of a self. And yet there is a consolation in this. When we’re in the thick of it, shame blots out everything else. But Twitter reminds us just how small most of the things that shame us really are.

Jacob Brogan is a writer based in Washington, D.C, currently working on a book about the cultural history of lovesickness. He is too ashamed to ask you to follow him on Twitter. He wrote this for Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME society

When Homework Is a Matter of Life and Death

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

My parents fled Iran because they were forbidden from getting an education there. I've spent over one-third of my life on a university campus

The first hint of sunlight glows off the horizon as I rush toward Stanford Hospital from the parking garage, white coat in hand, stethoscope bouncing against my chest. Every few steps, the diaphragm of my stethoscope ricochets off the silver pendant my mother gave me—a nine-pointed star etched with a symbol of my Bahá’í faith. My mother escaped Iran at age 17 as the country was on the cusp of revolution—a revolution that would create a society where, to this day, Bahá’ís like myself are barred from obtaining a university education. But here, in the United States, I’ve spent more than a third of my life on a university campus.

The Bahá’í faith was founded in 19th-century Persia, and is now the largest non-Islamic minority religion in Iran. Persecution of our religion has helped it expand around the world—my own family’s escape to the United States in 1979 guaranteed that I would be born to the freedom and opportunities denied to Bahá’ís back home.

Back in Iran, the state bans Bahá’ís from studying at universities as just one of many different forms of persecution, which has included desecration of cemeteries, confiscation of property, and wrongful imprisonment. However, because education is such a fundamental principal of our faith, Bahá’í students there have to learn in secret—usually through the Bahá’í Institute of Higher Education (BIHE), whose volunteers quietly teach classes in homes or via online portals. The threat of arrest is constant; the government recently imprisoned both BIHE students and professors, some at the notorious Evin Prison, which has held many prisoners of conscience. I, on the other hand, had the freedom to receive a bachelor’s degree in bioengineering from Rice University and am now in an M.D.-Ph.D. program at Stanford University, filling my brain with pathophysiology and methods of statistical analysis, which I hope to use to serve the community.

Sometimes I find the sheer volume of learning to be overwhelming, but then I take a deep breath and remind myself how fortunate I am to be able to acquire knowledge freely. Inside the hospital, it’s all bustle. I’m greeted by beeping pagers, an antiquity forgotten by the rest of the outside world, as I make my way to my morning clinic. As soon as I arrive, I glue myself to the computer and begin mentally dissecting patient charts. My first appointment of the day is a lovely woman with Type 2 diabetes who is just beginning to get her blood sugar under control. Between patients, I pore over the medical literature, making sure I understand each patient’s problems.

In my afternoon clinic, one of the residents excitedly approaches me. “You speak Farsi, right?” I nod. “I have a patient who would be really happy to meet you.” She gives me the room number, and I walk gingerly toward the room, already feeling self-conscious about my accent. I walk in and greet Mrs. H. in Farsi; her face instantly glows with a smile. I ask about the course of her cancer, how she’s feeling, and if she has any questions. She tells me she’s doing well and that the therapy has put her in remission. Then, she asks me where my parents live (Dallas), whether I’m married (I have been for three years, to a fellow Bahá’í I met at Stanford), and if I cook Persian food (I wish). At the end, she tells me how proud she is to see a young Iranian woman becoming a physician.

That evening, as I enter my house, I’m surprised to hear voices coming from my living room. But then I remember that my husband, a volunteer BIHE professor of engineering, was scheduled to give a lecture. I peek into the living room, where he is lecturing into his laptop on how circuits work. The information is over my head, but the students halfway around the world are excitedly asking questions. They are huddled on a beautiful scarlet-colored Persian carpet and are dressed like typical American college students—jeans and comfortable sweaters.

I quietly walk in, take off my stethoscope, and sit on the couch across from my husband. I close my eyes and touch the pendant around my neck, trying to imagine, just for a moment, what it would feel like to be on the other side. When I open my eyes, I feel an overwhelming mix of feelings. I’m incensed that rulers anywhere would deprive individuals eager to learn a chance to contribute to society, and deprive a society their contribution. And yet I can’t help but feel hope for a generation of Iranian Bahá’ís who are so motivated that not even the threat of arrest can extinguish their passion for knowledge.

And, with a feeling of gratefulness, I crack open my 500-page textbook on internal medicine and pour over medications for treating Type 2 diabetes.

Roxana Daneshjou is an M.D.-Ph.D. candidate at the Stanford School of Medicine and a recipient of a Paul and Daisy Soros Fellowship for New Americans. She wrote this for What It Means to Be American, a national conversation hosted by the Smithsonian and Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

Your browser is out of date. Please update your browser at http://update.microsoft.com