TIME Science

What It’s Like To Work in the Body Donation Industry

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

I had to ask next-of-kin uncomfortable questions about the deceased

For over three years, I thought about death every day. This wasn’t some morbid obsession. It was my job.

A growing number of senior citizens—both permanent residents and part-time “snowbirds”—have settled in neighborhoods and mobile home parks across Phoenix, Arizona. All of these out-of-state transplants and seniors have created a growing demand for alternatives to traditional funerals.

When I was 23, I started working for one of several whole-body donation organizations that serve the region. When someone dies, his or her family has several options for the body: a viewing and burial, cremation, embalming, or donation. Donating tissue, like organs, corneas, and skin, can also take place.

My time in the whole body donation industry began on Craigslist. The help wanted ad didn’t list the organization’s name. When my future boss called to set up an interview, I thought she was a telemarketer. I could’ve hung up then, missed the opportunity, and been none the wiser about the death industry.

Instead, I stayed on the phone, and she ended up hiring me for the receptionist position.

I answered phones, typed up letters, and ran to the post office. It was a normal office job, except for the cadavers less than 100 feet from my desk, sealed away in the laboratory.

Soon I was promoted and began taking “notification of death” calls. Some people signed up to donate their bodies to science. Family members or a hospice nurse called to inform me of the donor’s passing, and I arranged for mortuary transportation. Other times, people called in dazed. Their father or mother or sister or spouse had died, they told me, and they didn’t know what to do.

I couldn’t do much. I couldn’t undo their loved one’s death or say something wise to make everything better. But I could give them a to-do list: call this person, sign this, fax this, and answer these questions.

It was an industry I hardly knew existed until I was part of it. Soon, I started picking up on references to donation on television shows like Bones and Law & Order: SVU. The storylines usually involved a nefarious character in a suit, stolen body parts, or a funeral home with a hidden autopsy suite.

These plotlines are not baseless. The reputation of the body- and tissue-donation industry as a whole has been damaged by the actions of dishonest hospitals, funeral homes, and doctors. Less than a decade ago, the CEO of a tissue recovery firm was arrested for selling illegally harvested body parts. Funeral homes, donation companies, and hospitals have also been exposed for forging consent forms and unlawfully obtaining tissue. In 2013, I watched as FBI vans and news helicopters swarmed a nearby tissue donation firm. Our phones began ringing off the hook. I assured panicked callers that no, that was not our organization on the television, and that yes, we could help with arrangements for their loved ones. I feel lucky to have worked for an ethical business in such a loosely regulated industry.

My supervisor required us to inform people fully of the nature of full body donation, answer all questions, and only proceed with witnessed, written consent. Not all religions and communities support donation. I made it clear that people should only donate if they were 100 percent comfortable with the process.

Whole body or anatomical donation places organs, body parts, and other tissues with medical facilities. The tissue is then used for training doctors, developing new treatments and medications, and researching diseases, from breast cancer to dementia. What is not used for research is cremated and returned to the family.

Death is expensive. Traditional funerals cost upwards of $6,000 and even simple services can force families to choose between paying rent or paying for a cremation. The organization where I worked covered the cost of mortuary transportation, cremation, and the return of ashes to the family. We took care of all the necessary paperwork and tried to whittle down any other costs. Usually families were just left with paying the county for a death certificate, which are about $20 apiece.

Some people decided to donate because of financial hardship. For most people, the decision to donate was not a financial choice. I saw people who had suffered for years from cancer sign up for donation because they thought it was one final way to fight the disease.

Sometimes, especially when the person died in hospice, a nurse or social worker would step up and help the family plan for a funeral home or alternative service. Other times, it was a family member who called. They found us in a Google search or were referred by a friend and were fumbling through a bewildering situation.

If the deceased was registered to donate, I started the transportation and donation process immediately. If they were not, I ran through a list of questions. Depending on the answers, I either carefully told the next-of-kin that their loved one did not qualify for whole body donation or informed them that they were accepted. Hepatitis C or HIV/AIDS rules out donation of any kind.

The majority of my workday consisted of filling out medical questionnaires. I called the family sometimes within hours of the passing and asked questions ranging from routine medical questions to intimate social history. As an introverted child and teen, I was too shy to call in a pizza delivery order. In this job, I was on the phone calmly inquiring about drug use, tattoos, and sexually transmitted diseases.

With the phone cradled on my shoulder, I frantically typed and Googled unfamiliar medical conditions. I memorized the correct spellings of aneurysm, Levothyroxine and myelodysplastic syndrome.

The most surreal part of the job was when the donation was done, and I personally delivered the ashes back to the family. I logged thousands of miles on my VW station wagon. The other coordinator and I drove so much the owners eventually bought a company car that was much nicer than either of our vehicles. I traveled to every corner of the valley to multi-million dollar homes, gated suburbs, and rusted trailer parks.

Some days were good. The families were in mourning but thanked me profusely. I could tell they were content with their decision. People invited me into their living rooms. They showed me photos of their loved ones and detailed their plans to spread the ashes on the beach or in the forest or in the Colorado River.

Other days, not so much. I visited hoarders with houses so crammed with card tables, boxes, and cat food that I had to come in through the garage. I delivered to homes in the empty stretches of Apache Junction that gave off a distinct meth house vibe, complete with cardboard jammed over the windowpanes and a television set smashed on the lawn. After delivering the ashes of a 20-year-old to his grieving mother, I sat in the car for 20 minutes and tried to shake off a wave of sadness.

People expressed shock to see me, a person in her 20s, on their doorstep, holding their loved one’s ashes. They remarked on how young I was. (I think they expected a gaunt-faced, middle-aged man in a gloomy suit.)

“How’d you end up in with this kind of job?” they’d ask.

The death industry is not an easy business to work in. You carry the sadness and anger of your work day home with you. Sometimes you have nightmares. After reading too many medical examiner’s reports, you create a mental list of ways you do not want to die.

I still didn’t come any closer to understanding death. I couldn’t ever define what I wanted after my death, but I realized that talking about it was the best way to prepare for it. Ignoring death just leaves empty spaces and gaps for the survivors to guess their way through.

One thing that struck me was how the reports always describe the state of cleanliness of a residence where someone dies. Now I find myself making a point of keeping my home neat, because you never know when death might visit. And I’d like to avoid the judgment of the medical examiner.

My time in the industry gave me endless anecdotes and cautionary tales. With my boss’s encouragement, I re-enrolled in school and accepted a journalism internship. I left the job after three years, but felt a decade older. I knew my time in the death industry was over.

Whitney M. Woodworth, an Arizona State University Cronkite Sustainability Fellow at Zócalo Public Square, is a senior at ASU

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Culture

How Americans Fell in Love With Taking Road Trips

open-road
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

As the automobile industry took off, drivers discovered the romance and freedom of long-distance travel

Tens of millions of Americans have hit the road this summer. The all-American road trip has long been a signature adventure, but once upon a time the notion of your own motorized excursion of any length would have seemed impossible.

In 1900, Americans were hampered by wretched roads and limited by the speed and endurance of the horses that powered buckboards, coaches, and wagons. If they had an urge to travel far distances, they had to rely upon the steam locomotive.

As fantastic as it might have seemed at the turn of the 20th century, the idea of supplanting the iron horse with the horseless carriage did catch the fancy of some intrepid men and women. Eager to test the technological limits of their new contraptions, a few hardy souls set off upon far-reaching expeditions between 1900 and 1910.

Colorado attorney Philip Delany, recounting his 1903 excursion from Colorado Springs to Santa Fe observed: “and so the machine is conquering the old frontier, carrying the thudding of modern mechanics into the land of romance. . . .” Such travel meant seeing “the wildest and most natural places on the continent,” encountering more than a few hints of danger on steep and rocky mountain roads, and reliving the exploits of American pioneers. “The trails of Kit Carson and Boone and Crockett, and the rest of the early frontiersmen,” he declared, “stretch out before the adventurous automobilist.”

At the same time, some city dwellers simply sought an escape. Early 20th century urban environments had their drawbacks: sidewalks overflowing with scurrying pedestrians; streets crowded with unending waves of trolleys, delivery wagons, carriages, and pushcarts; the persistent stench rising from mounds of horse manure; raw sewage emptying into open gutters; rotting piles of uncollected garbage and dense clouds belching from factory smokestacks.

Upper-middle-class tourists motored through the countryside and then camped by the side of the road, finding the sentimentalized image of the gypsy or the tramp quite a compelling identity to assume. They reveled in their sense of independence from stodgy summer resorts and the tyranny of inflexible timetables set by railroads or steamship lines. They delighted in the beauty and serenity of unspoiled countryside. In the same article quoted above, Philip Delany observed that “when [the automobilist] is tired of the old, there are new paths to be made. He has no beaten track to follow, no schedule to meet, no other train to consider; but he can go with the speed of an express straight into the heart of an unknown land.”

In its infancy, however, an automobile could not deliver most Americans from their urban frustrations—for most Americans could not afford to own and operate one. At a time when average annual salaries might not reach $500, many automobiles might cost between $650 and $1,300, securely beyond the grasp of all but the wealthiest. Moreover, with few garages, filling stations, and dealerships outside of city limits, even the infrastructure required for the care and feeding of the automobile could be difficult to locate and could drain the motorist’s wallet. During their earliest years, neither automobiles nor auto touring could be considered within the reach of the masses. Automobility would only become pervasive over time, thanks to rising wages, falling prices for used cars, expanding opportunities to buy these machines on credit, and, especially, the introduction of Henry Ford’s revolutionary Model T in 1908.

Even for those Americans who could afford the first horseless carriages, to go off the few familiar paths in most parts of the country, especially in the great distances of the trans-Mississippi West, required a large measure of self-reliance. One motor traveler characterized the roads of his native Wyoming in 1909 as “deep ruts, high centers, rocks, loose and solid; steep grades, washouts, or gullies . . . ” He went on to note that, “unbridged streams; sand, alkali dust; gumbo; and plain mud, were some of the more common abominations.” Between the obstacles presented by such abysmal road conditions, the likelihood of frequent mechanical breakdowns, and the rarity of supplies to sustain driver and vehicle, these early outings always required an audacious spirit.

Aspiring long-distance auto tourists back then were counseled by self-proclaimed experts to carry abundant quantities of supplies. Those who made the first transcontinental drives between 1901 and 1908 hauled along ropes, blocks and tackle, axes, sleeping bags, water bags, spades, camps stoves, compasses, barometers, thermometers, cyclometers, first aid kits, rubber ponchos, tire chains, pith helmets, assorted spare parts, and sufficient firearms to launch a small insurrection. Mary C. Bedell’s impressive list of gear, published in her entertaining 1924 account of auto touring, Modern Gypsies, typifies what was carried by the most dedicated motor campers both in scale and variety: “tent, duffle bags, gasoline stove, Adirondack grate and a kit of aluminum kettles, with coffee pot and enamel cups and saucers inside”—an array of equipment that added “four or five hundred pounds” alone to the weight of the fully loaded automobile. A car so laden, puffing along western trails, bears a striking resemblance in the mind’s eye to a hermit crab staggering across the ocean floor burdened with its house on its back.

Even as motoring Americans loaded up their cars with the contents of their local hardware stores, however, the growth in their numbers year by year provided alluring prospects to entrepreneurs in small towns and great cities throughout the West. Garages, gas stations, roadside cafés, and diners began to pop up along more frequently traveled routes while hotels, restaurants, and general stores started to advertise in the earliest guidebooks produced by organizations such as AAA and the Automobile Club of America. Following the lead of Gulf Oil in 1914, gasoline retailers commissioned maps branded liberally with their logos for free distribution at their service stations. Motorists once left entirely to their own devices now encountered a rapidly evolving infrastructure of goods and services.

Meanwhile, governments at the local, state, and federal levels began to invest increased engineering skill, construction efforts, and tax dollars in road improvements. While motor tourists by the end of the World War I might still encounter 10,000 miles of battered gravel trails littered with potholes for every 10 miles of carefully surfaced and maintained roads throughout the country, the increasing pace of improvements made it far easier to drive through the West than it had been for those who had attempted such a journey only a decade before.

Although still new to the American scene by 1920, the road trip thus had begun to take on a shape familiar to modern eyes. Above all, the automobile was assuming a dominant role in popular recreation as more and more Americans incorporated it into their visions of recreation and leisure. As costs fell and reliability increased, as the successful outings of the few began to inspire the many, and as the thrill of this new technology spread through an ever-wider range of the populace, motoring for pleasure insinuated itself as a notion in the minds of many Americans. Indeed, less than a decade after the turn of the 20th century, author William F. Dix could assert that the automobile had become nothing less than a “vacation agent” for motor-savvy Americans as it “opens up the countryside to the city dweller, [and held out the promise of] great national highways stretching from ocean to ocean and from North to South.” Over those highways, he continued, “would sweep endless processions of light, graceful, and inexpensive vehicles . . . carrying rich and poor alike into a better understanding of nature and teaching them the pure and refreshing beauties of the country.”

While Dix fell far short as a prophet of social or technological developments, his sense of how inextricably linked the automobile would become in the leisure pursuits of Americans has been thoroughly borne out by the evolution of the American road trip.

Peter J. Blodgett is H. Russell Smith Foundation curator of western historical manuscripts at the Huntington Library and editor of Motoring West Volume 1: Automobile Pioneers, 1900-1909. He wrote this for What It Means to Be American, a national conversation hosted by the Smithsonian and Zócalo Public Square

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME society

Policing My Hometown Is a Labor of Love

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

A California deputy chief explains the challenges of protecting the city he grew up in

We hear a lot these days about rifts between police and the communities they serve, especially in communities of color. I come from both worlds: I am a deputy police chief in my hometown of Salinas.

You read a lot about crime in East Salinas, but I’m proud to have grown up there. The city was different and smaller (about 80,000 vs. 155,000 today) when I was a kid. My family was typical; on both sides, I had grandparents from Mexico who came here to work in fields and packing sheds.

My schools—Fremont Elementary and El Sausal Middle—had an assigned police officer (an early version of today’s school resource officers). I made friends with those officers. Those relationships—and the fact I had strict parents whose rules kept me away from drugs and gangs—led me to consider law enforcement as a career.

After high school, I enrolled at Hartnell Junior College. The college had a campus safety program run by the Salinas Police Department; I enrolled in classes and patrolled the college, providing security services and parking enforcement. Before I graduated, I became a reserve deputy with the Monterey County Sheriff’s Department.

After Hartnell, I was accepted to Fresno State but put my education on hold to pursue my career. I applied for jobs with the sheriff’s department and the Salinas police. The sheriff’s department offered me a job first, so I took it. After the academy and training, I worked in the King City substation, but I wanted to come home and work out of the Salinas station, which serves unincorporated areas like Castroville and Prunedale.

After time on patrol, I took a job as an investigator in the sheriff’s coroner division. I felt the pull of Salinas. I applied with the police department and was hired. I’ve done a number of different jobs—training officer, SWAT, detective, the gang unit—and found time to complete bachelor’s and master’s degrees. Over the next 20 plus years, I was promoted to sergeant, lieutenant, commander, and then, in 2013, I became one of two deputy police chiefs.

I work at staying grounded in the community. I’ve been on the board of directors of three local nonprofits—our own Police Activities League, Second Chance Youth & Family Services, and Sun Street Centers, a drug and alcohol abuse program.

The Salinas crime picture is complicated. Start with socioeconomic disadvantages, absentee parents, educational challenges, then add in the glamorization of gang lifestyles in music and movies, and the intrinsic desire of kids to belong to something. The gangs are better armed than they used to be, and the prison gang and drug cartel influences have made the problem even more challenging.

While the problem is real, the perception of violent crime in East Salinas has been overstated. Gangsters are a small number of people here. Most people are honest and hardworking and want a safe place to live, work, and raise a family.

In the police department, we are trying to use technology to work smarter and target the most violent individuals. The days of just driving around and trying to arrest gangsters are over. We rely on intelligence and statistical analysis to identify troublemakers. We combine that with place-based policing, as part of our overall violence reduction strategy.

This worked particularly well in the neighborhood of Hebbron, where two officers took care of quality-of-life issues and really got to know the people there.

Unfortunately, we had to temporarily shut down our place-based policing program—and all our other special units. We are more than 30 officers below our budgeted force. The attrition and retirement rates have been outpacing our hiring efforts.

Police recruitment and staffing are problems everywhere—and the recent media coverage of alleged police misconduct is not helping matters—but Salinas has special challenges. We’re just an hour’s drive from Silicon Valley, where police recruits can make three times the salary and not have to put their lives on the line.

I see hope in the improvements people are making in neighborhoods, the work that planners are doing to attract small businesses, the investment by Taylor Farms in its new headquarters downtown. If we can make Salinas peaceful and safer, the future could be very bright.

Right now, we’re focused on the fundamentals—and on building our ranks. We have 13 people in the academy and need more. Our goal is to better reflect the community we serve. We even have a grant for eight school-based police officers, and I’d love to have the people to staff it. I know firsthand the impact even one officer in one school can make on a kid in Salinas.

Dan Perez is a 30-year veteran of local law enforcement. He is the married father of three who enjoys photography and travel in his spare time. This essay is part of Salinas: California’s Richest Poor City, a special project of Zócalo Public Square and The California Wellness Foundation

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME politics

Let California Pick the Next President

california-flag
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

Forget the caucus in Iowa—let's host one on the Golden State's Central Coast

Correction appended, Aug. 14, 2015.

At the risk of sounding like Donald Trump, let me say it’s just stupid that California won’t play a significant role in picking the next president. It’s even dumber that the small state of Iowa, with its first-in-the-nation caucuses and swing status in general elections, is a presidential kingmaker.

So why don’t we do something about it? Yes, California has previously moved its primaries up in the calendar to try to make itself important. But that won’t work because the small-fry states of Iowa and New Hampshire have hoodwinked the country into believing that small, rural places are better presidential proving grounds and give a chance to lesser-known candidates. No matter where California shows up on the calendar, we are easily dismissed for our size—too big to be more than a test of money and name recognition.

If we’re going to take our proper place in picking presidents, we’ll need an entirely new strategy. We have to out-Iowa Iowa. We must make ourselves smaller.

How? California is a collection of regions the size of normal states. Our new strategy: pick one region that offers all the things Iowa offers—small population, a rural character, no big cities, an engaged political culture—and hold an early presidential contest in just that region.

Which region? My fellow Californians, let me introduce you to the Central Coast Caucus.

Offered to the nation as a single political entity, the six Central Coast counties—Ventura, Santa Barbara, San Luis Obispo, Monterey, San Benito, and Santa Cruz—could answer every argument that’s ever been made for Iowa’s primacy.

You want a small population? The Central Coast has just 2.3 million people—800,000 less than the 3.1 million who crowd Iowa. You hate big cities? The Central Coast’s most populous municipality, Oxnard, has fewer people than the metropolis of Des Moines. You want rural voters who know their agriculture? Iowa has the corn, sure, but the Central Coast has Ventura berries, Salinas lettuce, and all the wineries in between. (Heavy drinking may be necessary for today’s presidential politics to make any sense).

Iowa and the Central Coast are middling places, between larger, more important regions. Both have relatively competitive politics that incorporate extremes, but tend to the moderate. While Iowans boast that they pick winners, the Central Coast includes the state’s most reliable political bellwether, San Benito County.

But the Central Coast, offers much more than Iowa—more diversity, stronger universities (UC Santa Cruz, Cal Poly San Luis Obispo), better scenery (Big Sur), and more striking venues for campaign events (from Pebble Beach to the Neverland Ranch).

Instead of pursuing endorsements of obscure Iowa county party chairs, the Democrats would have an Oprah primary (she has a place in Montecito) and the Republicans would stage a Clint primary (Eastwood lives in Carmel). If you give the media and political professionals who dominate presidential politics a choice between the weather of Sioux City and Santa Maria, California would take Iowa’s crown.

The Central Coast Caucus would be good for California, too. The national attention would force our weak county parties to raise their games. With the new caucus’s central location, young people from all over the state would work on campaigns, and learn skills and make connections that can change their lives.

Overlooked issues would also get attention. Candidates would confront homelessness in Santa Barbara, the perils of offshore oil drilling, and drought. Public health might get a boost from candidates doing photo ops at hot yoga classes in Seaside instead of greasy spoons in Cedar Rapids. (I’d pay good money to see Ted Cruz try kite surfing). And who knows? Maybe the heavy reliance on migrant labor in Central Coast agriculture would force candidates to speak in more human ways about immigration.

The Central Coast Caucus would make sure that no one gets to be president without the sign-off of some Californians. And it would demonstrate California’s greatness; it only takes a small piece of this state to conquer the world.

Joe Mathews is California & innovation editor for Zócalo Public Square, for which he writes the Connecting California column.

Correction: The original version of this article incorrectly identified the Central Coast’s most populous municipality. It is Oxnard.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Education

How I Built a Place to Keep Kids Out of Prison

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

I was tough on crime for years as a prosecutor, but then I realized that the kids who weren't in trouble needed help, too

I recently sat in an amphitheater in Salinas and watched students receive their high school diplomas and training certificates from the Drummond Culinary Academy. A few hours later, I attended the graduation ceremony for the Construction Academy. I spoke earlier that month at the ceremony for students graduating from the Silver Star youth program; many of these 15- to 18-year olds were on probation when they started it. In total, 43 high school diplomas were issued on the Rancho Cielo campus just this year, adding to the ranks of 200 graduates over the last decade who have received hands-on vocational training, college credits, and leadership training opportunities.

As I sat through these ceremonies, I recalled when I was a county prosecutor and the Rancho Cielo Youth Campus consisted of nothing but an unsightly 100-acre dumpsite on the foothills of Salinas. Today, Rancho Cielo is a comprehensive program to educate and train young people in Monterey County for job opportunities—and keep them out of incarceration facilities like the Natividad Boys Ranch that once occupied the site.

The former Natividad buildings have been renovated into classrooms and administration offices, the gym has been completely remodeled, and the old cafeteria now stands as a state-of-the-art culinary school and restaurant. We currently have a 600-person amphitheater, a transitional housing village constructed with the help of our construction academy students, two lakes, and miles of running and biking tracks all across campus. The transformation of our 100-acre facility occurred in less than 12 years.

The creation of Rancho Cielo is an unlikely story; my own participation in it was even more unlikely.

Salinas is primarily known as a rich agricultural region, the nation’s salad bowl. But the city also has gained notoriety for its gang violence and high youth homicide rate. I gained firsthand knowledge of the cycle of violence here—first during a long tenure as a Monterey County prosecutor and later as a Superior Court judge. I devoted most of my 21 years on the bench to criminal cases. During my career, I was responsible for sending a lot of young people to prison. That was my job.

By the mid-1990s, California had gotten tough on crime (“Use a gun and go to prison” and the three strikes law) and the legislature was severely restricting judicial discretion. At work, I found myself having to decide if an 18-year-old kid would be sentenced to 46 years to life—or 52 years to life. Most of the young people who stood before me were men of color who, because of multiple factors, had never had the opportunities that are supposed to be afforded to all our kids in this great nation.

There was also a bit of economic irony. Very few services were provided for young people involved in criminal activity before they got in trouble. But once the trigger was pulled, all sorts of resources were directed to them—police, prosecutors, a defense attorney, the judge, the judicial system, probation officers, and of course, prison incarceration. After a while, I didn’t feel as good as I once did about my job; I didn’t feel as if I was making things better. So I decided to do something about it.

I had learned there was one strategy that actually worked to engage disenfranchised young people—the combination of education, job training, and eventually, employment. These critical three experiences allow youth to reconnect with communities from which they feel alienated and help build the self-esteem and self-confidence that many lack.

I also knew of a county-owned, 100-acre, abandoned facility in Salinas—Natividad Boys Camp—and felt it would be the perfect location (beautiful land and just far enough from the streets of Salinas) for programs to help struggling youth regain trust in themselves and in our community. I tried to convince our county to restore the facility as a site for youth programs, but was told it would take $20 to $30 million to reopen the doors. It took the help of some friends in the legal community to form a nonprofit and convince the county to lease me the property.

Initially, my board of directors consisted of mainly elected officials. Frankly, we didn’t accomplish much. I was able to raise enough grant money to fund a feasibility study of my idea, but that $26,000 study concluded that the Rancho Cielo project was totally impossible. I decided to change direction and replaced my board of directors with people in the business community—construction industry leaders, in particular, since they were willing to get to work revamping the old building along with the kids.

I had no money but we moved forward anyways, commencing work on the property in 2003. When I arrived at 7 a.m. on that first Saturday morning, 75 pickup trucks already covered the hills; 22 dump trucks from various trucking companies lined the road. It was a beautiful sight to see. We never looked back.

We worked every Saturday and Sunday for the next six months until we could relocate an established probation and education program from the old hospital building. We did all this with a little bit of money and a lot of donated labor and materials. It was a true community project. One of the first out-of-pocket expenses was an electrically controlled gate at the entrance to the ranch—since providing a safe and peaceful campus for those who chose to turn their life around was essential to the success of the program.

I also insisted that no kids be sentenced to Rancho Cielo. Judges could recommend Rancho Cielo, but we wanted this ranch to be considered an opportunity for success rather than a punishment.

In 2004, I decided to retire from the bench and took over the running of the ranch as president and unpaid executive director. I also convinced my wife to come out of retirement to run the office. We did that for about four years until we could afford a full-time staff.

Today, there are nearly 150 students between the ages of 16 and 24 on campus every day. Forty percent of students are female, and nearly 70 percent of our students are on probation. Although Rancho Cielo is open to all of Monterey County, nearly 70 percent of our youth are residents of Salinas.

There are multiple programs on campus. The Silver Star youth program is a public-private partnership with Monterey County’s offices of probation, behavioral health, and education. We also run the Rancho Cielo Youth Corp, the Drummond Culinary Academy, the Rancho Cielo Construction Academy, and our newest program, the Independent Living Village—a transitional living program for homeless and at-risk Monterey County men between the ages of 18 and 24.

Our latest project, and by far the largest, is the Ted Taylor Vocational Center, a four-wing, 28,000-square-foot facility that will nearly double the number of students that we can serve. We are in the midst of our first-ever capital campaign in order to fund that project.

I am continually amazed at how the people in our vast county, from Salinas to the Monterey Peninsula, from King City to Pebble Beach, want to help. Many individuals in our community volunteer their time to mentor and educate our students, exposing them to the arts, music, camping, and recreation.

When you provide young people with an encouraging environment and the opportunity to rediscover themselves, they begin to hold their heads up high and start thinking, often for the first time, about their future. The model works; we’ve reduced recidivism 80 percent among students in the program (the rate of our students staying out of trouble is twice that of young people exiting incarceration without the benefit of our program). And the costs of our prevention and intervention programs amount to approximately 10 percent of the cost of incarceration.

We’ve found that 83 percent of our young people are still working or in college one year after they finish their program. Some 200 have graduated so far. As they enter, they pass a sign that reads, “You have just passed through the gates of opportunity—welcome to Rancho Cielo.”

John Phillips, a former Monterey County prosecutor and superior court judge, is founder of Rancho Cielo, and a newly elected county supervisor. This essay is part of Salinas: California’s Richest Poor City, a special project of Zócalo Public Square and the California Wellness Foundation.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Culture

Why and When Did Americans Begin To Dress So Casually?

jeans-hanging
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

It's all about freedom

I study one of the most profound cultural changes of the 20th century: the rise of casual dress. I study casual dress as it evolved on the beaches of Miami. I study casual dress as worn by the Black Panthers and by Princeton undergraduates. As a professor, I teach seminars on material culture and direct graduate students as they research and curate costume exhibitions, but my bread-and-butter as a scholar is the “why” and “when” our sartorial standards went from collared to comfortable.

I happen to own 17 pairs of sweatpants, but I am a convert to casual. As a teen, I scoffed at the wrinkled khakis of my high-school colleagues and scoured the thrift stores of central Pennsylvania in search of the most non-casual clothes I could find—wasp-waist wool dresses, opera gloves, and evening bags. By my mid-20s, I realized I no longer wanted to pry my 6-foot-tall body into uncomfortable clothes and stay in them for hours. While my Clergerie-clad best friend chased down taxis and potential husbands in 3-inch heels, I chose cowboy boots and a pair of overalls that same friend said made me look like an oversized baby. For me, casual is not the opposite of formal. It is the opposite of confined.

As Americans, our casual style uniformly stresses comfort and practicality—two words that have gotten little attention in the history of fashion but have transformed how we live. A hundred years ago, the closest thing to casual was sportswear—knitted golf dresses, tweed blazers, and oxford shoes. But as the century progressed, casual came to encompass everything from worker’s garb (jeans and lumberman jackets) to army uniforms (again with the khakis). Americans’ quest for a low-key style has stomped on entire industries: millinery, hosiery, eveningwear, fur, and the list goes on. It has infiltrated every hour of the day and every space from the boardroom to the classroom to the courtroom.

Americans dress casual. Why? Because clothes are freedom—freedom to choose how we present ourselves to the world; freedom to blur the lines between man and woman, old and young, rich and poor. The rise of casual style directly undermined millennia-old rules that dictated noticeable luxury for the rich and functioning work clothes for the poor. Until a little more than a century ago, there were very few ways to disguise your social class. You wore it—literally—on your sleeve. Today, CEOs wear sandals to work and white suburban kids tweak their L.A. Raiders hat a little too far to the side. Compliments of global capitalism, the clothing market is flooded with options to mix-and-match to create a personal style.

Despite the diversity of choice, so many of us tend towards the middle—that vast, beige zone between Jamie Foxx and the girl who wears pajama bottoms on the plane. Casual clothes are the uniform of the American middle class. Just go to Old Navy. There—and at The Gap, Eddie Bauer, Lands’ End, T.J. Maxx, and countless others—t-shirts, sweaters, jeans, sports shoes, and wrinkle-free shirts make “middle classness” available to anyone who choses to put it on. And in America, nearly everyone wants to put it on because nearly everyone considers himself or herself to be middle class.

The “why” behind casual dress is a hand-clappingly perfect demonstration of fashion theorist, Malcolm Barnard’s idea that clothing does not reflect personal identity but actually constitutes it. As one of my students put it, “So, it’s not like ‘Hey, I’m a hipster and then I buy skinny jeans and get a haphazard haircut,’ but more like in becoming a hipster, I get the jeans and the haircut.” Yes.

In wearing cargo shorts, polo shirts, New Balance sneakers, and baseball hats, we are “living out” our personal identifications as a middle-class Americans. Our country’s casual style is America’s calling card around the world—where people then make it their own. It is witnessed by the young boy on the Ivory Coast wearing a Steelers jersey and in the price of Levi’s on the black market in Russia. Street styles in Tokyo harken the campuses of Harvard and Yale in the 1950s—tweed sports coats paired with t-shirts and saddle shoes. Casual is diverse and casual is ever- changing, but casual was made in America.

As far as the “when” of our turn to casual, three major milestones mark the path. First, the introduction of sportswear into the American wardrobe in the late 1910s and early 1920s redefined when and where certain clothes could be worn. The tweed, belted Norfolk suits (complete with knickers and two-tone brogues) of the Jazz Age seem so formal by our “flip-flops-can-be-worn-everyday” mentality, but these garments were truly revolutionary in their time. As were the sweater sets and gored skirts worn by women. The trend towards casual flowed in one direction, as one period observer noted in a 1922 article in the San Francisco Call and Post: “Once a woman has known the joys and comfort of unrestricted movement, she will be very loath to go back to trailing cumbersome skirts.” The mass acceptance of sportswear coincided with the consolidation of the American fashion industry, which had previously been disjunctive and highly inefficient. By the end of the 1920s, centralized firms produced designs, worked with manufacturers across the country, and marketed specific kinds of garments to specific demographics.

A second milestone towards casual was the introduction of shorts into the American wardrobe. A flare-up in the popularity of bicycling in the late 1920s brought about a need for culottes (looks like a skirt but is actually shorts) and actual shorts—usually to the top of the knee and made of cotton or rayon. Shorts remained time-and-place specific for women (gardening, exercising, and hiking), until the Bermuda shorts craze of the late 1940s, when women turned plaid wool shorts into legit fashion and began experimenting with length.

At all-male Dartmouth College in May 1930, the editors of the student paper challenged their readers to “bring forth your treasured possession—be it tailored to fit or old flannels delegged” so that the men could “lounge forth to the supreme pleasure of complete leg freedom.” The students listened. The Shorts Protest of 1930 brought out more than 600 students in old basketball uniforms, tweed walking shorts, and newly minted cutoffs, and introduced shorts into the American man’s wardrobe.

With a higher tolerance for different genres of dress and a newfound appreciation for non-constraining garments, Americans moved into the 1950s with more options to self-create than ever before. Fundamental to this freedom—apart from the suburban department store boom and the onslaught of media (magazines, television, film)—is a “unisexing” of our wardrobe, a third milestone on our quest to go casual. While bohemian types wore pants in the 1910s and 1920s, women really didn’t wear them until the 1930s, and it was not until the early 1950s that pants made it mainstream. There were still discussions and regulations about women in pants well into the 1960s.

That decade saw seismic shifts in “unisexing.” Women adopted t-shirts, jeans, cardigans, button-down collared shirts, and for the first time in nearly 200 years, it was fashionable for men to have long hair. James Laver, a renowned historian of dress, told a group of fashion industry executives in 1966, “Clothes of the sexes are beginning to overlap and coincide.” He recounted a recent experience walking through his town “behind a young couple” who “were the same height, both with long hair, both with jeans, both with pull overs, and I couldn’t tell them apart, until I looked at them from the side.”

To dress casual is quintessentially to dress as an American and to live, or to dream of living, fast and loose and carefree. I’ve devoted the past decade of my life trying to understand “why” and “when” we started dressing this way—and I’ve come to many conclusions. But for all the hours and articles, I’ve long known why I dress casual. It feels good.

Deirdre Clemente is a scholar, public historian, and teacher. She is the author of Dress Casual: How College Kids Redefined American Style (UNC Press, 2014) and has published articles in The Atlantic and Harper’s Bazaar, among other publications. She served as a historical consultant for the Baz Luhrman film, The Great Gatsby (2013). For more information, visit www.deirdreclemente.com. She wrote this for What It Means to Be American a national conversation hosted by the Smithsonian and Zocalo Public Square

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME world affairs

Don’t Blame Germany for Greece’s Debt Crisis

German Chancellor Angela Merkel at a press conference after meeting with Albanian Prime Minister Edi Rama in Tirana on July 8, 2015.
Gent Shkullaku—AFP/Getty Images German Chancellor Angela Merkel at a press conference after meeting with Albanian Prime Minister Edi Rama in Tirana on July 8, 2015.

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

No country has done more to democratize and raise Europe's living standards

Germany knows a thing or two about being punished for bad deeds, but in recent weeks the country has been the poster child for the old adage that no good deed goes unpunished.

There is no other way to describe the reputational black eye Germans received as a result of the drawn-out Greek bailout negotiations that culminated last week in the approval of a deal struck with other eurozone countries.

No country has done more than Germany in recent times to raise living standards and democratic norms across Europe, which is one reason the country that once bedeviled the continent emerged as the world’s most admired nation in a 2013 BBC poll conducted in 25 countries. But over this summer, it’s been stunning to see how easily we can be lured back into embracing darker stereotypes of those bullying, inflexible Germans.

The prevailing narrative of the Greek crisis in U.S. media, and on social media, was that the poor Greek people and their idealistic young Prime Minister Alexis Tsipras were being driven to the brink by heartless creditors, led by Germany’s Chancellor Angela Merkel and her Finance Minister, Wolfgang Schäuble. The Germans, and their “troika” of servants – the European Central Bank, the European Commission and the International Monetary Fund – had saddled too much debt on the Greeks, imposed counterproductive austerity policies on its government and demanded humiliating reforms that violated Greek sovereignty.

It’s hard not to empathize with the people of Greece; The country’s GDP shrank by a staggering 25 percent in just five years. But the prevailing narrative of a morally tidy showdown between stingy, stubborn Germanic creditors and their victimized Greek debtors overlooked a number of inconvenient truths.

For starters, the Greek debt crisis was triggered in no small part by the 2009 revelation that the Greek government had falsified its economic data to make the country appear a member in good standing of the eurozone. A second fact often overlooked: This is actually the third bailout in the last five years, and in 2012, the Greeks did benefit from a $117 billion write-off of debt owed to private banks. Third, much of the roughly $380 billion in remaining debt is owed to sovereign nations, meaning that the true creditors in the story are German, Dutch, French, and other European taxpayers, not greedy banks or faceless international bureaucracies. Fourth, while Greece did adopt painful fiscal austerity in recent years, it has been slow to carry out many of the needed structural reforms (such as privatizing state-owned enterprises) it agreed to under the previous bailouts. This would be akin to a U.S. company gaining protection from creditors in a bankruptcy proceeding, without engaging in a difficult reorganization to make it a more viable enterprise going forward. The bloated and inefficient Greek state accounted for a stunning 59 percent of the nation’s GDP in 2013.

Furthermore, the issue of whether to provide more aid to Greece this summer was never solely a bilateral German-Greek issue. The countries most adamant about being tough on the Greek were not the Germans, but poorer eastern European Union member nations. But the only “real people” in too much of the coverage of the drama were Greeks, at the expense of people elsewhere in Europe who are understandably frustrated at bailing out a nation where many express their European solidarity by not paying their own taxes.

Among pundits eager to portray Berlin as the villain of the saga, a favorite charge (pushed by Thomas Piketty, among others) has been that the Germans are ungrateful hypocrites. Their own national debt, after all, was substantially reduced by international creditors in 1953. The extent to which this analogy has been repeated without any curiosity as to the underlying facts, or context, is a distressing case study in uncritical groupthink. A good portion of the debt at issue in 1953 dated back to the vanquished Nazi regime and its predecessors; after World War II, the Federal Republic had assumed responsibility for it (at old exchange rates favorable to creditors) as an act of good faith, as part of Germans’ larger sense of atonement. Germany literally lay in ruins, and was divided, and there was no doubt about whether the German people were doing their part to dig out of the rubble and make amends. That debt hadn’t been accumulated in just a few years while the country was simultaneously violating the terms of its obligations, as in the case of Greece.

It’s especially galling to hear Americans chide the Germans for their supposed lack of generosity, when you consider how differently Germany and the U.S. have reacted to the distress of less fortunate regional economic partners. German taxpayers invested for decades in the development of peripheral European Union members, spent trillions to develop Eastern Germany, and will now pay a good chunk of a third Greek bailout that will be somewhere in the neighborhood of $100 billion.

In contrast, the United States has refused to incorporate into the North American Free Trade Agreement the type of regional development funding that Europeans deemed essential to a functioning common market. And when Mexico faced an existential debt crisis in the 1980s that was far more severe than the one Greece is experiencing, Mexicans could only have wished that Washington had reacted to their plight as Berlin has reacted to the plight of Greeks.

After walking up to the cliff and contemplating a break-up of their currency union, the leaders of Greece and their European counterparts in Berlin, Paris, and Brussels all blinked, and worked out a deal to keep Greece in the eurozone, at a painful cost to both sides (more money coming out of the pockets of other European citizens, more painful austerity for Greeks). This was a case of political and historical imperatives trumping economics, at least for now.

The European Union is a monument to Germany’s atonement for its past sins. From its very inception in the 1950s, when it was born as a coal- and steel-producing union, what eventually came to be known as the European Union was considered by its French architects as a means to subvert German nationalism, and to make its repentant people pay more than their fair share for a common project largely directed by the French.

When Germany suddenly had the opportunity to end its postwar partition a quarter century ago with the fall of the Soviet Union, French and other European leaders pressed the Germans to abandon their cherished currency and symbol of hard-won stability, the deutsche mark, in favor of a shared European currency. The more intensely Germany was bound to a broader European Union, the theory went, the less likely a reawakening of a troublesome German nationalism. And so, Germany agreed to the euro once Europeans went along with German reunification.

I remember visiting Germany’s Foreign Minister Joschka Fischer in 2000, in the immediate aftermath of this transition. I asked Fischer if Germany could ever become a normal country again, fully off probation, fully atoned, fully entitled to wave its own flag, at least alongside that of the EU. Not really, he said, and then he talked passionately about Germany’s need to always act within the Atlantic and European communities.

But the ensuing years have proven bullish ones for German nationalism. Berlin surprisingly refused to go along with the United States in its showdown with Saddam Hussein. World Cup successes (as a host and contender) emboldened Germans to pull out their flags and cheer on their country as if it were no longer on probation.

But maybe the most shocking development has been the degree to which the European Union has come to be perceived as a project inspired by Germany, as opposed to one imposed on it. Henry Kissinger once said that Germany was to be pitied because it was too big for Europe but too small for the world. So it was perhaps inevitable that the EU would come to be seen as a projection of German influence. The euro, which initially Germans were so reluctant to adopt, proved a competitive boon to German exports, by raising costs in the rest of Europe.

This monetary straightjacket of 19 very different economies sharing the same currency has had its economic pluses and minuses, but the initial impetus to take this plunge into the economic unknown in the 1990s was all political – the goal of reinforcing a broader European identity across the continent. And whatever its economic merits, the political endeavor is backfiring: The shared currency has only strengthened nationalist sentiments. Try sharing a credit card, and thus your credit rating, with a very diverse group of friends, and sooner or later you’ll also find that it’s not easy to cheerily embrace the “we’re all in this together” plan.

Europe’s integration over time, and Germany’s role in it, is a nuanced, complicated tale, and Americans have a vested interest in its success. Lazy caricatures of Germany that harken back to World War stereotypes may make a dry economic tale more entertaining, and seduce us into thinking we’re rooting for the supposed underdog. But it is a disservice to the truth and our national interest. The Germany of Angela Markel, not a socialist Greece, is our indispensable ally, the democracy that shares our values and can still teach us a thing or two about improving the lives of people beyond its borders.

Andrés Martinez is the editorial director of Zócalo Public Square, for which he writes the Trade Winds column, and a professor at the Walter Cronkite School of Journalism at Arizona State University.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME health

Vaccinations Have Always Been Controversial in America

vaccination
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

While creating the polio vaccine, Jonas Salk had to deal with critics like Walter Winchell, who warned, "It may be a killer"

In 1952, Americans suffered the worst polio epidemic in our nation’s history. As in prior outbreaks, the disease spread during the summer, mainly attacking children who had been exposed to contaminated water at public pools or contaminated objects in other communal places. The poliovirus entered the body through the mouth and multiplied in the gastrointestinal tract. Symptoms started innocently enough—a sore throat, a runny nose. As the virus moved throughout its victims’ bloodstreams, the pains soon began—electric shocks darting through the neck to legs, muscle spasms. Within a day or two, paralysis set in. If the virus made it to the nervous system in the base of the brain, death came quickly. By the time the outbreak’s end, 58,000 people had been stricken. More than a third were paralyzed, many of whom spent the rest of their lives in a wheelchair or bed.

Most Americans today have no concept of the terror generated by polio throughout the first half of the 20th century. During epidemics, newspapers and magazines displayed adorable children struggling to walk in braces or entombed in iron lungs, but the disease mostly fell off the national radar after it was eliminated from the country in 1979. In the past few years, however, polio has begun creeping back into headlines, for two opposite reasons. On the one hand, thanks to the Global Polio Eradication Initiative, the world is closer than ever to wiping out the virus completely; widespread vaccination efforts reduced the number of cases to 414 in 2014, mostly in Pakistan and Afghanistan. On the other hand, because of recent anti-vaccination trends, it’s not unreasonable to worry that a resurgence of polio might afflict Americans again.

The person responsible for easing our minds over the past half century was Jonas Salk, a physician-scientist who was born in a New York tenement and driven by a passion to aid mankind. During the 1952 outbreak, with funds from the March of Dimes, he rushed to develop the earliest vaccine for polio that used a killed, or “inactivated,” form of the virus. In that, he met resistance from more-senior scientists who believed that only a vaccine made from a live virus could provide lifelong protection.

The public was desperate for a vaccine, yet Salk was afraid these scientists would try to derail his efforts. Objections from one even prompted the famed newscaster Walter Winchell to warn his radio audience not to take the vaccine, because “it may be a killer.” So Salk initially made and tested his vaccine in secret. Thankfully, his promising preliminary results led to the March of Dimes launching the biggest clinical trial in the history of medicine. Beginning on April 26, 1954, with a six-year-old named Randy Kerr from McLean, Virginia, the trial eventually involved 1.5 million children, and had remarkable results: Salk’s vaccine was 80 to 90 percent effective in preventing paralytic polio. It was mass-produced and distributed around the country, and by the end of the decade, it had reduced the incidence of paralytic polio in the United States by 90 percent.

When the success of the vaccine trial was first announced, the public crowned Jonas Salk a national hero. He experienced a celebrity accorded few scientists in the history of medicine. Yet his rebuke by the scientific community had only just begun. As heads of states around the world rushed to honor him, scientists—the one group whose adulation he craved—remained ominously silent. Basil O’Connor, director of the National Foundation for Infantile Paralysis/March of Dimes, said they acted as if Salk had committed a felony. They accused Salk of failing to give proper credit to other researchers whose work had laid the foundation for his own. Salk in fact had tried to give them credit. But the media had made him the icon for polio, ignoring other scientists’ contributions. This set the stage for difficulties throughout Salk’s career wherein politics in and beyond the scientific community seemed to override good science.

In 1961, a public health decision was made to replace Salk’s vaccine with one developed by a virologist who constantly tried to discredit him, Albert Sabin. Sabin’s oral vaccine, made with a live virus, was cheaper and more convenient, but also much riskier; it actually caused polio in some cases. Salk worked throughout the rest of his life trying to reverse the decision—a sole warrior in a fight against what he considered entirely a politically-driven change. (In 1999, four years after his death, the Sabin vaccine was replaced with a new version of Salk’s vaccine, which is still used today.)

Salk also campaigned vigorously for mandatory vaccination, putting the health of the public foremost. He went as far as calling the immunization of all the world’s children a “moral commitment.” Thanks to his efforts—along with those of other researchers—we’re able to enjoy our summers without the fear of a crippling disease.

America now has been polio free for more than 35 years, and children are supposed to be vaccinated when they are babies. We’ve reached the point, however, where it seems many people can’t believe an epidemic could really occur. Some parents refuse vaccination, arguing that a healthy lifestyle is enough to protect their children from potentially lethal infections. But studies have shown that the introduction of sanitation actually enhances the circulation of poliovirus, because babies are no longer exposed to the virus in the very small amounts that used to produce lifelong immunity. Poliovirus can spread relentlessly once it gets a foothold in an unvaccinated community.

Such was the case shortly after Salk’s vaccine was released in 1955. Massachusetts closed its vaccination program because a manufacturing error led to some contaminated shots. Even though the mishap was quickly corrected, the state did not reopen its program. That summer, Massachusetts suffered one of its largest epidemics. Four thousand people contracted polio, and 1,700 were paralyzed—mostly children.

Does the public want to repeat history? I think Jonas Salk would plead with them to learn lessons from our past. Californians did with the recent measles outbreak, which affected more than 130 people, the majority of whom were unvaccinated. This helped spur the state to join Mississippi and West Virginia by mandating childhood vaccination, despite an outcry from several groups. Now if only 47 other states would follow suit.

Charlotte DeCroes Jacobs is a professor emerita at the Stanford University School of Medicine and the author of Jonas Salk: A Life. She wrote this for What It Means to Be American a national conversation hosted by the Smithsonian and Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME health

We Can’t All Go to Fancy Yoga Classes

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

Economic and cultural disparities have stacked the deck against healthy choices in America—it doesn't have to be that way

Americans have more ways to be healthy than ever before. Organic vegetables line grocery store shelves. Yoga studios crowd city streets. Fitbits remind us to skip the elevator and Rocky Balboa it up the stairs.

But just because all these waist-thinning options exist doesn’t mean everyone has access to them. Fresh produce, fancy gyms, and habit-tracking technology are expensive, and cost is one of many prohibitive factors that keeps millions of people from taking good care of themselves.

The obesity rates in America are staggering; more than a third of adults deal with the disorder, with demographics that reflect a complicated web of cultural and socio-economic influences. So how do we even the playing field? In advance of the Zócalo/The California Wellness Foundation event “Is Healthy Living Only for the Rich?” we asked medical experts and others invested the public’s well-being: Given the structure and demands of everyday life in America, what can be done to make healthy living more accessible across classes?

Partner up to change people’s environments — Leonard Jack, Jr.

The burden of chronic disease in the United States is substantial. Chronic diseases—such as heart disease, stroke, diabetes, obesity, and cancer—are among the most common, costly, and preventable of all health problems. More than half of all adults in the U.S. have at least one of them.

Improving the overall health of Americans will involve reducing risk factors such as physical inactivity, poor nutrition, and limited or no access to quality healthcare. In addition to these factors, environment (for example, communities, schools, workplaces, restaurants, parks, etc.) plays a large role in helping Americans achieve optimal health. The importance of creating safe places to exercise, increasing access to healthier foods and beverages, and reducing exposure to secondhand smoke in public places cannot be overlooked or minimized.

Bettering people’s environments requires strong partnerships and a commitment to making healthy choices easier. The best successes are where schools, local parks, businesses, faith-based organizations, clinical settings, and restaurants work together through a combination of approaches. These approaches include establishing policies in schools that make available salad bars and healthy-foods vending machines; adopting joint-use agreements with facilities such as schools, churches, and businesses to increase access to safe places to exercise; working closely with corner grocery stores in communities to increase access to fruits and vegetables; and changing prices of healthier foods and beverages relative to the cost of less-healthy foods.

Leonard Jack, Jr. is the director of the division of community health within the National Center for Chronic Disease Prevention and Health Promotion at the Centers for Disease Control and Prevention.

First, address a community’s fundamental needs — Paul Simon

There is unfortunately no single, easy answer. We certainly need to encourage healthy eating and regular physical activity, and discourage tobacco use and unhealthful patterns of drug and alcohol use. However, these efforts are unlikely to make a dent in the glaring health inequities we see across socioeconomic and racial and ethnic groups unless we also create environments that promote healthy living.

Sadly, in most communities, and particularly in low-income communities, the deck is stacked against healthy choices. We need to shift the balance in favor of healthy options where landscapes are largely dominated by junk food, unhealthy food and beverage marketing, tobacco shops, and liquor stores, and few if any options for recreation. Even these measures, though, are unlikely to achieve the desired results unless we address even more fundamental needs, including safe and affordable housing, high-quality education, and meaningful employment opportunities that offer a living wage.

Though these issues may seem far removed from healthy living, they are in fact among the most important factors that influence health across the lifespan. In addition, chronic stress associated with living in communities that are rife with violence, disinvestment, and degradation exact an enormous toll on one’s health. We need to work with and support communities to build resilience, strengthen social networks, and create opportunity. Anything less is like putting a Band-Aid on a festering wound.

Paul Simon is the director of the division of chronic disease and injury prevention at Los Angeles County of Public Health. He is also a pediatrician, and teaches at the UCLA Fielding School of Public Health.

Think macro — Cherise Charleswell

Healthy living is truly a privilege in the United States, because many families are focused on survival. I will forever remember the words of a mother who spoke up at a grassroots community-organizing event I attended: She said even though she knows it’s not nutritional, she still gives her sons honey buns in the morning, because it’s quick, cheap, and at least they will have something to eat.

Simply stating that this woman—and others like her who don’t have access to healthy living—need more health education classes is not enough. There is arrogance in that notion. Surely, this marginalized population knows that they should be eating better, getting more exercise, and having more routine physicals. However, there are barriers in the way that seem to be growing—barriers that are among the key reasons why protestors took to the streets during the Occupy campaigns. When we speak of classes, the United States is simply becoming the “have” and the “have-nots,” and public health is suffering because of this, as healthy living options become unattainable.

The problem involves macroeconomics, and should be approached from this macro-level with urgently needed social policies and legislation that will make healthy living accessible. Speaking about the poor is taboo in American politics; politicians prefer to talk about the “shrinking middle class,” without providing any context as to where they are shrinking away to. For that reason, we need to begin to speak about privilege and inequity, and consider diverse public health interventions, such as creating and maintaining community gardens, imposing higher taxes on sugar-laden foods, and extending physical education requirements in schools.

Cherise Charleswell holds a dual position as diversity officer and clinical researcher at Huntington Medical Research Institutes in Pasadena, California. She is the president of the Southern California Public Health Association.

Take care of the easy stuff — Jonathan M. Samet

This is a critical question with an easy answer: Much can be done, and the imperative to do a lot is strong.

In the United States, there are tremendous gradients in health and longevity by indicators of socioeconomic status. Underlying these gradients are patterns of harmful substance use (tobacco, excessive alcohol, and illegal drugs), availability and affordability of healthy food, psychosocial stress, and access to high-quality preventive and medical care.

What can be done on the short-term?

  • Continue to drive down rates of tobacco use, while taking on the new challenge of various electronic nicotine delivery systems. In California, for instance, cigarettes are too cheap, so there’s an opportunity to reduce smoking through a tax increase.
  • Invest in research to find better interventions for alcoholism, as the problem of excess alcohol consumption remains deeply rooted.
  • Assess, address, and monitor the availability and consumption of healthy foods. We have the tools to do so, and already know the solutions: local growing and selling, and pushing for healthier options from the food industry.
  • Promote physical activity by identifying and addressing barriers to it (for instance, lack of walkable routes, lack of education on the risks of a sedentary lifestyle)
  • Teach communities to advance their own health. Strategies could include engaging community leaders, providing model initiatives, and offering funding to foster innovation.’

For some of the most critical factors, solutions are for the long-term and outside the domain of local communities—healthcare access and quality, and a strong base of jobs—but they should not be forgotten.

Jonathan M. Samet is the chair of the department of preventive medicine at University of Southern California’s Keck School of Medicine. He is also the director of USC’s Institute for Global Health.

Get people exercising — Siddhartha Angadi

Socioeconomic status is adversely associated with obesity and lifestyle diseases. However, it’s often incorrectly assumed that access to recreational facilities is critical to maintaining a good workout routine.

Exercise is potent and virtually “free” medicine that delivers health benefits independent of education level, financial status, or body size. Cardiovascular disease is the leading cause of death in the U.S., and exercise is extremely efficacious cardiovascular medicine. Current American Heart Association guidelines recommend that adults engage in 150 minutes of moderate-intensity aerobic activity or 75 minutes of vigorous-intensity activity per week.

While these goals may sound unattainable to a time-crunched person or busy family, it is important to note that accumulating 30 minutes of exercise in short bouts is just as effective at lowering cardiovascular risk. Short exercise sessions—as little as 10 minutes at a time—have been shown to be just as effective at reducing blood pressure as a single, longer session. A short walk or bike ride to work, the park, or a store can enhance heart health without the cost of a gym membership.

The salutary effects of exercise occur independently of body weight or body fat. In a series of studies carried out at the Cooper Clinic in Texas, researchers examined the effects of fitness on the risk of death from all causes and cardiovascular mortality. The key finding was that obese and fit individuals had the same risk of adverse outcomes as normal-weight and fit individuals, and half the risk as normal-weight or obese, unfit individuals.

In short, good heart health is possible at any size with a cheap pill called exercise.

Siddhartha Angadi is an assistant professor at Arizona State University. His research focuses on the effects of exercise and diet on cardiac and arterial function in patients with serious cardiovascular conditions. His doctoral student Jennifer Herbold assisted him in writing this response.

This article is supported by a grant from the The California Wellness Foundation

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME society

Katy Perry Isn’t the Only One Who Wants to Live in a Convent

The Sisters of the Immaculate Heart of Mary property in the Los Feliz area of Los Angeles. Singer Katy Perry sought to purchase the property.
Nick Ut—AP The Sisters of the Immaculate Heart of Mary property in the Los Feliz area of Los Angeles. Singer Katy Perry sought to purchase the property.

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

Repurposing religious buildings should be done with sensitivity and purpose

I moved into a convent 10 years ago this summer.

My roommates were not Catholic sisters, but other recent college graduates, who sometimes acted a little too much as if we were still living in a college dorm. But most of our time was dedicated to service of our community—teaching, leading afterschool programs, counseling pregnant teens and gang members, working with the elderly—just as the sisters who preceded us in the convent had once done.

The news that pop star Katy Perry wants to buy a former convent in Los Feliz has me thinking about my days at Amate House, a full-time Catholic volunteer program in Chicago. The Los Angeles Times broke the story that two nuns are blocking the archdiocese from selling the estate to Perry, who wants to live there. Early coverage of the story centered on the sister’s disapproval of the “I Kissed A Girl” singer.

My fond memories of convent living, though, make me wonder if the question of whether Perry is a suitable successor to the sisters misses the point. As our society become less connected to religious institutions, it may be more important than ever for communities to think creatively and sensitively about how to make use of formerly religious spaces.

I had never imagined that I would live in a convent. Amate House operates three houses, two of which were convents, with both male and female volunteers, and it is part of the Chicago archdiocese. But I approached it more like Peace Corps or Teach for America: an opportunity to do something special, learn about life in the inner city and give back—not to live out my faith. I identified myself as a “practicing-but-not-believing Catholic.” I had volunteered with my high school youth group through college, but I was more interested in Buddhism than Christianity.

Though I defied typical categories—neither fully Catholic nor a religious “none”— my experience reflects the trend of young Americans disaffiliating from institutional religion and forming their own religious identities and understandings.

My grandmother, in contrast, grew up wanting to be a Catholic sister. Unfortunately for her (but thankfully for me), she lacked the education to join a religious congregation. Instead, she got married and raised my father and his four brothers.

Seeking to understand my recently deceased grandmother’s devotion—why would a woman voluntarily commit her life to a patriarchal church?—I wrote about Catholic sisters for a class in college. The nun in her nineties that I profiled couldn’t explain her vocation other than as a call from God.

Her order had once occupied a huge motherhouse in my hometown and sent teachers to schools throughout the Midwest. In northwest Iowa, she had taught art to a budding cartoonist who would go on to work for Disney and draw the genie in Aladdin. But by the time I visited, they had moved to a smaller house, essentially a nursing home for sisters.

Their grand old motherhouse became Loyola University Chicago’s education school. The sisters were happy that a Catholic institution was continuing their legacy, but then Loyola moved to sell the property to a developer that planned to raze the convent and put in single-family homes.

The city intervened, and the building still stands as senior housing. But the sale of convents and churches to developers is not unusual. Around the same time, my parents moved into a development in a neighboring suburb that had been built on the grounds of a former convent. And when I lived in a convent, my window looked out on a Protestant church that had been converted to condos.

Such examples will become more common as people move away from institutional religion. Places that once brought together a community become individual units, our architecture seeming to reflect our spiritual trends.

Yet, many still long for a sense of togetherness, even if in untraditional ways. My convent roommates and I were not all regular churchgoers, despite living above a chapel where daily mass was held. Our “church” came in the form of meals, reflection nights, and service to the broader community.

But buildings can’t be preserved just for community. In exchange for our service, our work sites paid Amate House small fees to cover our living expenses, including our convent housing. Another solution is to make churches into community arts centers, renting space out to nonprofits during the week. Both situations provide a win-win for religious institutions and nonprofit organizations.

A year or so ago, I met with two sisters in Chicago who were in the process of opening a migrant shelter in an old convent, supported by an interfaith organization. They told me what Pope Francis had recently said at Centro Astalli, a refugee center in Rome: “Empty convents are not for the church to transform into hotels and make money from them. Empty convents are not ours, they are for the flesh of Christ: refugees.”

Intrigued by this tension between money and mission, I applied to and received an International Reporting Project fellowship to find out if Pope Francis had affected Italy’s welcome of migrants. Visiting Centro Astalli and other refugee centers around Rome, I met many migrants living on the street or in abandoned buildings, unable to find work or housing in their new country. Two men showed me how they survived while homeless in Rome, sleeping at Termini train station, passing their days in a park behind the Colosseum and seeking services at churches and convents.

For my last few days in Rome, I checked into a convent hotel along their daily path, a few blocks from Termini. Once again, I found myself in a spartan single.

My convent hotel was clean and comfortable, European beds being what they are. And for not much more than the price of a hostel, I had a private, quiet space.

Four sisters lived on the top floor, and one of them told me that they make themselves available to travelers for either logistical or spiritual concerns. Many orders consider hospitality to pilgrims as part of their mission. In addition to tourists, they host student groups and families of patients from a nearby hospital. And the hotel helps fund their work in the missions.

Yet, when I saw the generous breakfast spread for what seemed like a handful of guests, I couldn’t help but think of the homeless migrants I had met on the streets of Rome. If the government, churches, or nonprofits paid for even a few migrants’ room at this convent, I wondered, how would the tourists staying there react?

Some argue that the pope’s statement against convent hotels reflects the male hierarchy’s desire to control the hard-earned assets of women in religious orders. In Los Angeles, the Katy Perry story is more about who manages the proceeds of the sale— the nuns or the archbishop—than whether Perry or someone else is the next owner of the convent.

I, for one, would trust a group of sisters more than the archdiocese to put the millions earned from the sale to good use. Yet the sisters’ buyer, a driver of gentrification who is also currently refurbishing the former Pilgrim Church into a hotel and restaurant, is no more likely than Perry to transform the convent into a homeless shelter.

As religious institutions decline, not all religious buildings will survive. But as someone who enjoyed living in a convent—temporarily—I would hope that some could be transformed into shelters, art centers, homes for nonprofit or volunteer organizations or other projects that benefit the whole community.

With a little creativity, Catholic sisters’ spirit can live on in a very concrete way.

Megan Sweas is the editor at the USC Center for Religion and Civil Culture, and a freelance journalist based in Los Angeles. She is author of Putting Education to Work: How Cristo Rey High Schools are Transforming Urban Education. Reporting for this story was supported in part by the International Reporting Project.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

Your browser is out of date. Please update your browser at http://update.microsoft.com