TIME Sports

How Watching Every Single World Cup Game Changed My Life

FBL - WC - 2014 - FRA - FANS
Football fans react during the FIFA World Cup 2014 football match between France and Ecuador on June 25, 2014 at a bar in Paris. STEPHANE DE SAKUTIN—AFP/Getty Images

Vieiwing all 64 matches in 2010 showed me true, unadulterated passion, and led me to quit my job.

Maybe I did expect to be in a New York City bar below 14th Street at 7 in the morning at some point. Doesn’t every newly minted college graduate hoping to make it in the Big Apple’s art and greed industries have that expectation? I didn’t, however, expect it to be mid-week, on my way to work, and involving zero women.

Nonetheless, I found myself at Dempsey’s in the East Village at dawn, watching the least anticipated game of the 2010 World Cup: New Zealand vs. Slovakia. I had embarked on a self-appointed quest to watch all 64 games taking place in South Africa over the course of a month. I had done this for the 1998 France Cup, but back then I was in high school, on summer vacation. Now I held a 9-5 job in corporate publishing.

I bounced around New York watching games before work and taking long lunches in the middle of the day in dark bars. I watched games surrounded by foreign nationals and the ever-growing USMNT fan base (that’s the U.S. Men’s National Team, for the uninitiated). I watched games on a bus to Boston and at my niece’s second birthday party. I kept a blog that I treated as a personal journal, and to my surprise about 150 people seemed to follow it. I even got some fairly technical questions sent in to me by readers, including “If Carlos Tevez is the second ugliest man in the tournament, who is the ugliest?” Some other smalltime blogs linked to my own, lauding my ethnic ribbing of the gelled hair, headband-wearing Italian national team. A Greek chap got angry at me for scorning his team’s goading tactics against an undisciplined Nigerian squad. I was having fun.

Drinking breakfast pints and waving the flag of nations I’d adopt for 90 minutes at a time was great, but of course I couldn’t sneak away from my job for every game. There were many I had to watch under fluorescents in my cubicle. This, though not ideal, greatly improved my workday. But when my boss interrupted the Ivory Coast match I was watching at my desk and she apologized sincerely, I began to realize the low stakes of my work. It crossed into the absurd when a vice president from our corporate headquarters in Germany came into my cubicle to watch Die Mannschaft. She didn’t know my name, but she did know that my job was unimportant enough not to care.

I hated my career. This was not a big reveal. It wasn’t Portugal’s trashing of North Korea or being cramped on a bus trying to watch two semi-tedious games at the same time that made me realize that punching in numbers on a spreadsheet and passing along UPS orders was not the ultimate career for me. I had half-heartedly searched for new jobs, but I was comfortable where I was. I liked my colleagues, had a lot of friends at the company and captained the softball team. No, the 2010 World Cup did not shine a light on the fact that I didn’t care much about my job. But it did illuminate that no one else cared about it either.

Seven months later, I quit that job. Before the end of the next year, I’d left New York City for the Bay Area. As the World Cup in Brazil opens, I’m finishing my first semester of medical school in Australia (where I will be watching games in the middle of the night) and am set to have a wedding stateside in January. Watch all 64 games, learn about life, become a doctor, get married. FIFA should hire me for an ad campaign.

I love being an American fan of international soccer. There are very few times we as Americans get to be the clear underdog, and the World Cup is one of them. “300 million people,” a Scottish bartender at Luca Lounge in Alphabet City said to me with pity during the thrilling third-place match between Uruguay and Germany, “and you can’t find 11 who can beat Ghana at a football match.”

It took four years, but we proved him wrong. And even in 2010, the USMNT’s 0-2 comebacks, last second game-winners, and a final gut-wrenching defeat in extra time made it clear that the U.S. could compete on the world stage. While I was standing on a chair tossing piles of napkins in the air, celebrating Landon Donovan’s dramatic goal against Algeria, Brooklyn’s Black Horse Pub burst into “The Star-Spangled Banner.” On the pub’s giant LCD screens, bars around the country were being broadcast, doing the same.

Anyone who has ever identified with Joan Didion’s “Goodbye to All That” or The Onion’s oft-shared article on realizing that New York is a horrible place to live—and perhaps anyone who’s thought to themselves that New York’s a nice place to visit but not to stay—should try to bounce around the city for the 2014 World Cup. I’ll never forget leaving the orange-clad Dutch fans at a Midtown bar blaring their euro-techno to celebrate a win over Brazil to join the African nationals rallying around Ghana at a South African restaurant in Fort Greene. They watched their continent’s last hope in the tournament dissolve twice: once when Luis Suarez’s handball denied them a victory, and again when the ensuing penalty kick rattled the crossbar. The crowd that had drummed and danced in the street at halftime fell silent, with tears in their eyes, until the quiet was punctuated by Ghanaians wailing in agony.

The 2010 World Cup showed me a city that forces you to interact with the rest of the world’s population. When I think back to the Chileans wrapped in flags and caked in face paint or the Korean grocer who left his shop to sneak a peek at the neighboring sports bar, there is no doubt I miss New York. The 2010 World Cup also brought me face to face with true, unadulterated passion and when it was over, the void was intolerable. Unfortunately I had to leave a place I loved in order to do something I loved. But when the 2018 World Cup rolls around, I’m counting on returning to a bar below 14th Street—even if it’s at 5 a.m. to watch the match from Moscow. There are many reasons to call New York the city that never sleeps. My favorite is that somewhere around the world, a soccer match is being played.

Saba Afshar is a medical student at the University of Queensland Ocshner School of Medicine in Queensland, Australia. A version of this piece originally appeared at Zocalo Public Square.

TIME Media

The F-Word: Let’s Just Call It What It Is… [Bleep!]

Los Angeles Kings Victory Parade And Rally
Los Angeles Kings Mayor Eric Garcetti raises a beer and swears during the Los Angeles Kings Victory Parade and Rally on June 16, 2014 in Los Angeles, California. Harry How—Getty Images

(But if you don't like hearing it, or saying it, or reading it, you should probably stop right here.)

In making headlines after declaring at a hockey rally, “This is a big fuckin’ day,” was Los Angeles Mayor Eric Garcetti having a big fuckin’ day himself? Or rather, one for the f-word?

There are real data now to help answer such a question. Relatively recent technologies — cable television, satellite radio, and social network media — provide us with a not-too-unrealistic picture of how often people swear in public and what they say when they do. Before these new forms of reporting, the media provided a fairly sanitized view of spoken English. Newspapers today still swerve to avoid swearing, opting for euphemisms like “_____,” “PG-rated expletive,” or “an eight-letter word for animal excrement,” instead of telling us what was really said. Fortunately, YouTube now offers people like me, who study language and profanity, a more accurate picture.

Are widely reported acts of swearing by public figures like Garcetti’s typical or not? And are the rest of us any different? How frequently do regular people swear and what do we say?

We language scientists attempt to answer these questions. In one study reported in the journal Science, less than one percent of the words used by participants (who were outfitted with voice recorders over a period of time) were swear words. That doesn’t sound like very much, but if a person says 15,000 words per day, that’s about 80 to 90 “fucks” and such during that time. (Of course, there’s variability–some people don’t say any swear words while other people rival David Mamet). More recently, my research team reported in The American Journal of Psychology that “fuck” and “shit” appeared consistently in the vocabularies of children between 1 to 12 years of age. And you shouldn’t worry — there is no evidence to suggest a swear word would harm a youngster physically or psychologically.

So please, let’s not be shocked by swear word statistics, or by politicians swearing in public. Politicians get caught swearing all the time. In 2000, George W. Bush referred to a New York Times reporter as a “major league a–hole.” In 2004, Vice President Dick Cheney told Vermont Senator Pat Leahy to go [bleep!] himself on the floor of the U.S. Senate. In 2010, Vice President Joe Biden called the passage of President Obama’s health care legislation “a big fucking deal.” (Granted, it was meant to be said more privately than the mic conveyed.) I place Mayor Garcetti’s profane celebration of the Kings’ Stanley Cup in the Biden category of Happiness-Induced Cussing.

But what happens when the viewer at home encounters these expletive-laced speeches on their TVs or the Internet? Some viewers take it personally, taking it as classless, or moral degradation; I would argue they’re only thinking of the historically sexual meaning of the word “fuck.” But both Garcetti and Biden (along with Bono at the Golden Globes) used “fucking” as an intensifier, not as a sexual obscenity. Yet most swear words are used connotatively (to convey emotion).

The Federal Communications Commission waffles on what to do about Garcetti-style “fleeting expletives.” Fox Sports apologized for Garcetti’s “inappropriate” speech but it’s not clear if Fox will be fined by the FCC. (My best guess: probably not, since Obama’s commissioners are dovish on profanity.) The FCC ruled less liberally during the Bush years when conservatives had more sway. It’s interesting that people don’t complain as much about alcohol ads in professional sports. Alcohol can kill you, but swearing won’t; swearing might even help you cope with life’s stressors, according to recent research.

Older generations who are less understanding of technology may perceive that profanity represents a change in language or societal habits, even when that is not the case. Swearing by people in positions of power has always been there; it just used to be better hidden. We have to learn to accept that we are now going to hear more Garcettis.

And there’s something else you might have noticed. The day after any swearing incident nothing happens. No one has to be hospitalized or medicated. Yes, sensibilities may get jangled, but coping with slight deviations from the expected is part of life. No one, not even your mother, dies from hearing “fuck.”

Timothy Jay is a professor of psychology at the Massachusetts College of Liberal Arts. He has published numerous books and chapters on cursing, and a textbook for Prentice Hall on The Psychology of Language. This piece originally appeared at Zocalo Public Square.


How to Deal With Doctors Who Get Drunk and High on the Job

A group of surgeons work in an operating theatre.
A group of surgeons work in an operating theatre. Jochen Sands—Getty Images

Doctors might be under the influence more often than you think. A California physican explains why mandatory drug and alcohol testing could be the solution

Larry was a doctor trainee at a hospital where I taught in Burbank. I recommended that he not passdue to his poor preparation and work habits. But he did, and set up a general practice nearby. He had trouble with it, though, and drifted into addictionmedicine over time, helping patients overcome their problems (he was said to have had a cocaine problem in his past). He later moved outside the immediate area, and word got around that he was a go-to localfor scoring prescription narcotics. People who encountered him thought he might be high. Eventually, the DEA entered his life, and he put a gun into his mouth and pulled the trigger.

An upcoming ballot initiative on malpractice caps in California includes a provision that would require physicians to be drug- and alohol-tested prior to practicing at any hospital, and require other health care practitioners to report any suspected abusers. This is packaged with other measures that appear punitive towards all physicians. But the drug-testing provision deserves scrutiny because, while drug testing is widespread in American business, and required of nurses and many medical workers, private doctors have not been routinely tested.

I’d like to tell you substance abuse isn’t a problem for doctors, but unfortunately, I’ve seen firsthand that there are physicians who practice while they are under the influence. And we physicians often find it hard to speak up when we see something. The attitude is all too often, If it isn’t my patient, it isn’t my problem.

I personally made it my practice never to have a drink at lunch or in the evening when I was on call. And because I was on call for most weekdays for 30 years, I never felt free to drink during my career. Sadly, that was not always what I encountered from my fellow physicians.

When I was a young ER physician new to a small community hospital in California, I called in a prominent surgeon to perform an emergency appendectomy. He arrived, reeking of alcohol. There was no other availablesurgeon, and a delay exposed the patient to significant risk. The surgery went ahead, and the patient did fine. But I asked around, and the surgeon turned out to be known as a boozer, frequently coming to the hospital drunk. This still haunts me, and I left that hospital rather quickly.

That was my first experience with the difficulty of dealing with physicians who abuse mind-altering substances. I didn’t make any sort of formal report on the surgeon; I would have felt intimidated. I passed the word along to colleagues, but that was all I did. Today, as a senior physician in the latter part of my career, I would hope that I’d do more.

But that was a case when I recognized a problem. It can be hard to recognize that a colleague has a substance abuse problem, even if you’re a trained observer of addicts. Among my professional pursuits, I was the director of a drug/alcohol program for a large medical group, and personally saw every patient who entered the program for several years.

In 1994, I hired an associate, Cindy, a graduate of a famous cancer center, looking for temp work. She was young, attractive, and very smart. But I was surprised by her poor work habits, and my staff reported strange behavior. Drugs from the office started disappearing. I just couldn’t believe Cindy was abusing drugs, until it became undeniable. (Although she denied it.) A year later she had her license revoked for drug use, unrelated to my experience with her.

Until a few years ago, the licensing board for physicians in California had a diversion program for those who were identified as having an abuse problem, which allowed such physicians to keep their licenses if they sought adequate rehabilitation. It had a 75 percent long-term success rate and allowed for anonymous reporting of suspected abuse. However, the licensing board, in its wisdom, recently discontinued this program as they felt that the board’s primary mission was patient protection, not physician rehabilitation. Funding should not have been an issue: the program was paid for by physician licensing fees, not by taxpayers. Nothing has appeared to take its place, and so California is without a confidential reporting system for doctors.

I’ve spoken with a number of practicing physicians recently, and surprisingly, I hear a lot of support for mandatory testing. This support may have less to do with protecting patients than with a feeling of impotence in dealing with colleagues who abuse drugs and alcohol.

Mandatory testing will cost a lot of money, and it is intrusive to the daily practice of medicine. But patient safety concerns justify such testing for physicians, just as air safety concerns justify testing for pilots. And even with testing in place, doctors should not be excused from their obligation to report colleagues, and the government should provide a way to make such reports confidentially.

None of this should be done by a deeply flawed ballot initiative; instead, the Legislature should craft a careful law that will work in practice.

Ken Murray MD is a retired Clinical Assistant Professor of Medicine from USC, and is a frequent contributor to the Southern California Bioethics Commitee Consortium. He writes on topics of end-of-life, ethics, and water. His writings have been published in media world wide in virtually all languages, and he speaks frequently on these subjects.


TIME Parenting

What Single Policy Could Ease Americans’ Time Crunch?

Work-life balance is at the core of why we all feel so overwhelmed. Here are some solutions from thought leaders and experts for how to remedy that.

You’ve probably seen that “Poolside” Cadillac commercial, which debuted during the Sochi Olympics, where a dad looks over his infinity pool and notes, “Other countries – they work, stroll home, stop by the café, take August off.” High-fiving his kid and handing a newspaper to his wife, he tells us why “we” aren’t like that: “Because we’re crazy-driven, hard-working believers, that’s why.” The ad was meant to provoke, but it also illustrates how Americans work hard, play hard, and still expect a warm family and manicured yard as part of living the American Dream.

And yet, 53 percent of working parents in a study published by the Pew Research Center last year said they found it very or somewhat difficult to balance their work and family life. Thirty-four percent of those parents say they always feel rushed, even to do the things they have to do. This is only one of a slew of studies that illustrate how overwhelmed many Americans feel trying to “have it all.” In advance of the Zócalo event “Why Can’t Americans Balance Love, Work, and Play?”, we asked experts what single cultural or policy change could ease American’s time crunch?

1. Retool school schedules and expectations

It seems to me that one cultural shift that has gone way too far is the expectation that parents will be intimately involved in the workings of schools and the goings-on of classrooms.

I realized it had escalated way beyond normality when the parent group at my kid’s elementary school organized not teacher appreciation day but teacher appreciation week. Each day, kids needed to remember to bring in something different: a rose, say, on Monday, and a card on Tuesday, and Wednesday we needed to contribute an item to the teacher’s breakfast–this, on top of endless committees having to do with art contests, silent auctions, book fairs, etc.

Most of it seems to fall on mothers. It just adds to the overlong to-do list. Parents need to say no–and I did, much of the time–but schools, and parent committees, should also ask themselves whether this or that event or request for classroom involvement is necessary. Related to this, of course, is the culture of extracurricular events, which is also its own kind of arms race: travel soccer, camps, teams, fees. There needs to be some sort of cultural pushback, some sort of ratcheting down of the number of things that parents have to do with regard to schooling.

Even more important is a whole scale re-envisioning of the school day and a culture-wide effort to have school sync up better with parents’ work schedules. More school aftercare would help. Also, what would help would be to have the above-mentioned extracurriculars incorporated into the afterschool day, so that it can happen on school grounds, and parents don’t have to do all that driving and organizing. We need a Steve Jobs–somebody obsessed with simplicity and ease of use–to tackle and vanquish the level of complexity that has come to define the raising and education of children.

Liza Mundy is director of the Breadwinning and Caregiving Program at the New America Foundation. A journalist and book author, Liza most recently wrote The Richer Sex: How the New Majority of Female Breadwinners is Transforming Sex, Love and Family.

2. Hold employees accountable for results

Americans need to work less. As I chronicled in my book, Maxed Out: American Moms on the Brink, overwork is not only diminishing Americans’ quality of life outside the office, it’s making us less effective inside the office, too.

It may sound counterintuitive, but when we work more than 40 hours per week, study after study has shown we actually become less productive. Knowledge workers have four to six hours of solid productivity in a day. After that, productivity starts to decline until eventually we enter a negative progress cycle, which means we’re creating more problems than we’re solving.

Many of us know we should work less. But that’s hard to do in a culture where “full time” often means 50-plus hours a week (not including the commute), and part-timers are treated as slackers (even if, hour for hour, they are in fact the most productive people on the payroll). Roughly half of all jobs in America are compatible with working from home part-time, yet many companies still frown on this practice. Commitment to one’s job is still measured not by effectiveness, but by how many nights and weekends one works.

A simple but powerful change businesses can make is to hold employees accountable to results, rather than fixating on how many hours or days they spend at a desk. One exciting trend management experts talk about is “results-only work environments” where managers stop acting like babysitters and instead, they empower employees to decide when, where, and how to best get their work done. Businesses reap the benefits in increased productivity and morale, and decreased turnover.

Our state of overwork is bad for our health and bad for business. If companies want a competitive edge, they must create environments where employees can thrive—even if that means for many of us, working less.

Katrina Alcorn is a writer, consultant, and public speaker. Her first book, Maxed Out: American Moms on the Brink, tells a deeply personal story about “having it all,” failing miserably, and what comes after.

3. Make work schedules flexible – and don’t ding workers for taking advantage of that

Simply put, today’s workplace is not designed around today’s worker. Instead, it clings to the 1960s notion of an “ideal worker” – someone who is available to work whenever needed while someone else holds down the fort at home, and who takes little or no time off for childbearing or child rearing. Structuring work in this fashion marginalizes caregivers, men, and women alike.

Women who take family leave or adopt flexible work schedules to have more time with their children often encounter “maternal wall” bias, which is by far the strongest form of gender bias today. A well-known experimental study found that mothers were 79 percent less likely to be hired, half as likely to be promoted, offered an average of $11,000 less in salary, and held to higher performance and punctuality standards than identical women without children. Mothers face assumptions that being committed to work makes them bad mothers, and that being committed to motherhood makes them bad workers.

Meanwhile, men face a different type of “flexibility stigma” because childcare, fairly or unfairly, is still seen as being a feminine role. Men seeking to take family leave, for instance, are not only seen as bad workers, but also as bad (i.e., less manly) men. In other words, the flexibility stigma is a femininity stigma.

This is a sobering message for employers: creating flexible work policies is only half the battle. The next step is to eliminate the stigma that all too often accompanies such arrangements. Happily, change may be on the horizon. Many, if not most, talented young men and women want to combine meaningful work with a fulfilling personal life. As the Millennial generation gains influence in the workforce, we can only hope that their values will lead to a change in workplace culture.

Joan C. Williams is Hastings Foundation Chair and Director of the Center for WorkLife Law at the University of California (Hastings). She has authored or co-authored over 90 academic articles and book chapters, as well as authored or co-authored 8 books, the most recent being What Works for Women at Work: Four Patterns Working Women Need to Know. You can also follow her work on Twitter @JoanCWilliams and her Huffington Post blog.

4. Teach employees how to be their best on and off the job

We need to recognize, as a culture, that we have to train people to fit work and the other parts of their life together. It’s a modern skill we all need to succeed that most of us don’t have.

According to our research, most of us are flying by the seat of our pants trying to get everything done even though the boundaries that used to tell us where work ended and the rest of life began have all but disappeared.

The good news is we have more flexibility in how, when, and where we can get our jobs done. The bad news is that no one is showing us how to capture that work-life flexibility, intentionally, and use it to be our best, on and off the job.

According to the results of our recent national survey of full-time employed U.S. adults, 97 percent of respondents reported having some form of work-life flexibility in 2013 when compared to the previous year; however, only 40 percent said they received training or guidance on how to manage it. Not surprisingly, 62 percent of respondents reported obstacles to using or improving their work-life flexibility such as increased workload or having no time, and fears of job and income loss.

Teaching people the basics of how to manage the way their work and life fit together makes a difference. For example, we showed a group of 40 employees in a large medical testing lab how to choose small, but meaningful work, career, and personal priorities and focus on these actions for the next seven days, a technique in my book, Tweak It. They planned when, where, how, and with whom they would accomplish those “tweaks.” At the end of six weeks, 92 percent of participants said they were better able to prioritize all of their responsibilities and goals, and 88 percent felt they more actively managed what they had to get done at work and in their personal lives.

Cali Williams Yost is a flexible workplace strategist and author who has spent two decades helping organizations and individuals partner for award-winning flexible work success. Her “how to” work+life fit advice for individuals can be found in her new book Tweak It: Make What Matters to You Happen Every Day (Center Street, 2013).

5. Five mindset changes that leaders should adopt

The pressure to work more hours and to work faster is real. Over 70 percent of both men and women say that they have to work very fast, and roughly 90 percent say that they have to work very hard, according to our research.

But it doesn’t have to be this way. If we’re going to help ease Americans’ time crunch among the rank and file, the leaders at organizations will need a major mindset overhaul when it comes to how they think about work for themselves and for their employees.

Mindset #1: Priorities, not balance. Balance is static, but life is not, so accept that every day is different, and anchor your day-to-day in your overall priorities.

Mindset #2: Dual centric, not work centric. Don’t put work before everything else all the time. Our research shows that executives who prioritize work some of the time and prioritize personal life some of the time – what we call being dual centric – are less stressed, have an easier time managing work and personal demands, have advanced as high or at higher levels than those executives who were work-centric, and feel more successful in their home lives.

Mindset #3: Better, not perfect. Expecting perfection limits your ability to ask for help, so set expectations that allow for getting better and you will grow.

Mindset #4: Team, not individual. Going it alone limits your options, so get the whole team work it out together. That means the team at home as well as the team at work.

Mindset #5: Rest and recover, not flat-out. Making decisions in a constant time bind affects performance, so step away before diving in.

Leaders and managers at all levels who adopt these mindsets for themselves will both ease their own time crunch and improve their performance – and change the culture at work for everyone.

Anne Weisberg is senior vice president of the Families and Work Institute and an executive who has designed innovative practices to build effective, inclusive work environments. She co-authored the best selling book Mass Career Customization: Aligning the Workplace With Today’s Nontraditional Workforce and directed the report on women in the legal profession Women in Law: Making the Case.

TIME Africa

Africa Is Not a Country

Antiqued Globe
Getty Images

And four other shocking facts Americans should know about the continent

Not to pick on Sarah Palin, but it’s troubling for all Americans when there are rumors that a former Vice Presidential candidate thinks Africa is a country, not a continent. Africa is a blurry image in the mind of many Americans–warring, impoverished, unfixable. But then there are the stories of Africa the West doesn’t hear about: urban farmers feeding their families on unclaimed plots of land, a nonprofit building mapping apps to combat election fraud and violence, the booming Nigerian “Nollywood” producing blockbuster movies. In advance of the Zócalo event “Can Homegrown Innovation Change Africa?”, we asked observers of Africa to tell us what they think Americans would be most surprised to know about the continent today.

1. Moria in Lord of the Rings isn’t just a fantastical place

My undergraduate students are always shocked to find out that the indie rock musician Dave Matthews was born and raised in South Africa. They have no idea that the ESPN reporter, Sal Masekela, is the son of famed South African trumpeter Hugh Masekela. While these students are media-savvy, tech-savvy and by no means isolated, they, like much of America, are surprised to learn about the African roots of many American cultural phenomena.

Let’s take the case of J.R.R. Tolkien, born in Bloemfontein, South Africa. Some of the famous symbolism in The Lord of the Rings book and movie trilogy come straight out of South African history. Fans of the dwarves will know, for example, that the sun shone down through Moria onto Balin’s tomb – like the sun streams through the Voortrekker Monument in Pretoria, South Africa, every Dec. 16 – onto a slab dedicated to the remembrance of the “day of the vow” when Afrikaners slaughtered Zulu fighters in 1838 at the Battle of Blood River. It’s a centerpiece of Afrikaner history.

In The Return of the King, signal beacons, calling for aid, are lit along the mountain tops between Rohan and Gondor – as “freedom fires” were also lit along the hilltops of the Orange Free State by Afrikaner adherents of the Ossewa Brandwag, a fascist organization in the 1940s.

Teresa Barnes is an associate professor of History and Gender/Women’s Studies at the University of Illinois Urbana-Champaign.

2. A Particularly African Christianity, Coming to a City Near You

The past three decades have been marked by a remarkable surge in the popularity of Pentecostal Christianity across many parts of sub-Saharan Africa. This surge is part of a larger trend in which the majority of the world’s Christians now live in the global south. The Pew Forum on Religion and Public Life reports that, compared to the 9 million Christians in Africa in 1910, there were 516 million Christians by 2010, a 60-fold increase!

Today, there are more Anglicans in Nigeria than there are in England, and the United States is the only country with more Protestants than Nigeria. Historically, much of the evangelizing that took place in 19th and 20th century sub-Saharan Africa was conducted by countless, unknown Africans, rather than their more famous European counterparts. And many Africans who encountered missionary Christianity sought to make it theirs by replacing European liturgy, language, and practice with African alternatives. Which brings us back to the ongoing Pentecostal wave in sub-Saharan Africa.

Africans’ interest in Pentecostalism was fueled by literature emanating from North America in the 1970s and 80s. But, as one example of how they re-shaped Pentecostal theology to be more responsive to local practitioners’ material conditions, they presented a God who is deeply invested in believers’ fiscal and physical well-being in the present, not just the fate of their souls in the after-life. Amidst the swirling political and economic crises of the postcolonial state, this was an immensely attractive proposition.

Conceiving themselves as part of a global religious community, they began to export their brand of Christianity around the globe. As a result, we find that the largest single congregation in Europe, the 25,000-member Embassy of God, is a Pentecostal church founded by a Nigerian man. Today the largest African Pentecostal organizations are sending so-called “reverse-missionaries” to North America and Europe. One of those groups, the Redeemed Christian Church of God, founded in Nigeria, has 15,000 parishes around the globe including at least one in every major North American city. (Google your city and RCCG). This shift inevitably demands a change in how we, in both secular and religious America, understand our relationship to African Christians and Christianity as a whole.

Adedamola Osinulu is an assistant professor of Afroamerican and African Studies at the University of Michigan and a postdoctoral scholar with the Michigan Society of Fellows. He holds a doctorate in Culture and Performance from the UCLA Department of World Arts and Cultures.

3. Claiming an African Version of the American Dream

When the British were booted out of Nigeria in 1960, they left behind a fat inheritance of social and psychological trauma. In the years that followed, my mother’s generation – the ambitious, the frustrated, the desperate – left Nigeria by the millions to work and to go to school in the West, believing that their own country and continent had nothing to offer, that the future in Africa was born dead, that the West had a patent on progress.

What’s so exciting about Africa’s homegrown innovation (and, honestly, what’s the alternative to cultivating homegrown African innovation — outside innovation? Five hundred years of data tell us that hasn’t really worked out well.) is that it reveals a generation of young Africans who don’t see the West as a unique symbol of authority, power, and progress. This means something. If we limit the conversation about Africa’s development to governments and NGOs, we miss a bigger point: that a new generation of Africans is shrugging off the psychological humiliation of colonialism and leading the charge to positively change their own communities.

There’s a real connection between national psyche and national progress. Our national fairytale — the American Dream, the belief that anyone can be and do anything — fuels our drive to innovate. This country is home to millions of people who believe they can do something no one else has done, or that they can do the things others have done but do them better; this phenomena requires a confidence that is irrational, artificial, and national. The idea of the American Dream supplies that confidence.

What Americans need to understand is that colonialism systematically stole that kind of confidence from multiple generations of Africans. Today’s innovators show us that young Africans from all over the continent are reclaiming it.

Lanre Akinsiku is a Nigerian-American traveler and writer. He is currently pursuing a master’s in fine arts in fiction at Cornell University.

4. There is no time warp

Americans often think Africans are always trying to catch up to the West. This couldn’t be further from the truth. When I went to do dissertation research in Luanda, Angola, in 1999 I bought my first cell phone there. Cell phones existed in the U.S., but I barely knew anyone that had one. In Luanda, cell phones were already commonly available and used among a broad sector of Angolan society.

Similar dynamics exist in music and fashion. Men and women living in Luanda and other cities in Angola in the 1930s and 1940s listened to music from Latin America – plenas, rumbas, merengues – what they referred to as “GVs” (because they came on record albums with each musical number printed and preceded by the letters GV from the British company HMV). People all over the world – in the neighboring Belgian Congo, in Senegal, in England, and in the US (including my grandfather in the little town of Dundee, Ill.) — listened and danced to this music in those same decades.

In the 1970s, when low-waisted bell bottoms hit the runways, and then the streets of the U.S. and Europe, they caught on elsewhere. Luandan musicians, keen to outdress their frumpy colonial rulers, commissioned their tailors to copy these styles. Record albums from the period attest their sartorial panache but the sounds on them deny any simple reading of this as mimicry of the West. Instead, they folded these styles into their own way of doing and being.

So it doesn’t surprise me to find videos of Pharell Williams’s “Happy” uploaded from Benin and Madagascar. And I am always shocked, though delighted, when I hear Angolan kuduro on my university’s radio station, because what young Angolans produce now, young Americans ought to listen to too.

Marissa Moorman is associate professor of African history at Indiana University and an affiliate in the School for Global and International Studies and the Program for African Studies. She is the author of Intonations: a Social History of Music and Nation in Late Colonial Luanda, Angola 1945-Recent Times. She blogs at Africasacountry.com, and tweets @mjmoorman.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser

Get every new post delivered to your Inbox.

Join 46,094 other followers