Ever since Attention Deficit Hyperactivity Disorder (A.D.H.D., né A.D.D.) became the icon-disability of the digital age, articles have been regularly tracking the spectacular increase in the diagnosis, the tragedy of addiction, doctors having second thoughts, the occasional rogue who never had first thoughts. Yet instead of reconsidering our industrial-strength pharmacological panacea, the attention-deficit generation, now in its 20s and 30s, has been taking its stimulants to the office—even when there are no meaningful symptoms to treat. Non-disordered go-getters, who maybe bummed their friends’ prescription pills in college, are simply doing what’s imperative to make it in the real world.
As stimulant meds expand from the classroom to become part of corporate hygiene—vitamin P for productivity!—personhood itself seems at risk of being a psychotropic delusion. Though the normalizing of pharmaceutical aids may have muted any scientific discussion of whether attention deficit is a real medical epidemic, why hasn’t the quickening drift toward mass medication triggered some kind of public moral debate? Shouldn’t we as a society finally begin asking ourselves not just “Is this necessary/wise/safe?” but “Is it right?”
Presumably the reaction to all the self-medicating, college-educated strivers would not have been quite so resigned if the workers in question were taking Adderall to keep pace with the robots on a General Motors assembly line, or were chemically adapting to longer shifts in a Wal-Mart stockroom. But where was the soul-searching, let alone outrage, when an even more defenseless population—that stressed-out cohort of young achievers back during their childhood—was being diagnosed and dosed in the millions, all the way down to preschoolers?
I doubted A.D.D. was a definable biological disorder the first (but not last) time I heard an expert compare a child prescribed Ritalin to a diabetic requiring insulin. That was more than a dozen years ago, and the professionals I encountered in the ensuing decade stood staunchly behind whatever new theory cycled through, putting me in mind of Churchill’s line about how the only statistics he trusted were the ones he had doctored himself. Though I don’t deny that hyperactivity and distraction may be on the rise, behaviors that common could be explained by many social and environmental (as well as biological/evolutionary) variables—let’s put them under the umbrella of humanness.
Indeed, a lot of the “maladjusted” conduct I saw being tagged “A.D.D.” struck me as the artistic temperament, since artists are stimulated by creating stuff, even if it’s trouble. It seemed clear that these drugs were catering to a Zeitgeist that transcended mental health—and pharma profits—when the diagnosing of students spiked with the implementation of No Child Left Behind and other test-driven “accountability” laws. Assessment is the handmaiden of standardized performance, that increasingly holy sacrament of the marriage between human ideals and the bottom line, in which “good” and “profit” become one.
A zeal for self-improvement is, of course, in the DNA of our more perfect union. Betterment was the original goal of the once-respectable eugenics movement in the early 20th century, but that natural impulse eventually converted into an also natural desire to purge the civic corpus of the less-good. (“We have no business to permit the perpetuation of citizens of the wrong type,” as former President Theodore Roosevelt put it.) It may seem sensational to compare present-day chemical “improvement” of our young to a “scientific” program that led to Supreme Court-approved forced sterilization here and, in Germany … worse. But what they share is the reduction of humans to things, and a thing either serves a function or is in the way. “Focus!” “Find your passion!” we tell our kids, though we really mean: “Measure up.” “Get into Harvard.”
Whenever I speak to young people about my own social-disaster expertise—the segregated South—the point I stress is that practices we look upon in retrospect as wrong and perhaps weird feel normal to those living under them, and are sustained by an intricate nexus of economic, political, emotional and even spiritual forces that in their time seem impossible to change. When will a human rights movement emerge to take on today’s version of “just the way things are”?
“A snake is not going to commit suicide,” Fred Shuttlesworth, the civil rights trailblazer in Birmingham, Ala., used to say to explain why segregationists would never reform themselves, and by that logic we can’t expect the drug companies stop selling drugs. But couldn’t we begin, say, by outlawing TV ads for all prescription medicine, including the psychotropics? Every time I see a commercial pushing some pharmaceutical (other than that insane list of side effects) miracle, I think about my pack-a-day grandmother, who was counseled by her doctor in the early 1950s to take up smoking in order to settle her nerves. It’s now the pill-makers who see a boundless market of customers, and by presenting a dazzling menu of treatable conditions, smartly packaged in acronyms, they hope to convince the public that there’s a brand out there for everybody.
Reform is not likely to come from parents either—their time and constituents inside attention-deficit city hall are finite, and their motive is love, with its boon companion, worry. Nor is the school system about to throw away a crutch that not only boosts those test scores but also mitigates the classroom-management travails aggravated by budget cuts.
Barring an intervention by Pope Francis, the logical starting place for organized dissent would be the medical profession, especially the branch with the highest reputation stakes. The psychiatric establishment might well worry about having some explaining to do down the road, as the specialty that once deemed the lobotomy Hippocratically sound. (Philosophical qualms are being voiced in the field about the new normal of cosmetic psychopharmacology.)
The philosopher-historian Achille Mbembe has described, in a different context, an insidious spiritual impediment to reframing our national drug habit as a moral crisis. Emancipation, he notes, used to be defined as “the refusal of a human to be turned into a thing.” But currently “humans are trying to turn themselves into things,” he says. Instead of rejecting “thingness,” we yearn toward “thinghood,” reflected in our luminous gadgets and the ideology of our era, the quantification mania that might be called “datalitarianism.” A Marxist would probably say thinghood is an example of “false consciousness.” Indeed, as a commenter summed up on The New York Times’ latest report from the attention-deficit front: “This is your brain on capitalism.” Then he concluded in so many words: That’s just the way things are.
More Must-Reads from TIME
- Donald Trump Is TIME's 2024 Person of the Year
- Why We Chose Trump as Person of the Year
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- The 20 Best Christmas TV Episodes
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com