The disease killed my father. At 39, I had to choose how much I wanted to know about my own fate
People ask me all the time if I want to find out how and when I’m going to die. But that’s not exactly how they ask it. What they ask is whether I’m going to get tested for the gene associated with early-onset Alzheimer’s disease. It’s hard, though, to miss the subtext in the question: How morbidly curious are you? How much terror can you withstand?
I don’t blame them. These friends know I’m 39 and that my father started showing symptoms of Alzheimer’s in his early fifties (and possibly earlier). They know that after a handful of difficult years my father was diagnosed when I was a freshman in college and that he died less than a decade later. They wonder if I’m going to take advantage of the remarkable opportunity science affords us to uncover our genetic destinies and plan accordingly.
Modern life is all about making us forget we’re capable of dying. We love to feel in control of our mortality, even if we understand that that control is only an illusion. Alzheimer’s disease is the opposite of modern life. It’s the ascendancy of entropy and chaos.
My father’s disease had a devastating effect on our family. It didn’t just take away our time with him and his with us. It also took away his time with the not yet conceived children who would populate the family in his absence. He would have been in his 70s now, surrounded by three grandchildren through my sister and two through my wife and me. It’s painful to know what a resource he would have been for them and how much they’ve lost. He will live, faintly grasped, if at all, only in stories.
When he was still living, we tried to make the best of the situation. When my sister got married, my mother brought my father’s tux to the nursing home and had the staff dress him in it. After the ceremony, while everyone else headed to the reception, two limos carrying my immediate family took a detour to the nursing home for photos.
When I look at the framed shot of us huddled around my father in his wheelchair, I see how hard my sister is trying to keep her emotions in. She’s smiling big, but tears are streaming down her face. We are all smiling hard, though there’s no driving off the pain and awkwardness of the moment. Everyone’s looking at the camera except my father, who is gazing vacantly the other way, his mouth hanging open. Moments later we drove to the reception, leaving him behind, feeling terrible for doing so. I wanted him not to understand a thing that was happening in that scene, but you never knew what he knew.
For most of my youth, my father seemed to know everything. A universe of information swirled around in his brain. I could hardly put a question to him that he couldn’t answer. The rare times he came up short, he pulled me into his study, took a book off the shelf, lay it on the desk and stood flipping through it with me. I think sometimes he pretended not to know things just so that we could look them up together.
Once, when I was about 10 and my sister about 14, we were walking with my father on the outskirts of his old neighborhood. He stopped in front of a town house and told us Winston Churchill’s mother was born there.
“The iconic English statesman of the century!” he said. “A mother from Brooklyn!” He gave us a look almost wild with the significance of what he was about to say. “The wit!” he said. “The chutzpah! That was the Brooklyn in him!”
Three decades later, I can still remember the moment, bathed in that ethereal light that we reserve for our happiest memories. Why do I remember it, though? How did such a quotidian moment burrow its way into my consciousness and survive? Was it the juxtaposition of incongruous worlds, England and Brooklyn? I don’t think so. I think it was the joy my father took in sharing his knowledge with us.
My father would have loved my twin children. They’re 3 years old and full of vitality and personality. My son is unusually strong for such a skinny kid, and remarkably agile. He climbs whatever is available, with a monkey’s speed. When he sits at the piano and pounds the keys, it sounds as if he’s playing a real song. My daughter is a sensitive cuddler who remembers everything. “Daddy, is this from the hotel we stayed at?” she asked the other day, handing me a pad from a Marriott where we stayed six months ago.
Recently my daughter came into our bed in the early morning, lying between my wife and me, and started in on iguanas. “Iguanas are baby alligators,” she said, and I chuckled at the powers of observation of a developing mind. “Can iguanas learn to open doors?” she asked, and after I offered the opinion that they couldn’t, I pulled her close, gave her kisses and began to choke up.
Maybe when my twins are older, science will have caught up to this disease. We have the best scientific minds working on the problem of Alzheimer’s. Much like the search for the cure for cancer, there is a massive payout at the end of the rainbow for anyone who comes up with a solution. If there’s anything to put one’s faith in in the health care system, it’s that the confluence of genius and capital will, in this case, produce the outcome if the outcome is producible. And I do believe it’s producible. But if it isn’t produced in time, no amount of awareness of my fate, if it is to be my fate, is going to forestall its unfolding on me.
My wife and I have little battles over my forgetfulness. She asked me to fix the kink in the hose that runs from the humidifier in our basement to the French drain. A few days later, she gave up and fixed it herself. We had a grill delivered for our backyard, and the flame kept going out on it as soon as we lit it. I was supposed to call about it the next morning, but I’d more or less forgotten that we’d bought a grill in the first place when I heard my wife on the phone with the store. These aren’t terrifying signs in themselves — everyone is a little forgetful occasionally — but they make me pause enough to wonder if the worst is coming.
I’m built like my father, I sound like him, and if I have a genetic mutation in one of three genes that are all variations of the apolipoprotein E gene, then I will likely develop early-onset Alzheimer’s like him. These genes are rare, accounting for only 1% to 5% of all Alzheimer’s cases. But if I inherited the mutation from my father, then I will probably get the disease.
My grandfather — my father’s father — died relatively young of other causes, so there’s no saying whether he would have gotten early-onset Alzheimer’s. No one else in the family had it that we know of. I have as good a chance of getting familial Alzheimer’s as I have of avoiding it. Genetic testing would settle the question for good.
But what would I gain by knowing I was getting Alzheimer’s? I wouldn’t gain another day with my family. I wouldn’t gain a leg up on planning. My wife and I have taken care of practical considerations. We have wills. My wife has a durable power of attorney that enables her to make decisions on my behalf. Every policy, every asset, is in both our names. We opened college savings accounts for the kids. I’m working hard on my next book. How much more could I prepare?
After some deliberation, I’ve decided not to get genetic testing done. Instead, I’m going to try to live every day as if I know that I’m dying. The fact is, we are all dying. If I try to wring the most I can out of every moment, if I set aside time every day that my wife and I keep as inviolate as possible, if I give my wife and children quality interactions whenever we’re in the same room, if I leave the smartphone on the counter and realize there is no information more important than the information I get in my interactions with my loved ones, then how different is any of that from what I’d do if I knew I was getting Alzheimer’s?
Scientific studies suggest that my children are at just the age when they can begin to form lasting memories of their experiences. If I’m aware that I’m going to be gone someday and I consider it possible that that day will come far sooner than I’d like, then I want them to grow up not only knowing their father well but also knowing that they are well loved. I want to get in better shape for them, because I’d like them to see what a truly vital father looks like. And I’ve decided to read to them whenever they ask, if I possibly can. I don’t have any memory of my father telling me, “No more books” at bedtime. I will forever picture him with an arm around me, holding a book out before me, showing me the world.
As physicians become more specialized, our health care system becomes increasingly costly, sloppy and disorganized
Not long ago, a primary-care physician called me about a patient with a right-lung “consolidation” — probably pneumonia, though a tumor could not be excluded — that a lung specialist had decided to biopsy. My colleague wanted me to provide “cardiac clearance” for the procedure.
“Sure, I’ll see him,” I said, sitting in my office. “How old is he?”
I stopped what I was doing. “Ninety-two? And they want to do a biopsy?”
My colleague, who is from Nigeria, started laughing. “What can I tell you? In my country we would leave him alone, but this is America, my friend.”
Though accurate data is lacking, the overuse of health care services in this country probably costs hundreds of billions of dollars each year out of the $3 trillion that Americans spend on health. This overuse is driven by many forces: “defensive” medicine by doctors trying to avoid lawsuits, a reluctance on the part of doctors and patients to accept diagnostic uncertainty (thus leading to more tests), lack of consensus about which treatments are effective, and the pervading belief that newer, more expensive drugs and technology are better. However, perhaps the most important factor is the overspecialization of the American physician workforce and the high frequency with which these specialists are called by primary-care physicians for help.
The past half-century has witnessed great changes in American medicine. One of the biggest shifts is the rise of specialists. In 1940, three-quarters of America’s physicians were general practitioners. By 1960 specialists outnumbered generalists, and by 1970 only a quarter of doctors counted themselves general practitioners. This increase paralleled an equally dramatic rise in medical expenses, from $3 billion in 1940 to $75 billion in 1970.
Specialist-driven care has now become a fact of medical practice. In the past decade, the probability that a visit to a physician resulted in a referral to a specialist has nearly doubled, from 5% to more than 9%. Referral rates to specialists are estimated to be at least twice as high in the U.S. as in Britain.
The consequences for patients are troubling. Besides high costs, having too many consultants leads to sloppiness and disorganization. As Drs. Donald Berwick and Allan Detsky recently wrote in the Journal of the American Medical Association, inpatient care at hospitals has become a relay race for physicians and consultants, and patients are the batons.
I remember a 50-year-old patient of my Nigerian colleague who was admitted to the hospital with shortness of breath. During his monthlong stay, which probably cost upward of $100,000, he was seen by a hematologist; an endocrinologist; a kidney specialist; a podiatrist; two cardiologists; a cardiac electrophysiologist; an infectious-disease specialist; a pulmonologist; an ear, nose and throat specialist; a urologist; a gastroenterologist; a neurologist; a nutritionist; a general surgeon; a thoracic surgeon; and a pain specialist. The man underwent 12 procedures, including cardiac catheterization, a pacemaker implant and a bone-marrow biopsy (to investigate only mild anemia). Every day he was in the hospital, his insurance company probably got billed nearly $1,000 for doctor visits alone. When he was discharged (with only minimal improvement in his shortness of breath), follow-up visits were scheduled for him with seven specialists.
This case — in which expert consultations sprouted with little rhyme, reason or coordination — reinforced a lesson I learned many times in my first year as an attending physician: in our health care system, if you have a slew of specialists and a willing patient, almost any sort of terrible excess can occur.
What to do about this overspecialization? One option is accountable-care organizations, an idea put forward by the Affordable Care Act, in which teams of doctors would be responsible (and paid accordingly) for their patients’ clinical outcomes. This would force specialists to coordinate care. Unfortunately, most doctors, notoriously independent and already smothered in paperwork, have generally performed poorly in this regard.
Reforms will also have to focus on patient education. Medical specialty societies recently released lists of tests and procedures that are not beneficial to patients. By using these lists, cardiologists have been able to decrease their use of imaging tests by 20%. Better-informed patients might be the most potent restraint on overspecialized care. A large percentage of health care costs is a consequence of induced demand — that is, physicians persuading patients to consume services they would not have chosen had they been better educated. If patients were more involved in medical decisionmaking, there would be more constraints on doctors’ behavior, decreasing the possibility of unnecessary testing. This could serve as a potent check on what the doctor ordered.
Today roughly 1 of 6 dollars spent in America goes toward health care. If we do not succeed in controlling these costs, they will gradually crowd out other necessary societal expenditures. Improving health literacy will be critical to these efforts. Without a better understanding of what doctors are actually doing, one may end up like the patient who had 17 consultants and 12 procedures and who reinforced a further lesson I have learned many times since entering practice: when too many specialists are involved in a case, the result too often is waste, disorganization and overload.
Jauhar is a cardiologist and the author of Intern: A Doctor’s Initiation and the new memoir, out today, Doctored: The Disillusionment of an American Physician
The Burns Collection consists of human cadavers from the early 1800s that were anatomically dissected and preserved to teach anatomy and surgery to medical students. For the first time this portion of the collection is on display to the public as a part of traveling exhibit "Mummies of the World: The Exhibition."
Early results from clinical trial prove successful in restoring hair loss
A drug normally used to treat bone marrow disorders may help patients suffering from alopecia, according to a new study.
Alopecia areata is a kind of autoimmune disorder that causes the immune system T-cells to attack hair follicles, causing them to fall out and become dormant. For some sufferers, this means the loss of small patches of hair, such as “spot baldness”, but a small percentage suffer complete hair loss. No current treatments for alopecia patients completely restore hair loss.
For a long time, medical researchers were uncertain what was causing the immune system cells to attack hair follicles, until Columbia University Medical Center (CUMC) researchers were able to determine a “danger signal” in the follicles of alopecia patients that was signaling the T-cells’ attack.
Now, they’re moving on to search for a cure. In the paper, published in Nature Medicine, the researchers report their early results in both mice and human trials. The researchers tested two FDA-approved JAK inhibitors (or involved in immune response), ruxolitinib and tofacitinib. In mice with severe hair loss they found that both of the drugs completely restored the hair in the animals. In three human patients in the researchers’ clinical trial of ruxolitinib, hair loss was restored within four to five months. Ruxolitinib was already approved by the FDA as a treatment for myelofibrosis.
One of the lead doctors on the clinical trial is Angela M. Christiano, a professor in the Departments of Dermatology and of Genetics and Development at CUMC, who is herself an alopecia sufferer. “Patients with alopecia areata are suffering profoundly, and these findings mark a significant step forward for them,” she said in a statement. “The team is fully committed to advancing new therapies for patients with a vast unmet need.”
The findings are early, but given the limited amount of research and resources dedicated to the study of alopecia, the findings will be a great relief to sufferers—especially as the drug in question has already been given the thumbs up by the FDA.
Children treated with growth hormone are more likely to experience strokes decades later
Since the Food and Drug Administration approved a synthetic form of growth hormone (GH) in 2003 to treat short stature in kids, it’s become a popular medication not just among parents who want their children to grow but also in locker rooms of professional athletes who believe the collagen-building features of the drug can both protect and improve recovery from injury.
Now the latest study shows that children treated with GH are at risk of bleeding in the brain nearly 20 years later. French researchers report Wednesday in the journal Neurology that among a group of children treated for short stature or low levels of growth hormone had between a 1.5 to 5.3 times higher risk of having a stroke during the follow-up period than the general population.
“Subjects on or previously treated with growth hormones should not panic on reading these results,” the authors said in a written statement. “The results of this study highlight the importance of studies of this kind for the evaluation of the long-term effects of treatment.”
While the researchers can’t explain why the hormone treatments, which are usually given in daily injections over four to five years, led to the strokes, earlier studies on animals with a metabolic disorder in which they produced excessive amounts of the hormone showed that they tend to have more bleeding events. The scientists admit, however, that it’s also possible that short stature itself may have some connection to stroke risk since other disorders in which people don’t grow properly are also linked to abnormal blood flow to the brain.
The study, which involved nearly 7,000 participants, provides good reason for people taking growth hormone to discuss the potential risk of stroke with their doctors, say the authors. Whether the findings apply to others who take growth hormone – athletes who use it for performance enhancement, or those affected by other diseases such as kidney disorders – isn’t clear yet.
But you need to take it for at least five years, and probably 10, for the benefits to be seen
Researchers have found that taking aspirin over a period of several years in late middle age can reduce deaths from bowel, esophageal and stomach cancer by 40%, 35% and 50%, respectively, Reuters reports.
The claim is based on a sweeping review of all available research into the harms and benefits of aspirin.
However, researchers stress that the benefits are only apparent if the drug is taken for up to 10 years between the ages of 50 and 65.
The study’s lead author, Professor Jack Cuzick, head of the center for cancer prevention at Queen Mary University of London, said that benefits were only seen after at least five years of low daily doses (about 75 to 100 mg).
Researchers also warned that 60-year-olds who take the drug for 10 years could slightly increase their chances of stomach bleeding, which could prove fatal for a small number of people. Aspirin can also increase the chances of a hemorrhagic stroke, which is caused by the rupture of a blood vessel in the brain.
Cuzick concluded that taking the drug did not relieve users of the obligation to live healthily. Although a daily dose of aspirin can help reduce the risk of some cancers, the drug “should not be seen as a reason for not improving your lifestyle,” Cuzick told the Guardian.
Dr. Emmanuel Farber's research contributed to a paradigm shift in American attitudes to tobacco
Emmanuel Farber, the Canadian-American doctor whose medical research contributed to groundbreaking discoveries in the study of cancer-causing chemicals, died on Sunday. He was 95.
“He represents a guiding example of a life devoted to serving his fellow man and scientific colleagues with unmatched qualities of integrity, humbleness, deep reasoning, and an exquisite no-nonsense … approach to science,” the Society of Toxicologic Pathology wrote in 1985, when inducting him as an honorary member.
Farber was born in 1918 in Toronto, where he would first study medicine. After graduating from the University of Toronto with an M.D. in 1942 and serving in the Royal Canadian Medical Corps during World War II, he earned a Ph.D. in biochemistry from the University of California, Berkeley.
His career was long and his legacy is vast, but perhaps his most prevailing accomplishment came at the nexus of medicine and public policy, when, in the early 1960s, he sat on the Surgeon General’s Advisory Committee on Smoking and Health, which produced some of the earliest conclusive evidence that cigarettes could cause cancer. The committee’s report, according to Harvard Medical School, caused a paradigm shift in American culture, which until then largely dismissed concerns surrounding smoking’s health risks.
Over the course of his career, Farber held positions on the faculties of Tulane University, the University of Pittsburgh, and his alma mater in Toronto; he also served as president of both the American Association for Cancer Research and the American Society of Experimental Pathology. He received numerous awards for his scientific research.
He spent the last years of his life in Columbia, S.C., where he would meet his second wife, Henrietta Keller Farber. She died in 2011. He is also preceded in death by his first wife, Ruth Farber, and two siblings, Lionel Farber and Sophie Goldblatt. He leaves behind a daughter, a son-in-law, and one grandson.
Researchers published promising findings, while a pharmaceutical company applied for the first-ever regulatory approval of malaria vaccine
The world’s first malaria vaccine may just be a year away, after a thorough trial of a new drug showed promising results.
PLOS Medicine on Tuesday published a study, in which researchers found that for every 1,000 children who received the vaccine, 800 cases of illness could be prevented. The children also retained protection 18 months after being injected.
Now, pharmaceutical manufacturer GlaxoSmithKline (GSK) has applied the drug for regulatory approval — the first time a malaria vaccine has reached this stage.
“This is a milestone,” Sanjeev Krishna, professor of molecular parasitology and medicine at St. George’s, University of London, who reviewed the paper for the journal, told the BBC. “The landscape of malaria-vaccine development is littered with carcasses, with vaccines dying left, right and center. We need to keep a watchful eye for adverse events, but everything appears on track for the vaccine to be approved as early as next year.”
Around 800,000 people die from malaria every year, most of them children under 5 in sub-Saharan Africa. Several African countries were involved in the trial of the new vaccine, which is developed by GSK in cooperation with the nonprofit Path Malaria Vaccine Initiative, for which they have received funding from the Bill & Melinda Gates Foundation.
A number of complicating factors and delayed data make conclusions difficult to draw
The New York Post reported Sunday that the number of cancer cases among 9/11 first respondents had more than doubled in the past year, from 1,140 to over 2,500. However, to scientists who specialize in analyzing such data, the number of cases cannot ever tell the full story.
Dr. Roberto Lucchini is an epidemiologist and director of the World Trade Center Health Program Data Center at Mount Sinai Hospital, which treats and researches the police officers, construction workers, sanitation workers and iron workers who were among the first respondents on 9/11. To Lucchini, the number of observed cancer cases among these patients cannot be significant until compared to the number of expected cancer cases.
“I don’t think there’s a double of cases one year to the other,” Lucchini told TIME. “When you compare one year to the other, you have to be careful and try to understand what you are comparing. If you don’t compare correctly, you can come up with information that is not exactly true.”
“I don’t think they compared like-with-like which is what you normally do in epidemiology,” adds Dr. Billy Holden, a deputy director of the data center. “I don’t know how they came to the conclusion that there was a doubling.”
Mount Sinai has a record of 1,646 confirmed cancers from 2002 to present-day among the over 30,000 first respondents that they oversee. The hospital’s cases are reviewed and certified by the National Institute of Occupational Safety and Health (NIOSH). Meanwhile, the public registry—which also collects data on these cases—has confirmed 1,172 cancers among Mount Sinai patients, but the registry’s number only represents data through the year 2010, which may account for the difference.
“That’s the latest that we have in reliable data that we can use,” Holden says. “The delay is coming from the registries themselves. It takes them a long time to get the data.”
According to a press release from Mount Sinai, “analysis of available data through 2010 shows that there is an approximately 20% increase in cancer incidence in 9/11 rescue and recovery workers compared to the general population, with a particular increase in thyroid cancer, prostate cancer, myeloma, and leukemia.”
This elevated incidence rate could result from the high exposure to carcinogens that many first respondents endured. However, even this number is subject to question due to a number of complicating factors, including over-diagnosis of certain cancers—such as thyroid and prostate—and questionably reliable data for the general population.
“Over-diagnosis means you’re just screening for cancers, and you pick up cancers that in the normal course of things would never cause symptoms and would never cause death,” Holden says. “The screening for thyroid and prostate cancer is picking up these really non-malignant cancers that don’t do anything.”
Another complicating factor is the continued aging of the first respondents. Epidemiologists would expect the number of observed cancer cases among this population to increase over the coming years regardless because everyone’s risk of cancer rises with time. “Numbers are interesting, but they’re not revealing because we have to look at the rates,” Holden says. “Looking at numbers themselves doesn’t mean anything. You have to put them in a certain context.”
The search for a similar context alone can result in frustration for researchers. As so many residents of New York need not be reminded, 9/11 is an event that stands alone in our history.
“There’s nothing like this in the whole history of the world,” Lucchini says. “We can think about Chernobyl or Fukushima, but this is a totally different situation here… So for us to compare this to other studies and other experiences is quite difficult.”
Lucchini adds, “We are doing as much as we can.”
When it comes to the men and women who first responded on that fateful day, the question remains of how much can ever be enough.