TIME Research

Health Problems Linger for 9/11 First Responders

Firefighter walks through the rubble of the World Trade Center after it was struck by commercial airliners in a terrorist attack, Sept. 12 2001. after the first.
Todd Maisel—New York Daily News/Getty Images Firefighter walks through the rubble of the World Trade Center after it was struck by commercial airliners in a terrorist attack, Sept. 12 2001. after the first.

Emergency medical services workers who arrived at the World Trade Center site were twice as likely to show signs of depression than those who didn't

It’s been more than a decade since the attacks of 9/11, but many of the first emergency workers to arrive at the World Trade Center site continue to feel health effects. Nearly 17% of emergency medical service (EMS) workers who responded to the 9/11 terrorist attacks display symptoms of depression and 7% show signs of Post-Traumatic Stress Disorder (PTSD), according to a new study in the journal Occupational & Environmental Medicine.

“Our findings are part of a pattern of adverse health outcomes found among those who were exposed to the disaster,” said study author Mayris Webber, a health official at the New York City Fire Department, in an email. “We highlight the importance of continued medical monitoring and treatment of FDNY EMS workers, and indeed, of other responders and individuals who were affected by the [World Trade Center] disaster.”

Read More: Why 40% of Americans Misremember Their 9/11 Experience

The study, conducted by researchers for the New York Fire Department, evaluated the health of nearly 2,300 New York City Fire Department EMS workers over a 12-year period. In addition to depression and PTSD, EMS workers experienced a number of conditions that affected their physical health including 12% who experienced acid reflux disease and 3% who experienced cancer. And the earlier a medical worker responded, the greater his or her risk for medical conditions.

The study adds to a body of research that has found long-term health effects on police officers and firefighters who responded to 9/11, but it’s the first research to look specifically at EMS workers, whose primary responsibility is to provide medical care. Because of the difference between the roles of EMS workers and police officers and firefighters, EMS workers tend to be at lower risk of health conditions than EMS workers, researchers said.

The findings suggests that 9/11 responders still need to be monitored to protect their health, researchers said. The New York City Fire Department plans to do just that.

“At the Fire Department, in addition to providing treatment, we will continue our efforts to identify emerging health conditions and to identify individuals who are at high risk for developing these conditions,” said Webber.

TIME Research

Here’s Why Our Knuckles Crack

Scientists say they've finally figured out what happens when we pop our fingers

Scientists have answered the puzzling question of why our knuckles make that “pop” sound when we crack them.

A team of University of Alberta researchers had a volunteer crack his knuckles inside an MRI scanner so the researchers could figure out what was going on. They published their findings on Wednesday in the journal PLOS ONE.

The researchers have concluded that the crack comes from a gas-filled cavity or “bubble” that forms in the fluid between the joint.

“It’s a little bit like forming a vacuum,” said lead study author Greg Kawchuk, a professor in the Faculty of Rehabilitation Medicine in a statement. “As the joint surfaces suddenly separate, there is no more fluid available to fill the increasing joint volume, so a cavity is created and that event is what’s associated with the sound.”

MORE: You Asked: Is Cracking Your Knuckles Bad?

The study supports an original theory from the 1940s, the researchers say. But in the 1970s, other researchers believed that the sound came from a bubble collapsing in the joint instead.

But is cracking your knuckles bad for you? That little bubble appears to be benign; there’s no evidence to suggest that people who crack their knuckles are more likely to suffer from arthritis than those who don’t.

TIME Addiction

E-Cig Use Triples Among Middle and High Schoolers in One Year

TIME.com stock photos E-Cig Electronic Cigarette Smoke
Elizabeth Renstrom for TIME

More students use e-cigarettes than the conventional kind

E-cigarette use among middle school and high school students tripled in one year, reported the U.S. Centers for Disease Control and Prevention on Thursday. The new data shows that e-cigarette use has surpassed the use of all tobacco products, including regular cigarettes, among young people.

The data, published in the CDC’s Morbidity and Mortality Weekly Report, shows that e-cigarette use among high schoolers increased from 4.5% in 2013 to 13.4% in 2014. That’s a rise from about 660,000 students to 2 million, the CDC says. Use among middle schoolers rose from 1.1% to 3.9% in the same time period.

MORE: E-Cig Flavors May Be Dangerous, Study Says

The study looked at all forms of tobacco use and found that hookah use doubled for middle and high schoolers, and other smoking methods like cigarettes and cigars declined.

“We want parents to know that nicotine is dangerous for kids at any age, whether it’s an e-cigarette, hookah, cigarette or cigar,” said CDC director Dr. Tom Frieden in a statement. “Adolescence is a critical time for brain development. Nicotine exposure at a young age may cause lasting harm to brain development, promote addiction and lead to sustained tobacco use.”

Tobacco sources like cigarettes and smokeless tobacco are regulated by the FDA, and the agency is currently in the process of finalizing rules that would give it jurisdiction over other products including e-cigarettes. In hopes of discouraging use among kids and teens, several states have passed laws that enlist a minimum age for purchasing e-cigarettes, and many states have extended traditional cigarette bans to include e-cigarettes.

TIME medicine

The Strange Way a Diabetes Drug May Help Skin Scars

128630038
BIOPHOTO ASSOCIATES—Getty Images/Photo Researchers RM

We all form scars, but most of us don’t want them. There may soon be a way to make them disappear

We all have them — scars that won’t let us forget the spill we took off a bike, the burn we got from a hot stove, or even the legacy of radiation therapy. Scarring is a good thing in some ways — it’s the body’s quick response to a deep injury, its way of protecting and sealing up the wound to keep infections and other noxious agents away.

Now scientists led by Dr. Michael Longaker, co-director of the Institute for Stem Cell Biology and Regenerative Medicine at Stanford University, report in the journal Science that they have teased apart the molecular steps behind scarring, and also discovered a way to inhibit them from forming.

While training to become a plastic surgeon, Longaker operated on fetuses still in the womb and became intrigued by the fact that they did not scar; any incisions surgeons made disappeared practically without a trace. Why, if babies did not change their genes from the womb to the time they are born, do infants form scars?

Working with mice, the team focused on two kinds of fibroblasts, which are cells responsible for maintaining the structural integrity of organs, tissues and more. One is primarily responsible for wound healing, and the formation of tumors like melanoma. “This type of fibroblast starts out as less than 1% of the developing skin, but by the time an animal is a month old, it’s 80% of the fibroblasts in skin on the back of the animals,” he says.

When he treated the cells with diphtheria toxin, which destroys the fibroblasts, the animals scarred less. It turns out that these fibroblasts carry a marker on their surface that helps scientists to pick them out. And even more fortuitous, there is a drug approved for treating type 2 diabetes that inhibits the work of this marker.

In the mice, the drug reduced scarring but did not compromise the integrity of the wounded skin at all, making it a promising potential treatment for scar in people. Each year in the US people get 80 million operations, the bulk of which require incisions that leave a mark, not to mention the millions more who get cuts or scrapes during accidents or who develop fibrous tissue after radiation to treat cancer. If the scar-inhibiting drug is used on those wounds before they begin to heal, says Longaker, it’s possible they won’t leave a scar.

Whether the same could be true of existing scars isn’t clear yet. But he says that doctors may be more eager to do revision surgery to minimize scars if such a compound exists. And, if the results are repeated and confirmed, doctors may be able to reduce scars not just for cosmetic purposes but for medical ones as well, such as in the heart after a heart attack, following spinal cord injury and in deep tissues treated with cancer-fighting radiation.

TIME Diet/Nutrition

Dining Out May Be Linked To High Blood Pressure in Young People

545770377
Getty Images

One additional meal eaten away from home correlates with a more than 5% increase in the chance of having prehypertension

Young people who eat out often are more likely to have high blood pressure than their counterparts who cook their own food, according to a new study in the American Journal of Hypertension.

The study, which looked at more than 500 university students in Singapore, found that one additional meal eaten away from home correlates with a more than 5% increase in the chance of having prehypertension, or slightly elevated blood pressure. Researchers explained that meals eaten out typically contain more calories, saturated fat and salt than homemade meals. All three of those factors have been shown to cause high blood pressure.

Overall, 38% of students ate at least 12 meals, or around half of the total meals in a week, away from home each week and half of the male participants had prehypertension, compared with only 10% of the women.

The study authors note that the research doesn’t identify cause and effect and is likely more applicable to residents of South East Asia than elsewhere.

TIME health

How Doctors of the Past Blamed Women for Breast Cancer

Motherhood
Fine Art Photographic/Getty Images 'Motherhood' by Hector Caffieri, circa 1910

Breastfeeding, corsets and aging: the mysterious dangers of womanhood

History Today

 

 

 

This post is in partnership with History Today. The article below was originally published at HistoryToday.com.

‘I congratulate … [my fair countrywomen] on their present easy and elegant mode of dress’, wrote the surgeon James Nooth, in 1804, ‘free from the unnatural and dangerous pressure of stays.’ Nooth’s concern was not aesthetic. The danger he saw in restrictive bodices was cancer: ‘I have extirpated [removed] a great number of … tumours which originated from that absurdity.’

Breast cancer in the 19th century was a consistent, if mysterious, killer. It preoccupied many doctors, unable to state with any confidence the disease’s causes, characteristics or cures. While the orthodox medical profession in Britain were broadly agreed on cancer’s ultimate incurability, they were less uniform in their understanding of its origin. The disease was thought to develop from a range of harmful tendencies and events acting together. Both the essential biology of being female, as well as typically ‘feminine’ behaviors, were understood as causes of breast cancer.

Breastfeeding was a contentious topic at the end of the 18th century. An image of idealised motherhood emerged that infiltrated concepts of femininity: women were by nature loving, maternal and self-sacrificing. This ideology was expressed through changing social and political attitudes to breastfeeding and an outcry against wet-nursing across western Europe. In 1789 only 10 per cent of babies born in Paris were nursed by their own mothers; by 1801, this number had increased to half of all Parisian infants and two thirds of English babies.

Late 18th-century medical men were explicit about the associations between breastfeeding and breast cancer. In 1772, man-midwife William Rowley wrote: ‘When the vessels of the breasts are over-filled and the natural discharge through the nipple not encouraged … it lays the foundation of the cancer.’ Frances Burney – an aristocratic novelist who underwent a mastectomy in 1811 – attributed her disease to her inability to breastfeed properly: ‘They have made me wean my Child! … What that has cost me!’

Menstruation was seen as particularly hazardous. The surgeon Thomas Denman wrote: ‘Women who menstruate irregularly or with pain … are suspected to be more liable to Cancer than those who are regular, or who do not suffer at these times.’ However, their risk only increased after menopause. Denman considered ‘women about the time of the cessation of the menses’ most liable to cancer. Elderly women were blighted by a dual threat: their gender and their age. While surgeons insisted their theories were based on clinical observation, designating these various female-specific processes as causes of cancer supported their broader thoughts about female biology.

Eighteenth-century theory dictated that all diseases were explained by an imbalance in ‘humours’: black bile, yellow bile, blood and phlegm. Into the 19th century the insufficient drainage of various substances continued to be invoked as a cause of cancer; women’s ‘coldness and humidity’ made them particularly prone to disease. Menstruation was the primary mechanism by which the female body cleansed the system of black bile and its regularity was seen as central to a woman’s wellbeing. Certain situations in which the menses were disrupted or had been terminated were, therefore, especially dangerous: pregnancy, breastfeeding and menopause. Similarly, when the female body and its breasts were not used for their ‘correct’ purpose – childbearing and rearing – the risk of breast cancer increased.

The historian Marjo Kaartinen has noted that 18th-century theorists considered just ‘being female and having breasts’ a threat to a woman’s health. This way of thinking about female biology suggested that women were more likely to suffer from all cancers, not just cancer of the breast. Denman wrote: ‘It can hardly be doubted … that women are more liable to Cancer than men.’

This association between womanhood and disease and between breastfeeding, pregnancy, menopause and cancer is still part of our 21st-century understanding of breast cancer; that certain female-specific processes make you more or less likely to succumb to it. On its website, the breast cancer charity Breakthrough lists various ways you can reduce and increase your chances of disease. According to contemporary research, having children early and breastfeeding them reduces your risk. The later a woman begins her family the higher her risk is. The contraceptive pill, growing older and the menopause also increases your risk of breast cancer.

Drawing attention to such historical continuities questions the social and cultural environments that make certain medical assumptions possible. The causes of cancer suggested by Denman, Nooth and friends were informed by their understandings of female biology and female inferiority more generally. They were working within a school of thought that suggested any deviation from ‘appropriate’ womanhood could have hazardous consequences for a woman’s health. While the role of the historian might not be to deny the validity of 21st-century medical research, it is part of our remit to question cultural assumptions that continue to have some effect on both the conclusions of scientists and the way those conclusions are accepted by the broader public.

Agnes Arnold-Forster is a PhD candidate at King’s College London.

TIME Diet/Nutrition

Should I Drink Fat-Free Half and Half?

5/5 experts say no.

“Hell no!” says Dr. David Katz, director of the Yale University Prevention Research Center, followed by: “What is it?” Those were sentiments echoed by all of our experts in this week’s burning food question.

Lest you, too, are left scratching your head, here’s the lowdown. Half-and-half math is simple: whole milk plus cream. The fat-free version requires some more advanced calculations, however. “It typically replaces the milk fat with corn syrup and thickeners,” says Julia Zumpano, an RD at Cleveland Clinic’s Heart and Vascular Institute. (Kristi King, senior clinical dietitian at Texas Children’s Hospital, agrees that the real thing is better than additives.) The ingredient list on a typical brand of fat-free half and half contains fat-free milk, corn syrup, carrageenan, cream, artificial color, disodium phosphate, guar gum and vitamin A palmitate. It has half the calories (20) as regular half-and-half and about twice the sodium (20-30 mg), plus sugar (1-2 grams).

“Fat-free half-and-half strikes me as an absolutely unnecessary product,” says Mario Kratz, PhD, a dairy researcher and nutrition scientist at the Fred Hutchinson Cancer Research Center in Seattle. It exists, of course, because people want the rich texture and flavor and calcium benefits without the fat or calories. But that dairy phobia is misguided, according to Kratz’s recent review on dairy. “Our work shows that consuming dairy foods in their full-fat form (rather than nonfat or low-fat) is associated with lower weight gain, a lower risk of obesity, and possibly even lower risks for type 2 diabetes and cardiovascular disease,” he says.

MORE Why Full-Fat Dairy May Be Healthier Than Low-Fat

These findings are largely drawn from observational studies, so they can’t establish cause in the way that a randomized controlled trial can, Kratz cautions. Nor do the findings imply that chugging a carton of regular half-and-half is a good idea—just that drinking the fat-free version might be a worse one.

“I didn’t know there was such a thing as fat-free half-and-half. Sounds awful,” says Andrew Weil, MD, founder of the Arizona Center for Integrative Medicine. Weil steers people away from nonfat dairy products (though he cops to dipping into fat-free sour cream every once in a while) because taking the fat out of dairy, he says, might have hormonal effects. “Milk contains natural sex hormones,” he says. “The centrifugation process for preparing nonfat milk and products made from it causes differential concentration of male and female hormones in the separated watery and fatty components.” Some studies link skim milk to increased risk of type-1 diabetes, male acne and infertility in women, he says.

Without the fat, half and half is a lot like the skim milk that makes it up—it just isn’t quite whole.

fat-free half and half
Illustration by Lon Tweeten for TIME

Read next: Should I Eat Falafel?

Listen to the most important stories of the day.

TIME TIME 100

Meet the Women Scientists of TIME 100

Joanne Liu TIME 100 Women Scientists
Bryan Schutmaat for TIME Joanne Liu

These five most influential women are pioneers in the field of science and medicine

It will surprise no one to learn that women are vastly underrepresented in the field of science. But in this year’s TIME 100, five outstanding women who are making huge strides in the fields of medicine, genetics, and infectious disease, made the list.

Read more about these five influential scientists.

Dr. Joanne Liu, International president of Doctors Without Borders/Médecins Sans Frontières (MSF)
Liu and her team at MSF were the first to respond to the Ebola outbreak in Guinea. Liu has become a leader in the outbreak, and has fiercely and publicly criticized the international community for its slow response to the outbreak.

Emmanuelle Charpentier & Jennifer Doudna, Creators of gene-editing technology
Charpentier and Doudna developed a groundbreaking gene-editing technique called CRISPR-Cas9, which allows scientists to add or remove genetic material as they please. The process has major implications for a variety of health problems from HIV to sickle cell anemia to cancer. In theory, CRISPR-Cas9 could be used to edit any human gene.

Dr. Pardis Sabeti, Geneticist who sequenced the Ebola genome from the most recent outbreak
Sabeti and her team are responsible for quickly sequencing the genome of the Ebola virus that has ravaged Guinea, Sierra Leone and Liberia. The task was important, since it determined that the disease was indeed spreading from person to person. Many of her collaborators and fellow researchers died during the outbreak. When she’s out of the lab, Sabeti sings in a rock band.

Elizabeth Holmes, Health technology entrepreneur
Holmes is the CEO of Theranos, a blood testing company that has challenged the traditional lab testing model. She studied chemistry before dropping out of Stanford University her sophomore year to start her company, and at age 31 she made Forbes’ Billionaires List as the youngest self-made woman billionaire.

TIME Addiction

E-Cig Flavors May Be Dangerous, Study Says

TIME.com stock photos E-Cig Electronic Cigarette Smoke
Elizabeth Renstrom for TIME

Why you might want to reconsider that cotton candy e-cig

The chemicals used to flavor e-cigarettes may surpass safe levels, a new study says.

The study, which is published in the journal Tobacco Control, reveals that high exposure levels of these chemicals could spur respiratory irritation. The chemicals used to flavor e-cigarettes are the same flavors often added to foods, so the FDA has determined them to be generally recognized as safe in food. However, the authors of the new study say the high levels raise concern for safety and need for regulation and that these chemicals may be more dangerous when inhaled than when they are ingested in food.

“Chronic inhalation of these ingredients has not really been studied much at all,” says study author James F. Pankow, a professor of chemistry and civil & environmental engineering at Portland State University.

In the study, Pankow and his colleagues assessed the levels and types of flavor chemicals used in 30 different e-cigarette refill bottles, including a wide variety of flavors like tobacco, menthol, vanilla, cherry, coffee, chocolate, grape, apple, cotton candy and bubble gum. In 13 of the 30 products, the flavor chemicals made up more than 1% of the refill liquid volume, the researchers found, and the chemical levels were higher than 2% in seven of the liquids. Two of the liquids had levels of flavor chemicals higher than 3%.

The researchers found that some of the flavor chemicals used were benzaldehyde and vanillin, which are known to be respiratory irritants and have exposure limits for the workplace. However, when Pankow and his colleagues estimated consumption rates, they found that an e-cigarette liquid consumption rate of about 5 ml per day puts users at an exposure of twice the recommended occupational limits. “That’s probably not a good thing,” says Pankow.

The study authors point out several concerns about flavoring, including the fear that flavored e-cigarettes might attract young people and the fact that flavored e-cigarettes don’t usually list the levels of specific chemicals that are present in the liquids.

“The point is that when e-cigarettes manufacturers talk about these things as being food grade or food-like, they are sort of suggesting that use of flavors is equivalent to using them in foods,” says Pankow. “Never mind the fact that these things have not really been tested for safety, but in food FDA requires labeling ingredients. If they are going to say these are food-like, then why don’t they list the ingredients? It’s also not food-like product because you are inhaling it not ingesting.”

The researchers note that the small sample size doesn’t necessarily represent the entirety of the growing e-cigarette market. But they conclude that the results are likely what a broad survey would have revealed and that their findings suggest high levels of certain chemicals are likely present in many products.

TIME Biology

Here’s Why You Have a Chin

Gorgeous—and pretty much useless
Chev Wilkinson; Getty Images Gorgeous—and pretty much useless

Hint: You could do perfectly well without it

Nature is nothing if not parsimonious, especially when it comes to the human body. There’s a reason we don’t have webbed feet or nut-cracking beaks like other species, and that’s because we don’t need them. The system isn’t perfect, of course. If you ever wind up having painful abdominal surgery, odds are pretty fair that it will be your good-for-nothing appendix that’s to blame. And wisdom teeth seem a lot less wise when you consider how often they fall down on the job and need to get yanked.

As it turns out, the same why-bother pointlessness is true of what you might consider one of your loveliest features: your chin.

Researchers have long wondered what the adaptive purpose of the chin could possibly be. Sexual selection seems like an obvious answer, since an attractive chin increases your chances of mating. But a feature needs a function before it can appear in the first place. Only then can it be assigned some aesthetic value.

The other, better answer is all about chewing. The jaw exerts enormous forces when it bites and chews—up to 70 lbs. per sq. in. (32 kg per 6.5 sq. cm) for the molars. Conscious clenching increases the figure, and people who grind their teeth in their sleep may exceed the average force 10-fold. What’s more, the jaw moves in more than just one axis, both chewing up and down and grinding side to side.

That, so the thinking went, might increase bone mass in the same way physical exercise builds muscle mass. And bone mass, in turn, may produce the chin. The problem with the theory, however, is that it doesn’t account for Neanderthals and other primates—including the great apes—which lack prominent chins but in many cases have far more powerful bites than we do.

To answer the riddle, Nathan Holton, a post-doctoral researcher who specializes in craniofacial structure in the University of Iowa school of orthodontics, selected 37 of the many subjects whose facial measurements have been taken regularly from age 3 to young adulthood, as past of the longstanding Iowa Facial Growth Study (yes, there is such a thing).

With the help of basic physics, it’s possible to determine how much force any one jaw exerts without the subjects’ ever having to be tested directly with a bite gauge. Measuring the geometry of what orthodontic researchers call the mandibular symphysis and what everyone else just calls the chin region, and comparing that to what is known as the bending moment arm—or the distance between where a force is initially applied (in this case the muscles in the jaw) and where that force is eventually felt (the chin)—yields a pretty good measure of force exerted.

“Think about removing the lug nuts from a wheel on your car,” Holton wrote in an e-mail to TIME. “The longer the wrench, the easier it is because the longer wrench increases the moment arm, allowing you to create more force.”

And more force, in this case, should mean more bone mass in the chin—but that’s not what the results of the new research showed. Not only did the two turn out to be unrelated in the 37 subjects studied, but Holton and his colleagues even found that as the face matures, the chin is less adept at resisting mechanical forces, which is the whole reason it was assumed to grow more pronounced in the first place.

So why did we grow chins at all? The answer is, we didn’t. Holton and his collaborator, University of Iowa anthropologist Robert Franciscus, instead suspect that the face shrank away from behind the chin as primitive and pre-humans became modern humans, making it appear larger relative to everything else. The reason, as with so many things in the human species, has to do with male behavior—specifically violent male behavior.

As humans migrated from Africa 20,000 years ago and settled down into societies, males had to become less competitive and more cooperative—giving an advantage to those with lower testosterone levels. And reduced testosterone softens and shrinks the craniofacial structure.

“What we are arguing is that modern humans had an advantage at some point to have a well-connected social network,” Franciscus said in a statement accompanying the study. “And for that to happen, males had to tolerate each other. There had to be more curiosity and inquisitiveness than aggression, and the evidence of that lies in facial architecture.”

It wasn’t until we had our chins that we set about assigning value to them—strong ones, weak ones, angular, round, cleft or dimpled, depending on your tastes. Those tastes—and the mating choices that arise from them—ensure that the chin will stay. It might be biomechanically useless, but you’d look awfully silly without one.

Read next: Can Plastic Surgery Make You More Likeable? A Close Look at a New Study

Listen to the most important stories of the day.

Your browser is out of date. Please update your browser at http://update.microsoft.com