TIME Mental Health/Psychology

Survey: 15% of Med Students in the U.K. Have Considered Suicide

Doctors Seek Higher Fees From Health Insurers
Adam Berry—Getty Images A doctor holds a stethoscope on September 5, 2012 in Berlin, Germany.

A new survey of medical students in the UK illustrates complex mental health issues

Before medical students earn their MDs and start the hard work of being doctors, they have to make it through medical school—and that can be a grueling time, according to the results of a new survey published in the journal Student BMJ.

In the survey, which was taken by 1,122 medical students in the UK, 30% said that they had a mental health condition or had been treated for one, during medical school. Of the students who had a mental problem, 80% said they felt a lack of support from their medical schools. About 15% of all the students who took the survey said that they had considered committing suicide at some point during their medical school careers.

MORE: Why The Toxic Treatment Of Doctors Needs To Change

The survey was small and therefore not necessarily representative of all doctors in training in that country; the number of people who completed it represent only about 2% of medical students in the UK, according to the paper. But the answers highlight the stress that some doctors experience. Fierce competition between students and unforgiving exam schedule are some of the reasons that mental health issues are so prevalent, the authors say.

More than 8% of the students in the survey reported using brain-enhancing drugs for academic reasons. “As a postgraduate student studying undergraduate medicine I worry for my younger colleagues,” said one student in the survey. “I am stunned by the amount who take prescription medication during exam time.”

Mental health problems are stigmatized in medicine, and that attitude trickles down from the highest reaches of the medical hierarchy, the survey authors say. TIME’s recent story on the mental health of American physicians shows that the many stresses of medicine only tend to get worse after medical school, and many young training physicians have high levels of depression and anxiety. Up to 400 American physicians die by suicide every year, according to the American Foundation for Suicide Prevention.

Administrators are actively trying to figure out how to reduce the stigma associated with asking for or needed help with their mental health. At the University of Cardiff, for instance, students can refer themselves, without consequence, for mental health help, and the referrals have increased fourfold since the program began.

Not everyone who had a mental health issue in medical school felt unsupported, the survey also found. Nearly 60% of students with an issue—but without suicidal thoughts—said they felt well supported by their school. “I couldn’t ask for better support from my medical school,” one student wrote. “I was embarrassed to approach them—but they were great and so understanding.”

TIME Crime

Men Who Buy Sex Are More Prone to Sexual Violence, Study Says

Sex buyers share characteristics with men who commit sexual violence

Men who buy sex are more prone to sexual coercion and are more likely to report a history of sexual violence, according to a new study.

The study of 101 men in the Boston area, published Monday in the Journal of Interpersonal Violence, found that men who hire prostitutes tend to have less empathy for women and tend to share characteristics with sexually violent men. The researchers screened 1,200 men in order to isolate demographically comparable groups to interview. “Both groups tend to have a preference for impersonal sex, a fear of rejection by women, a history of having committed sexually aggressive acts and a hostile masculine self-identification,” said UCLA professor Neil Malamuth, who co-authored the study, in a statement. “Those who buy sex, on average, have less empathy for women in prostitution and view them as intrinsically different from other women.”

The researchers define “hostile masculinity” as a hostile and narcissistic desire to have power over women. One man told researchers he thought of prostitution like buying a cup of coffee: “When you’re done, you throw it out.”

The study also found that men who buy sex are more likely to rape and commit other sexual offenses. The study comes as more and more jurisdictions are focusing on targeting johns rather than prostitutes in their efforts to curb prostitution and sex trafficking, and on the heels of Amnesty International’s vote in August to recommend the complete decriminalization of prostitution for buyers and sellers. Read about the effort to target sex-buyers in the United States here.

The study was co-authored by Melissa Farley, who runs Prostitution Research & Education, a nonprofit that studies prostitution and sex trafficking. In its mission statement, PRE says it is dedicated to abolishing the prostitution altogether. The study was also funded by Hunt Alternatives.

SPECIAL REPORT: Catching Johns: Inside the National Push to Arrest Men Who Buy Sex

 

TIME Heart Disease

Here’s How To Find Out Your Real Heart Age

476963397
Getty Images

So much for being "young at heart"

The age on your birth certificate may say one thing, but the age of your heart is likely significantly older.

A new report from the U.S. Centers for Disease Control and Prevention (CDC) released on Tuesday reveals that three out of four Americans have a predicted heart age that’s older than their real age, which means they are at a greater risk for heart issues like attacks and strokes.

A person’s heart age is based on risk factors like blood pressure levels, whether they smoke and how much they weigh.

(Calculate your heart age here, if you’re between ages 30 and 74.)

In the new study, the researchers analyzed data collected from every state and from the Framingham Heart Study and estimated that about 69 million U.S. adults had a heart age older than their actual age. For men, the average heart age was about eight years older than their chronological age; for women, their hearts were an average of five years older than their real age.

MORE: This is the Worst Type of Fat for Your Heart

The researchers found some notable demographic differences. Heart age was highest among black men and women: black men had hearts three to four years older than white and Hispanic men, while black women had hearts five to seven years older than white and Hispanic women. Southern adults also had notably higher heart ages overall.

In the report, the study authors argue that heart age is a simple way to convey heart disease risk to their patients—one that might motivate Americans to adopt heart-protective lifestyle changes like quitting smoking, eating better or exercising more often.

TIME Research

America’s Smoking Rate Continues to Drop

Smoking
Stephan Geyer—Getty Images

But there are some at-risk groups, including LGBT people and Native Americans

The smoking rate for American adults has continued to fall, according to a new report.

The national smoking rate stands at 17% this year, a drop from the 18% reported in 2013, the Centers for Disease Control and Prevention found. Smoking rates were actually even lower earlier this year, when they were hovering around 15%, but some say the lower number is attributable to New Year’s resolutions.

Other demographic trends were released in the report. Multiracial people had the highest smoking rates, at 26.8%, with those of Native American ancestry close behind at 26.1%; experts believe that ceremonial use of tobacco affects Native American smoking rates. Men were more likely to smoke than women, as were those who identified as LGBT. More than 29% of smokers were below the poverty line.

The CDC notes that smoking is “the single largest preventable cause of death and disease” in the U.S., with 480,000 deaths per year being traced to cigarettes.

TIME Diet/Nutrition

Prevent Peanut Allergies by Giving Kids Peanuts, Doctors Say

The American Academy of Pediatrics points to a recent study that shows peanut allergy rates fell as much as 81% when this strategy was used

The American Academy of Pediatrics has some advice for parents trying to prevent their kids from developing peanut allergies: Give them peanuts.

As counterintuitive as it may seem, new research from the National Institutes of Health indicates that putting peanuts out of reach of children actually could make an eventual problem worse, the American Academy of Pediatrics said in a statement.

The group suggested that health care providers could recommend peanut-containing products to children between the ages of 4 and 11 months, particularly those with severe eczema or egg allergies. Because peanuts can be a choking hazard, the AAP suggests creamy peanut butter for infants, carefully graduating to peanuts in their whole form around age 4.

About 5% of American children have a peanut allergy, which is higher than worldwide rates of between 1% and 3%. There’s also research suggesting a recent spike of kids with peanut allergies in the U.S., with many experts suggesting that in the past decade alone, rates might have even tripled.

The NIH’s study, released in February, noted that introduction of peanut products in high-risk infants was safe and that doing so led to a remarkable 81% decreased risk in developing a peanut allergy later.

Read Next: The Surprising Way to Treat Peanut Allergies

TIME Developmental Disorders

Researchers Zero in on the Best Way to Diagnose Autism

TIME.com stock health autism puzzle pieces
Illustration by Sydney Rae Hass for TIME

What’s the most reliable way to know if your child has autism? Is it a genetic test? Or are more traditional behavioral assessments, which measure talking and social skills, more accurate? The latest research provides some answers

Autism is a complex developmental disorder, and diagnosing it properly usually involves a combination of different tests. In the latest issue of JAMA, scientists provide the most up-to-date assessment yet of which tests work best for detecting genetic mutations associated with certain kinds of autism. Categorizing the various forms of autism will be important to guide parents to the proper care, the researchers say.

Traditionally, autism is diagnosed with behavioral tests that assess whether kids are meeting developmental milestones, such as talking, interacting with their parents and siblings, and learning to give and take in social situations. In recent years, researchers have been working on other ways to detect and potentially diagnose autism. Scientists have identified more than 100 genes connected with a higher risk of developing autism.

MORE: Study Finds Possible Association Between Autism and Air Pollution

Stephen Scherer, director of the center for applied genomics and a professor at the Hospital for Sick Children in Toronto, and his colleagues conducted a comparison test to see how the genetic tests matched up, both against each other and against the more conventional behavioral evaluations.

They studied 258 children who were diagnosed with autism spectrum disorder; all had a form of genetic testing done that looks specifically at abnormalities in the chromosomes; some had more extensive genetic testing, called whole-exome sequencing.

MORE: How Brain Scans Can Diagnose Autism With 97% Accuracy

The two genetic tests were roughly equally capable—around 8-9%—of detecting autism. Regardless of the fact they perform similarly, however, more labs and clinicians are favoring whole-exome sequencing, says Scherer. That’s concerning because the two genetic tests pick up markers for different kinds of autism, and excluding the other test in favor of the more high-tech whole-exome sequencing would miss about half of the possible genetic predictors of autism. Together the two gene-based tests can diagnose nearly 16% of cases.

“We need to use both technologies now,” he says. “If we only used one, we would miss some important information.”

The tests aren’t cheap. The chromosome-based test costs about $500, and exome sequencing slightly more. Ideally, this research suggests, both tests would be done for any child referred to a developmental pediatricians who suspects autism. But the reality is that for now, insurers may not cover both.

Scherer’s group looked at how non-genetic evaluations matched up with the genetic testing. Using factors such as brain scans to look for physical differences that might indicate autism, they divided the children into three groups based on whether they possessed physical anomalies or not. Among children with more physical abnormalities, the two types of genetic testing together diagnosed autism in 37.5% of cases.

MORE: Autism Rises: More Children than Ever Have Autism, but Is the Increase Real?

That suggests that the most accurate diagnosis of autism may come from combining all three types of tests. Not only that, says Scherer, but such testing can also categorize the type of autism that a child may have. “We need to start looking at each autism case individually, and come up with the best recommendations,” he says.

For now, based on the results of the study, he recommends that behavioral testing be the first step. Then, the chromosomal test should be done to see if it yields any additional information about a connection to autism. Even if it does not, that doesn’t mean that there aren’t genetic factors in play. If the chromosomal test is negative, Scherer argues that in some cases the whole exome sequencing might be useful.

Working with genetic counselors can help parents decide if and when this type of genetic testing is needed. “The message is that we need to use all technologies to get as much detailed information as we can to marry them all together,” he says.

TIME Aging

Weight at Age 50 Connected to When a Person Gets Alzheimer’s

Being overweight may bring symptoms of the disease on earlier

Middle-aged Americans have one more reason to keep an eye on the scale as they age: research shows that people who are overweight when they are 50 years old may be more likely to develop Alzheimer’s sooner than those that are a healthy weight.

Scientists at the National Institutes of Health studied midlife obesity’s connection to Alzheimer’s and announced in a study published Tuesday in the journal Molecular Psychiatry that they had found a connection between being overweight or obese in middle age and developing Alzheimer’s.

“Maintaining a healthy BMI at midlife is likely to have long-lasting protective effects,” Dr. Madhav Thambisetty, lead author of the study and a researcher at the NIH’s National Institute on Aging, told the Associated Press.

BMI stands for body mass index, a common medical indicator that takes the ratio of a person’s weight to height. The medical community normally considers a BMI of 25 to be overweight.

Read More: New Study Identifies 9 Risk Factors for Alzheimer’s Disease

The research team used the Baltimore Longitudinal Study of Aging, a project developed to track how healthy people age, to find the correlation. The study used the records of approximately 1,400 people who’d taken regular cognitive tests for 14 years; 142 of them developed Alzheimer’s. The researchers then used the records of those 142 people to figure out their BMI and discovered that every increasing step up on BMI charts meant Alzheimer’s struck 6 1/2 months sooner.

Researchers aren’t sure if the reverse—having a healthy BMI and being slim—are indicators of not developing Alzheimer’s later on. They’re also not sure if losing weight after age 50 lessens the impact or delays Alzheimer’s.

Regardless, it’s a result that has many worried, particularly given the increasingly large population of obese middle-aged adults across the world. About 46 million people currently suffer from Alzheimer’s disease, with the number projected to double in the next 20 years.

TIME Diet/Nutrition

5 Foods That Taste Better in September Than They Will All Year

Here's what should be on your grocery list this month

Never know what’s growing now? Let’s take it one month at a time, with TIME‘s Foods That Taste Better Now Than They Will All Year.

As summer draws to a close, you might think your best days of produce are behind you. But there are plenty of fruits and vegetables to fall for in September. We asked Joan Casanova, spokesperson for Bonnie Plants, what early-fall items are worth watching.

Swiss chard: This deep green veggie with colorful stems means it’s both beautiful and nutritious, says Casanova. Swiss chard is one vegetable that tolerates both cool temperatures and the heat, so you will see tasty varieties in September.

Rutabaga: A fall favorite, this root vegetable can be chunked or mashed, similar to potatoes. “It ripens best in cool autumn weather, taking on its characteristic mild, rich flavor after fall frosts descend on the garden,” says Casanova.

Lettuce: While lettuce is known for growing fast in full sun, Casanova says it is one of few vegetables that also does well in the shade. Home gardeners can grow lettuce in a small space, too.

Turnip leaves: These greens are extremely easy to grow in the fall, when nights become longer and cooler turnip greens get crisper and sweeter, says Casanova.

Leeks: Leeks are sweet, mild and gentle on the digestive system, Casanova says. They don’t produce bulbs like onions do, but they “stash their flavor in thick, juicy stems that look like huge scallions.”

TIME Diet/Nutrition

McDonalds is Making a Big Change to the McMuffin

Egg McMuffin McDonald's
Mike Blake—Reuters An Egg McMuffin meal is pictured at a McDonald's restaurant in Encinitas, Calif. on Aug. 13, 2015.

The McMuffin will soon be made with real butter

McDonald’s is going to make McMuffins with real butter.

Instead of using liquid margarine, CNBC reports that McDonald’s is changing how it makes its biscuits, English muffins and bagels and is transitioning to using the real stuff.

Some stores are already advertising the change.

According to CNBC, one sign at a Manhattan McDonald’s location said: “We’re proud to cook breakfast items on the grill with real butter and we toast our English Muffins, biscuits and bagels with real butter too.”

McDonald’s has not responded to requests for confirmation or comment.

It was unclear how long the chain has been using real butter on its breakfast items.

[CNBC]

TIME Diet/Nutrition

How the Diets of Early Humans Explain Our Eating Habits

healthiest foods, health food, diet, nutrition, time.com stock, dark meat, poultry, chicken
Danny Kim for TIME

Meat formed the crucial lean-season food for the Neanderthal people during successive winters

Much attention is being given to what people ate in the distant past as a guide to what we should eat today. Advocates of the claimed palaeodiet recommend that we should avoid carbohydrates and load our plates with red meat and fat. Its critics, on the other hand, argue that these are the same ingredients that would set us up for heart attacks. Moreover, these animal-derived foods require more space to produce on our crowded planet filled with starving humans.

A factual foundation for the debate is provided by a review of the eating patterns of early humans and how we adapted to digest starches softened by cooking. The researchers contend that it was digestible starches that provided extra energy needed to fuel the energy needs of bigger brains, rather than extra protein from meat to grow these brains.

But the most striking thing about human diets is just how variable they have been and the adaptations that have taken place. Furthermore, the American evolutionary biologist Marlene Zuk in her book Paleofantasy contends that these dietary adaptations are not fixed on what our ancestors ate in caves at some time in the past.

So are our energy, or protein, needs much different from other mammals of similar size? Brains demand a lot of energy but so does the liver and the digestive tract. The extra nutrition that we need for brain work may be counterbalanced, at least partially, by a lesser need for:

  • a long gut to process poor quality foods, or
  • a large liver to handle nasty chemicals in these plant parts.

Once built, a large brain does not require extra sources of protein to maintain its activities.

My studies on the dietary requirements of savanna-inhabiting herbivores highlight how these animals must cope with the dry season when most herbage is brown and indigestible even with the aid of microbial symbionts in the gut.

But carnivores do not have this problem because the dry season is when weakened herbivores are most readily killed, especially when they concentrate around scarce waterholes.

The role of carbs among early humans

Meat has long been part of human diets, along with carbohydrates provided by fruits, tubers and grains. We can get by without it, obtaining protein from milk or, with some planning, from legumes.

The early humans that consumed most meat were the Neanderthals, who lived in Europe many thousand years ago, but were not our ancestors. Meat formed the crucial lean-season food for the Neanderthal people during successive winters when plants were seasonally buried under deep snow, but later also for the modern humans who spread through Eurasia and displaced them around 40 000 years ago.

Unlike tropical Africa, meat could be stored during the freezing winters of the far north to provide a reliable food source, especially in the form of large carcasses of elephant-like proboscideans.

This led to a wave of large mammal extinctions as humans spread rapidly into Australia and entered the Americas towards the end of the last Ice Age. By that time hunting technology had been honed and meat routinely supplemented plant food, but the latter remained the dietary staple for African hunter-gatherers like the Bushmen or San people into modern times.

The food journey within evolution

Coping with the intensifying dry season in the expanding African savanna was a critical issue for human ancestors during the evolutionary transition from ape-men to the first humans between three and two million years ago. How did our ape-men ancestors gather sufficient to eat during this time of the year when nutritious fruits and leaves were scarce?

This was when meat, or at least the marrow left within bones, could have become a nutritional fallback, probably acquired by scavenging from animal carcasses not completely consumed by big fierce carnivores, along with underground storage organs of plants.

Obtaining this meat required more walking and hence longer limbs, hands freed to carry, security in numbers and stone weapons to throw at threatening carnivore fangs, but not much expansion in cranial capacity. These were features of the early Australopithicines.

At this early time, another branch of ape-men, placed in the genus Paranthropus, took a different adaptive route. They developed huge jaws to chew on tough plant foods extracted from underground storage organs to get them through the dry season.

The last representative of this genus faded out nearly a million years ago when this strategy eventually became unviable. About that time the lineage leading to early humans discovered cooking, or at least how to use it effectively to make starches stored by plants more readily digestible, according to the article in The Quarterly Review of Biology.

Adding this reliably found source of energy to the proteins acquired more opportunistically by hunting animals or gathering shellfish provided the means to survive through seasonal bottlenecks in food availability and build even bigger brains and the adaptations that followed.

A supporting adaptation was to store more body fat to get through the lean periods, especially among women supporting dependent offspring. This works against us now that foods supplying carbohydrates are plentiful.

The modern day dilemma

The problems we currently face are that we retain a craving for sugar, which was scarce the past, while most of the starchy carbohydrates we eat are highly refined. This means losing out on the other nutrients in plant parts like minerals and vitamins, and most basically fibre.

A meat-based diet could have a role to play for people who have a propensity to store fat by filling the gut for longer and alleviating desires to snack on sweets between meals. More important generally is the need to exercise so that we are hungry enough to consume sufficient food to provide the scarce micronutrients that we also require for healthy bodies.

The best advice is to eat lots of things: meat if you can afford it and justify its planetary costs to produce, but also all kinds of good food, as least refined and processed as you can obtain (apart from wines).

The ConversationNorman Owen-Smith is Emeritus Research Professor of African Ecology at University of the Witwatersrand.

This article was originally published on The Conversation. Read the original article.

Your browser is out of date. Please update your browser at http://update.microsoft.com