TIME public health

1 in 3 People Worldwide Don’t Have Proper Toilets, Report Says

A clean-up volunteer scoops plastic waste at an open sewer in Manila on May 4, 2015. Non-governmental environmental groups are calling for national legislation to prevent plastic waste that clog waterways.
Jay Directo—AFP/Getty Images A cleanup volunteer scoops plastic waste at an open sewer in Manila on May 4, 2015. Nongovernmental environmental groups are calling for national legislation to prevent plastic waste that clog waterways

Lack of proper sanitation facilities increases risk of waterborne diseases

About 2.4 billion people — or roughly one-third of the world’s population — still lack access to proper toilets, according to a report published Tuesday by the World Health Organization and UNICEF.

The study warns that progress on sanitation is falling short of the targets outlined in the U.N.’s Millennium Development Goals, even though significant improvements have been made in related areas including access to safe drinking water. Today, only 68% of the world’s population has access to proper sanitation facilities, a handful of percentage points short of the goal of 77%. Many of those who lack proper toilets and defecate in the open live in South Asia and sub-Saharan Africa, according to the report.

“Until everyone has access to adequate sanitation facilities, the quality of water supplies will be undermined and too many people will continue to die from waterborne and water-related diseases,” said WHO public-health director Dr. Maria Neira in a statement.

The U.N. is expected to outline new Sustainable Development Goals in September, with a goal of expanding sanitation facilities and eliminating open defecation by 2030.

TIME Diet/Nutrition

This Kind of Food Is Why America Is So Fat, Study Says

TIME.com stock photos Food Snacks Candy Chocolate
Elizabeth Renstrom for TIME

More calories in our food supply means more overeating

Worldwide, countries are dealing with a serious obesity problem. In the U.S. alone, more than two thirds of adults are overweight or obese. Now a new study suggests it likely has a lot to do with the make up of our food.

The new study, published in the Bulletin of the World Health Organization, looked at both the obesity rates and the supply of energy-dense—meaning high-calorie—foods in 69 countries, and found that both body weight and calories had increased in 56 of those states since 1971.

The finding was especially notable in high-income countries. “This suggests that, in high-income countries, a growing and excessive food supply is contributing to higher energy intake, as well as to increasing food waste,” the authors write. In the U.S. alone, the food energy supply went up by 768 calories per person between 1971 and 2008.

A wide reduction in physical activity may also be a contributing factor, the authors note, however, the surplus of available calories is likely leading people to overeat which in turn is adding on pounds for a lot of people. Other factors like pollution and gut bacteria should also be further studied to understand how they may contribute to weight gain as well, the researchers argue.

To combat the problem, the researchers argue that comprehensive approaches will be necessary. For instance, nation-wide policies should restrict the marketing of unhealthy food to young people and more packaged foods should have front of box nutritional labeling.

As always, eating more fresh foods rather than processed and exercising are two healthy habits worth adopting.
TIME employee benefits

Why Employers Are Offering More Generous Benefit Packages

Doctors Seek Higher Fees From Health Insurers
Adam Berry—Getty Images

Fewer workers are using nap rooms

Employers are offering more generous benefit packages, primarily driven by improvements in health care coverage.

Of the 402 human resource departments that responded to a survey from the Society of Human Resource Management, 35% said they were improving their benefits packages and 58% said they were keep them the same in 2015. Last year, only 28% of respondents said they were improving their packages.

Over the past five years, employers have especially improved mental health coverage, with 91% saying they would improve coverage in 2015, up from 82% in 2011. Over the same period, contraception coverage also improved, with 83% of employers making improvements this year, up from 69%, and the percentage of employers boosting critical illness insurance rising from 22% to 34%. Health savings accounts, which are tax-deductible accounts for medical expenses, have also seen an uptick, with usage rising 8% in the past five years.

As health care expenses rise for the majority of companies, many are offering preventative health benefits in order to tamp down spending in the long run, the study found. Those include health and lifestyle coaching, wellness programs, and smoking cessation programs, among others. Meanwhile, nap rooms, which were considered a preventative measure on the study, logged a 4% decrease in the past five years.

In terms of leave benefits, paid maternity and parental leave has increased in prevalence in the past five years. Flexibility is on the rise too: 56% of employers this year reported allowing workers to telecommute on an ad-hoc basis, compared to 42% in 2011.

According to the study, employers sink much of employee compensation in benefits that workers don’t notice in their paychecks: almost a third of private industry employee compensation came in the form of benefits in 2014. “In an environment with limited compensation growth in most sectors of the U.S. economy, a competitive benefits package can make the difference in attracting top talent to an organization,” the study said.

TIME Research

Your Diet May Be Causing Your Urinary Tract Infections

cranberry-cranberries
Getty Images

A new study reveals that factors related to diet might play a part in urinary tract infections

Tough-to-treat urinary tract infections (UTI) that are resistant to antibiotics are on the rise. Now, in a new study looking at human urine published in the Journal of Biological Chemistry, researchers say they’ve discovered why some people are more prone than others to the infections. Intriguingly, diet may have something to do with it.

Early on in an infection, cells produce a protein called siderocalin that blocks bacterial growth, including the growth of E. coli that often causes UTIs, says Jeffrey P. Henderson, MD, PhD, assistant professor of medicine at Washington University School of Medicine in St. Louis and senior author of the study. (It does this by keeping iron away from the bacteria, which need it to thrive.) The researchers wanted to see how the protein worked differently in various samples of urine at restricting the growth of E. coli, so they analyzed the urine from about 50 men and women.

“We found, kind of to our surprise, that there was a really wide range between individuals and how well this protein worked, just depending on that individual’s urinary composition,” says Henderson.

Two common factors emerged in urine that had a better ability to resist bacterial growth: it had a high pH—one that’s more alkaline, in other words—and higher levels of certain metabolites formed by gut microbes. That metabolite isn’t made from human cells, Henderson says; rather, they come from the diet or are metabolized by bacterial cells from dietary sources. “It looks like this protein that’s part of your immune system is able to use metabolites in the diet as grips to hold onto iron and keep it away from pathogenic bacteria,” Henderson says. In some people, that system is set up really well, he says, but in those who get recurrent UTIs, it doesn’t seem to work as well.

Both urine pH and metabolite production may be able to be changed through diet, and doing so could potentially offer a treatment strategy in the future, he says. “It may be that we have to adjust multiple things at the same time to get the system to work well, but the appealing part is this is not an antibiotic strategy,” he says. It may allow you to keep your normal flora while keeping bacteria out of the urinary tract.”

Physicians already know how to raise urinary pH with things like calcium supplements, and alkalizing agents are already used in the U.K. as over-the-counter UTI treatments, Henderson says. Knowing how to encourage the metabolites is trickier. The molecules come from phenolic, or aromatic, compounds, Henderson says, and robust food sources include those that we more often hear are rich in antioxidants: coffee, tea, colorful berries, red wine and dark chocolate.

And yes: cranberries, too, are known to make urinary aromatics, which may be why cranberry products are so often used as UTI remedies, Henderson says. “One thing this suggests is that maybe the reason it’s not more effective is that people need both cranberries and a higher urine pH, or they need cranberries and appropriate inhabitants of their intestine, or the right microbiome composition in their gut, for the cranberry part to work properly.”

A treatment without antibiotics would be a boon, but it’s likely a several-pronged approach and for now, more research is needed. “We still have a few more details to iron out before we know exactly how to do that.”

TIME diseases

West Nile Virus Found in New York City Mosquitos

mosquitoes blood sucking
Getty Images

The first of the season arrived in Staten Island and Queens

The New York City Health Department has detected the West Nile Virus in the city’s mosquitoes for the first time this summer, though no human cases have been reported yet.

The mosquitoes were found in the Glen Oaks, Queens and New Dorp Beach, Staten Island neighborhoods. The city plans to set up more traps and apply larvicide in affected areas.

“The most effective way to keep mosquito populations low is to remove standing water from items like buckets, gutters, planters, or any other receptacles that might be outdoors. New Yorkers are also encouraged to wear mosquito repellent and cover their arms and legs if they’re outside at dawn or dusk in areas with mosquitoes,” Health Commissioner Dr. Mary Bassett said in a press release.

West Nile can cause neurological disease and flu-like symptoms, though not all those bitten become sick.

TIME public health

California Governor Jerry Brown Signs Mandatory Vaccine Law

Law abolishes exemptions for personal beliefs

California Governor Jerry Brown signed a mandatory school vaccination bill into law Tuesday, abolishing the “personal belief” exemption that many parents use as a loophole to avoid vaccinating their children.

Now, under California law, which is among the strictest in the country, children would not be able to enroll in public school unless they have been vaccinated against diseases like measles and whooping cough. The law includes an exemption for children who have a medical reason to remain unvaccinated (like an immune system disorder) and can prove it with a doctor’s note. Parents who decline to vaccinate their children for personal or religious reasons will have to home-school them or send them to a public independent study program off school grounds.

Students who are unvaccinated because of “personal belief” who are already in public elementary school can stay until they’re in 7th grade, and then the parents will either have to vaccinate them or home-school them. Daycare students can stay until kindergarten, when they have to be either vaccinated or home-schooled. In the fall of 2014, almost 3% of California kindergartners were unvaccinated because of personal belief. Preschools in the most affluent areas are also the least likely to vaccinate, according to the Los Angeles Times.

The bill was proposed after a measles outbreak at Disneyland infected more 150 people, and many needed to be hospitalized. Supporters of the law argue that it is based on medical consensus that vaccinations improve public health. Opponents—who have been picketing outside the California legislature—argue that it’s an attack on personal freedom.

TIME public health

Here’s Where You’re Most Likely to Own a Gun in the U.S.

U.S. citizens own more than 270 million guns

Americans love their guns. U.S. citizens own more than 270 million of them, nearly 9 for every 10 Americans. No other country even comes close to matching that rate.

Now, a new study in the journal Injury Prevention shows just how much gun culture varies within the U.S. In Alaska, the state with the highest rate of gun ownership, more than 60% of residents own a gun. In Delaware, the state with the lowest rate, 5% of residents own a gun. Overall, one-third of American adults own a gun. Public health researchers say this information could help inform how to reduce gun violence.

“When you look at different states, you see a wide variation in these rates, and it mirrors the gun death rate,” said Bindu Balesan, an assistant professor at Columbia University’s public health school.

Idaho, West Virginia, Wyoming and Montana round out the list of states with top gun ownership, following Alaska. Delaware, Rhode Island, New York, New Jersey and New Hampshire are the only states where fewer than 15% of residents own guns, according to the research.

Read More: Gun Fatality Rates Vary Wildly By State, Study Finds

The study also found a strong correlation between gun ownership and living in a so-called social gun culture. In such gun cultures, friends and family tend to own firearms and community members may attend gun-themed social events. Nearly 8% of respondents even said that their social life with family members involves guns. The correlation may suggest a way to reduce gun deaths and injuries outside of federal government lawmaking, which has proven ineffective, Balesan says. Instead of focusing on big policy changes, public health advocates may want to focus on changing gun culture.

“We need to think about strategies for social change like we did with tobacco,” she said.

Gun violence remains a leading cause of death and injury in the U.S. More than 30,000 people are killed in a firearm-related incident each year, according to the CDC. And while a third of those deaths are the result of homicide, many more are a result of suicide and accidents. More than 200,000 people are injured each year.

But treating gun violence as a public health issue is going to be an uphill battle, even without relying on the federal government for new laws. Research dollars, for instance, are nearly impossible to come by, Balesan said. The Centers for Disease Control and Prevention (CDC), a huge source of money for health research, hasn’t given out funding for gun violence studies and others are similarly reluctant to touch such a controversial issue.

TIME vaccines

Why Jerry Brown Was Right to Sign the California Vaccine Bill

Bad choice: Anti-vaxxers protesting the California vaccine bill
Rich Pedroncelli—AP Bad choice: Anti-vaxxers protesting the California vaccine bill

Jeffrey Kluger is Editor at Large for TIME.

The governor had a chance to protect thousands of children—and he did

Updated: June 30, 2015, 2:32 PM EDT

California does not often make common cause with Mississippi and West Virginia. America’s blue-red divide doesn’t come any wider than it does between the liberal laboratory of the Pacific West and the conservative cornerstones of the old south. But with a single signature on a single bill, California Gov. Jerry Brown ensured that the largest state in the nation joined the two far smaller ones in what ought to be a simple, primal mission: keeping children healthy.

The law, which passed the California legislature with bipartisan majorities, does a straightforward job—removing the religious and personal belief exemptions that allowed parents to refuse to vaccinate their children. The legislation leaves standing the medical exemption—the waiver families receive when a child has a manifest medical condition like a compromised immune system that would make vaccines dangerous. Under the new rules, families without the medical waiver face a choice: get your kids the shots or prepare to home-school them, which ensures they get an education but protects other children from whatever pathogens they may be carrying.

Mississippi and West Virginia are the only other states in the country that currently have such no-nonsense rules and they’ve got the stellar vaccination rates to prove it: fully 99.9% of the states’ kids are up to date on all their shots. California was right to follow the example of those southern-fried smarts. Only 90.4% of the Golden State’s kindergarteners had their full complement of vaccinations in the 2014-2015 school year. The worst offenders are the parents in the too-rich, too-famous, too-smart by half provinces of Silicon Valley, where vaccination rates in some day care centers struggle to crack the 50% mark.

That matters—a lot. When vaccine coverage falls below 95%, communities begin to lose what’s known as herd immunity, the protection a fully inoculated population provides to the relative handful of its members who can’t be vaccinated. California has suffered the consequences of that, with outbreaks of whooping cough and mumps across the state. Earlier this year, more than 100 cases of measles in California and Mexico were traced to a single unvaccinated visitor to Disneyland. That outbreak, at one of the state’s most iconic destinations, at last got Sacramento’s attention, and the new law, though hotly debated, passed.

Brown was vague at first about whether he would sign the bill and that left a lot of health policy experts worried. He had signed an earlier bill that preserved the personal belief exemption but at least made it harder for families to claim one. No longer could parents simply check a box on a form—an awfully easy thing to do without giving the matter much thought. Under the previous law, they would have to visit a health care provider who would sign a statement confirming that the parents had been informed of the benefits (too many to enumerate) and the risks (vanishingly small) of vaccination. Once they’re in the doctor’s office, plenty of parents come around. But Brown, a one-time Jesuit seminarian who has made no secret of his spiritual side over the years, carved out an exception in that law for religious beliefs.

He was right not to make the same mistake this time. There was a time when religious exemptions were no cause for worry. The share of Americans whose faith forbids vaccinations is exceedingly small, and as long as the herd remained intact, those kids would remain safe. But that was before the nonsense factory of the anti-vaccine community went into operation, churning out all manner of misinformation about autism and brain damage and big pharma conspiring with big government to inject unsuspecting children with toxins. The result: Vaccine rates have plummeted nationwide, and children have paid the price.

The tension between religious liberty and civic responsibility is hardly a new issue in the American system. If your religion does no harm to anyone else—least of all kids—you ought to be free to practice it in peace. But if that faith requires prayer to treat pediatric cancer or laying on of hands as a cure for severe pneumonia, the state ought to be able to intervene and provide proper care if you won’t and prosecute you if your child is injured or killed. In some states that’s indeed possible but in others it’s not, and a complex patchwork governs the level of care each state will or won’t mandate.

Mandatory testing for lead levels in blood? OK in most places, but not if you live in Delaware, Maine, Kansas, Illinois, Massachusetts, New Jersey and Rhode Island, where religious exemptions are available. Mandatory eyedrops to help prevent blindness in newborns? An important preventive for kids born to mothers with certain kinds of STDs—but they may be out of luck if they’re born in Colorado, Delaware, Florida, Idaho, Iowa, Maine, Michigan, Minnesota, Nevada, or Pennsylvania.

The kids, it’s worth noting, did not choose to be born in states with weak protections. And they don’t choose either to be born to parents who look at vaccines and see in them something sinister or dangerous or strangely unholy.

Anti-vax parents came into a world of medically rational adults who had seen the wages of polio or diphtheria or smallpox or whooping cough and were grateful for a preventive that could eliminate those horrors. Jerry Brown himself came into that world too. Contemporary children deserve the same kind of wisdom and the same kind of care the grown-ups around them enjoyed. And California children deserve a governor who will see to it that that they get it.

Today Brown lived up to that responsibility.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME medicine

How This Common Drug Can Have Lasting Effects on Kids

Antibiotics are prescribed for a range of childhood ailments, from ear to throat infections. But the drugs may be changing kids’ health in potentially unwelcome ways

In a study published in Nature Communications, scientists document the possible long-term effects of antibiotics when they’re used early in life. Their study involved mice, but the team used the drugs in doses and treatment regimens that mimic those frequently administered in young children.

Dr. Martin Blaser, professor of medicine and microbiology at New York University Langone Medical Center, and his colleagues tested three different antibiotic regimens: one involving amoxicillin, another involving macrolides and a final one that combined the two. They compared these animals to mice that received a placebo. The mice got antibiotics 10 to 15 days after birth, then again 27 days later and finally after day 39. They lived for 160 days, at which point they were sacrificed and their gut bacteria were studied.

MORE: Here’s What Eating Nothing But McDonalds for 10 Days Does to Your Gut Bacteria

Compared to the mice taking the placebo, the antibiotic-treated animals had less diverse communities of bacteria, and the proportions of the bugs living in their guts were also different. The macrolides seemed to have the biggest effect on reducing microbial richness, while amoxicillin led to abnormally large bones. The changes in the microbiome persisted even to the animals’ death, nearly four months after their last antibiotic dose.

“There are really long-term, probably permanent effects on the microbiome from antibiotics,” says Blaser. “We showed changes in the richness and the community structure, and also the genes present in the bacteria.”

MORE: Antibiotics Before Age 2 Increase Risk of Childhood Obesity

What this means for humans still isn’t clear from this study, but the findings do provide hints. Other studies that have analyzed the potential effects of antibiotics found that children receiving more rounds of the drugs because of early infections tend to be heavier and are more likely to be obese as adolescents and adults. And the earlier children are exposed to the drugs, the more likely their metabolism is to be affected.

Blaser notes that antibiotics are a necessary and potentially life-saving treatment for some, but for many infections, their risks might be greater than their benefits. “If what we found in mice is true for human children, then this is yet another reason to be cautious in using antibiotics,” he says. “We know there are kids who are severely ill who must have antibiotics. But there is a larger number of kids who are only mildly ill. The question is, what proportion of them really need antibiotics?” Based on the animal data, he says, the first two to three years of life are particularly important for development, and doctors and parents should be judicious about prescribing antibiotics during this sensitive time.

TIME Healthcare

This Vitamin May Be Behind Your Acne Problems

It can be found in your burgers and cheese

Vitamin B12 is notably found in beef, dairy, and some fish. It’s been used to improve memory and combat anemia. Now, according to a study just published in Science Translational Medicine and as reported on the Verge, it may be linked to acne. It’s still early, so researchers don’t want everyone freaking out and nixing burgers and cheese from their diet, but it’s important to note that B12 changes how the genes of facial bacteria behave, a shift that aids in inflammation. The vitamin has been connected to acne in studies since the 50’s, but the researchers say that was mostly anecdotal.

“It has been reported several times that people who take B12 develop acne,” Huiying Li, a molecular pharmacologist at the University of California-Los Angeles and a co-author of the study, told the Verge. “So it’s exciting that we found that the potential link between B12 and acne is through the skin bacteria.”

Acne is still largely a mystery to researchers, even though 80 percent of teens and young adults have to deal with the pesky skin condition. Oily secretion known as sebum and faulty cells that line hair follicles play a role, but Li and her team wanted to see where bacteria factors into acne development.

The study found in a small group of people that humans who take B12 develop high levels of vitamin in their skin (which sounds like a good thing), but that skin bacteria known as Propionibacterium acnes then lowers its own production of B12 causing an imbalance. More porphyrins (naturally occurring chemicals in the body and a related molecule) are produced, which have been known to induce inflammation, AKA where acne begins.

Li says that the “main message is that skin bacteria are important. But until other researchers confirm the link between B12 and acne in a larger number of people, dermatologists won’t really be able to make any clinical recommendations one way or the other. I don’t want people to misinterpret the results by not taking B12.”

Let’s just drink more water and eat more berries until we know for sure what’s going on.

This article originally appeared on MIMI

More from MIMI:

Your browser is out of date. Please update your browser at http://update.microsoft.com