TIME Diet/Nutrition

McDonalds is Making a Big Change to the McMuffin

Egg McMuffin McDonald's
Mike Blake—Reuters An Egg McMuffin meal is pictured at a McDonald's restaurant in Encinitas, Calif. on Aug. 13, 2015.

The McMuffin will soon be made with real butter

McDonald’s is going to make McMuffins with real butter.

Instead of using liquid margarine, CNBC reports that McDonald’s is changing how it makes its biscuits, English muffins and bagels and is transitioning to using the real stuff.

Some stores are already advertising the change.

According to CNBC, one sign at a Manhattan McDonald’s location said: “We’re proud to cook breakfast items on the grill with real butter and we toast our English Muffins, biscuits and bagels with real butter too.”

McDonald’s has not responded to requests for confirmation or comment.

It was unclear how long the chain has been using real butter on its breakfast items.

[CNBC]

TIME Diet/Nutrition

How the Diets of Early Humans Explain Our Eating Habits

healthiest foods, health food, diet, nutrition, time.com stock, dark meat, poultry, chicken
Danny Kim for TIME

Meat formed the crucial lean-season food for the Neanderthal people during successive winters

Much attention is being given to what people ate in the distant past as a guide to what we should eat today. Advocates of the claimed palaeodiet recommend that we should avoid carbohydrates and load our plates with red meat and fat. Its critics, on the other hand, argue that these are the same ingredients that would set us up for heart attacks. Moreover, these animal-derived foods require more space to produce on our crowded planet filled with starving humans.

A factual foundation for the debate is provided by a review of the eating patterns of early humans and how we adapted to digest starches softened by cooking. The researchers contend that it was digestible starches that provided extra energy needed to fuel the energy needs of bigger brains, rather than extra protein from meat to grow these brains.

But the most striking thing about human diets is just how variable they have been and the adaptations that have taken place. Furthermore, the American evolutionary biologist Marlene Zuk in her book Paleofantasy contends that these dietary adaptations are not fixed on what our ancestors ate in caves at some time in the past.

So are our energy, or protein, needs much different from other mammals of similar size? Brains demand a lot of energy but so does the liver and the digestive tract. The extra nutrition that we need for brain work may be counterbalanced, at least partially, by a lesser need for:

  • a long gut to process poor quality foods, or
  • a large liver to handle nasty chemicals in these plant parts.

Once built, a large brain does not require extra sources of protein to maintain its activities.

My studies on the dietary requirements of savanna-inhabiting herbivores highlight how these animals must cope with the dry season when most herbage is brown and indigestible even with the aid of microbial symbionts in the gut.

But carnivores do not have this problem because the dry season is when weakened herbivores are most readily killed, especially when they concentrate around scarce waterholes.

The role of carbs among early humans

Meat has long been part of human diets, along with carbohydrates provided by fruits, tubers and grains. We can get by without it, obtaining protein from milk or, with some planning, from legumes.

The early humans that consumed most meat were the Neanderthals, who lived in Europe many thousand years ago, but were not our ancestors. Meat formed the crucial lean-season food for the Neanderthal people during successive winters when plants were seasonally buried under deep snow, but later also for the modern humans who spread through Eurasia and displaced them around 40 000 years ago.

Unlike tropical Africa, meat could be stored during the freezing winters of the far north to provide a reliable food source, especially in the form of large carcasses of elephant-like proboscideans.

This led to a wave of large mammal extinctions as humans spread rapidly into Australia and entered the Americas towards the end of the last Ice Age. By that time hunting technology had been honed and meat routinely supplemented plant food, but the latter remained the dietary staple for African hunter-gatherers like the Bushmen or San people into modern times.

The food journey within evolution

Coping with the intensifying dry season in the expanding African savanna was a critical issue for human ancestors during the evolutionary transition from ape-men to the first humans between three and two million years ago. How did our ape-men ancestors gather sufficient to eat during this time of the year when nutritious fruits and leaves were scarce?

This was when meat, or at least the marrow left within bones, could have become a nutritional fallback, probably acquired by scavenging from animal carcasses not completely consumed by big fierce carnivores, along with underground storage organs of plants.

Obtaining this meat required more walking and hence longer limbs, hands freed to carry, security in numbers and stone weapons to throw at threatening carnivore fangs, but not much expansion in cranial capacity. These were features of the early Australopithicines.

At this early time, another branch of ape-men, placed in the genus Paranthropus, took a different adaptive route. They developed huge jaws to chew on tough plant foods extracted from underground storage organs to get them through the dry season.

The last representative of this genus faded out nearly a million years ago when this strategy eventually became unviable. About that time the lineage leading to early humans discovered cooking, or at least how to use it effectively to make starches stored by plants more readily digestible, according to the article in The Quarterly Review of Biology.

Adding this reliably found source of energy to the proteins acquired more opportunistically by hunting animals or gathering shellfish provided the means to survive through seasonal bottlenecks in food availability and build even bigger brains and the adaptations that followed.

A supporting adaptation was to store more body fat to get through the lean periods, especially among women supporting dependent offspring. This works against us now that foods supplying carbohydrates are plentiful.

The modern day dilemma

The problems we currently face are that we retain a craving for sugar, which was scarce the past, while most of the starchy carbohydrates we eat are highly refined. This means losing out on the other nutrients in plant parts like minerals and vitamins, and most basically fibre.

A meat-based diet could have a role to play for people who have a propensity to store fat by filling the gut for longer and alleviating desires to snack on sweets between meals. More important generally is the need to exercise so that we are hungry enough to consume sufficient food to provide the scarce micronutrients that we also require for healthy bodies.

The best advice is to eat lots of things: meat if you can afford it and justify its planetary costs to produce, but also all kinds of good food, as least refined and processed as you can obtain (apart from wines).

The ConversationNorman Owen-Smith is Emeritus Research Professor of African Ecology at University of the Witwatersrand.

This article was originally published on The Conversation. Read the original article.

TIME public health

Lack of Sleep Dramatically Raises Your Risk For Getting Sick

TIME.com stock health sleep alarm clock
Illustration by Sydney Rae Hass for TIME

Getting six hours of sleep a night makes you four times more likely to get a cold compared to those who sleep more than seven hours a night, finds a new study

If you want to stay healthy, skip sleep at your own risk. According to the new results of a new study, people who slept six hours a night or less were four times as likely to get sick after being exposed to the cold virus compared with those who got more sleep.

The study, published in the September issue of the journal Sleep, looked at 164 healthy adults who volunteered to catch a cold for science. The researchers first equipped the volunteers with a wrist gadget to monitor how much they slept per night over the course of a week. A couple of weeks later, they brought them into the lab and injected live rhinovirus into their nose. They then quarantined them in a hotel for five days and took a virus culture from their nose each day to see who got sick.

MORE: Here’s How Hugs Can Prevent The Flu

How many hours a person slept, it turns out, was one of the strongest predictors of whether or not they got sick—even more than other factors like a person’s age, body mass, stress levels or emotional state. People who slept six hours a night or less were four times as likely to develop a cold compared to people who slept more than seven hours a night. Those who got less than five hours of sleep a night were at 4.5 times that risk.

The study wasn’t designed to figure out the link between sleep and sickness, but Aric Prather, lead author of the study and assistant professor of psychiatry at the University of California, San Francisco, had some theories. “We know that sleep plays an important role in regulating the immune system,” he says. When we don’t sleep enough, our internal environment shifts to make us less effective at fighting off a virus, he explains; studies have shown that important immune cells are increased in the blood, meaning they’re not where we really need them to be—in the immune organs like the lymph nodes—to effectively fight off viruses.

Shortened sleep also seems to alter the inflammatory response, which helps our bodies clear out viruses when it’s functioning properly, he says.

“This is really the first convincing evidence that objectively verified sleep is associated with susceptibility to the common cold, which is a huge deal for the sleep research community,” Prather says.

Sleeping more isn’t quite a cure for the common cold, but it could go a long way in protecting you from getting sick in the first place.

TIME Research

Your Kid’s Gigantic Backpack Is a Health Risk

Child backpack
Getty Images

Like adorable turtles, their little limbs poking out from under outsized shells, kids shuffle their ways to school bearing on their shoulders ever-heavier backpacks. Even high schoolers have to bend forward beneath books and binders to cart their cargo to and from school. They’re burdensome (and can be goofy-looking), but are they dangerous?

Yes, say many experts.

“Since at least 1998, we’ve noticed backpacks getting bigger and heavier, and not in proportion to the kids’ sizes,” says Dr. Karen Jacobs, a clinical professor at Boston University and spokesperson for the American Occupational Therapy Association (AOTA), which sponsors a school backpack awareness day. Jacobs says crowded schools and scant locker space appear to be driving the phenomenon.

A 2010 study from the University of California, San Diego, concludes, “backpack loads are responsible for a significant amount of back pain in children.” The same study says a full third of kids aged 11 to 14 report back pain. Other research from 2011 came to a similar conclusion.

“Kids are saying ‘My back hurts, my neck and my shoulders hurt,’” Jacobs says. “A heavy backpack can also contribute to headaches and problems concentrating at school.”

Like the frame of a house, the spine what keeps your child’s body sturdy and upright. Put too much weight on this frame while a young body is still developing, and it could change a kid’s posture, compress his spine, and impair growth, says Rob Danoff, a doctor of osteopathic medicine and a certified family physician with Philadelphia’s Aria Health System. “It also might contribute to back problems or injuries when your child’s older,” Danoff says.

How heavy is too heavy? “As a general rule, research shows the backpack should be no more than 10 to 20 percent of your child’s body weight to avoid pain or potential injury,” Jacobs says. “We like to err on the side of caution and recommend 10 percent.” (Danoff’s recommendation—no more than 15 percent—falls in line with Jacob’s.)

For an elementary school child who weighs just 50 or 60 pounds, a couple textbooks and lunch could push a pack beyond the safe threshold. For that reason, Jacobs says it’s important to check your child’s backpack every day to ensure she’s carrying only what she needs. “We’ve noticed that students are taking lots and lots of water to school with them, which is a lot of unnecessary weight,” Jacobs says. “We’re telling parent to send empty water bottles and have their kids fill them at school.”

She also recommends positioning the heaviest items in the middle of the pack and close to your child’s back.

Danoff says proper fit and design are important to relieve pressure from your little guy or girl’s spine and shoulders. You want a backpack made for someone your child’s size, he says. Padded shoulder straps and a cushioned back will also prevent aches and pains.

Finally, for crafty parents who may be considering non-backpack options—like a small roller bag—Jacobs says some schools have already started banning rollers because they pose tripping hazards, or may litter classroom aisles or hallways in the event of a fire.

If all this is exasperating, take heart: it probably won’t be long before every text or course packet your child needs is digitized, and schools stock tablets in every classroom. At which point, we can start to panic about tech neck instead.


TIME Diet/Nutrition

3 Delicious Things to Make With Bananas (That Aren’t Smoothies)

From pancakes to wholesome desserts

You already eat bananas as a snack and toss them into smoothies, of course—but this potassium-loaded, fiber-rich fruit can do so much more. Here are three creative ideas, straight from our food director’s kitchen.

Grain-Free Banana Pancakes

banana-pancakes-photo
Beth Lipton

Have you gone gluten-free, or are you reducing the amount of grains in your diet? Doesn’t mean you have to give up enjoying pancakes for breakfast. Try these this weekend (double or triple the recipe if you’re feeding a crowd):

Yield: 4 pancakes

1 ripe banana
1 large egg
Generous pinch of salt
3 Tbsp. almond meal
¼ tsp. vanilla extract (optional)
Coconut oil (for cooking)

In a bowl, mash banana well with a fork. Add egg, salt, almond meal and vanilla, if using and stir with fork until well combined. Melt a generous amount of coconut oil in a skillet over medium-low heat. Pour in batter (use about 3 Tbsp. batter for each pancake), spacing well in skillet. Cook for about 8 to 10 minutes total, flipping halfway through. Note: These can be tricky to flip, so resist the urge to try until the pancakes are firm on the bottom.

If cooking in batches, keep cooked pancakes warm on a plate in a 200ºF oven until all pancakes are cooked. Serve hot, with yogurt and berries or a drizzle of maple syrup, if desired.

Chocolate Banana “N’ice Cream”

banana-chocolate-ice-cream
Beth Lipton

Dairy- and refined sugar-free! This is a great treat for anyone, but especially if you’re lactose-intolerant.

Serves: 1

½ cup dairy free milk of choice (I used homemade cashew milk)
1 ripe banana, sliced and frozen
3 to 4 Tbsp. raw cacao
Generous pinch of salt
1 Tbsp. coconut oil
1 Tbsp. maple syrup

Combine all ingredients in a high-speed blender and blend until smooth and very thick. Thin with more nut milk, if needed. Transfer to a bowl and freeze.

Roasted Bananas

roasted-bananas
Beth Lipton

Yep, you can roast bananas. This recipe makes for a sweet, spiced, and wholesome treat for breakfast or dessert.

Serves: 2 to 4

1 Tbsp. unsalted butter (preferably grass-fed, such as Kerrygold)
2 bananas, thickly sliced on a bias
1 Tbsp. fresh lemon juice
Pinch of salt
2 to 3 tsp. raw honey
1/8 to ¼ tsp. cinnamon
1/8 to ¼ tsp. ground ginger

Preheat oven to 400ºF. While oven is heating, place butter in an 8-inch square pan, place in oven and allow it to melt. (Watch carefully; you don’t want butter to brown.) Remove baking dish from oven. Place bananas in pan in a single layer. Sprinkle with lemon juice, then salt. Drizzle with honey and sprinkle with cinnamon and ginger. Roast until bananas are soft and lightly darkened, about 10 to 15 minutes. Serve warm or transfer to a bowl, let cool, then cover and refrigerate. Serve with plain yogurt, over oatmeal or chia pudding, or simply topped with chopped nuts or granola.

Want to know more about bananas? Read up on their surprising history and get more details about their abundant health benefits.

This article originally appeared on Health.com

More from Health.com:

TIME Research

Your Kids Should Know About the Dangers of Drinking By Age 10, Doctors Say

504494219
Getty Images

Kids should know about the dangers of alcohol before their first sip

Health care professionals should be talking to children about the risks of alcoholic drinks when they are as young as nine, according to a new report from the American Academy of Pediatrics (AAP).

“Surveys indicate that children start to think positively about alcohol between ages 9 and 13 years,” the AAP authors write in the report. “The more young people are exposed to alcohol advertising and marketing, the more likely they are to drink, and if they are already drinking, this exposure leads them to drink more. Therefore, it is very important to start talking to children about the dangers of drinking as early as 9 years of age.”

In the United States, alcohol is the substance most commonly abused by kids and adolescents. The new report says that 21% of young people say they had more than a sip of an alcoholic beverage before they were 13 years old, and 79% have tried alcoholic drinks by the time they were seniors in high school.

The study also found that 80% of adolescents say their parents are the biggest influence on whether they drink or not, which suggests parents have a role as well. “We must approach drinking in children, particularly binge drinking, differently than we do in adults,” said co author and pediatrician Dr. Lorena Siqueira, a member of the AAP Committee on Substance Abuse. “Given their lack of experience with alcohol and smaller bodies, children and adolescents can have serious consequences — including death — with their first episode of binge drinking.”

Other research reviewed by the AAP committee suggested that continued use of alcohol at a young age can hinder brain development, lead to alcohol-induced brain damage, and increase the risk of substance use problems later on. The AAP says every pediatrician should screen their adolescent patients for alcohol use during appointments and offer preventative messaging.

The report authors focused specifically on the risks of binge drinking, which is classified as three or more drinks in a two-hour period for girls between ages nine and 17. For boys it’s three or more drinks in two hours between age 9 to 13, four or more drinks for boys ages 14 to 15, and five or more drinks for boys ages 16 to 17. The authors note that drinking rates increase in high school with 36 to 50% of high school students drinking and 28% to 60% binge drinking.

TIME Health Care

Physicians Avoid Conversations About Religion in the ICU

TIME.com stock photos Health First Aid Kit Gloves
Elizabeth Renstrom for TIME

Even though it's important to patients and their families

Religion and spirituality are not common topics of discussion in intensive care units (ICUs), and doctors often go out of their way to avoid them—even though religion is often very important to patients and their medical surrogates during end-of-life care, a new study shows.

In the study, published Monday in the journal JAMA Internal Medicine, researchers listened to audio recordings of 249 meetings between surrogates of critically ill patients and health care professionals in 13 different ICUs across the country. The goal was to investigate the religious or spiritual content in these talks. The researchers found that although religion was considered important to 77.6% of the surrogates (a surrogate is a family member or another person responsible for making medical decisions for a patient), conversations about religious and spiritual topics occurred in less than 20% of the goals-of-care conversations. Health care professionals rarely “explored the patient’s or family’s religious or spiritual ideas.”

When conversations about spirituality did occur in some of these end-of-life care conversations, the researchers found that 65% of the time the topic was initiated by the surrogate. Health care professionals raised the issue of spirituality only 5.6% of the time.

The types of religious conversations surrogates would bring up fell into categories such as: referencing their religious or spiritual beliefs, having the notion that the physician is God’s tool to aid in the healing of their loved one, and the idea that the end of life would be a new beginning. For example, surrogates said things like, “All I can do is pray for her to continue to get better and maybe one o’ these days, she can walk outta here.” Or, “I’m very, very optimistic because I know our faith is strong.”

The most common response among health care providers when a surrogate brought up religion or spirituality was to change the subject. In only eight conferences did a health care professional try to understand the beliefs of the surrogate by doing things like asking about the patient’s religious beliefs. “Our findings suggest that religious considerations—viewed as important to a large proportion of Americans—are often absent from end-of-life conversations,” the authors wrote. “This may signal a need for changes in health care delivery in ICUs.”

The study authors concluded that one potential solution would be to “redesign” health care processes so that spiritual care providers were a larger part of end-of-life care discussions for patients who value spirituality and religion.

In a corresponding editorial, health care professionals who were not involved in the study wrote: “Although we health care professionals struggle to connect spirituality and medicine as evidenced by the many and mounting articles that refute or explicate their connection, our patients and families typically do not struggle. For most, thoughts of what is most sacred, of what transcends the finitude of human life, come flooding in the moment the physician shares the news of the serious illness or the telephone call comes urging the listener to the bedside of a critically ill loved one.”

The new study suggests that religion and spirituality may be a conversation that people want to have at the end of life, and they are not getting it from their health care providers. Finding a solution for this discrepancy could be in patients’ and health care professionals’ best interest, the editorial said.
TIME Infectious Disease

Blue Bell Ice Cream Slowly Returns to Stores

The ice cream is making its way back into select markets after spurring a listeria outbreak

Blue Bell Creameries resumed delivering ice cream to select regions on Monday, several months after shutting down production when Blue Bell products were identified as the source of a multi-state listeria outbreak.

The first deliveries of ice cream were made early Monday morning in Brenham, Texas, where a local NBC News station says freezers were well stocked with Blue Bell ice cream.

Blue Bell production was shut down earlier this year after it was determined that Americans were getting sick with listeriosis from consuming Blue Bell products. The multi-state outbreak hospitalized 10 people, and three people died. In April, Blue Bell recalled all of its products.

MORE: How Ice Cream Gets Contaminated—and Sometimes Kills

In early August, Alabama approved Blue Bell’s request to resume production and delivery. Since the company still only has limited production capacity, it plans to re-enter 15 states in five phases. Monday began the first phase, which includes distribution to the Brenham, Houston and Austin, Texas areas and Birmingham and Montgomery, Ala.

“Over the past several months we have been working to make our facilities even better, and to ensure that everything we produce is safe, wholesome and of the highest quality for you to enjoy,” Ricky Dickson, vice president of sales and marketing for Blue Bell, said in a statement.

You can see where Blue Bell will be distributing next, here.

TIME Food

Lawsuit Accuses Nestlé of Using Slave-Caught Fish in Fancy Feast

Fancy Feast cat food
Elise Amendola—AP Fancy Feast cat food cans are photographed in Boston on March 19, 2015.

California residents brought a class-action lawsuit

A class-action lawsuit filed by California residents claims that Nestlé purchases fish from a Thai supplier known to use slave labor—and uses that fish in Fancy Feast cat food.

The suit was brought by consumers who say they would not have bought the product if they had known it had ties to slave labor, according to Bloomberg. Their lawyer says that “By hiding this from public view, Nestlé has effectively tricked millions of consumers into supporting and encouraging slave labor on floating prisons.”

Nestlé would not comment specifically on the suit, but told Bloomberg that it was working with an NGO “to identify where and why forced labor and human rights abuses may be taking place” in the region, and that forced labor “has no place in our supply chain.”

[Bloomberg]

TIME Diet/Nutrition

How to Tell If Your Grass-Fed Beef Is Real

cheeseburger
Getty Images

A range of practices and labels persist amid lack of regulation

When you buy a pound of hamburger in the grocery store, you’re likely to be bombarded by an incredible assortment of labels. With all-natural, grass-fed, free-range, pastured, sustainably sourced, and certified organic options to choose from, it’s not easy to parse which beef is actually the best.

In recent years, demand for grass-fed beef has grown rapidly, thanks to the popularity of high-protein diets and growing consumer awareness about the overuse of antibiotics on farms and other related concerns. Grass-fed beef is also seen as nutritionally superior to its corn-fed counterparts, thanks to the omega-3 fatty acids that cows ingest when they graze on clover and other grasses. Grass-fed burger chains are popping up all over the country, and even Carl’s Jr. began offering a grass-fed burger earlier this year.

But what exactly do we mean when we say “grass-fed”? And is all grass-fed beef the same?

It’s All in the Finishing

“All cattle are grass-fed at one time in their life, until most end up in a feedlot where they’re finished on grain,” says Texas rancher Gerry Shudde. Indeed, most cows spend at least six months eating grass, before they are “finished,” or fattened up, with grain.The National Cattlemen’s Beef Association puts that number at 12 months, but most grain-finished beef cows don’t live beyond 18 months.

According to rancher and the author of Defending Beef Nicolette Hahn Niman, the real number likely falls somewhere in the middle. “On average, the cattle in the U.S. that is going through feedlots is slaughtered at 14-16 months,” she says. “They do grow fatter and faster if they’re being fed grain, so they’re going into feedlots at younger ages to shorten that time as much as possible.” In a feedlot environment, grain causes cows to put on about one pound for every six pounds of feed they eat. In contrast, grass-fed cows are slaughtered anywhere between 18-36 months.

“When you keep cattle on grass their whole lives, and truly have them forage for a diet that their bodies have evolved to eat, you allow them to grow at a slower pace,” says Niman. Not surprisingly, caring for the animal for so long can be expensive for ranchers and consumers.

Many informed eaters will tell you that this slower process results in a signature flavor and distinct leanness that sets it apart from its corn-fed counterpart, but the fact is that beef producers can label their product “grass-fed,” even if the animal is fed grain over the course of its lifetime. Unlike the lengthy auditing process involved in U.S. Department of Agriculture’s (USDA) organic certification, the use of “grass-fed” is only regulated under the agency’s “marketing claim standards.

According to these standards, grass-fed cows are supposed to be given continuous access to rangeland, and they cannot be fed grains or grain by-products. In the event of drought or other “adverse weather conditions,” farmers are allowed to bend these rules if the animal’s wellness is in jeopardy, but they must maintain meticulous records. Unfortunately, these regulations are, for the most part, a paper tiger.

Missing Oversight

Marilyn Noble of the American Grassfed Association argues that beef producers have little incentive to stick with those rules. “It’s a big issue, and there is a lot of misunderstanding. The Agricultural Marketing Service developed the grass-fed standard, but the Food Safety and Inspection Service actually enforces it,” says Noble. “The two organizations, even though they’re both part of the USDA, don’t communicate especially well. You see a lot of beef labeled as ‘grass-fed,’ but whether or not it actually meets that standard is questionable.”

Noble’s skepticism is rooted in the fact that, for the most part, the USDA allows producers to determine whether or not their beef meets the grass-fed beef marketing claim standard. Noble says farms “self-certify” their own beef, and the Food Safety and Inspection Service generally goes along with their claim. The ubiquitous “naturally raised” label on meat has no enforceable meaning either, and further muddles a consumer’s ability to find beef that has been exclusively raised on pasture.

The American Grassfed Association, established in 2003, has far more stringent standards for its own label than the USDA, and hires third-party auditors to inspect the farms of its 100-plus certified producers across the country each year.

Farmers’ markets are also often full of vendors offering grass-fed beef from their own pastures. And the rising popularity of meat CSAs and whole animal buying clubs is an indication of how dramatically this trend has grown in recent years. With these options, consumers can talk directly to farmers to find out how their beef was raised. Many of these producers have begun using the term “pasture raised,” another unregulated labeling term that is popular among ranchers.

Even Whole Foods has adopted some of this farm-to-market language in its meat sourcing standards. For example, “pasture-centered” farms score a 4 out of 5 on the grocer’s Animal Welfare Rating scale (owned by Global Animal Partnership). In reality, Niman says, these animals may not be doing much of the foraging that gives grass-fed beef its nutritional benefits.

“[Whole Foods] has been encouraging this segment of beef in the marketplace where animals are roaming on a small area with vegetative cover,” says Niman. “But they’re being provided feed, and not actually getting most of their nutrition from foraging. It’s almost like a feedlot.”

At BN Ranch, which Nicolette operates with her husband, Bill Niman, “the godfather of sustainable meat” and founder of Niman Ranch, cattle is given more time to slowly develop fat over a period of more than two years. For the Nimans, good “eating quality” in the beef is paramount. But, Nicolette says, that’s not always the case on farms where people are “doing it for philosophical reasons. They believe that grazing is ecologically superior, and that it is the right way to raise cattle. The things that are motivating them are not eating quality.”

As a result, grass-fed beef’s lean flavor is often seen as inferior. Some chefs, particularly in fine-dining steakhouses, still resist serving grass-fed beef in favor of corn-fed, USDA prime beef, because of its fat content.

Worth the Wait

Michael Sohocki, chef of Restaurant Gwendolyn in San Antonio, Texas, chooses grass-fed beef over the cheaper, richer, corn-fed cuts because he firmly believes that the process is worth the extra time and money. And his discerning diners come to his restaurant because they know the meat has been properly sourced. “When you eat stockyard beef, all of that beef is the same,” says Sohocki. “It’s done that way to guarantee its consistency. That’s what McDonald’s specializes in.”

Sohocki calls grass-fed beef “the only trustworthy product left in this world.” He sources it from nearby Shudde Ranch, where Jeanne and Gerry Shudde make a point of raising a specialized cross-breed of species suited to naturally develop fat on pasture.

“Our [cows] are on grass when they’re with their mother. And when separated, they stay on the grass,” says Gerry Shudde.

The Shuddes decided to go grass-fed by chance after acquiring a herd of Longhorn cattle that they planned to cross-breed with their own. The offspring did not fare well, but the Shuddes ultimately decided to keep the longhorn cows. When they butchered a six-year-old cow, which had been raised on grass for much longer than usual, Jeanne says, “It was really tender. We thought ‘gosh, this tastes better than what we get in the grocery store.’”

From there, the Shuddes developed their own, new breed of grass-fed cattle. They were already raising cows without antibiotics or hormones, and their farm eventually evolved into a completely grass-fed operation by 2002. Still, they had to find the right cow to produce the quality of beef that they desired. “Most of the animals that you find today have been genetically selected to do well in a feedlot environment,” says Jeanne. “If you take them and put them on grass and think they will [taste good], I’d say maybe, maybe not. But if you take an animal that is genetically survival-oriented, it will become well-marbled on grass.”

Their own cows are now a cross between that original herd of Longhorn cattle and a heritage Devon bull. “Our belief is that if they eat what they evolved to eat, and live in the way that they have evolved to, the nutrition for the animal’s survival will be there,” says Jeanne. “If the nutrition is there, humans will get that nutrition when we eat the meat.”

This article originally appeared on Civil Eats

More from Civil Eats:

Your browser is out of date. Please update your browser at http://update.microsoft.com