On May 23, U.S. Surgeon General Dr. Vivek Murthy issued an advisory warning about the impact that social media is having on the mental health of young people.
“I issued this advisory because this is an urgent crisis,” Murthy tells TIME. “In the effort to maximize the benefit and minimize the harms of social media on children, we have not made enough progress. As a consequence, I worry about the mental health and well-being of our children.”
In a conversation with TIME, Murthy discusses how parents, policy makers, researchers, and technology companies can and should come together to make social media platforms safer for children. (This interview has been condensed and edited for clarity.)
More from TIME
TIME: Why do you think social media’s impact on young people is so concerning?
Since becoming Surgeon General, I have focused primarily on mental health and well-being, which I see as the defining public health crisis of our time.
And youth are a point of concern. As I traveled around the country and talked to families about mental-health concerns, the No. 1 question I get from parents is about social media: “Is social media safe for my kids?” And many kids raise the same concerns. At roundtables I’ve had with middle-school students, high-school students, and college students, they often proactively bring up social media.
The three things they have told me most consistently are: 1. That social media often made them feel worse about themselves; 2. That it made them feel worse about their friendships; and 3. That they couldn’t get off of it. As one student told me, “I feel great during the day, then take out my phone and get on social media and see all of these people doing things without me, or accomplishing incredible things—having incredible bodies and living incredible lives—and suddenly I feel worse about myself.” It’s a common theme.
The reason I issued the advisory is to answer the question that so many parents have been posing to me about social media.
What does your report conclude about social media and youth mental health?
After putting together the available data, which involved going through publicly available research and looking at published data as well as consulting independent experts, our conclusions are, first, that there isn’t enough data to say that social media platforms are safe for kids, and, second, that there is growing evidence that social media use is associated with harms.
Do policy makers and technology companies have a responsibility to ensure that their platforms are safe for children?
I 100% see this as a responsibility for policy makers and technology companies. Any company that produces a product consumed by kids has a fundamental responsibility to ensure it is safe for children—that it helps and not harms them.
We don’t ask parents to inspect the brakes on cars that children will ride in, or the ingredients in medications that children use, or ask them to conduct chemical analyses of the paint used in toys made for children to make sure that they are safe. We set standards and enforce them—that’s usually done by government—to make sure that manufacturers meet them.
That’s what is missing here. We can’t have technology companies set their own standards; we don’t do that in any other sectors where kids’ well-being is at stake. But that’s largely what has been happening over the past 20 years.
What are some specific standards that policy makers can set for social media use among children?
We need to strengthen protections for kids through safety standards, especially by protecting kids from exposure to harmful content. Too many kids are exposed to sexual and violent content, as well as harassment and abuse online. That should not be happening.
We can take a page out of safety standards applied to other products for children, and should include standards around age. While 13 is the commonly used age many platforms use to allow users to join, we should keep in mind two things. First, it’s terribly enforced, because 40% of eight to 12 years olds are on social media. Second, 13 [years old] did not come from a health assessment that looked at what appropriate age kids should be on social media. It came from COPPA [Children’s Online Privacy Protection Rule], a law that restricted the age under which data could not be collected and shared. We need to understand what appropriate age a child should start using these platforms.
Are there data to inform at what age children can safely start using social media?
That’s another thing that standards set by policy makers can do: ensure that technology companies share data that’s relevant from their platforms. I hear from researchers all the time who are not able to get full access to the data they need to fully understand the impact platforms are having on children. As a parent myself, I don’t want to feel that there is information that is hidden from me about the impact products my kids are using may have on their mental health and well-being.
Should standards also include restrictions on certain types of content for younger users?
Effective standards would protect kids from harmful content. And these standards not only need to be set but need to be enforced. It’s important for parents and kids to be at the table to help inform how these standards are shaped.
These platforms have been designed to maximize how much time kids are spending on them. One thing new standards can do is to minimize the features that lead to excessive use, especially among younger children.
I acknowledge that companies are trying to take steps to make platforms safer, but it’s really not sufficient. Time matters. Children only have one childhood, and every day, every month, every year matters in the life and development of a child.
- Introducing the TIME100 Climate List
- Accenture’s Chief AI Officer on Why This Is a Defining Moment
- U.S. Doctors Can't Be Silent About Gaza: Column
- Inside COP28's Big 'Experiment'
- The Movie Wives Would Like a Word
- The 100 Must-Read Books of 2023
- The Top 100 Photos of 2023
- Want Weekly Recs on What to Watch, Read, and More? Sign Up for Worth Your Time