Instagram and TikTok are failing users with eating disorders
Dr. Jason Nagata has seen it happen time and again. As an assistant professor of pediatrics at the University of California, San Francisco, he treats teenagers who’ve been hospitalized because of their eating disorders. Even as patients lie in their hospital beds, he says, many still post and share dieting and weight loss content on social media.
“People with eating disorders can get trapped in a vicious cycle of content related to disordered eating and weight loss,” says Nagata. In cases like these, he says, hospital staff may need to take away a patient’s access to social media to help with their recovery.
For years, social media platforms like Instagram and parent company (formerly Facebook) have been criticized for and and , particularly among younger audiences. Use of these sites has also been linked to negative body image and , due to the abundance of that reinforce unrealistic ideals. TikTok, the ByteDance-owned social video site that’s exploded in popularity, has also been criticized for surfacing pro-eating-disorder videos to teens.
(Note: This story is based on Frances Haugen’s disclosures to the Securities and Exchange Commission, which were also provided to Congress in redacted form by her legal team. The redacted versions received by Congress were obtained by a consortium of news organizations, including CNET.)
Over the last several months, TikTok and Instagram — which last year launched Facebook Files, which were first reported by The Wall Street Journal. These documents include results from the social network’s own research into the mental health impact of its platforms. Among its findings, the company noted that 33% of Instagram users and 11% of Facebook users think the platforms make their own body image issues worse. Additionally, more than half of Instagram users reported body dissatisfaction.— have released as well as designed to support people with . But more recently, Instagram’s efforts have been overshadowed by known as the
“Overall,” Meta concludes in one document, “there is substantial evidence to suggest that Instagram and Facebook use can increase body dissatisfaction.”
Following The Wall Street Journal’s publication of Meta’s internal reports on teen mental health, which had been leaked by , a former Facebook product manager, the social network publicly shared annotated versions of its research decks. Regarding data on body image issues for teen girls, Meta wrote that the responses reflect how people “already experiencing hard moments” felt about Instagram, rather than the general population of teenage Instagram users.
Still, Meta’s findings have experts like Renee Engeln, a psychology professor at Northwestern University, concerned that issues around eating disorders and body image are “much too broad to be solved by a link to some resources,” she said. “People aren’t suffering for lack of a link to resources. When the whole culture is just a big toxic soup, it’s not gonna solve our problems.”
Eating disorders and body image issues have become more prominent during the COVID-19 pandemic, Engeln says. One study from the University of Pennsylvania published in November found that the number of people hospitalized for eating disorders in the US doubled during the pandemic. Other research found that those who had an eating disorder before the pandemic saw it become worse.
In addition to fostering a loss of social support and daily structure, lockdowns changed many of our eating and exercise habits — all while we spent more time indoors scrolling through our phones, Engeln says.
“If you were tiptoeing around the edge of an eating disorder, COVID might have nudged you over that line,” she said.
The power of algorithms
Experts believe algorithms on sites like TikTok and Instagram can play a key role in worsening negative body image. If someone’s struggling with their weight, for instance, they tend to seek out information related to weight loss, which signals to the platform that they want more of that content. As a result, feeds become filled with harmful posts that play to those insecurities.
“The algorithms are doing their best to optimize, and they’re not necessarily optimized for a person’s well being,” says Jeff Hancock, communications professor and founding director of the Stanford Social Media Lab. “They’re optimized for profit-making for these companies. And that typically is around increasing engagement and attention so that advertising is more effective.”
Elisa Aas recovered from bulimia and orthorexia (an unhealthy obsession with healthy eating) in 2014 and is now a coach for people with eating disorders. She uses her Instagram page @followtheintuition, which has more than 8,000 followers, to share tips and lessons with others who may be struggling. Aas says that, on the one hand, social media can help people struggling with eating disorders find much-needed support and resources. But on the other hand, it can be hard not to compare oneself to others or to scroll endlessly through toxic diet-related content, which can make people feel worse.
“No matter how aware you are, it’s so easy to get sucked into the comparison,” Aas said. “Recovery content can become triggering because there’s a lot of comparison even there.”
Social media companies say they have measures in place to help people who may be struggling with these issues. A TikTok representative told CNET, “We take a holistic approach to support the well-being of people on TikTok, including screen time reminders, settings to control viewing preferences and access to expert help on eating disorders and self-harm across our platform. We’re committed to fostering a supportive environment for people who choose to share their personal wellness journeys and to removing content that normalizes or glorifies disordered eating which violates our Community Guidelines.”
In a blog post on its newsroom page about “limiting unwanted content,” TikTok advises users to long-press on a video they don’t want to see and tap the “Not interested” button, which will show “less of that sort of video in the future.”
Instagram didn’t provide CNET with an on-the-record comment. But users can also flag content on their Explore page they don’t want to see by tapping the three dots in the upper right corner of a post and choosing “Not Interested.” Instagram doesn’t block or remove content related to admission or recovery from suicide, self harm or eating disorders, but instead doesn’t recommend that type of content to users. Earlier this year, Instagram also launched a tool called, which gives people more control over how much sensitive content they see in Explore.
In its internal reports, Facebook notes that “mass media has long been blamed for body dissatisfaction” and writes that it’s now looking into what social media’s impact might be. While traditional media is also rife with altered images, Hancock says social media’s pitfall is that users compare themselves to peers, in addition to influencers and celebrities. When content comes from someone more relatable, it can be harder to distinguish what’s real and what’s not.
There’s also a distance between us and the celebrities featured in movies, TV shows and magazines, Engeln notes. We tend to have a better understanding that what we see there isn’t real because those stars and models aren’t people we know. But we don’t always have that same filter when it comes to content online.
“One of the most powerful things social media does is give us the impression that certain kinds of lives or certain kinds of appearances are more accessible than they actually are. Because we’re not just seeing celebrities who look perfect — we know to mistrust that,” she says. “But we also see images of our peers that have been perfected and curated and edited, and those feel more real to us.”
A solution to these issues remains elusive. Researchers haven’t yet gathered enough data on TikTok to better understand its impact, Engeln says. The app’s algorithm, which presents varied information to different users depending on their interests, can make it hard to study. That also makes it difficult to figure out what changes should be made.
“We — the users, the posters, the creators — have to change things, or else we just have to walk away,” Engeln says. “Because you cannot count on a social media company to protect your mental health. That’s not what they do.”
In its internal reports, Instagram lists some suggestions on how to mitigate body dissatisfaction that have been “made in previous research, but that are not necessarily backed by any specific findings.” One recommendation is to include a disclosure when someone creates an account, reminding them that “images on Instagram are often edited.” Another suggestion: “Some work concluded that the best thing to do is to reduce social media use and screen time, given the difficulty of limiting exposure to body dissatisfaction-inducing content.”
That’s what Nagata, who works with hospitalized patients with eating disorders, recommends for anyone struggling. He suggests turning off notifications, avoiding social media use before going to sleep and keeping your phone away from your bed while sleeping. But, he adds, much of the responsibility still falls on companies to moderate harmful posts and to be more transparent about their research findings and about the algorithms that point users toward specific content.
“Although there is growing awareness about social media and body image issues,” he says, “there will need to be more changes by social media companies or government regulation to improve these issues.”