“Pro-ana” communities—websites, blogs, forums, and social media spaces dedicated to promoting the worsening of eating disorders like anorexia—have been a fixture of the web more or less since its inception. So it’s no surprise that, as Buzzfeed reported last month, some TikTok users have found disturbing pro-ana content on their For You page, a personalized section of the platform that displays videos users are likely to enjoy.

Discovering, damage-controlling, and deleting pro-ana content has become a rite of passage for web companies. In 2001, Yahoo removed 113 pro-ana websites from its servers. MySpace, Tumblr, Instagram, Pinterest, Reddit, and many other social media platforms have faced pro-ana problems. This well-publicized history makes it frustrating that TikTok wasn’t better prepared, beyond claiming it doesn’t allow “content that promotes eating habits that are likely to cause health issues.” But now that TikTok’s policies are under a microscope, what guidance will the company take from a longer history of regulating online pro-ana communities, and exactly how worried should its users be?

WIRED OPINION

ABOUT

Dr. Ysabel Gerrard is a lecturer in digital media and society at the University of Sheffield. Her research on social media content moderation has been featured in venues like The Guardian and The Washington Post. She also consults for social media companies, including Instagram.

The problem TikTok has right now is that its For You page is working exactly as it should: It gives users a personalized and therefore pleasurable experience by showing them what they likely want to see. I’ve previously written about the same problem playing out on Instagram, Pinterest, and Tumblr. Recommendation algorithms like this are the bread and butter of social media platforms. The happier you are on a platform, the likelier you are to stay, and if you stay, the company can retain your profitable data-generation.

But the problem—a problem most major social media companies have faced—is that recommendation algorithms aren’t really trained to make moral and health-related judgements about the kinds of content they recommend. Do you like cats? TikTok thinks you do, based on what you’re liking and searching for, so its algorithm will show you more cats. Yay cats! But the exact same formula applies to potentially harmful forms of content. Do you have anorexia? TikTok thinks you do, so here’s a bunch of triggering videos. Have at it!

In a recent BuzzFeed article, some TikTok users shared anecdotes of randomly receiving recommendations for pro-ana videos through their For You page. It is difficult to describe pro-ana behaviors without triggering readers, but they might involve sharing diet tips and purging methods, writing personal stories, and pairing up with a “buddy” to further encourage weight loss. We know from charities like Beat that eating disorder patients often report feeling “triggered” by certain images or words. If a TikTok user continuously sees triggering posts on their For You page, this could very well harm them. But one of the frustrations social media researchers have is that the inner workings of recommendation systems like the For You page are notoriously opaque, making it difficult to figure out why particular users see certain recommendations while others don’t. A recent New Media and Society article notes how social media users often create elaborate theories for figuring out how recommender systems work, what the author calls “algorithmic gossip.”

Without dismissing anyone’s claims about their For You recommendations, readers should know that users who are not engaging with videos related to eating disorders are highly unlikely to have them randomly recommended. A TikTok spokesperson explained that users can also adjust the content they see by, for example, “hearting” videos, clicking “not interested,” and following users. “In doing so, through time users will see more of the content they prefer.”

Whenever stories like BuzzFeed’s appear, I always worry that social media companies will respond by panicking and prohibiting all content relating to eating disorders, even if it’s about recovery or support.

Researchers have long known that social media and older online communities can offer support for people with stigmatized conditions like eating disorders. For example, Reddit’s decision to remove the r/proED sub in 2018 was met with outcry from community members who explained that, despite its name, the sub wasn’t actually used as a space to promote eating disorders and functioned more like a support group.

When moderated more appropriately, there’s no reason TikTok can’t offer an extra space for people to express their feelings and share their experiences in a highly creative way. TikTok could also become a helpful resource for people struggling with eating disorders. Secrecy is one of the hallmarks of an eating disorder, meaning social media sometimes exists as a sufferer’s only form of support. With this in mind, TikTok could develop genuinely useful eating disorder resources beyond sending users a list of contact details for local charities, “the 2020 equivalent of handing a teen a tri-fold brochure,” as psychiatrists Neha Chaudhary and Nina Vasan recently wrote in WIRED. Pinterest, for example, has pioneered a series of wellbeing exercises that it recommends to users searching for self-harm-related Pins.

You May Also Like

World’s oldest ORNAMENT – a 51,000-year-old engraved deer toe – is discovered in Germany

An engraved deer toe dating back 51,000 years is the oldest ornament…

Scientists discover new type of training that makes dogs better behaved… here is how YOU can do it at home

Dog training can be a daunting process that doesn’t always produce positive…

Cord-Cutting Isn’t About Saving Money. It’s About Control

The point of cord-cutting was to save money. Pick the programming that…

Netflix Can Cut Off Moochers Without a Password-Sharing Crackdown

Netflix has been trying to crack down on password sharing for more…