Young users of a major video-sharing social media platform are creating and distributing a huge amount of content promoting “extremely unsafe” weight-loss techniques, eating disorder experts have warned.
- Young people are sharing videos on TikTok which show “dangerous restrictive eating”
- The Butterfly Foundation said it had been “increasingly alerted” to the videos
- A TikTok spokesperson said the platform has a range of moderation strategies
Since its launch by a Chinese company in 2016, TikTok has grown a worldwide user base of more than 800 million people, about half of whom are estimated to be under the age of 34.
It is primarily known for its young users’ videos of dance challenges, animals and humorous lip sync takes on politics.
However, eating disorder support group the Butterfly Foundation said a failure to restrict access to videos promoting unhealthy weight loss methods was an urgent problem that had got worse amid the coronavirus pandemic.
“These videos depict potentially harmful content that has the ability to reinforce negative feelings, attitudes and behaviours — in relation to body image, food and diet — to a vulnerable youth audience,” national helpline team leader Amelia Trinick said.
Potentially thousands of users are sharing videos — often captioned with the words “what I eat in a day” and overlaid with pop music — which count calories of every meal, offer recipes for water-based weight-loss drinks, and provide tips on how to rapidly lose weight.
Another with the caption “how did u loose weight?” (sic) was followed by a photo of cigarettes.
A self-professed eating disorder survivor posted a video from what appeared to be a hospital room, saying she was on fluids and “so scared of water weight” she had gained.
Another video shows a young girl wandering into a room with a bag of potato chips, then watching a video of model Emily Ratajkowski, before putting down the chips.
The caption states: “TikTok reminds me not to eat”.
Ms Trinick said the Butterfly Foundation has been “increasingly alerted to problematic content” on TikTok in recent months.
“The videos also highlight our fixation with the societal ideal that ‘thin is best’ and promote extremely unsafe weight loss methods to an impressionable audience.”
The Butterfly Foundation’s annual Insights in Body Esteem survey of more than 5,000 Australians last year showed “alarming” results, demonstrating social media’s influence in how young people view their bodies.
Of those surveyed, 48 per cent indicated they were dissatisfied or very dissatisfied with their appearance.
Experts are also concerned about an increase in disordered eating during the coronavirus pandemic, with the Butterfly Foundation’s helpline receiving “many” calls from people experiencing eating disorders and facing “a unique set of challenges and triggers”.
“People living with an eating disorder during this time have indicated a significant increase in eating disorder behaviours and thoughts due to the high levels of stress and uncertainty associated with COVID-19,” Ms Trinick said.
‘Loopholes’ allowing children to bypass restrictions
Under the self-harm section of TikTok’s terms and conditions, the company stipulates content that promotes eating habits that are “likely to cause health issues” is “not allowed on the platform”.
“Do not post content that supports pro-ana [anorexia] or other dangerous behaviour to lose weight,” it states.
Despite those warnings, Ms Trinick said many videos showed young people engaging in “dangerous restrictive dieting behaviours to lose excessive amounts of weight”.
“While this in itself is an issue, what is even more worrying is that these behaviours are being shared with other TikTok users who may then engage in the same behaviours or make body, weight, shape, appearance comparisons to the person in the original video — who may indeed have an eating disorder,” she said.
“Unfortunately, the issue of exposure to harmful content such as this is heightened by the fact that TikTok, unlike other social media platforms, is relatively unmoderated.”
The Butterfly Foundation’s head of communications Melissa Wilson echoed those sentiments, saying it was “incumbent on the platform providers to include safety messaging and other support mechanisms to mitigate this risk”.
She said while TikTok had added some “help” functions and banned certain hashtags, the foundation had “identified loopholes” that allowed people — including children younger than the app’s minimum age of 13 — to access potentially harmful content.
“While we are concerned that TikTok is targeted at a younger demographic … the bigger concern is the lack of moderation and safety messaging,” Ms Wilson said.
“Due to the user-generated and largely unmoderated nature of TikTok, protecting people from harmful content is extremely challenging.
Ms Wilson said that on other social media channels, such as Instagram, there are “greater search restrictions”, and that help functions “are more obvious”.
Flinders University senior lecturer and psychologist Ivanka Prichard is also concerned about the videos on TikTok, and on other image-based platforms accessed by young people.
“They idealise thinness and being skinny, and present people who appear to have no qualifications providing nutrition and fitness advice,” Dr Prichard said.
“Experimental research on other platforms shows that exposure to this type of imagery leads to greater body image concern and negative mood.
“Adhering to advice from social media in relation to diets … is associated with greater dietary restraint.”
TikTok ‘committed’ to safe content
Dr Prichard said it was unlikely the people filming and sharing the videos in question recognised the issues associated with them.
“For the most part, young people are probably sharing them because they may want to help others or to share with others what they are doing,” she said.
“They probably don’t realise the potential harm that these types of videos could have.”
A TikTok spokesperson said the platform’s content moderation is undertaken by “global safety teams”.
The spokesperson said the platform’s safety teams “comprise experienced industry professionals” who “collaborate closely with regional regulators, policymakers, government, and law enforcement agencies” to promote safety.
TikTok filters and removes “red-flag language including those related to eating disorders”, along with directing users searching for that content to support resources, the spokesperson said.
“We care deeply about the complex and multi-faceted issue of eating disorders,” the spokesperson said.
“If we become aware of any content that violates our terms of service and community guidelines, we will take immediate action to remove content, terminate accounts, and report cases to law enforcement as appropriate.”
The Butterfly Foundation has welcomed TikTok’s plans to establish an Australian office and said it had engaged with the company “to work together on these issues”.
If you or anyone you know is experiencing body image concerns or an eating disorder, you can call the Butterfly Foundation National Helpline on 1800 33 4673.