Seven families in France have filed a lawsuit against TikTok. Families allege that the platform’s algorithm exposed their teenage children to harmful content promoting suicide, self-harm, and eating disorders. Experts believe that this exposure contributed to the suicides of two 15-year-old girls.
Seven French families are suing the popular video-sharing app TikTok. Lawyers are alleging that harmful content on the platform exposed their children to serious risks. This comes to light when two of them tragically ended their own lives at age 15.
This group lawsuit, filed in the Créteil court, marks a notable legal action in Europe, according to their lawyer, Laure Boutron-Marmion. The families claim TikTok’s algorithm recommended videos that promoted self-harm, suicide, and eating disorders, which impacted the mental health of the young users.
The lawsuit calls for TikTok to take responsibility for the content it provides to young, impressionable users. “The parents want TikTok’s legal liability to be recognized in court”, she said, adding: “This is a commercial company offering a product to consumers who are, in addition, minors. They must, therefore, answer for the product’s shortcomings.”
The parents argue that TikTok’s algorithm targets young people with harmful content. It also has negatively affected the mental well-being of many teens.
TikTok, like other social media platforms, has long faced scrutiny over its content moderation and the impact of its algorithm on young users. Other major platforms, like Meta’s Facebook and Instagram, have also faced similar concerns. Hundreds of lawsuits in the United States accuse these companies of addictive content that harms children’s mental health.
Last month, more than a dozen states and the District of Columbia filed lawsuits against the Chinese-owned company Tiktok. The lawsuit alleges it is damaging children’s mental health with a product designed to be used compulsively and excessively.
Responding to the lawsuits, a TikTok spokesperson said: “We strongly disagree with these claims, many of which we believe to be inaccurate and misleading“. The company previously stated that it takes issues related to children’s mental health seriously. Its chief executive, Shou Zi Chew, this year told US lawmakers the company had invested in measures to protect young people who use the app.