News

Meta, TikTok, Snapchat and other social media platforms have long been criticized for failing to remove content deemed harmful to teens, including videos and images of self-harm. A Meta workspace ...
Through the Mental Health Coalition’s Thrive program, Meta, Snap and TikTok will be able to share signals about violating suicide or self-harm content with one another so they can investigate ...
Meta, Snap, TikTok, and the Mental Health Coalition developed Thrive to stop graphic self-harm and suicide content on social media like Instagram, Facebook, and others from spreading.
Meta, Snap, and TikTok have launched a joint initiative called Thrive, aimed at combating the spread of suicide and self-harm content online by sharing "signals" to identify and address such ...
Thrive, which counts Meta, Snap, and TikTok as founding members, will provide ways for platforms to share hashes — essentially unique fingerprints — of graphic suicide and self-harm content ...
Meta, TikTok and Snap are partnering with the Mental Health Coalition to launch a program that invites companies to share signals about graphic content depicting self-harm or suicide.
Meta, TikTok, Snapchat and other social media platforms have long been criticized for failing to remove content deemed harmful to teens, including videos and images of self-harm. Topics Technology ...
Meta is teaming up with Snapchat and TikTok as part of a new initiative to prevent content featuring suicide or self-harm from spreading across the social media platforms, Meta said Thursday in a ...
In an attempt to prevent suicide and self-harm content from spreading online, the nonprofit Mental Health Coalition (MHC) today announced a new program, Thrive, aimed at encouraging online platforms ...
Sept. 12 (UPI) --Three of the biggest social media platforms are teaming up to address online content that features suicide and self harm, Meta announced Thursday. Meta, the owner Facebook, Instagram ...