新闻来源:www.nbcnews.com
原文地址:Meta teams up with Snap and TikTok to address self-harm content
新闻日期:2024-09-12
Meta公司宣布与Snapchat和TikTok合作,共同推出一个名为“Thrive”的项目,旨在防止包含自杀或自残主题的内容在多个社交平台上传播。这一计划是与精神卫生联盟(The Mental Health Coalition)合作创建的,该组织致力于消除这些话题对精神健康的影响。
Meta表示,在Thrive计划下,它将识别包含自杀或自残主题的内容,并将其标记给TikTok和Snapchat,以便这两个平台也能对该内容进行调查。如果发现相同或相似的内容,这些平台会从各自的平台上移除相关内容,并在“Thrive”数据库中标记,以供其他社交媒体公司采取行动。
Meta的一位发言人表示,“Thrive”是一个所有参与公司都能访问的数据库。Meta称正在使用与科技联盟(Tech Coalition)的“灯笼计划”(Lantern program)结合的技术——该组织旨在使技术对孩子更安全,并包括亚马逊、苹果、谷歌等公司——以确保在“Thrive”中共享的数据是安全的。
当在Meta平台上发现包含自杀或自残内容时,这些内容会被赋予一个名为“哈希”的编号。其他社交平台可以通过这个编号在“Thrive”数据库中搜索相同的内容,并在需要时移除相关内容。
Meta还提到,该公司此前已从4月至6月期间从Facebook和Instagram上删除了1200万条涉及自杀或自残的内容。此次合作意在防止这些内容在多个平台上传播,同时也希望能保持有关自残和自杀的重要对话。如果您或您认识的人处于危机中,请拨打或短信988号码联系Suicide and Crisis Lifeline,或访问988lifeline.org进行在线聊天。更多支持资源可访问SpeakingOfSuicide.com/resources。
原文摘要:
Meta is teaming up with Snapchat and TikTok as part of a new initiative to prevent content featuring suicide or self-harm from spreading across the social media platforms, Meta said Thursday in a blog post. The program, named Thrive, was created in conjunction with The Mental Health Coalition, a group of mental health organizations working to destigmatize these issues. Under Thrive, Meta will identify content that features suicidal themes or self-harm and flag that content to TikTok and Snapchat so they can also investigate if the same or similar content has been posted to those apps. A spokesperson for Meta described Thrive as a database that all participating companies will have access to. Meta said it is using technology it created and uses in conjunction with the Tech Coalition’s Lantern program — an organization that aims to make technology safe for children and includes companies like Amazon, Apple, Google, Discord, OpenAI and more — to ensure that data shared in Thrive is done so securely. The spokesperson for Meta said that when content featuring suicide or self-harm is discovered, it will be removed from the platform and then flagged in the Thrive database so other social media companies can act. Meta’s blog post made clear that the program is intended to target content — not users. “We’re prioritizing this content because of its propensity to spread across different platforms quickly. These initial signals represent content only, and will not include identifiable information about any accounts or individuals,” Antigone Davis, Meta’s global head of safety, wrote in the post. When content featuring suicide or self-harm is identified on a Meta platform, it will be assigned a number known as a “hash,” according to the spokesperson. That hash can then be checked by the other social media companies so they can search for the same content and remove it if it exists on one of the participating platforms. Social media platforms, including Meta, TikTok and Snapchat, have long been criticized for not doing more to moderate content that teens consume, including video and images of self-harm. All three platforms have been sued by parents and communities who said content on the platforms led to suicides. Additionally, in 2021, leaked internal research, known as the “Facebook Papers,” revealed Meta was aware that Instagram, which it owns, could have harmful effects on teen girls. A study from the National Library of Medicine shows that a major uptick in minors using social media has led to an increase in depression and suicidal ideation in those groups. The study also suggests that young people who use self-harm are more active on social media. Earlier this year, the company announced that it would begin removing and limiting sensitive “age-inappropriate” content from teenagers’ feed on its apps and that it had plans to hide search results and terms relating to suicide, self-harm and eating disorders for all users. In its blog post Thursday, Meta said that it removed 12 million pieces of content featuring suicide and self-harm from Facebook and Instagram from April to June. While Meta said it still wants to facilitate important conversations around self-harm and suicide, it hopes Thrive will help to keep graphic content off participating platforms. If you or someone you know is in crisis, call or text 988 to reach the Suicide and Crisis Lifeline or chat live at 988lifeline.org. You can also visit SpeakingOfSuicide.com/resources for additional support.