By Familyguide Contributor
In a collaborative effort to combat the spread of self-harm content, Meta has joined forces with Snap and TikTok to develop innovative methods for preventing such material from circulating on their platforms.
“We’ve partnered with the Mental Health Coalition to launch Thrive, an innovative program enabling tech companies to share information about content violating suicide or self-harm policies, ultimately preventing its proliferation across various platforms,” Meta announced.
The company explained that while they have previously implemented internal safeguards, they believe a cross-platform alliance will yield more effective results.
“To truly address this content effectively, cooperation among tech companies is crucial,” they stated. “That’s why we’ve collaborated with the Mental Health Coalition to establish Thrive, the pioneering signal-sharing program for exchanging information about content that violates suicide and self-harm policies.”
The Thrive system will alert other apps when self-harm or suicidal content is posted on one platform, allowing for swift action to prevent its spread elsewhere.
“Meta revealed it’s utilizing technology developed in conjunction with the Tech Coalition’s Lantern program – which aims to ensure technology is safe for children and includes companies like Amazon, Apple, Google, Discord, OpenAI and others – to guarantee secure data sharing within Thrive,” NBC reported.
Meta claims to have removed over 12 million pieces of self-harm and suicidal content during the summer months.
“From April to June this year, we took action on more than 12 million instances of suicide and self-harm content on Facebook and Instagram,” the blog post detailed. “While we permit discussions about experiences with suicide and self-harm – as long as they’re not graphic or promotional – this year we’ve implemented important measures to make such content less discoverable in Search and completely hidden from teens, even when shared by someone they follow.”
“The incorporation of signal sharing, along with cross-industry collaboration moderated by an independent and neutral intermediary, marks a significant advancement in industry cooperation and public protection regarding the global public health crisis of suicide, ultimately saving lives,” stated Dr. Dan Reidenberg, Thrive’s director.
Social media platforms have faced criticism for their inadequate protections for teenagers. Recently, Snapchat has come under fire for its insufficient measures to prevent bullying and was subject to a lawsuit in New Mexico. Familyguide previously reported:
The lawsuit alleges that certain Snapchat features facilitate predators’ access to children. The lawsuit states, “Snap’s features, including its algorithms, which analyze users’ consumption patterns to recommend content and other users aligned with their interests, function to connect children with adult predators and drug dealers, and deliver a stream of sexualized, drug-related, or other dangerous content to children, predators, and others.”
The lawsuit further asserts that Snapchat failed to implement appropriate safety measures to protect users.
It also mentions the rape of an 11-year-old girl who was targeted by a predator through Snapchat.
While it’s commendable that companies are collaborating to enhance social media safety, these platforms still carry inherent risks. Ultimately, parents bear the responsibility of safeguarding their children and monitoring their online activities.
You may also like
-
“Shocking Truth: The Hidden Danger Lurking in Your Child’s Backpack!”
-
“The Silent Epidemic: How Your Child’s Screen Time Could Be Stealing Their Smile”
-
“You Won’t Believe How This Simple Family Tradition Transformed a Worship Leader’s Daughter”
-
“Country Star’s Heartfelt Plea: The Song That’s Making Parents Everywhere Reach for the Tissues”
-
“Shocking Discovery: Your Phone Addiction Could Be Silently Sabotaging Your Child’s Future!”