Facebook, Instagram To Senscor Posts About Suicide –Meta

Meta To Launch Paid Verification Service For Facebook, Instagram

Facebook and Instagram’s parent firm, Meta Platform Incorporation, has declared that it will not show postings regarding suicide and eating problems to teens.

According to Meta, young kids will only see information that is acceptable for their age on both networks. In a recent blog post, Meta stated that minors will be subject to “the most restrictive” content control settings on Instagram and Facebook.

The social media giant also said that it will conceal some types of information and limit the usage of additional phrases in the ‘Search’ and ‘Explore’ functionalities. It stated that information related to suicide, self-harm, and eating disorders will become more difficult to access and will not be recommended.

Meta said the development aims to make the social media platforms safe and age-appropriate for young people. This, it said, is in line with the guidance of experts in adolescent development, psychology, and mental health.

“We want teens to have safe, age-appropriate experiences on our apps,” Meta said.

“We have developed more than 30 tools and resources to support teens and their parents, and we have spent over a decade developing policies and technology to address content that breaks our rules or could be seen as sensitive.

“We are announcing additional protections that are focused on the types of content teens see on Instagram and Facebook.

“Now, when people search for terms related to suicide, self-harm and eating disorders, we will start hiding these related results and will direct them to expert resources for help.”

Leave a Reply