Meta Announces Measures to Protect Teens From Harmful Content

 
 
 

Meta said Tuesday it will hide inappropriate content from teenagers' Instagram and Facebook accounts, including posts about suicide, self-harm, and eating disorders. In a blog post, the Menlo Park-based company said that in addition to its goal of not recommending such "age-inappropriate" material to teens, it will also not display it in their feeds even if another account shares it. "We want teens to have a safe, age-appropriate experience on our apps," Meta said.

Teens who do not lie about their age when signing up for Instagram or Facebook will also see their accounts placed on the most restrictive settings, and they will be blocked from searching for harmful terms. “Take the example of someone posting about their ongoing struggle with self-harm thoughts. This is an important story, and can help destigmatize these issues, but it’s a complex topic and isn’t necessarily suitable for all young people,” Meta said. "Now, we'll eliminate this type of content from teens' Instagram and Facebook experiences, as well as other types of age-inappropriate content."

In a lawsuit filed by dozens of U.S. states accusing Meta of harming young people and contributing to the youth mental health crisis, Meta has designed features on Instagram and Facebook that are knowingly and deliberately addictive to children. Meta's moves were criticized for not going far enough by critics.

Josh Golin, executive director of Fairplay, a children's online advocacy organization, said today's announcement by Meta is yet another desperate attempt to avoid regulation. In light of the fact that the company is capable of hiding pro-suicide and eating disorder content, why did they wait until 2024 before announcing these changes? ”

 
****************************************************