Meta, the parent company of Instagram and Facebook, has announced significant changes to protect teenagers from exposure to harmful content related to self-harm and eating disorders. In response to increasing scrutiny over its impact on minors, Meta is now actively limiting such content on teen users’ Instagram Feeds and Stories. This action represents a shift from their earlier stance, where they permitted the circulation of these posts under the rationale of destigmatizing mental health issues.
A key aspect of this new policy involves altering how teens interact with potentially sensitive content. Searches for terms linked to suicide, self-harm, and eating disorders will now trigger a hiding of related results, redirecting users to professional help resources. This extends Meta’s existing policy of concealing search results that violate their rules, broadening the scope to encompass a wider range of terms.
Furthermore, Meta is introducing an automatic setting for teen users, placing them in the most restrictive content control by default. This has already been applied to new teenage accounts and will soon be extended to all existing young users. The aim is to make it harder for teens to encounter sensitive content in areas like Search and Explore. Although teens have the option to adjust their settings, Meta intends to use notifications to encourage them to maintain a more private online experience.
This move by Meta is a part of its ongoing efforts to create a safer online environment for its younger audience, acknowledging that certain content may not be appropriate for all age groups. The company’s initiative reflects a growing awareness of the need to balance openness with the protection of vulnerable users, particularly teenagers.