
Instagram, owned by Meta, will notify parents in the US, UK, Australia, and Canada if their teens repeatedly search for suicide or self-harm terms, provided they are enrolled in the platform's parental supervision program. This new alert system complements existing content blocks and directs users to support resources. The move comes amid global regulatory pressure and ongoing legal trials against Meta over child safety concerns. Some experts caution that forced disclosures may have unintended effects, while Meta plans to expand alerts to AI interactions later this year.
Select a news story to see related coverage from other media outlets.