
Instagram, owned by Meta, will notify parents enrolled in its parental supervision program if their teens repeatedly search for suicide or self-harm-related terms within a short period. This feature, launching soon in the US, UK, Australia, and Canada, aims to help parents support their children by providing alerts via email, text, WhatsApp, or in-app notifications. Instagram already blocks such content and directs users to helplines. Meta faces ongoing legal scrutiny over child safety and is developing similar alerts for AI interactions.
Select a news story to see related coverage from other media outlets.