
A US lawsuit alleges Meta delayed implementing safety tools for teenagers on Instagram despite internal awareness of risks since 2018. Instagram head Adam Mosseri acknowledged potential harms from unsolicited explicit images in private messages but said the company balanced user privacy with safety. In April 2024, Instagram introduced a feature to blur explicit images in direct messages to teens. Court documents reveal nearly 20% of users aged 13-15 reported seeing unwanted sexual content, highlighting concerns over platform safety and addiction.
Select a news story to see related coverage from other media outlets.