Meta will notify parents if their child repeatedly searches Instagram for language related to self-harm or suicide within a short time frame, and the system will expand beyond the U.S., U.K., Australia and Canada later this year. The company is also building alerts for teens’ AI conversations about self-harm, with notifications sent only to parents using Instagram’s parental supervision feature and designed to be triggered cautiously based on search patterns and expert guidance. #Instagram #Meta
Keypoints
- Meta will notify parents if teens repeatedly search Instagram for self-harm or suicide terms within a short period.
- Alerts are limited to parents using Instagram’s parental supervision feature and will expand to more countries later this year.
- Meta is developing a similar notification tool for teens’ AI conversations about self-harm.
- Notifications will be delivered via WhatsApp, email, or text and include advice on how to talk to teens about self-harm.
- The change comes amid legal and legislative pressure over Instagram’s impact on young people, including ongoing trials and the Kids Online Safety Act debate.
Read More: https://therecord.media/instagram-alert-parents-children-search-terms-self-harm