Washington, February 26: Instagram has announced that it will begin notifying parents if their teenage children repeatedly search for terms related to self-harm within a short period.
In a statement, the platform - owned by Meta
Platforms - said the feature will apply to parents who have opted into
Instagram’s supervision settings. The alerts are designed to help families
identify potentially concerning online behaviour and provide timely support to
minors.
The company said the new measure complements its
existing policies that prohibit content promoting or glorifying suicide or
self-harm, as part of broader efforts to enhance safety for young users.
The announcement comes amid increasing global
pressure on governments and technology companies to strengthen online
protections for children and teenagers.
Australia has already moved to restrict social
media use for children under 16, while the United Kingdom, Spain, Greece and
Slovenia are exploring similar regulatory measures aimed at safeguarding young
internet users. (QNA)