Social media massive Meta said on Tuesday it was rolling out a swing of measures to boost the safety of young users on its Instagram platform, the newest firm to address the issue.
Campaigners have long complained tech giants for weakening to protect teenagers from damaging content, and the attractiveness of Instagram with young people has engaged it firmly in the firing line.
Meta, which also owns Facebook and WhatsApp, said parents and guardians would be able to usual time limits on children’s scrolling on Instagram.
And young users would now see nudges cheering them to look at other subjects if they are outlay too much time looking at content about a single topic.
“It is crucial for us to develop tools that respect the privacy and autonomy of young people while involving parents in the experience,” said Clotilde Briend of Meta throughout a media briefing.
Instagram rocked last year by exposes from whistleblower Frances Haugen that recommended executives were aware the platform could harm the mental health of young users, chiefly teenage girls.
Meta has steadily denied the claims but has since faced a series of grilling’s in US Congress and recommendations that regulation could be on the way.
Other apps, containing video-sharing platform TikTok, have also criticized over fears young people were finding it hard to tear themselves away from the content.
Last week, TikTok announced young people would get nudges to prompt them to take a break from scrolling — comparable to an Instagram feature that already been rolled out.
On Tuesday, Meta also publicized new measures for its virtual reality headsets.
Parents and guardians will be able to block apps, view what their child is looking at on another device and see how long their child is outgoings with their headset on.