ADVERTISEMENT
Instagram is set to make teen accounts private by default in an effort to safeguard younger users from online dangers. Starting this week, new accounts for anyone under 18 in the U.S., U.K., Canada, and Australia will be private, with existing teen accounts migrating over the next two months.
This change, aimed at reducing unwanted contacts and exposure to harmful content, is part of a broader push by parent company Meta to address growing concerns over how social media impacts the mental health of young users.
Under the new rules, teens will only be able to receive direct messages from people they follow. In addition, "sensitive content" - like violent videos or posts promoting cosmetic procedures will be limited. Teens will also receive reminders to log off after 60 minutes and can activate "sleep mode" that silences notifications between 10 p.m. and 7 a.m. While these safety measures will apply to all teens, 16 and 17-year olds will have the option to disable them, while users under 16 will need parental approval for any changes.
Meta's Naomi Gleit, head of product, said the move targets three primary concerns from parents: inappropriate content, unwanted contact, and excessive time spent on the app. However, critics argue that these steps are not enough to combat the underlying risks.
U.S. Surgeon General Vivek Murthy and New York Attorney General Letitia James both called for more aggressive measures, with James labelling Meta's changes "an important first step." but insufficient.
The announcement follows a wave of lawsuits against Meta by multiple U.S. states, accusing the company of contributing to the youth mental health crisis by designing addictive features on its platforms. Although Meta hinted at potential short-term declines in teen engagement, analysts like Jasmine Enberg from Emarketer believe the revenue impact will be minimal. She added that while the changes might curb some harmful behaviours, teenagers are likely to find ways to bypass restrictions, possibly even encouraging them to explore workarounds.
Meta is also expanding parental control options, allowing parents to monitor their child's online activity through its Family Center. With these changes, parents will be able to view who is messaging their teens, potentially aiding conversations about online safety and harassment.
Yet, as Murthy pointed out, the burden of monitoring remains largely on parents, who may struggle to keep up with the rapidly evolving digital landscape.