ADVERTISEMENT
Meta has unveiled a series of new safety measures aimed at enhancing protections for teens and children on Instagram. These updates are part of the company’s ongoing commitment to preventing both direct and indirect harm to young users on its platforms.
Key among the new features is an update to direct messaging (DMs) in Teen Accounts. Now, teens will be able to see safety tips, block/report options, and account details like join date prominently displayed at the top of new chats. Meta has also introduced a combined “block and report” option to streamline reporting of potentially harmful accounts.
According to Meta, these tools are already having an impact. In June 2025 alone, teens blocked accounts one million times and reported another million after seeing in-app Safety Notices. Additionally, a new Location Notice alerting users when they’re chatting with someone in another country was viewed one million times, with 10% of viewers tapping for more information—helping to deter scams and sextortion attempts.
Meta also reported positive response to its global nudity protection feature, which automatically blurs suspected nude images. Ninety-nine percent of users, including teens, have kept the feature turned on. In June, 40% of blurred images remained unopened, and nearly 45% of people chose not to forward such content after seeing a warning.
Importantly, Meta is extending teen protection features to adult-managed accounts that primarily feature children such as those run by parents or managers representing child influencers. These accounts will now have stricter messaging settings and offensive comment filters enabled by default. Suspicious adults, particularly those previously blocked by teens, will be prevented from finding or interacting with these accounts.
In a broader crackdown, Meta said it has removed nearly 135,000 Instagram accounts for posting sexualized comments or soliciting images from child-focused accounts, and an additional 500,000 related accounts across Instagram and Facebook. The company is also sharing data with other tech firms via the Tech Coalition’s Lantern program to combat child exploitation across platforms.