
On September 17th, 2024, Meta (2024) announced that they built Instagram Teen Accounts to automatically place teens in built-in protections and reassure parents that teens are having safe experiences. And on April 8th, 2025, Meta (2025) again added additional Instagram Teen Account protections, so teens under 16 won’t be able to go Live or turn off our protections from unwanted images in DM without a parent’s permission. What’s more, Meta announced that they would also begin making Teen Accounts available on Facebook and Messenger.
What is “Instagram Teen Accounts”?
According to Figure 1 and Video 1, Instagram Teen Accounts, created by Meta, provide an automatic selection to limit who teens can contact and what content they can see, which is regarded as a helpful way that teens can be protected from harmful or unhealthy content on the internet. Also, parents can decide when Instagram should be locked and how long it can be used. All the safety settings of these accounts only can be changed with parents’ approval, so that parents can take a real-time control and supervision of whether their kids are surfing safely on the internet. Obviously, the main purpose of Meta’s Instagram Teen Accounts is not only to reassure parents but to protect teens from being negatively affected by the complicated online environment and give them correct guidance on how to use their digital rights.
Why does Meta Set “Teen Accounts”?
The one biggest reason is that digital rights are important to all digital media users, especially teens, who have less knowledge of how to select and judge and how to protect themselves on the Internet. Digital rights, as human rights in the internet environment, contain some basic elements like access, privacy, security and freedom of speech, which are fundamental to online activities for every participant. How to protect digital rights of internet participants has been paid much more attention than before. However, Teens as some of the most active and fluent users of digital platforms should have been under supervision, but in fact, they are often ignored or excluded from decisions about how those platforms are designed and governed. Thus, setting Instagram Teen Accounts and expanding the range of online activities that may do harm to teens can keep them away from some harmful, irrational or extreme things on the internet, to some extent.
In the internet environment, privacy issues acquire a new significance through the volume of information that can be provided, the trade-offs between privacy rights and access to free online services, and the scope for commercial interests and government agencies to use big data for personal profiling in ways to which the user has not given informed consent (Flew, 2021).
On Instagram, people tend to post what they like or what is happening, and everyone can read it at the same time. At that moment, they may ignore their exposure of privacy and say, “Who cares? I don’t have anything to hide.” But privacy isn’t about hiding—it’s about having control over your personal boundaries. Your every click, every like, every time scrolling will all be collected by data and used to analyze your preference and find the matched recommendation and advertisement for you. Due to teens’ high engagement in Instagram, their information and preference are more likely to be targeted by some advertisers and harmful content creators, which pushes them to give up their boundaries unconsciously. With default private accounts, teens need to accept new followers and people who don’t follow them can’t see their content or interact with them (Meta, 2024).
Security always closely follows privacy when we mention digital rights. Security does not simply mean being away from hacking or computer virus but includes many things from information leakage to mental health injury. For teens, security can be defined as three elements, protecting teen’s mental health from online harassment or bullying, avoiding creepy behavior like unsolicited DMs or image-based abuse and preventing schools or apps from collecting unnecessary data without consent. Research shows that many teens—especially girls and LGBTQ+ youth—are more likely to experience harassment online (Tumber & Waisbord, 2017). Before Meta’s Teen Accounts was released, all the accounts would be displayed to public as soon as they were signed up. But now Teen Accounts help set up a boundary by Meta and parents to reduce much exposure to unrelated groups and commercial organizations so that teens can engage and interact in a more friendly and clean internet environment.
Is “Teen Accounts” Effective?
The answer may differ from different perspectives. Since September, Meta says that there are at least 54 million active Teen Accounts globally, and that 97% of teens aged 13-15 have stayed in these built-in restrictions (Meta, 2025). After a period of practice, people have different ideas about whether it is effective. Now let’s see what they say.
From Video 2, we can see that Natasha expressed a high praise to Teen Accounts and pointed that filtered content, message protection and sleep mode are her favorite features of Teen Accounts. Besides, she said that the relationship between her and her daughters became closer because Teen Accounts opened a door to enable her to know what her daughters are interested in and make more connection with them. From parents’ perspective, on the one hand, Teen Accounts, working as an automatic filter to eliminate some harmful content and creepy adults by default, reassure parents and do help them to protect their children from some danger on the internet. On the other hand, with Teen Accounts, parents can take control of when and what their kids do on Instagram, which helps them get connected with their kids and have a supervision to protect them.
However, unlike parents’ positive attitude, many experts hold a negative view on Teen Accounts. They admit that it is a good start for digital social platforms to protect teens’ digital rights, but whether it works or not still remains unclear. “Eight months after Meta rolled out Teen Accounts on Instagram, we’ve had silence from Mark Zuckerberg about whether this has actually been effective and even what sensitive content it actually tackles,” said Andy Burrows, chief executive of the Molly Rose Foundation (McMahon, 2025). The transparency of how Teen Accounts are operated is not explained and displayed towards public literally or visually.
“For these changes to be truly effective, they must be combined with proactive measures so dangerous content doesn’t proliferate on Instagram, Facebook and Messenger in the first place,” said Matthew Sowemimo, the associate head of policy for child safety online at the NSPCC (Milmo, 2025).
Platforms like Facebook and Instagram aren’t public spaces—they’re corporate environments, governed by internal policies that are often opaque and inconsistent (Suzor, 2019). Moderation is outsourced, automation makes mistakes, and users rarely know what they can appeal. As Matthew points, Teen Account is just manifestation of taking responsibility to protect teenagers’ digital rights on the internet by companies like Meta which should have done so, but indeed, what matters is that these companies and digital platforms consciously regulate the content people post and the users’ activities in order to create a friendly and clean environment not only for teens but for everyone.
Thus, whether it’s an effective practice still needs to be seriously considered. From users’ perspective, with Teen Accounts, parents are able to have an overall command of what their children are going through on Instagram and who they are connecting with and when they can use it, which can release parents’ concern about the danger that their kids may encounter. On the other hand, teen users can stay in a healthier internet environment by the automatic selection in Teen Accounts. But there is another question, “Will teens follow the rules or will they ‘find a way around safety settings (McMahon, 2025)’?”
From a more objective view, some experts’ argument does make sense. Because of the lack of transparency of how Teen Accounts being operated, we can’t distinguish whether the content appearing on Teen Accounts reaches the setting standard of safety requirement. All the digital platforms make their rules, and audiences and users are expected to accept and follow them. So, it is not Teen Accounts but companies like Meta and government which take responsibility for setting up safety standards and supervision will make an actual difference to the protection of teens’ digital rights.
As a result, Teen Accounts can not be widely admitted as an effective way to protect digital rights for teens. We still have a long journey to go. But it’s a good start.
What should we do?
As Teen Accounts still have limitations, what can we do to improve?
For teens themselves, it is necessary to realize the significance of protecting your own digital rights.
Know Your Settings! Go through your account privacy settings. Make your profile private. Limit who can DM or tag you. Review permissions for third-party apps. Be Aware of Data Collection! Every time you use digital platforms your data will be collected. Read the fine print (or summaries) and turn off anything you don’t need. Speak Up and Push Back! Whether it’s reporting a problem to the platform or speaking out on social media, your voice matters. Support Each Other! Talk to friends about privacy and security. Share tips. Back each other up when someone’s being targeted online.
For adults, companies and governments, it’s your duty to set a good model to teens and build up strong wall to protect them.
Strengthen digital privacy laws, especially for under-18s. Limit corporate data collection, especially in schools. Require platforms to build stricter safety settings into design. Fund digital media literacy education for adults that goes beyond basic safety.
Conclusion
In a word, Teen Accounts set a good example and start for digital platforms to practice taking the protection of digital rights of teenagers into consideration. To some extent, it makes a difference to building a boundary between the content teens can see and that they can not get connected to. Also, it gives parents an opportunity to know their children and a approach to correct their ways of how to survive on the internet. However, some limitations of Teen Accounts are still there, and need to be improved or solved by the cooperation and devotion of different parties of the society.
As digital technologies continue to shape the way young people communicate, learn and express themselves, it is imperative that we recognize teenagers not merely as passive users, but as active participants and stakeholders in the digital ecosystem. Ensuring their privacy, security, and digital rights is not a secondary concern—it is fundamental to develop a safe, healthy and equitable online environment. To achieve this, platforms must adopt transparent and youth-informed policies, educators must prioritize digital media literacy education, and policymakers must advance protections that reflect the realities of young users today. Ultimately, safeguarding the digital rights of teenagers is not just about responding to current risks—it is about empowering the next generation to navigate and shape the digital world with confidence and dignity.
Reference
Flew, T. (2021). Regulating Platforms. John Wiley & Sons.
Tumber, H., & Waisbord, S. (2017). Media and human rights. Wiley.
Suzor, N. P. (2019). Who makes the rules? In Lawless: The secret rules that govern our lives (pp. 10–24). Cambridge University Press.
Meta. (2024, September 17). Introducing Instagram Teen Accounts: Built-In Protections for Teens, Peace of Mind for Parents. https://about.fb.com/news/2024/09/instagram-teen-accounts/
Meta. (2025, April 8). We’re Introducing New Built-In Restrictions for Instagram Teen Accounts, and Expanding to Facebook and Messenger. https://about.fb.com/news/2025/04/introducing-new-built-in-restrictions-instagram-teen-accounts-expanding-facebook-messenger/
McMahon, L. (2025, April 8). Meta expands Teen Accounts to Facebook and Messenger. BBC. https://www.bbc.com/news/articles/cvgqe6yv0yzo
Milmo, D. (2025, April 8). Meta blocks livestreaming by teenagers on Instagram. The Guardian. https://www.theguardian.com/technology/2025/apr/08/meta-blocks-livestreaming-by-teenagers-on-instagram?utm_source=chatgpt.com
Instagram. (2024, September 26). Instagram Teen Accounts. YouTube. https://www.youtube.com/watch?v=A9PUMeAQRbA
Meta. (2025, April 8). We’re Introducing New Built-In Restrictions for Instagram Teen Accounts, and Expanding to Facebook and Messenger. https://about.fb.com/wp-content/uploads/2025/04/04_Parent-Spotlight.mp4?_=1
Be the first to comment