In September 2023, the EU fined TikTok €345 million for failing to comply with the GDPR (General Data Protection Regulation), and in particular for serious failings in the design of TikTok’s platform in dealing with the protection of the data of underage users.Also, the investigation found that the accounts of people between the ages of 13 and 17 are made public by default when they register, which means that the content they post is visible to anyone, and that this is a serious violation of minors’ privacy.And, the fine was issued by the DPC (Data Protection Commission of Ireland ), and it’s the largest fine TikTok has received from a regulator to date.What’s more,this was not a technically induced mistake, but rather a result of the design of the TikTok platform system.And, such a design of the platform system has impacted the digital privacy of minors.

Platforms set privacy to “public”
TikTok’s accounts for users aged 13-17 are set to public by default, and the recommendation system further pushes this content into the eyes of strangers. This is not an accidental slip-up, but a reflection of the platform’s commercial purpose by design.
The rapid pace of technological development of digital platforms, opaque rule-setting and lagging regulatory mechanisms have allowed digital platforms to continue to erode the rights and interests of their users, especially marginalised groups, such as the youth population mentioned here.And privacy does not only mean to have absolute control over information, but to allow the information to circulate in the appropriate context.Also, the invasion of privacy is not only whether the information has been leaked or not, but also whether the context in which the information is used has been changed without the user’s knowledge and consent.Then, adolescents upload short videos originally for the purpose of socialising, but TikTok places them in an algorithmic recommendation system and recommends them to groups of strangers, destroying their intended social context. This goes against the contextual norms mentioned by Nissenbaum (2018).And the flow of user content posted by minors to unanticipated groups can cause them serious privacy concerns.
So does setting a minor’s account to public by default violate the minor’s privacy? Digital platforms often violate users’ privacy by breaking their original social context through algorithmic recommendations and other means.For minors, uploading a video is an act of social interaction, not a consent to be analysed and designed by an algorithm. Even if TikTok considers this platform setting “legal”, this “contextual misalignment” creates an invasion of minors’ privacy.
Private management of “default settings”

On TikTok, user accounts for 13 to 17 year olds are set to public by default, meaning that the user content they post can be readily viewed, commented on, and even used to be commercialised by strangers.A spokesperson for TikTok said, “We made changes to these features and settings before the investigation began, such as setting all accounts under the age of 16 as private by default.” And in fact TikTok had long avoided taking responsibility for privacy by blurring the line between “user responsibility” and “platform responsibility” until the EU fined them and forced them to change some of their practices.While the platform claims that users can also change these settings, this seeming freedom of choice really only exists on the surface.And most minors are not mature users of digital platforms, and it is difficult for them to keep track of data usage policies, let alone change privacy settings.And this makes TikTok’s stated ability for users of the platform to make changes to their account privacy a moot point.
Platforms are not neutral tools, but rather decision makers with governance powers, Suzor (2019) mentions that “there is no such thing as a “neutral” platform; all platforms make decisions about their rules and technical design that determine the types of content people can publish and the types of content that are visible types.”While teens are not sophisticated “data consumers” and have a limited understanding of privacy policies and platform algorithms, they are the group of users most susceptible to manipulation by platforms.And family management features are set up in a complex manner and cannot be effectively controlled by parents who do not understand digital technology. This goes to show that the exposure of minors’ privacy is not the result of individual carelessness, but the default design of digital platforms.
TikTok’s rules may seem user-controllable, but its real power is hidden in its default settings and algorithmic structure. And, not only does TikTok fail to protect the privacy of platform users, it actively manages them.The default settings of digital platforms are a manifestation of such rules. It not only guides user behaviour, but also shapes users’ perceptions of the right way to use the platform, so that the benefits of the platform, such as generating more data and commercialisation, can be realised “voluntarily”.And this opaque and uninvolving governance system makes it impossible for users to know how they are being “governed”, just as minors do not know to who their user content on TikTok is being pushed and who is seeing it.
How minors are “guided” by algorithms

One of TikTok’s most controversial features is its “recommendation algorithm”. Once a platform user’s video receives an initial interaction, it is pushed out to a wider group of strangers, encouraging them to post more content in exchange for more traffic.And this way of allowing users to unconsciously and continuously “expose themselves” in exchange for traffic not only violates user privacy, but also builds a kind of algorithmic domestication.And the platform constructs a positive cycle of rewards, and this algorithmic mechanism is not neutral.Such mechanisms are ostensibly designed to enhance the user experience, but in fact build an algorithm-driven attention economy system.And in such systems, user behaviour is predicted, guided and even manipulated, and the process often lacks transparency and accountability mechanisms.
The psychological needs of teenagers when using social media can be easily exploited by such algorithms. and this allows teenagers to unconsciously “expose themselves” in exchange for huge traffic rewards.Karppinen (2017) mentions that: digital platforms are builders of rights environments, and the realization of digital human rights relies on structural environments, not just individual behavior· or consent mechanisms.Then, TikTok’s use of complex default settings and opaque data mechanisms builds the “right to privacy” into an unattainable technological environment.And, the insidious nature of recommendation algorithms that collect, analyze and exploit behavioural data without informing the user exacerbates the invasion of teenagers’ right to privacy, and also blurs the platform’s liability for the consequences of their actions.What’s more Under the structure of digital human rights, we must revisit the “neutrality” of algorithms and emphasise the corresponding responsibility of platforms for the social consequences that “recommendation algorithms” may or have caused.
When “acceptance of terms” becomes digitally mandatory

TikTok, like other platforms, requires users to accept complex terms of service and privacy policies when signing up. These terms use highly specialised legal jargon that can be difficult for even adult users to really understand, let alone teenage users.On the face of it, acceptance of these terms is clearly a process of ‘voluntary consent’, but as Marwick (2019) notes: this ‘informed consent’ is often a disguised coercive mechanism amongst marginalised user groups.And teenagers are often portrayed as ‘tech-savvy’ without being given actual control of their information, a seemingly powerful title that belies their marginalised position in the digital society.And, instead of “active consent”, they passively accepted the platform’s invasion of privacy because they lacked the legal awareness and technical ability to protect themselves.
This seemingly “legitimised” structure masks the leading role of the platform by placing responsibility on the user.The platform puts the onus on the user by agreeing to the content of the terms, and the user, manipulated by the platform’s logic, is led into an engagement that cannot be refused, nor understood.By structural design, the And platform creates a mandatory choice, where users can only gain the possibility to participate in exposure, at the cost of the loss of the right to manage their own privacy and control their own data.
This is a dangerous choice for teenagers, who lack the ability to recognise the consequences of algorithmic manipulation. More importantly, adolescents often find it difficult to balance the desire to be seen with the need to protect their privacy.They are encouraged by algorithms to choose exposure without understanding how this information will be captured, analysed & cashed in by the platform. This is because they don’t understand how recommender systems work and don’t have the ability to set private content.And digital platforms are designed to achieve higher levels of content distribution by doing just that.And the algorithms of the platforms are designed to reinforce this behaviour, with the more visible they are to the platforms the more likely they are to expose personal information.And this exposure is not entirely of free will, but rather the platform’s behavioural guidance through traffic incentives.
The EU’s fines are “treating the symptoms, not the cause.”

The EU’s €345 million fine sends a strong signal . But TikTok’s fine is just a legal response. And the real challenge is to ensure that platforms are designed with human rights in mind, not as an afterthought.While the GDPR emphases data minimisation and user control, in reality, digital platforms can still evade their responsibilities by devising complex terms and conditions.And, the case of TikTok is simply not solved by fines after the fact alone. And digital platform governance cannot rely on legal penalties alone, but rather human rights principles should be incorporated into platform governance and technology policy (Karppinen, 2017).
But this isn’t just a TikTok problem, it’s a global problem, and TikTok isn’t the only “privacy violator”; Meta (Instagram) has also been fined for similar issues.Instagram owner Meta has been fined €405 million by Ireland’s data regulator for allowing teenagers to create accounts that publicly display their phone numbers and email addresses.The penalty was confirmed by the Data Protection Commission following an investigation into a potential breach of the EU’s General Data Protection Regulation (GDPR).And These kinds of incidents all reflect the following points: firstly, teenagers are the prime group of active users. the more data And has, the more money it can make, and the more addicted the users, the more stable clicks it can provide. This shows that these kinds of problems are not the mistakes or negligence of individual platforms, but the inevitable product of the logic of platform capitalism.
Conclusion
The TikTok case is not just a failure of privacy management and settings, but a wake-up call.And the TikTok case is not an isolated one either, as large digital platforms such as Instagram have also come under regulation and scrutiny due to the privacy concerns of teenagers and children.Digital platforms have shaped a new way we connect with others.Then, when they fail to protect the privacy of their users, especially teenagers and other marginalised groups, it’s a system design feature that lacks ethics. And we can’t just rely on fines to deal with this problem if we want to fix it.
And this is a global ethical crisis for digital platforms.Then, in social platforms dominated by superior technology and business logic, privacy is no longer simply a “passive protection” but a structural right that needs to be actively designed.And,every time a teenager posts a video on a digital platform, they are not “choosing to be exposed”, they are being structurally directed to be exposed. Instead of placing the responsibility for privacy protection on the shoulders of teenage users, we need to reconfigure the underlying logic of platform design and governance.As Karppinen (2017) mentions, technological governance should not only focus on ‘efficiency’ or ‘innovation’, but should also be based on human rights, especially when it comes to digitally marginalised groups, such as adolescents.The root cause of the privacy crisis on And platforms lies not in the ignorance of users, but in the fact that platforms are evading their responsibilities in the name of platform design.
Reference
Karppinen, K., Tumber, H., & Waisbord, S. (2017). Human rights and the digital. In The Routledge Companion to Media and Human Rights (1st ed., pp. 95–103). https://doi.org/10.4324/9781315619835-9
Marwick, A. E., & boyd, d. (2019). Understanding privacy at the margins: Introduction. International Journal of Communication, 13, 1157–1165. https://ijoc.org/index.php/ijoc/article/view/7053
Nissenbaum, H. (2018). Respecting Context to Protect Privacy: Why Meaning Matters. Science and Engineering Ethics, 24(3), 831–852. https://doi.org/10.1007/s11948-015-9674-9
Suzor, N. P. (2019). Who makes the rules? In Lawless: The secret rules that govern our lives (pp. 10–24). Cambridge: Cambridge University Press.
BBC News. (2023). TikTok fined €345m over children’s data privacy.https://www.bbc.com/news/technology-66819174
Milmo, D. (2022). Instagram owner Meta fined €405m over handling of teens’ data.https://www.theguardian.com/technology/2022/sep/05/instagram-owner-meta-fined-405m-over-handling-of-teens-data
Be the first to comment