The Price of Connection: Privacy, Security, and Digital Right

Have you ever experienced this? You’re happily scrolling through TikTok when suddenly the next video promotes the exact product from a clip you just watched and liked. Or after scanning a QR code to order food, you’re prompted to provide your email, phone number, and other personal details before checkout. Maybe you signed up for a membership, only to find your inbox flooded with unsolicited promotional emails—with no option to unsubscribe… Some shrug it off: “I’ve got nothing to hide anyway.” Others quip, “Who even cares about privacy these days?” But perhaps this indifference is merely resignation—a silent surrender to the fact that our privacy is no longer safeguarded online.

So today, do we truly have the freedom to choose whether we’re “seen”? Reality might be far more unsettling than you think.

1. ‘Who makes the rules’? The rules of the platform are not up to you at all

By default, social media platforms are relatively ‘neutral’, allowing users to speak freely, set privacy settings, delete accounts, and so on. It may seem like it’s all in the hands of the user. But is this really the case?


According to the survey, as early as 2009, Facebook’s CEO Zuckerberg had put forward a seemingly ‘democratic’ idea: in order to make Facebook a more democratic social platform, users can participate in the development of Facebook’s terms of service by ‘voting’. Facebook’s Terms of Service. What seemed like an experiment in digital democracy turned into a farce that ended in failure. This was because Facebook set the voting threshold very high: more than 30% of active users had to vote for the results to be valid. This was almost impossible at the time, as it was equivalent to hundreds of millions of users participating at the same time. It was clearly an ‘almost impossible task.

After the issue was exposed after a vote on a certain change 3 years later, Facebook repudiated the statement and blamed a former employee (Suzor, 2019). This was the end of this attempt at user participation in rule making.


This incident clearly reveals a problem: the terms of service we see in apps are not made between you and the platform, but are pre-written by the company as a ‘user contract’. By clicking Agree, you are agreeing to accept all the terms and conditions. Whether it’s the platform’s ability to collect, use, or even sell your usage data, or the ability to delete your posts and block your account without prior notice.

On the face of it, it’s your choice to join the platform. But in reality, it’s the platform setting up ‘what you have to accept’.

What’s even more problematic is that these rules are usually hidden in dozens of pages of privacy contracts that most people don’t even read. And even if you do read them all and find something you’re not happy with, you can’t use the app if you don’t accept the contract. Do you think you can avoid privacy exposure by turning off location in your Settings and not liking, favoriting, or commenting?But in reality, the platform will be able to deduce your preferences just through an algorithm. For example, the platform keeps track of how long you stay on which video or post, whether you open comments, whether you skim quickly, whether you speed up, and so on. This information is enough for the platform’s algorithms to create a user profile of you.

So our privacy and digital rights are an illusion created by algorithms, commercial interests and opaque rules.

2.The illusion of choice: most people are compelled to share their data

But there’s another way of looking at it on the web: ‘If you don’t like being tracked? Then don’t use these platforms.’ It may sound like opting out is just a matter of clicking a button, but for the vast majority of users, privacy isn’t a matter of ‘wanting to’, it’s a matter of not having a choice.


In the various apps we use on a daily basis, data is often shared not ‘by choice,’ but for survival. According to a 2018 international communication report, many marginalised groups often have to provide personal information in exchange for access to basic services, such as applying for a job, government assistance or medical support. In the United States, for example, some jobs require job applicants to provide a personal credit report, even though it’s not relevant to their ability to do the job. However, if you refuse to submit it, you may not even get an interview (Marwick & Boyd, 2018).

A similar situation exists in Australia. A citizen survey by the University of Sydney showed that 57 per cent of respondents expressed concerns about online privacy in general, and in particular, their distrust of data collection and use by businesses was high. However, only 38 per cent of respondents truly felt in control of their privacy (Goggin et al., 2017). Of these, people on low incomes, the elderly, and those from non-English speaking backgrounds are particularly vulnerable. When they receive government services or use digital platforms, they struggle to understand the complex privacy terms and have to passively agree to them. Otherwise, they are unable to access basic services.

In other words, ‘privacy’ is now becoming a quid pro quo. You give up a little bit of information as one of the conditions in exchange for a pass to life, such as work, life or healthcare. In this environment, choice itself becomes a privilege.

So the next time you see someone’s privacy being abused, don’t be so quick to blame them by saying, ‘Who told you to use it? For many people, life no longer allows them to say no. As Marwick & Boyd (2018) point out, though, most of us are not completely ignorant in surrendering our privacy, nor are we in a state of extreme resistance. We live in a world where we must rely on data platforms, knowing the risks, but having no choice. Therefore, all we can do is hope that the platforms don’t abuse their rights.

3. Privacy is a resource and a privilege in the digital society

As mentioned before, platforms are interfering with users’ digital rights by manipulating information through specified rules. Privacy is no longer an equal right for all. This also begs the question: who really has the power to ‘protect privacy’? And who is responsible for the violation of user privacy?

Privacy is widely recognised as a ‘right that is equal from birth’, but in today’s digital society, it has become a resource. If you want to protect your privacy, you have to be tech-savvy, understand complicated terms and conditions, have time to study app settings, and even be able to afford not to use certain services.

As Kari Karppinen distinguishes in the chapter Human Right and the Digital, privacy is not just about your freedom not to be monitored; it also includes positive rights: the structural prerequisites needed to realise these freedoms (2017) For people, it is just a matter of telling them that ‘you can turn off location ‘ or ‘log out of apps’ is not enough for people. True ‘digital privacy’ is not just about people not being able to monitor them, but also about people having active control over their data lives – for example, if people don’t have fair access to the internet, understandable contractual terms for apps, protected spaces for speech, and regulation of the power of platforms, then it’s not enough. If people don’t have fair access to the internet, understandable contractual terms for apps, protected spaces for speech, and regulation of the power of platforms, then even if people theoretically have a right to privacy, they can’t really exercise it. This is the importance of ‘positive rights’ in the digital society. The real realisation of the right to privacy relies on a complete set of basic conditions – that is, whether or not you have good equipment, whether or not you have the ability to understand, and whether or not you have the law on your side – rather than unilateral choices or efforts on the part of the individual (Karppinen, 2017).

But in reality, many people struggle to even understand the terms of service, let alone protect themselves with complex laws or systems. For example, children can be lured into spending large amounts of money on ‘free-to-play’ games (ACCC, 2023), and many people fall prey to online scams because they are unfamiliar with the risks to their privacy. According to statistics, since 2020, the Australian Federal Police have stopped approximately $83 million from being in the hands of cybercriminals, but this is just the tip of the iceberg (AFP, 2024). These two phenomena are very common in today’s society. There have been two cases of fraud in my neighbourhood. The amount was up to hundreds of thousands of RMB.

These cases illustrate one point: privacy is not a default equal right for people, but something that requires ability, resources and institutional guarantees to be truly realised. Therefore, it is not enough for platforms to simply provide an ‘opt-out’, ‘settings’, or ‘decline’ button. Rather, platforms and policymakers should consider how to provide people with the necessary support and protection to ensure that they can truly exercise their digital rights.

4. ‘networked privacy’: we need a more sophisticated understanding and design

So far, we have learnt how platforms make rules that override users’ rights, why users cannot really “opt out”, and how privacy has become a resource in reality. We need to think further: is privacy really a ‘personal’ issue?

Maybe you’ve thought: ‘I never share my life on the platform, and I don’t turn on my location, so there’s no way my information can be leaked online’, but the truth is that privacy is no longer a personal issue.

As Marwick and Boyd (2018) suggest, privacy today is no longer just a single technological action, but a ‘networked practice’. Even if you are careful not to even share your life online. You can’t stop your friends and acquaintances from @ing you, uploading your photos, and discussing your life on social media. Your data will still be exposed (AFP, 2024).

In other words, even if a person is careful, they cannot stop themselves from being brought into the online data by their relationships. It is also a further extension of the aforementioned ‘privacy is not up to you’: even if you do not voluntarily surrender your privacy, your surroundings and other people will push you to be exposed to the network.

This is particularly complex for marginalised social groups. For example, some users who are experiencing difficulties are forced to disclose their medical conditions, family income and even personal details in order to receive financial support on the platform; in some specialised groups, such as the Australian Aboriginal community, there may be a greater emphasis on ‘group privacy’; and sometimes ‘being seen’ is instead a way for users to fight against marginalisation. Sometimes ‘being seen’ is a way for users to combat marginalisation. None of these behaviours can be explained by whether a setting is public or not, and they go far beyond the ‘logical norms of privacy control’ designed by the platform.

Instead, this tells us that privacy is not just a technological option, but a product of social structures, cultural contexts and interpersonal relationships. If these complexities are ignored in technology, law and platform rules, no real protection of privacy can be achieved.

What should we do when privacy becomes a luxury?

Maybe you’ve had the feeling that when you open an app, it seems like the whole world knows what I’m up to. Or maybe you’ve been asked to fill out a bunch of ‘personal information’ in order to use a free service, what did you think?

But the question is, do we really have a choice?

In the digital age, privacy is no longer about switching off location. For many people, privacy has become something that ‘you have to have the time, knowledge and resources to protect. Not everyone has the ability and patience to read and understand dozens of pages of terms of service; not everyone can correctly identify scams, and not every child understands the Game recharge tips.

When you realise that even ‘refusing to share’ has become a privilege, it’s time to make a change.

What can we do?

  • For myself, I need to stop thinking that I don’t have any privacy and that it doesn’t matter. Every time before you authorise an app, take a look at it and say no if you feel something is ‘wrong’.
  • For platforms, don’t just give the option to ‘opt out’. Simplify the terms and conditions, and set up real and transparent algorithmic options, instead of making users spend hours checking them before each use.
  • For governments, privacy protection depends not only on technology, but also on policy and regulation. Equitable digital education and related protection mechanisms should be provided to vulnerable groups.

We can’t rely on a single ‘setting’ to protect privacy, nor can we rely on individual choice to resist an entire system. But we can start by questioning “who makes the rules” and by caring about the data security of people around us.

It’s not that we don’t have secrets. We just haven’t really been allowed to have it.

Reference

ACCC urges app industry to adopt new principles following “sweep” of children’s game apps | ACCC. (n.d.). Retrieved April 10, 2025, from https://www.accc.gov.au/media-release/accc-urges-app-industry-to-adopt-new-principles-following-sweep-of-childrens-game-apps?utm_source=chatgpt.com

AFP takes the fight to cybercriminals in 2024 | Australian Federal Police. (n.d.). Retrieved April 11, 2025, from https://www.afp.gov.au/news-centre/media-release/afp-takes-fight-cybercriminals-2024?utm_source=chatgpt.com

ByteDance. (2025). AI-generated image using Doubao [AI-generated image]. https://www.doubao.com/

Digital Inequality: Unequal Access to Technology and Internet | AI Art Generator | Easy-Peasy.AI. (n.d.). Retrieved April 12, 2025, from https://easy-peasy.ai/ai-image-generator/images/digital-divide-contrasting-high-tech-metropolis-low-tech-rural-area

Goggin, G., Vromen, A., Weatherall, K., Martin, F., Adele, W., Sunman, L., & Bailo, F. (2017). Digital Rights in Australia. https://ses.library.usyd.edu.au/handle/2123/17587

Forget pay to view, pay for privacy is the latest problem for people online | TechRadar. (n.d.). Retrieved April 12, 2025, from https://www.techradar.com/computing/cyber-security/forget-pay-to-view-pay-for-privacy-is-the-latest-problem-for-people-online

Karppinen, K. (2017). Human rights and the digital. In The Routledge Companion to Media and Human Rights. Routledge.

Marwick, A. E., & Boyd, D. (2018). Understanding Privacy at the Margins: Introduction. International Journal of Communication (Online), 1157–1166.

OpenAI. (2025). AI-generated image using ChatGPT and DALL·E [AI-generated image]. https://chat.openai.com/

Suzor, N. P. (Ed.). (2019). Who Makes the Rules? In Lawless: The Secret Rules That Govern our Digital Lives (pp. 10–24). Cambridge University Press. https://www.cambridge.org/core/books/lawless/who-makes-the-rules/6688999078ABFE0821E84D76A055BE70

Under the Gun, Facebook Relents on Privacy | WIRED. (n.d.). Retrieved April 12, 2025, from https://www.wired.com/2011/11/facebook-relents-on-privacy/?utm_source=chatgpt.com

Be the first to comment

Leave a Reply