
Do you really agree?
Do you remember the last time you ‘agreed’? Maybe it was accepting your classmates’ point in class. Maybe it was allowing your friends to use your personal belongings. Or maybe it was agreeing to the privacy terms of an online platform when you logged in. When you consent offline, you usually understand the specifics and are certain of your consent. However, when you click ‘I agree’ on an online platform, are you really agreeing?

Source: The Grounds of Alexandria (n.d.).
When faced with a privacy policy that runs to thousands of words, people usually just scroll to the bottom and accept it without reading. It leads to few people read through the terms and conditions carefully. In other words, when we randomly ‘agree’ to an online platform, we’ve long since legally handed over our data to the platform. Using this information, platforms know almost everything about our privacy. This includes our home address, what type of content we like to watch, and even what time we go to bed at night.
What’s worse, this ‘consent’ doesn’t mean we have freedom of choice. When you reject these privacy terms, the platform will deny you access to them and force you to exit the page. This may make it impossible for you to carry on with your normal online socializing, work and life. Not only that, but people can’t change any word of the ‘terms’. This is because users are faced with options that have already been set by the platform: either accept them in their entirety or be rejected by the platform.
In other words, privacy freedom is an illusion. This post will explore how digital platforms silently extract our personal data and monetize it—without our full awareness or control.
The power play behind ‘I Agree’
Behind every seemingly simple button lies a complex power play. Users are likely to unknowingly give up privacy and rights that belong to them. Dark patterns are a common design tactic used by various platforms. It deliberately places obstacles or barriers in the user’s path to make it more difficult for the user to take a certain action (Deceptive Patterns, 2025). It allows users to unconsciously give up their right to make choices by deliberately creating complex programmers. This seems like there is a choice, but in fact there is no choice.

In January 2022, the French regulator CNIL imposed a record €210 million fine on Google and Facebook. This was because they made it difficult for users to refuse the collection of ‘cookies’ (a small file that tracks a user’s behavior) by deliberately designing misleading interfaces. When users visit these websites, the option to ‘accept all cookies’ is so obvious that it can be clicked on with a single click. However, when people want to reject their data being tracked, they find that there is no equally accessible ‘reject’ button (BBC News, 2022). On the face of it, people clicking ‘agree’ are making their own choices, yet the truth is that the platforms are secretly manipulating them. As the BBC News (2022) points out, this manipulative design violates the principle of informed and free consent required by the EU’s General Data Protection Regulation (GDPR).
Beyond that, even when people agree to a privacy policy, most people are not sure exactly how much power they are authorizing the platform. In 2022, Australia’s second-largest telecom company, OPTUS, suffered a massive data breach (Slater and Gordon, 2025). And exposing the sensitive information of nearly 10 million subscribers. This information including ID numbers, home addresses, and birthdates. Many users were shocked to learn how much data the company had collected without their knowledge (Ratnatunga, 2022). They recalled signing up with a simple click, never realizing that such vast amounts of personal data were being stored. And The OPTUS’s privacy policy made no clear mention of how long this data would be retained, who would have access to it, or what safeguards were in place. In other words, users had agreed—but they didn’t know what they were agreeing to. It means that OPTUS operated under a vague and outdated privacy framework that allowed excessive data harvesting by default. This wasn’t informed consent. It was quiet compliance, engineered through legal complexity and user inertia.
These examples reveal a consistent strategy: digital platforms minimise resistance and maximize data access through quiet manipulation. As Flew (2021, p74) points out, the central conflict in platform governance is how to balance economic interests with the protection of users’ privacy. The existence of the dark model, on the other hand, suggests that many platforms are more inclined to sacrifice users’ rights and interests in exchange for maximizing commercial interests.
Privacy is not a Commodity—it’s a Right

Figure 3. How digital platforms process. Source: TechFunnel (n.d.).
When privacy is seen as a ‘consent to trade’ commodity, it is often our rights that are really sacrificed. Many free platforms require our data to be collected before offering their services. But is this really a fair cession of rights?
Europe has given a clear response: ‘privacy is basic human right, not a product for sale’. In 2018, the EU introduced the General Data Protection Regulation (GDPR), which mandates that any collection of personal data must be based on clear, informed, and freely given consent. In addition, the GDPR also grants users additional data rights, such as the ‘right to be forgotten’ (requiring platforms to delete personal data) and the ‘right to data portability’ (enabling users to transfer data to other platforms more easily) (Goggin et al., 2017, p5).

In contrast, privacy protection and governance policies in other countries are still lacking. The OPTUS privacy breach in Australia exposed the gaps in legal protection. After the incident, the regulator found that the current privacy protection mechanism in Australia is simply unable to effectively pursue the platform. This is what happens when user data is treated not as something to be protected, but as an asset to be mined, stored, and sold.Not only that, when people browse online platforms in Australia, they often receive nuisance phone calls once they have logged in or registered an account. All these are since the Australian government’s legal provisions on privacy protection are not comprehensively perfected. It is difficult to get penalized even if the platforms buy and sell data. So, digital platforms in Australia will rightly ignore the basic rights of users and use their data at will to gain profits.
Admittedly, in today’s age of information technology, big data is tantamount to electronic oil, a precious treasure. However, this does not mean that privacy becomes an item that can be traded. Goggin et al. (2017, P5) declared that the protection of privacy in the digital age is not just about protecting individual privacy, but also about the quality of social justice and democracy. The real issue in platform governance is that privacy should be viewed as a right to be protected by regulations, not as a commodity to be recklessly peddled by platforms. If we do not explicitly reject this logic of treating privacy as a transaction, then users will continue to be disadvantaged, and platforms will be able to easily profit from their privacy.
Privacy is a Luxury?
However, even if privacy is seen as a fundamental right of a person, it is not equal for everyone, especially in digital platforms. In fact, privacy is more of a privilege than a right that everyone can have equally. Marwick and boyd (2018) strongly argue that ‘privacy is a luxury that only those of higher socio-economic status can afford to protect their privacy, and that vulnerable groups are often unable to refuse being monitored by data. ‘
In platform societies, ordinary or marginalized individuals frequently struggle to maintain control over their personal information. One of the key reasons for this is that vulnerable groups largely lack digital literacy. For example, many older people are not aware of how their personal data is collected and used when browsing websites or shopping online. As a result, they are likely to enter sensitive information. Without the skills to navigate complex privacy settings, these users are far more exposed to data misuse and leakage.

Secondly, when ordinary people want to gain more attention, traffic or economic opportunities through social media, they are usually forced to submit more private information to the platforms. For example, ordinary anchors on live-streaming platforms are required to provide a lot of private information, such as their real names, addresses and even bank account information, to pass the platform’s vetting and gain referrals. But the privileged class, such as celebrities, social elites or affluent users, do not need to expose so much information. They have more resources to manage their online privacy, and platforms tend to proactively protect their information from disclosure to maintain the platform’s public image and financial interests. In short, privacy becomes a form of digital privilege.
This difference is also reflected in platforms’ responses to privacy breaches. The Optus data breach is a striking example of this divide, as ordinary users often receive vague excuses such as ‘technical error’ or are blamed for ‘user negligence’ when they experience a data breach or misuse. This incident affected millions of Australians. And the burden of risk and uncertainty falls squarely on ordinary people, while platform responsibility remains minimal. On the other hand, when it comes to privacy incidents involving the privileged, platforms are usually quick to respond, take timely measures to protect their privacy, and handle the incidents efficiently. This apparent differential treatment reinforces the vulnerable position of the underprivileged in the digital society.
This privacy inequality in platform governance no longer stops at the individual level; it in fact further entrenches social inequality and makes marginalised groups even more vulnerable in the social structure. Privacy protection is no longer just an individual issue, it is a widespread injustice and discrimination that exists in our society.
What kind of platform governance do we need?

Source: Institute for Digital Transformation (n.d.).
To address these issues, we must fundamentally rethink how platforms are governed. A more ethical and democratic governance model should revolve around two core principles: transparency and user empowerment.
Firstly, platforms should clearly disclose how they collect, process, and use user data, as well as the rules behind content moderation. At present, algorithms and data processing systems operate as black boxes. Users have no way of knowing where their data is going. This needs to change so that users have a clear understanding of how platforms use their data.
Secondly, stronger and clearer legal frameworks are essential. The EU’s GDPR provides a good example, forcing platforms to respect users’ right to informed consent and data controllability (GDPR-Info.eu., 2025). Such a legal framework can effectively reduce arbitrary invasions of privacy rights by platforms and force them to take actual responsibility for protecting user data.
Finally, platform governance should also give users more rights to actually control their data. For example, users should have the right to easily delete their data, or even freely transfer their data to other platforms, rather than just ‘I agree’.
Implementing these changes is entirely possible. They would not only boost transparency and rebuild user trust but also place real limits on platform overreach. More importantly, they would mark a critical step toward treating privacy as a right worth protecting.
Let’s stop pretending we have a choice

Source: ScholarBlogs (2023).
Privacy should never be a commodity that can be traded, let alone a privilege that only a select few can enjoy. The future of digital platforms should be a more transparent and fair space. Each of us has the right to know exactly where our data is going and how it is being used. We should also have the right to freely reject these unfair terms without being forced to give up the basics of life in the digital age as a result. As ordinary users, we must care about our privacy, demand transparency in platform governance, and push for stronger legal protections. These aren’t just personal concerns—they’re the foundation of digital justice.
Stop pretending that we have choices, it’s time to take back the real freedom that belongs to us!
Reference
Auth0. (2022, October 17). Practical privacy: A guide for everyone [Blog image]. https://auth0.com/blog/practical-privacy-a-guide-for-everyone/
BBC News. (2022, January 6). Google and Facebook fined over cookie tracking consent by French regulators [News image]. https://www.bbc.com/news/technology-59909647
BBC News. (2022, January 7). CES 2022: The biggest tech trends to watch. BBC News. https://www.bbc.com/news/technology-59909647
Deceptive Patterns. (2025). Obstruction. Deceptive Patterns. https://www.deceptive.design/types/obstruction
Flew, T. (2021). Regulating Platforms (pp. 72–79). Cambridge: Polity Press.
GDPR-Info.eu. (2025). Art. 7 GDPR – Conditions for consent. https://gdpr-info.eu/art-7-gdpr/
Goggin, G., Vromen, A., Weatherall, K., Martin, F., Webb, A., Sunman, L., & Bailo, F. (2017). Digital rights in Australia (p. 5). Sydney: University of Sydney. Retrieved from https://ses.library.usyd.edu.au/handle/2123/17587
Institute for Digital Transformation. (n.d.). Platform governance in organizations [Web image]. https://www.institutefordigitaltransformation.org/platform-governance-in-organizations/
Marwick, A. E., & boyd, d. (2018). Privacy at the margins. International Journal of Communication, 12, 1157–1165.
Ratnatunga, J. (2022, October 5). Optus data hack: The dark side of invading social media privacy. CMA Australia On Target. https://ontarget.cmaaustralia.edu.au/optus-data-hack-the-dark-side-of-invading-social-media-privacy/
ScholarBlogs. (2023, October 21). Human right [Blog image]. https://scholarblogs.emory.edu/queercultures101/2023/10/21/human-right/
Slater and Gordon. (2025). Optus data breach class action. Slater and Gordon. https://www.slatergordon.com.au/class-actions/current-class-actions/optus-data-breach
TechFunnel. (n.d.). Transaction processing system [Web image]. https://www.techfunnel.com/fintech/transaction-processing-system/
The Grounds of Alexandria. (n.d.). Homepage screenshot [Screenshot]. https://thegrounds.com.au/
Be the first to comment