
When you browse your friends circle, search for information or shop on your mobile phone. Every click and every comment you make actually leaves a digital footprint invisibly. These digital footprints become valuable “raw materials” in the hands of enterprises after data analysis and integration. As some economics professors said, ‘Data is the new oil of the digital economy’ (The University of Queensland, 2018). In this digital age, while we enjoy convenience, we also pay the price of privacy. In this article, I would like to talk to you about a topic that seems far away but is closely related to us: data commodification and user rights. I hope that through this article, you can have a clearer understanding of this topic and gain some inspiration to fight the privacy crisis in the digital age.
The Data Wave in the Digital Age: The Rise of Data Commoditization
Let’s first talk about what data commodification is. To put it simply, it is to turn your personal information, such as what videos you like to watch, where you often go to eat, what you like to buy, etc., into something that can be bought and sold. In the past two or three decades, the rapid development of the Internet has made information acquisition and communication more convenient than ever before. At the same time, companies and platforms have begun to realize that every move of users on the platform can be converted into economic value. They will use the acquired user information after complex data analysis to achieve precise advertising and personalized recommendations.
As Kitchin pointed out in “The Data Revolution”, we are in an era of “big data”. A large amount of information is collected, stored and processed, and the commercial potential contained in it is difficult to match the traditional economic model (Kitchin, 2014). Through algorithms and data mining, companies can predict user behavior, thereby greatly improving marketing efficiency and product conversion rate. In other words, every online behavior of ours may be “sold” in exchange for corporate profits.
However, behind this seemingly win-win situation, there is a serious problem: Do users really know how their data is used? In many cases, we simply click “agree” inadvertently and hand over our privacy rights, which should be protected by law, to data giants.
The loss of user rights in data commodification: the game between the loss of privacy and commercial interests
“Choice” or forced consent? — The dilemma of privacy terms
I believe many of my friends have encountered such a situation: when you register a new application or use a website, a pop-up will pop up asking whether you accept Cookies.

Often, the specific privacy policy and terms of service are omitted in the introduction. The vast majority of users will not click to view the specific content, let alone a long privacy policy and terms of service that will appear after clicking. So most people will quickly click “Agree” to start using the service as soon as possible. The problem is that this “agree” is not a real choice, but more like a forced compromise.
As Suzor pointed out, the real rules of social media platforms are somessy and contested, making it difficult for ordinary users to understand the specific meaning (Suzor, 2019). This leads to users unknowingly handing over their data usage rights to the platform, and the platform converts this data into commercial benefits behind the scenes. In other words, while we seem to enjoy the service “for free”, we are actually “paying” for the company at the cost of privacy.
Black box operation of data trading market

Data is not only a tool used to improve the efficiency of advertising, but also a commodity that can be traded publicly. Data brokers build detailed user profiles by integrating information from different platforms, and then sell this data to advertisers, credit institutions and even political consulting firms.
For example, data brokers such as Acxiom, Oracle and Experian have a complete set of secret and huge data collection, processing and sales systems behind them. Users are often unaware of this, and their data has been sold countless times in the market. Once data is commoditized, its original privacy attributes will be weakened, and users’ personal information will become a “tradable” resource (José, Thomas and Martijn, 2016).
This situation not only puts users in a disadvantageous position of information asymmetry, but also makes the use of data lack supervision and constraints (Flew, 2021). It is often difficult for the law to track every link of the circulation of this data, let alone require data brokers to bear corresponding responsibilities. Therefore, in the current digital ecosystem, users’ privacy rights are gradually being marginalized, while companies continue to accumulate capital and power through data transactions.
Unequal game between users and platforms
Many digital platforms portray themselves as providers of “free” services, but in fact they make profits through private data. Platforms use big data algorithms to accurately locate users’ interests and behaviors, thereby providing advertisers with efficient delivery channels. At the same time, users do not receive corresponding rewards or control, and they simply passively accept the “services” of the platform.
In this case, users seem to be always at a disadvantage. The power of the platform to make rules and interpret terms is almost unlimited, and users can only accept or give up. In this case, do we still have real choice? As Zuboff mentioned in “The Age of Surveillance Capitalism”, through the monopoly of data, the platform not only digs our intimate inner lives, but also seeks to shape, direct and control them. Ultimately forms a new ruling model of “surveillance capitalism” (Zuboff, 2019).
Australia’s metadata retention law: a case study on the imbalance between data commodification and user rights

Australia’s metadata retention law provides a striking real-world example. This legal framework was officially implemented in 2015 and requires telecommunications companies and Internet service providers (ISPs) to retain specific user metadata for at least two years. By analyzing this case, we can more intuitively understand how data is transformed into a commodity and how this process poses challenges to users’ privacy rights.
The Basics of Metadata Retention Law
Australia’s metadata retention laws require telecommunications operators to store certain information about user communications (Fair, 2015). This information includes:
- – The time, date, and duration of the communication.
- – The identities of the parties to the communication.
- – The geographic location of the device used.
Data commoditization reflected in metadata retention laws
As mentioned above, data commodification refers to the transformation of data into a resource that can be bought, sold or traded. In Australia’s metadata retention law, this commodification is reflected in the government’s view of metadata as an important asset for law enforcement and intelligence agencies. The “value” of this data lies not only in its direct use, but also in its potential for wide application. However, this practice also includes the daily communication data of all users under surveillance, regardless of whether it involves criminal activities. This indiscriminate collection method highlights the systematic use of data as a commodity.
Imbalance of user rights
The metadata retention law exposes a significant imbalance in the protection of user rights, especially in terms of privacy rights. This is particularly reflected in the following points:
- – Broad coverage: The law applies to all users, not just criminal suspects, which means that everyone’s privacy may be affected.
- – Low access threshold: Compared with communication content, which requires a court order to access, metadata can be obtained without judicial approval. In the early implementation, more than 80 agencies even exploited legal loopholes to access metadata. This data far exceeds the original legislative intent for serious crimes (Jenkins, 2020).
- – Privacy threat: Although it does not involve the content of communications, metadata can still infer a detailed picture of a person’s life. For example, by analyzing the time and location of a person’s daily calls, one can infer their work habits, social circles, and even personal preferences.
This imbalance of power reflects that national security interests are placed above personal privacy, while users’ control over their own data is weakened.
Controversy and Challenges
Since its implementation, Australia’s metadata retention law has sparked widespread controversy and legal challenges. Critics argue that this “comprehensive surveillance” approach lacks necessity and proportionality and may lead to risks of data abuse or leakage. The public questions why ordinary citizens’ communication data needs to be stored for a long time, while the relevant supervision mechanism is not perfect. Although the government has proposed reforms, such as tightening data access rights, these measures have not yet completely solved the fundamental problems (Goggin et al., 2017).
Lessons from the case
Australia’s metadata retention law vividly demonstrates the tension between data commodification and user rights. In the digital age, data is pursued by governments and businesses as a valuable commodity, but this pursuit often comes at the expense of personal privacy. This case reminds us that the legal framework needs to find a balance between exploiting the value of data and protecting user rights. Only through stricter supervision and transparent mechanisms can we ensure that data brokers do not erode basic civil rights.
In addition to Australia’s metadata retention bill, many countries have also introduced similar laws. For example, the UK enacted the Data Retention and Investigatory Powers Act in 2014, requiring ISPs and telecommunications companies to retain communication data to support law enforcement and counter-terrorism. The law was sued by the public in 2015. This shows that how to balance the use of data value and the protection of user data privacy is a global issue.
Digital Policy and Governance: Inadequacies of the Existing Framework and Directions for Improvement
Limitations of Existing Regulations
Although some countries and regions have established some laws in recent years to enhance the protection of user privacy to a certain extent. For example:
- – California Consumer Privacy Act (CCPA): If you are a Californian, you can ask companies to tell you what data they have taken from you, and you can also ask them to delete it.
- – General Data Protection Regulation (GDPR): Users can review their data and object to companies using it.
But the reality is not that simple. Although the law gives us rights, there are many loopholes in its implementation. For example, many applications will pop up an “Agree” button for you to click, but the terms look like novels. Some companies will also be clever, such as using buttons of different colors to induce you to click “Agree”. Google tested 41 shades of blue to find the one that is most likely to make people compromise (Hu, 2022).
Globally, the complexity of data commoditization and data transactions still makes it difficult for traditional laws to be effectively implemented. Many data brokers and multinational platforms take advantage of legal loopholes to circulate user data as commercial assets around the world. Existing regulations can often only regulate specific platforms or countries (Emile Ayoub and Elizabeth Goitein, 2024).
New model of digital governance: multi-party participation and data citizenship
In order to solve these problems, scholars have proposed new governance ideas. For example, some scholars have suggested introducing the concept of “data citizenship”, that is, regarding users as active participants in the digital ecosystem, rather than just passive information providers. By giving users more rights to participate in data governance, they are encouraged to supervise and provide feedback on data collection, storage and use. This helps to form a more transparent and fair digital governance model (Hintz et al., 2002).
In addition, a multi-party governance model is also seen as a possible solution. This model requires governments, enterprises, non-governmental organizations and user representatives to participate in the formulation and implementation of digital policies. Ensure a balance of interests among all parties and promote the formulation of data usage rules that are in line with the public interest. Only in this way can we protect users’ privacy and information security while encouraging innovation and business development.
Conclusion: Building a fairer and more transparent digital future
In the digital age, data has undoubtedly become a valuable resource. But at the same time, user privacy and personal rights are often forced to compromise in the face of commercial interests. The tension between data commodification and user rights reflects the imbalance and loopholes in the current digital governance system. We need to strengthen privacy protection not only from a technical and legal perspective, but also to raise public awareness of the value of privacy at the social and cultural levels. Only in this way can we gradually promote the formation of a more fair, transparent and responsible digital ecosystem.
Reference
ABC News (Australia) (2014). Metadata only retained for ‘only most serious crime’. [online] YouTube. Available at: https://www.youtube.com/watch?v=9odwhHo7jsM [Accessed 12 Apr. 2025].
Affairs, scheme=AGLSTERMS A. corporateName=Home (2015). Telecommunications (Interception and Access) Amendment (Data Retention) Act 2015. [online] www.legislation.gov.au. Available at: https://www.legislation.gov.au/C2015A00039/latest/text.
Emile Ayoub and Elizabeth Goitein (2024). Closing the Data Broker Loophole | Brennan Center for Justice. [online] www.brennancenter.org. Available at: https://www.brennancenter.org/our-work/research-reports/closing-data-broker-loophole.
Fair, P. (2015). Australia’s New Metadata Retention Laws. [online] Connect On Tech. Available at: https://connectontech.bakermckenzie.com/2015-12-22-australias-new-metadata-retention-laws/ [Accessed 11 Apr. 2025].
Flew, T. (2021). Regulating Platforms. John Wiley & Sons.
Goggin, G., Vromen, A., Weatherall, K., Martin, F., Adele, W., Sunman, L. and Bailo, F. (2017). Digital Rights in Australia. Usyd.edu.au. [online] doi:https://doi.org/978-0-646-98077-5.
Hintz, A., Dencik, L., Redden, J., Treré, E., Brand, J. and Warne, H. (2002). Civic Participation in the Datafied Society: Towards Democratic Auditing? [online] Available at: https://datajusticelab.org/wp-content/uploads/2022/08/CivicParticipation_DataJusticeLab_Report2022.pdf.
Hu, W. (2022). Multivariate test — Google’s 41 shades of blue. [online] Medium. Available at: https://medium.com/@whystudying/multivariate-test-googles-41-shades-of-blue-846d2c5781a8.
Jenkins, S. (2020). Departments slammed over management of data-retention laws. [online] The Mandarin. Available at: https://www.themandarin.com.au/126374-departments-slammed-over-management-of-data-retention-laws/ [Accessed 11 Apr. 2025].
José, van D., Thomas, P. and Martijn, de W. eds., (2016). De platformsamenleving. Strijd om publieke waarden in een online wereld. Amsterdam University Press eBooks. Amsterdam University Press. doi:https://doi.org/10.5117/9789462984615.
Kitchin, R. (2014). The Data Revolution: Big Data, Open Data, Data Infrastructures & Their Consequences. SAGE Publications Ltd. doi:https://doi.org/10.4135/9781473909472.
Mellea, J. (2021). Buying and Selling Data, Part 1. [online] Foundation for a Human Internet. Available at: https://medium.com/humanid/buying-and-selling-data-part-1-3846e7d98e6b.
Parliament of Australia (2019). Parliament of Australia. [online] Aph.gov.au. Available at: https://www.aph.gov.au/.
Rainie, L. (2018). Americans’ Complicated Feelings about Social Media in an Era of Privacy Concerns. [online] Pew Research Center. Available at: https://www.pewresearch.org/short-reads/2018/03/27/americans-complicated-feelings-about-social-media-in-an-era-of-privacy-concerns/.
Suzor, N.P. (2019). Lawless : the secret rules that govern our digital lives. Cambridge, United Kingdom ; New York, Ny: Cambridge University Press.
The University of Queensland (2018). Data is the new oil of the digital economy. [online] Uq.edu.au. Available at: https://stories.uq.edu.au/shorthand-uq/eait/ingenuity/data-is-the-new-oil/ [Accessed 10 Apr. 2025].
Zuboff, S. (2019). The Age of Surveillance Capitalism: the Fight for a Human Future at the New Frontier of Power. New York: Public Affairs.
Be the first to comment