“Who’s in Control of Whom?” – Power, Platforms, and Users

(Antonenko, 2022)

In the information age, the power relationship between digital platforms and users is becoming increasingly unbalanced. Platforms gain insights into users’ interests, preferences, and consumption patterns through their collection of user data, including search history, clicking habits, and even in some cases, biometric information collection through cameras and microphones, and platforms have gradually gained virtually unlimited bargaining power in this data-driven digital environment, whereas users, in their seemingly free behavior of use, have ceded their privacy and autonomy (Suzor, 2019).

(Anadolu Agency, 2020)

Why do platforms do this?

The more platforms know about their users, the more accurate their ad placements will be, and the higher the click-through and conversion rates will be.

Digital advertising has become a major source of revenue for large tech companies, and this percentage is likely to rise in the coming years (Marto & Le, 2024). This business model has led platforms to increase their control over user behavior; therefore, to gain higher economic benefits, platforms are constantly optimizing their algorithms, which are used to deeply analyze and predict users’ interests, behavioral habits, and even their emotional tendencies, to push highly personalized content and advertisements.

In this process, platforms use various algorithms to link users’ interests or preferences so that users are continuously immersed in a cyclical information world, the so-called “filter bubble” (Pariser, 2011). In this bubble, the information received by the user gradually becomes closed and homogenous, and the user’s original opinions and preferences are constantly reinforced. For example, when a user searches for or browses videos or products related to ‘Korean make-up several times on a social media platform, the platform will tacitly assume that he or she is interested in this kind of content and then push more beauty bloggers, product reviews, and shopping links about Korean make-up. As time goes by, users’ social media platforms are almost dominated by such content, which undoubtedly deepens their love for Korean makeup. However, users may have overlooked other beauty brands or styles that may be equally worthy of attention.

Although this highly personalized recommendation mechanism improves the user experience to a certain extent, providing users with seemingly ‘preferred’ content, it comes at a huge ‘privacy cost.’ Whether it is liking, retweeting, or commenting on social media or browsing, searching, and rating on the platform, these behaviors become a key source of data for algorithmic optimization and ad targeting, thus enhancing the platform’s commercial cash flow. This ‘user-centric’ logic is essentially based on the platform’s continuous monitoring of user behavior. By analyzing each user’s behavior, the platform adjusts the recommended content to ensure that users continue to engage and consume more content.

Why don’t users refuse?

Platforms making use of ‘default consent’ and the difficulty for users to realize their data is being exploited

It is also because of the veneer of ‘service optimization’ or ‘personalized recommendations’ that platforms make it difficult for users to realize that they are being exploited and that their data is being commoditized.

Also, users, as recipients of platform services, are often in a vulnerable position with limited choices. They must passively accept the platform’s carefully crafted privacy policies on a ‘take it or leave it’ basis, rather than truly understanding how the platform collects, uses, and processes their personal information (Commission, 2024).

A 2023 consumer study found that 74 percent of Australians do not want their personal information shared or sold to other companies (Commission, 2024).

The Australian Office of Citizenship and Immigration’s (OAIC) 2023 Community Privacy Attitudes Survey found that 84 percent of Australians want more control and choice over collecting and using their personal information (Gina Cass-Gottlieb, 2023).

69 percent of Australians believe it is unfair or unreasonable for organizations to track, analyze, and target advertising to adults online based on personal information (Gina Cass-Gottlieb, 2023).

Consumers are increasingly being asked to provide personal information or other data to access important services (Commission, 2024).

The above shows that users are not always satisfied with the platform’s privacy policy and do not reject it out of their preferences, but more because they have no choice but to acquiesce.

This imbalance of rights between platforms and users has also led to a lack of transparency and accountability on the part of platforms in dealing with privacy protection, data security, and content auditing.

Case Study

(NET25, 2023)

In recent years, user information has been leaked globally, and in 2023, a class action lawsuit against Google in California, USA, sparked widespread public concern about privacy violations on digital platforms. According to the lawsuit, Google has collected hundreds of millions of users’ web access records, device information, search content, and location data in “incognito mode” through its Chrome browser, Google Analytics, Ad Manager, and other services around the world (Reuters, 2023). In other words, Google secretly tracked the internet usage of millions of people who thought they were browsing the web in private. After the incident came to light, the public’s trust in the platform plummeted. The public has begun to question whether our are platforms doing their due diligence in data management and privacy protection.

One user said, “By collecting data, Google can learn about their friends, hobbies, favorite foods, shopping habits, and the “potentially embarrassing things” they search for online.” (Reuters, 2023).

In the end, Google agreed to pay a $5.2 billion settlement to destroy billions of pieces of user data illegally collected in ‘incognito mode’ and promised to revise its data policies, including more clearly informing users that their data may still be collected in incognito mode (Reuters, 2024).

This case is symbolically important as a historic step in honesty and accountability by mainstream technology companies (Reuters, 2024), and it reveals that the platforms’ definition of “privacy” is falling short of the public perception. The so-called ‘incognito mode,’ which in users’ minds means that browsing activities are not recorded or tracked, is not the case. This reinforces Suzor’s (2019) observation that there is an imbalance of power between platforms and users and that ‘more user engagement’ is a mere formality.

How to break this power imbalance

To truly break this power imbalance, it is not enough to rely solely on the platform’s own self-restraint and moral commitment.

As early as 2009, Facebook CEO Mark Zuckerberg promised that Facebook would be ‘more democratic’ and that changes to the site’s terms of service must reflect the values and principles of its users in the face of strong reactions to changes to its privacy policy (Suzor, 2019). While this promise may have seemed to empower users to participate more, the facts show that users’ actual voice in the platform’s rule-making has remained extremely limited and has not had a substantial impact on Facebook’s governance model.

(Johnston, 2019)

  • The government should strengthen regulation and improve legislation

In the digital space, users’ data and privacy rights are facing unprecedented challenges. As a defender of the public interest, the government must more actively take on the responsibility of digital governance to ensure that users’ privacy and data are effectively protected.

Firstly, it is necessary to improve legislation to clarify the legal responsibilities of platforms. Take the EU’s General Data Protection Regulation (GDPR) as an example, which establishes users’ rights to know, access, delete, and object to the processing of personal data (GDPR, 2013, Art. 15). Such legislation compels platforms to obtain users‘ “express consent” before collecting and using data, increasing the transparency of companies’ handling of user data and safeguarding users’ data sovereignty institutionally.

Australia has also accelerated the pace of digital privacy legislation reform in recent years. To better protect the privacy of Australians, the Australian government has overhauled the existing Privacy Act 1988. The Act introduced two landmark reforms: first, the addition of ‘serious invasion of privacy’ as an actionable civil tort; and second, the inclusion of ‘human flesh search’ as a criminal offense (Ogilvy.com.au, 2024).

Furthermore, in 2021 Australia introduced the News Media Negotiation Guidelines, a world-first piece of legislation facilitated by the Australian Competition and Consumer Commission (ACCC) (Davies, 2023). The legislation requires tech giants like Google and Facebook to negotiate fairly with the news media for content use and reasonable compensation. This initiative not only responds to the news media industry’s concerns about content ‘pillaging’ but also demonstrates the government’s commitment to curbing the monopolistic behavior of tech giants in the market.

In addition to legislative reforms, the government should introduce more content on privacy protection and data rights into the education system to help users enhance their ability to protect themselves in the digital space, especially for young users, who are highly dependent on social media platforms.

  • The platforms are not just neutral service providers, but governors in the digital environment

Firstly, platforms should clearly explain to users at the time of information collection the purpose and manner of their data collection as well as how they will safeguard the data. Avoid users unconsciously giving up their privacy rights due to information asymmetry.

Second, platforms should safeguard users’ autonomy over their personal information by allowing them to view, modify, or delete their personal information at any time.

Then, while digital platforms and their user communities are reasonably skeptical of government regulation of online content, this does not mean that platforms can avoid their responsibilities, and platforms should be subject to regular review and assessment by independent third-party organizations (Terry, 2019).

Finally, the platform should establish a sound accountability mechanism to ensure that the operation of the algorithm and recommendation system follows the principles of fairness, impartiality, and transparency. Algorithms should not be allowed to create filter bubbles and manipulate users’ behavior.

  • The user data is the currency (Pehar, 2024)

In the digital age, every user’s privacy and data can be at risk. Users should regularly check their privacy settings and limit the platform’s access to their data to ensure that their personal information is not over-collected and misused.

(Pehar, 2024)

“By treating your data as important currency, like you would real-life money, you will be able to protect it more securely than most people (Pehar, 2024).”

Conclusion

The imbalance of power between platforms and users has severely weakened users’ control over their data and privacy in the digital space. With the deep monitoring of user behavior and extensive data collection by platforms, users’ autonomy is constantly being eroded. Therefore, restructuring the platform governance system, strengthening legal regulation, and enhancing user data literacy have become urgent tasks. Platforms should not be used merely as a tool for capital gain; they should assume the responsibility of serving the well-being of society and the common interests of mankind.

References

Anadolu Agency. (2020, July 26). Governments should protect personal data on digital platforms. Daily Sabah. https://www.dailysabah.com/business/tech/governments-should-protect-personal-data-on-digital-platforms 

Antonenko, D. (2022, December 16). Data Privacy vs Data Security: Which Should You Prioritize? Businesstechweekly.com. https://www.businesstechweekly.com/cybersecurity/data-security/data-privacy-vs-data-security/ 

Commission, A. C. and C. (2024, May 21). Consumers lack visibility and choice over data collection practices. Www.accc.gov.au. https://www.accc.gov.au/media-release/consumers-lack-visibility-and-choice-over-data-collection-practices 

Davies, R. (2023, August 31). Policy case study: the impact of digital platforms paying for news in Australia – Media Freedom Coalition. Media Freedom Coalition. https://mediafreedomcoalition.org/news/2023/bargaining-codes-what-benefits/ 

GDPR. (2013). Art. 15 GDPR – Right of access by the data subject | General Data Protection Regulation (GDPR). General Data Protection Regulation (GDPR). https://gdpr-info.eu/art-15-gdpr/ 

Gina Cass-Gottlieb. (2023, October 17). Regulatory priorities and challenges posed by the digital economy speech. Australian Competition and Consumer Commission. https://www.accc.gov.au/about-us/news/speeches/regulatory-priorities-and-challenges-posed-by-the-digital-economy-speech 

Johnston, M. (2019). The Top 6 Shareholders of Facebook. Investopedia. https://www.investopedia.com/articles/insights/082216/top-9-shareholders-facebook-fb.asp 

Marto, R., & Le, H. (2024, October 10). The Rise of Digital Advertising and Its Economic Implications. Stlouisfed.org; Federal Reserve Bank of St. Louis. https://www.stlouisfed.org/on-the-economy/2024/oct/rise-digital-advertising-economic-implications 

NET25. (2023, December 29). Google settles $5 “Billion” lawsuit for “Incognito mode” | Mata Ng Agila International. YouTube. https://www.youtube.com/watch?v=veKkPI1xB0o 

Ogilvy.com.au. (2024). On the road: Australia’s privacy law overhaul begins – Insight – MinterEllison. Minterellison.com. https://www.minterellison.com/articles/australias-privacy-law-overhaul-begins 

Pariser, E. (2011). UCLA InterActions: UCLA Journal of Education and Information Studies Title Review: The Filter Bubble: What the Internet is Hiding from You by. https://escholarship.org/content/qt8w7105jp/qt8w7105jp.pdf?t=mhzvpm 

Pehar, D. (2024, August 12). Council Post: In The Digital Age, Our Data Is Currency. Forbes. https://www.forbes.com/councils/forbestechcouncil/2020/02/20/in-the-digital-age-our-data-is-currency/ 

Reuters. (2023, December 29). Google agrees to settle $5bn lawsuit claiming it secretly tracked users. The Guardian. https://www.theguardian.com/technology/2023/dec/29/google-lawsuit-settlement-incognito-mode 

Reuters. (2024, April 1). Google to destroy billions of private browsing records to settle lawsuit. The Guardian. https://www.theguardian.com/technology/2024/apr/01/google-destroying-browsing-data-privacy-lawsuit 

Suzor, N. P. (2019). Lawless : the secret rules that govern our digital lives. Cambridge, United Kingdom ; New York, Ny Cambridge University Press.

Terry Flew. (2019). Platforms on Trial. Intermedia, 46(2), 18–23.        

https://eprints.qut.edu.au/120461

Be the first to comment

Leave a Reply