Your Digital Right, Redefine by Big Tech

Customer or Product?

“If something online is free, you’re not the customer – you’re the product.”

—-Jonathan Zittrain, 2012

It’s a sentence we’ve all heard. Most of us accept it in quiet. During our experience on the Internet, when we scroll through social media or search engines, click ‘I Agree’ on another cookie banner, and wonder why Instagram is suddenly showing us ads for the shoes we mentioned once near our phones. Digital tracking is everywhere. And we don’t have any idea how to deal with it.

On March 22, 2012, Dr. Jonathan Zittrain (1969), who is an Internet law professor at Harvard Law, quoted it and sent it on Twitter. He also mentioned that this quote is not original by him. The origin of this sentence is by Richard Serra, an artist who broadcasts a short video titled “Television Delivers People” (quote investigator, 2017). Until this blog was posted, Zittrain’s Twitter has been past 13 years ago. As the Internet becomes our daily essential usage, every account, message, and piece of information we use to communicate with our relatives is left on the Internet and becomes a kind of symbol as a part of ourselves. However, although the Internet’s history keeps growing, issues related to digital rights and privacy still frequently appear in public view, even worse. According to the research, there are 54% of respondents in America were concerned that ‘computers and technology are being used to invade privacy’, however, 20 years later, 91% of adults ‘agreed or strongly agreed that consumers have lost control of personal information and only 9% of respondent trust social media companies would protect their data (Flew, 2021).

This does not happen for no reason. The digital privacy issue surrounding social media platforms and big tech frequently happened from the earlier stage of the internet until recent years, although this issue was mentioned many years ago.

In 2022, Instagram was fined by Ireland’s Data Protection Commission for allowing underage users to operate business accounts, making email addresses and phone numbers publicly visible (McCallum & Gerken, 2022); X (used call by Twitter), changed its API (Application Programming Interface) and removed content, user data included private emails and phone numbers leaked (Chris, 2023); In 2024, TikTok use algorithm to shape users’ behavior, been accused that TikTok is leading to mental health issues in teens without consent to use their data (The Sydney Morning Herald, 2024).

So why is this issue so hard to solve completely? The public and scholar conversations around privacy and security on social media platforms are not only about hackers, scams, hate speech, or password leaks anymore, but also about something deeper, more structural — about how social media companies and big tech companies redefine what privacy means, while governments race to catch up.

What’s Actually Happening to Our Digital Privacy?

Most of us think of privacy on the Internet as the digital right to keep our personal information to ourselves. That includes our name, address, phone numbers, browsing history, photo album, search queries, and even the weird things we Google at 3 a.m. However, as the concern of digital privacy raising and adjustment, that idea of privacy is constantly being reshaped, stretched, and traded — usually without our knowledge. What used to be a simple expectation — “I don’t want strangers tracking me” — has turned into a complex shift of responsibility back and forth between platforms, advertisers, regulators, governments, and algorithms.

The Path of Google’s Redefinition of Digital Privacy

Imagine you’re browsing the internet. You look up some new sneakers, check out a few cooking blogs, scroll through a travel site, and watch a YouTube video about fitness. After few days later, your Instagram feed is full of sneaker ads, you’re getting vacation deals in your Gmail, and somehow—magically—there’s a protein powder ad on every news site you visit. Are you already used to it? This surveillance advertising—and it’s been happening for years. But in the last few years, that “magic” has become more controversial than ever.

Recently, Google, the famous multinational technology company known for its search engine, has also been brought into focus by the public with its advertisement algorithm. To solve the privacy concern that bothered the public for a long time, one of the biggest changes that Google’s move away from third-party cookies, and their journey from FLoC to the Topics API—all under the umbrella of a bigger plan: the Privacy Sandbox.

Before getting into the process, let’s get a simple introduction of what meant Cookies. In a broad view, they are advertisements that appear on Internet platforms. They are tiny bits of code that websites store in users’ browsers. Some of them are useful. However, third-party cookies an advertisements that can track users’ paths across websites, usually used by advertisers to build target audiences’ profiles about consumers: for what you like, where you go, and what you buy.

After phasing out third-party cookies that aim to prevent users’ digital rights, Google experimented with its new advertisement tool — FLoC (Federated Learning of Cohorts).   According to Google, it has been developed to solve the privacy implications of tailored advertising, such as tracking cookies and device fingerprinting, which can reveal users’ browsing history across sites to advertisers or ad platforms (Google, 2021). It sounds like a great move for the next iteration to keep users away from the risks of privacy leaking to prevent users’ digital rights. However, although in the case of FLoC, Google only required access to a little data, Google collects large amounts of non-identifiable data to produce powerful correlations, so that can make highly accurate predictions about users with very little information (Eliot & Wood, 2022). Explain simply, the prediction action of Google to users analyze through only a little data still has a risk on privacy issues, and even more, could lead to discrimination and digital redlining where certain groups (ex. based on health conditions or income levels) are shown different ads or services.

As these concerns have been raised, Google abandoned FLoC, introduced its new technologies to replace it in early 2022, and proposed API (Application Programming Interface). This is a new system for interest-based advertising, that works by pinpointing interests based on users’ online activity (Roth, 2022), tracks the types of websites you visit, and tags users’ profiles with broad topics like:

  • “Fitness”
  • “Food”
  • “Books”
  • “Music”

When users try to visit a website with ads, the second that user accesses the website, the browser will tag the users with relevant content and show relevant ads. This process, compared to the FLoC, has several upgrades:

  1. Users’ privacy information will not be leak to the advertiser and they won’t follow across the websites.  
  2. Topics only stored for three weeks and then deleted.
  3. Users have the rights to view their topics and turned them off.
  4. Google doesn’t get the interest lists (Everything happens in browser!)

But it still raises concerns of privacy critics, the Electronic Frontier Foundation, which is an international non-profit digital rights group founded in 1990, argued that this system may identify users through browser fingerprinting which is a tool that may expose information about users’ demographics, furthermore, potentially resulting in discriminatory targeted ads (Roth, 2022). It’s still your data, and still also used to shape what you see online, and still been tagged by advertisers.

Once Google developed API, it bundled them into one, which is also been called the Privacy Sandbox. According to Google, it reduces cross-site and cross-app tracking and makes three promises (Google):

  • Build new technology to keep information private
  • Enable publishers and developers to keep online content free
  • Collaborate with the industry to build new internet privacy standards

In fact, by replacing third-party tracking with browser-based tracking, Google now owns both the front door and the back end of the digital advertising machine and has its algorithm. Smaller ad companies are squeezed out. And users — like you and me — still have very little right to speak about how our data has been used. Therefore, while the Privacy Sandbox is technically more secure than cookies, it also represents a new kind of perspective that has to give up the control of privacy:

  1. Give up transparency. Most people don’t know this system even exists, even don’t know how to close it.
  2. Give up control. Google’s algorithms decide what you’re interested in. You can’t exactly tell your browser, “No, I’m not into the food/sports/books.”
  3. Gain a false concept of protection. The word “privacy” becomes a marketing slogan of Google but also to the merchants, not a fundamental right for users.

Combined with the news I’ve shown at the beginning privacy issues related to many techs. This shows a common trend in big tech companies that companies use the language of safety and ethics to empower themselves as a marketing point, not to empower users.

Zooming Out: How Platform Governance Plays a Role

The conversation around privacy can’t be separated from how platforms are governed. Who sets the rules? Who decides what’s ethical? And who holds the power? This is where national privacy laws, international frameworks, and public advocacy all intersect. In countries like Australia, there’s growing awareness of Platformization — where a few large companies dominate the infrastructure of online interaction. The legal reality is that social media platforms belong to the companies that create them and have almost absolute power over how they are run (Suzor, 2019). That includes Google, Meta, Amazon, and others.

Do you still remember the long clause that you agreed to during the register? It is very difficult to give informed consent, as it is difficult to understand these terms (without notice to users when changing the clause), therefore these digital platforms are being given absolute discretion to the operators of the platform to make and enforce the rules as they see fit (Flew, 2021). These companies are private entities, but they increasingly play the role of public utilities. And that means their approach to privacy isn’t necessarily about protecting you, instead, it’s about protecting their business model and benefit. Therefore, the lack of accountability in moderation systems leaves confused to users about why their content was suspended (Suzor, 2019), or analysed.

When Google rolled out the Privacy Sandbox, it wasn’t just updating its tech infrastructure and algorithms. It was protecting its core business instead of protecting its users’ privacy beneficial and digital rights. Depends on knowing what we do, what we like, what we fear, and what we might buy. Therefore, the way we define and defend privacy will shape the future of our democracies, economies, and everyday lives.

If we leave it to the techs to decide what privacy means, we shouldn’t be surprised when it looks a lot like surveillance with a shiny new name. In conclusion, it’s been super essential to fight for our digital rights and let the third institution participate, further, to balance the power of big techs. The battle for privacy is not about hiding. It’s about choosing who gets to see, shape, and profit from your life, and on whose terms.

Reference

Eliot, D., & Wood, D. M. (2022). Culling the FLoC: Market forces, regulatory regimes and Google’s (mis)steps on the path away from targeted advertising. Information Polity, 27(2), 259-. https://doi.org/10.3233/IP-211535

Google. (2021, May 18). FLoC. Retrieved from https://privacysandbox.google.com/archive/floc

McCallum & Gerken. (2022, Sep 6). Instagram fined €405m over children’s data privacy. BBC. Retrieved from https://www.bbc.com/news/technology-62800884

Quoteinvestigator. (2017, July 16). Quote Origin: You’re Not the Customer; You’re the Product. Retrieved from https://quoteinvestigator.com/2017/07/16/product/

Roth, E. (2022, Jan 26). Google abandons FLoC introduces Topics API to replace tracking cookies. The Verge. Retrieved from https://www.theverge.com/2022/1/25/22900567/google-floc-abandon-topics-api-cookies-tracking

Stokel.W. (2023, Mar 10). Twitter’s $42,000-per-Month API Prices Out Nearly Everyone. WIRED. Retrieved from https://www.wired.com/story/twitter-data-api-prices-out-nearly-everyone/


Suzor, Nicolas P. 2019. ‘Who Makes the Rules?’. In Lawless: the secret rules that govern our lives. Cambridge, UK: Cambridge University Press. pp. 10-24.

Flew, T. (2021). Privacy and Security. In Regulating Platforms (pp.72-79). Cambridge: Polity.

The Sydney Morning Herald. (2024, Oct 9). TikTok accused of profiting from addicted teens and being a ‘virtual strip club’. Retrieved from https://www.smh.com.au/world/north-america/dopamine-inducing-more-than-a-dozen-us-states-sue-tiktok-claiming-it-is-addictive-and-harms-children-20241009-p5kgu4.html

Be the first to comment

Leave a Reply