The Illusion of Privacy: When “Off” is not an option anymore

Google’s $60 Million Lesson: When Off Isn’t Really Off

In 2018, Australians learned a hard truth about digital privacy: sometimes “off” doesn’t actually mean off. The Australian Competition & Consumer Commission (ACCC) took Google to court after discovering the tech giant was still collecting users’ location data even when people thought they had disabled tracking (Coldewey, 2018). The issue centred on two settings in Google accounts: “Location History” and “Web & App Activity.” Google’s interface made users believe turning Location History off would stop location tracking entirely. Another setting – Web & App Activity – was quietly logging location data whenever users used Google services like Search or Maps (Bryne, 2022). This second setting was on by default and not obvious to users, creating a false sense of security. In August 2022, the Federal Court of Australia slammed Google with an A$60 million fine for misleading consumers about these practices (Bryne, 2022). The court found Google had represented that the “Location History” toggle was the only account setting affecting whether location data was collected, whereas in reality the Web & App Activity function also did (Bryne, 2022).

This Google saga vividly illustrates how digital platforms can give users an illusion of privacy and control while continuing business as usual behind the scenes. Millions of users thought they had opted out of tracking. Yet Google’s systems kept quietly gathering their whereabouts to fuel its advertising engine (Bryne, 2022). Google later claimed it had “fixed the problem” by 2018 and clarified its settings (Bryne, 2022). However, the damage was done – not so much to Google’s bottom line – but to user trust. It’s a textbook case of what regulators and experts warn about: tech giants maintaining extensive control over user data while giving us just the feeling of power.

Dark Patterns: The Art of Deceptive Design

Suppose you’ve ever struggled to find the “unsubscribe” button or been confused by a privacy setting. In that case, you may have been a victim of dark patterns. Coined by UX researcher Harry Brignull in 2010, dark patterns are design tricks that intentionally mislead or nudge users into choices they might not otherwise make – often to the platform’s benefit. In the context of privacy, dark patterns give users the impression they’re making a privacy-friendly choice while subtly undermining that choice in the background (Coldewey, 2018). Google’s two-location-settings fiasco is a prime example. The interface told users, “With Location History off, the places you go are no longer stored,” suggesting that turning that setting off would stop all tracking. Yet Google still recorded location data through other means. The design misled users into feeling secure while Google retained control over their data (Coldewey, 2018). Simply put, Google offered a comforting toggle but hid the real controls elsewhere.

There are some common dark pattern tactics employed by UX design, they include:

Misleading Labels

Using confusing language or toggles that do the opposite of what users expect. (For example, a setting might appear to be “Off”, but another hidden setting continues the tracking – as in Google’s case.)

Default Opt-Ins

Privacy-invasive options that are turned on by default, often buried in settings. Users may be unaware of these defaults, or it’s tedious to opt-out. Google’s Web & App Activity being on by default, while Location History was off, is a prime example (De Souza, 2022a).

Nagging and “Confirmshaming”

Repeatedly prompting users to enable data-sharing features or shaming them for not doing so. For instance, a dialogue asking, “Are you sure you want to limit this feature?” is meant to guilt users into staying tracked (Colgate, 2022; Office of the Attorney General for the District of Columbia, 2022). Google was accused of warning that certain apps wouldn’t function properly unless location tracking was enabled (Zakrzewski, 2022).

These tricks exploit human psychology and the information asymmetry between users and platforms. Companies hold vastly more information and power than their users (Humphry, 2025). Most of us click “I agree” just to get rid of pop-ups—we can’t realistically read every word of those policies. To be precise, one estimate found it would take 76 work days to read all the privacy policies we encounter in a year (Wagstaff, 2012). No wonder only about 9% of people say they always read privacy policies (Auxier et al., 2019). This imbalance makes it easy for dark patterns to thrive. As the U.S. Federal Trade Commission noted in a 2022 report, many companies now use “sophisticated design practices… that trick or manipulate consumers into… giving up their privacy” (Federal Trade Commission, 2022). In short, the deck is stacked against the average user.

Contextual Integrity: Violating Privacy Expectations

Why do these deceptive practices feel so insidious? One explanation lies in the concept of contextual integrity, a privacy theory developed by scholar Helen Nissenbaum (Nissenbaum, 2004, 2018). Instead of viewing privacy as simply keeping data secret or giving users total control, contextual integrity focuses on whether information flows appropriately within its context. In other words, data shared in a particular setting should only be used in ways consistent with the norms and expectations of that situation (Nissenbaum, 2004). Suppose the information provided for one purpose is repurposed in a different context without our understanding. In that case, it violates our expectations – and thus feels like a privacy violation – even if we technically “agreed” to it in the fine print (Nissenbaum, 2018).

Consider the Google location saga through this lens. The user’s expectation was straightforward: “I turned off Location History, so Google should not track where I go.” The reality was different: Google still did track location, just under a different setting. That’s a breach of contextual integrity – it violated the normal information norms of the situation (turning something off should indeed mean stopping data collection, period).

Case Study: Facebook’s Two-Factor “Bait-and-Switch”

Another striking example of lost contextual integrity came in 2018, when users and researchers discovered a disturbing privacy bait-and-switch by Facebook. The company was using phone numbers that people provided for security purposes—like two-factor authentication (2FA)—and repurposing them for targeted advertising (Wolverton, 2018). In other words, a number you gave only to protect your account was quietly used to help marketers target you with ads. This wasn’t an accident or a breach; it was an intentional design choice. Users were never clearly told this would happen, and it flew in the face of what anyone would reasonably expect. A piece of information shared for the purpose of account security was quietly diverted into a completely different context—Facebook’s advertising machine—without genuine consent or awareness. This was a blatant violation of contextual integrity. Facebook essentially pulled a bait-and-switch on its users: offering security on the front end while conducting surveillance for profit on the back end (Wolverton, 2018).

Public reaction was fierce. Privacy advocates and tech journalists lambasted Facebook for this practice. The Electronic Frontier Foundation called it “deceptive and invasive,” noting it was contrary to user expectations and to Facebook’s own previous promises (Gebhart, 2018). Indeed, Facebook had long assured users that information given for security would not be used for ads—a promise it quietly broke. The backlash even caught regulators’ attention. By 2019, under a settlement with the U.S. Federal Trade Commission, Facebook was ordered to stop using 2FA phone numbers for advertising (Gebhart, 2018). But by then, the damage was done: millions of users felt tricked, and trust in the platform was further eroded. This case underlines how information asymmetry and dark patterns converge: Facebook had vast troves of user data and knew exactly how to monetise it, while users had little clue their security info would be mined for profit. The “illusion of control” in this scenario was the friendly prompt to “add your phone for extra security” — users assumed that data would stay within the security context. In reality, it did not.

Weak Laws, Big Tech: Australia’s Privacy Act vs. Europe’s GDPR

All of this raises the question: Where are the laws and regulators when users are misled about their privacy? The truth is that current legal frameworks struggle to keep Big Tech in check, and Australia is no exception. The ACCC’s win against Google came under consumer law (for misleading conduct) rather than under privacy law because Australia’s privacy legislation was simply too weak (De Souza, 2022a). At the time, the Privacy Act 1988 — Australia’s main privacy statute — had woefully low penalties for breaches. The ACCC noted that Privacy Act penalties were far smaller than those available under consumer protection law, so it turned to consumer law to seek a stronger punishment (De Souza, 2022a). That’s how Google ended up with a record A$60 million fine, an amount the Privacy Act would not have allowed (De Souza, 2022a). Even so, for a company like Google, A$60 million was just a slap on the wrist, representing a tiny fraction of its revenues.

Australia’s Privacy Act also lacks many of the safeguards of modern data protection regimes. For example, it permits “implied consent” in many situations, whereas the EU’s General Data Protection Regulation (GDPR) requires consent to be explicit and informed. And unlike the GDPR—which gives individuals strong rights to access, delete, or port their personal data—Australia’s law mainly allows people to access and correct their information, with no general right to deletion or portability (European Council, n.d).

Enforcement is another differentiator. The GDPR is often hailed as the gold standard not just for its principles but for its penalties. European regulators can fine companies up to 4% of global annual turnover, which for giants like Google or Facebook can reach into the billions. And they have shown their teeth: Google was fined €50 million in France in 2019 over opaque consent practices (De Souza, 2022b), and in 2023 Meta was hit with a record €1.2 billion fine for unlawful data transfers under GDPR. Fines of that magnitude get Big Tech’s attention.

Recognising these gaps, Australia has started to reform its privacy laws. In late 2022, lawmakers raised the maximum Privacy Act fine to AU$50 million (around €30 million), and in 2023 a government review recommended “strengthen(ing) and modernis(ing)” the law (Attorney-General’s Department, 2023) — potentially adding rights like data erasure and tighter consent rules. But such reforms will take time, and for now Australians remain less protected.

The bottom line is that Europe’s GDPR is a far more powerful tool for holding companies accountable than Australia’s current framework. GDPR treats privacy as a fundamental right, putting the burden on companies to justify data collection and build privacy by design. Australia’s law, born in the late 1980s and only patchily updated since, has tended to prioritise business convenience alongside user privacy. The result is that Australians (and people in countries with similarly outdated laws) often have to fall back on general consumer protection rules to address privacy harms, as we saw with the ACCC v Google case. It’s telling that the ACCC had to use consumer law at all; ideally, a privacy regulator armed with a strong statute would handle such cases directly. Stronger privacy laws and enforcement powers would give regulators the ability to turn those “strong messages” into actions that Big Tech cannot ignore.

Reclaiming Reality from Illusion

From Google’s sneaky location tracking to Facebook’s two-factor phone number bait-and-switch, it’s clear that consumers often overestimate how much control they really have. Tech platforms have become masters of offering the illusion of privacy—a comforting toggle here, a reassuring policy statement there—while they continue to harvest and monetise data at an industrial scale. These practices erode user trust, undermine autonomy, and can even have societal implications (think of Cambridge Analytica’s role in Brexit). In each case, the opacity and manipulation are deliberate, not accidental.

Meaningful reform is essential to change this status quo. On the one hand, platforms should design services ethically, with no more deceptive defaults or buried opt-outs. Users should be able to set their privacy preferences without running an obstacle course of confusion. Regulators can help by outlawing the most deceptive design tricks (California, for example, has banned manipulative opt-out interfaces (Hosch & Morris, 2021)). On the other hand, we need legal muscle. GDPR has shown that strong laws coupled with enforcement can push even the largest companies to improve (or at least slap them with a fine in the billions when they don’t). If Australia follows through on its proposed Privacy Act reforms, requiring explicit consent, giving users a right to delete their data, and imposing much steeper fines closer to GDPR’s scale, companies would have a greater incentive to respect user privacy. Warnings that companies “must not mislead consumers about how their data is being collected and used” need to be backed by real consequences, not just words (ACCC, 2021).

Ultimately, we must end the charade of the privacy illusion. People shouldn’t have to be tech experts to know what happens with their information. Real transparency, user-friendly controls, and laws with teeth can help rebuild the trust that’s been lost. As things stand, the balance of information is heavily skewed toward companies—but as the saying goes, sunlight is the best disinfectant. Strong privacy rights and enforcement won’t stifle innovation; they ensure that innovation serves users and society, not just Big Tech’s data ambitions. In the end, reclaiming privacy in the digital age will require enormous effort on multiple fronts but that fight is long overdue. The sooner we dispel the illusion and confront reality, the sooner we can create a digital world built on trust and respect instead of misdirection and power imbalance.

Bibliography.

Attorney-General’s Department. (2023, March 31). Government response to the Privacy Act Review Report—Attorney-General’s Department—Citizen Space. https://consultations.ag.gov.au/integrity/privacy-act-review-report/

Australian Competition and Consumer Commission. (2021, April 16). Google misled consumers about the collection and use of location data (Australia) [Text]. https://www.accc.gov.au/media-release/google-misled-consumers-about-the-collection-and-use-of-location-data

Auxier, B., Raine, L., Abderson, M., Perrin, A., Kumar, M., & Turner, E. (2019). Americans’ attitudes and experiences with privacy policies and laws. https://www.pewresearch.org/internet/2019/11/15/americans-attitudes-and-experiences-with-privacy-policies-and-laws/

Bryne, E. (2022, August 12). Google fined $60m for misleading some Australian mobile users about collection of location data. ABC News. https://www.abc.net.au/news/2022-08-12/google-fined-60m-misleading-mobile-users-location-data/101329790

Coldewey, D. (2018, August 13). Google keeps a history of your locations even when Location History is off. TechCrunch. https://techcrunch.com/2018/08/13/google-keeps-a-history-of-your-locations-even-when-location-history-is-off/

Colgate, J. L. (2022, January 28). Google Accused of Using “Dark Patterns” to Track Users Who Did Not Want to Be Tracked. Privacy Zone. https://www.theprivacylaw.com/2022/01/google-accused-of-using-dark-patterns-to-track-users-who-did-not-want-to-be-tracked/

De Souza, R. (2022a, August 18). Australia: Google agrees to pay AUD 60 million for misleading consumers regarding the collection of location data. Privacy Matters. https://privacymatters.dlapiper.com/2022/08/australia-google-agrees-to-pay-aud-60-million-for-misleading-consumers-regarding-the-collection-of-location-data/

De Souza, R. (2022b, August 18). Australia: Google agrees to pay AUD 60 million for misleading consumers regarding the collection of location data. Privacy Matters. https://privacymatters.dlapiper.com/2022/08/australia-google-agrees-to-pay-aud-60-million-for-misleading-consumers-regarding-the-collection-of-location-data/

European Council. (n.d.). The general data protection regulation. Consilium. Retrieved April 2, 2025, from https://www.consilium.europa.eu/en/policies/data-protection-regulation/

Federal Trade Commission. (2022, September 15). FTC Report Shows Rise in Sophisticated Dark Patterns Designed to Trick and Trap Consumers. Federal Trade Commission. https://www.ftc.gov/news-events/news/press-releases/2022/09/ftc-report-shows-rise-sophisticated-dark-patterns-designed-trick-trap-consumers

Gebhart, G. (2018, September 27). You Gave Facebook Your Number For Security. They Used It For Ads. Electronic Frontier Foundation. https://www.eff.org/deeplinks/2018/09/you-gave-facebook-your-number-security-they-used-it-ads

Hosch & Morris. (2021, March 20). Privacy Plus+: Additional CCPA Regulations – An Illusion of Privacy. Hosch & Morris, PLLC. https://www.hoschmorris.com/privacy-plus-news/additional-ccpa-regulations

Humphry, D. J. (2025). ARIN6902 Digital Policy and Governance: Week 4: Issues of Concern: Privacy,  Security and Digital Rights.

Nissenbaum, H. (2004). Privacy as Contextual Integrity. Washington Law Review, 79(1), 119.

Nissenbaum, H. (2018). Respecting Context to Protect Privacy: Why Meaning Matters. Science and Engineering Ethics, 24(3), 831–852. https://doi.org/10.1007/s11948-015-9674-9

Office of the Attorney General for the District of Columbia. (2022, December 30). AG Racine Announces Google Must Pay $9.5 Million for Using “Dark Patterns” and Deceptive Location Tracking Practices that Invade Users’ Privacy. https://oag.dc.gov/release/ag-racine-announces-google-must-pay-95-million

Privacy Wars: Comparing Australia’s Data Protection with GDPR! – GDPR Local. (n.d.). Retrieved April 7, 2025, from https://gdprlocal.com/privacy-wars-comparing-australias-data-protection-with-gdpr/

Wagstaff, K. (2012, March 6). You’d Need 76 Work Days to Read All Your Privacy Policies Each Year | TIME.com. https://techland.time.com/2012/03/06/youd-need-76-work-days-to-read-all-your-privacy-policies-each-year/

Wolverton, T. (2018, September 28). Facebook Uses Two-Factor Authentication Data to Target Ads. https://www.businessinsider.com/facebook-uses-two-factor-authentication-data-to-target-ads-2018-9

Zakrzewski, C. (2022, January 24). Google deceived consumers about how it profits from their location data, attorneys general allege in lawsuits. The Washington Post. https://www.washingtonpost.com/technology/2022/01/24/google-location-data-ags-lawsuit/

Be the first to comment

Leave a Reply