Privacy Dilemmas in the Digital Age through the Facial Recognition Controversy

Grocery shopping accidentally swiped face

While you’re still in the throes of shopping, while you’re checking out and paying for your purchases, overhead surveillance is watching your every move and facial expression and trying to determine whether you’re stealing in the present or the past by parsing your movements.

It sounds like a bridge to a sci-fi film, but it exists in the real world. According to the SBS News (2022, June 14), Kmart is testing facial recognition technology in some of its shops, claiming to ‘enhance the shopping experience’ and ‘reduce theft’. Another company, Bunnings, engaged in similar behaviour. But do consumers really know they are being ‘swiped’? Is the data or privacy safe? In this age of ubiquitous information, has our face become an invisible membership card?

The double-edged sword of facial recognition: security or surveillance?

As you browse supermarket aisles, hidden cameras may be analyzing your facial features in real time. Australian retailer Kmart recently rolled out facial recognition systems that scan facial points to create biometric fingerprints(SBS News, 2022) While stores celebrate reduced theft and faster checkouts, shoppers are left questioning the invisible cost of this convenience.

From a company’s perspective, the technology does have its benefits. Retailers say the technology aims to reduce in-store theft, especially in high-risk areas. It also enhances the customer experience and optimises shop operations (Zhou, 2024, November 19).

Yet privacy concerns cast a long shadow. A careful reading of the privacy policies of companies such as Kmart reveals that it collects not just the user’s face information, but also date of birth, purchase history, location data, and so on (Kmart Australia Group, n.d.). However, from a consumer’s perspective, the technology poses a number of concerns. Firstly, there is the privacy risk, if biodata is compromised, it could be used by hackers for deep counterfeiting attacks (Agarwal & Ratha, 2023). Second is non-transparent collection, the CHOICE Consumer Survey found 76% of Australians had their facial data collected unknowingly, often through discreet ceiling cameras (CHOICE, 2022). This biometric collection turning routine shopping trips into involuntary data-gathering sessions. Additionally, unlike passwords, facial features can’t be reset if compromised. Security experts warn these datasets could become goldmines for deepfake fraud or identity theft (Dunsin, 2025).

This furore reveals not only the ethical dilemma of technology, but also a new type of power game in the digital age. When our biometric features become the currency of circulation, perhaps we should rethink: on the scale of convenience and privacy, how much ‘face tax’ are we willing to pay for a hassle-free shopping experience? The answer may be hidden in the moment you look up at the camera the next time you walk into a supermarket.

Teoretical critique: When Supermarket Surveillance Becomes a Data Harvester

The importance of privacy has long gone beyond the simple protection of personal secrets; it constitutes the cornerstone of human dignity in the digital age and is the invisible operating system for the normal functioning of modern society. The facial recognition fiasco at Australia’s Kmart supermarket is like a siren’s mirror that reveals the ‘data game’ of modern society. More than a simple privacy controversy, it exposed the disturbing reality that our faces are becoming ‘digital commodities’ on supermarket shelves. Just as ranchers use fences to keep their cattle and sheep in captivity, companies are using cameras and algorithms to pen humans into data pastures, where each person’s biometrics become ‘data wool’ that can be harvested. The U.S. Federal Trade Commission (FTC, 2024) warns that while traditional passwords can be changed if stolen, compromising facial data is like permanently losing the keys to your life to criminals. Even worse, this data can be traded on the black market for many times the price of credit card information (IBM, 2024).

Although such companies claims that this is simply to reduce theft, it has sparked widespread controversy in the community, with many Australians concerned about their privacy and security. Related research shows that companies collect detailed information about consumers and employees, but provide as little information about themselves as possible to regulators. Internet companies collect user data but object to user control over their digital profiles (PASQUALE, 2015). Do you think that ‘privacy agreement’ you sign at the supermarket really protects people’s information? Maybe they’ve already got your face in their data pockets when you click the ‘Agree’ button. In current society, AI can accurately deduce your preferences just by browsing history, likes and favourites (Just & Latzer, 2016). There are even dynamic video synthesis technologies (such as DeepFaceLive) to make static photos ‘move’. Once the face information is leaked, it is the same as the lock cylinder of your home door is permanently stored in the black market. It is also difficult for the user to know how the information was leaked and how to erase the footprint of the leaked information. Even more frightening is that criminals can use these algorithms and technologies to generate dynamic face videos, threatening personal and property security. Kmart’s later statement is a classic case of crisis communications in the face of public outcry. The supermarket claimed to have suspended the measure (Zhou, N. 2022, July 25). All the while avoiding the central question: Is the facial data already entered into the system still proliferating silently deep within the servers?

The Dilemma of Protecting Privacy in Australia

The primary law protecting privacy in Australia is the Privacy Act 1988 enacted by the Federal Government. The Act sets out a set of Australian Privacy Principles (APPs)(Anyanwu, 2013). It applies to Australian Government agencies and certain private sector organisations (Alazab et al., 2021). The APPs cover all aspects of the collection, use, disclosure, storage and security of personal information (Alazab et al., 2021). The principles require relevant entities to take reasonable steps to protect the personal information they hold and to ensure that it is handled in a manner that meets standards of fairness and lawfulness. Despite Australia’s efforts to enhance data protection through the Privacy Act and, significant flaws persist in its legal framework. The Privacy Act (1988) lags behind technological developments, leaving regulatory gaps, especially for biometric data such as facial recognition (Witzleb, 2023). Companies can exploit these loopholes, implementing facial recognition without explicit user consent under loosely defined “reasonable use” standards. Moreover, Australia lacks a statutory Tort of Privacy, making it difficult for victims to seek legal redress (Alazab et al., 2021).

What’s more, Meanwhile, people’s perception of privacy and security is inadequate. The survey data shows that while 57% respondents are concerned about companies invading their privacy online, views on whether privacy online is exaggerated and “I have nothing to hide” vary by age, gender and frequency of social media use (Goggin et al., 2017). Francis (2017) also claims that although people are concerned about their privacy issues, their behaviour does not show it. In Australia, Even if people find their privacy violated, it is difficult for victims of privacy breaches to bring an action in court, mainly because the key differences between Australia and the EU in terms of privacy enforcement: Australia lacks constitutional privacy protections, and individuals must first file a complaint with the organisation involved and then rely on the under-resourced OAIC to investigate if they are unsuccessful. This cascade of limited recourse mechanisms makes it difficult for citizens to access justice directly and perpetuates a dearth of case law in the area of data privacy. (Daly, 2017). Witzleb(2023) also highlights that the High Court of Australia, in its 2001 judgement, declared that there were obstacles to the recognition of privacy torts at common law.

The Notifiable Data Breaches (NDB) scheme further reveals operational weaknesses. Organizations retain excessive discretion to determine whether a breach causes serious harm and are granted a 30-day window to respond, often extended(Alazab et al., 2021). Additionally, the burden of managing breach consequences is shifted onto individuals, such as recommending password changes rather than mandating corporate preventive measures.

Finally, rapid technological advancements, particularly AI and big data, exacerbate risks of data misuse. Current laws insufficiently address automated decision-making and algorithmic discrimination, leading to a widening gap between privacy protection and technological innovation (Anyanwu, 2013).

Decoding China’s Surveillance Governance for Australian Privacy System

Firstly, perfect laws and regulations are the ‘tranquilliser’ for customers. Effective protection of customer and user privacy requires an overhaul of Australia’s legal framework, and the Privacy Act 1988, while an important cornerstone, is lagging behind in meeting the challenges of modern technology. In contrast, China’s legislative process in the area of personal information and privacy protection has accelerated significantly, gradually building a comprehensive protection system covering national laws, administrative regulations, local regulations and technical standards, such as Fthe Personal Information Protection Act, the Data Security Act, and so on.

As for the details of the law, biometric data, particularly facial recognition technology, must be explicitly regulated. Without a comprehensive set of laws and regulations, companies are able to operate in the regulatory void and use the fair use standard to deploy intrusive technologies without the explicit consent of the user. The new law should make it mandatory for the collection and use of biometric data to be subject to express, voluntary consent. Regarding biometrics, there are also lessons to be learnt from China’s laws and regulations. Under the Personal Information Protection Law(2021), biometric collection requires explicit consent and cannot be the sole authentication method (China Briefing 2024, March 5).

What’s more, China’s Regulations on the Administration of Algorithmic Recommendations for Internet Information Services require an algorithm filing system, whereby key algorithms such as takeaway-delivery platforms and social recommendations are required to submit a description of their rationale and ethical assessment to the relevant department. Therefore, Australia can also improve the relevant regulations to alleviate the public’s anxiety about privacy issues by restricting companies that need to collect user information.

According to the Personal Information Protection Law of the People’s Republic of China (PIPL), people can complain to the Internet information department, the General Administration of Market Supervision or industry authorities(EDPB, 2021). Compared to Australia, where people need to complain to the OAIC, China’s multi-departmental approach is more proactive in terms of administrative intervention.

Due to national and cultural differences, Australia does not need to replicate the Chinese model in its entirety, but can learn from its experience in regulation and accountability. As the reality cameras begin to parse customer fatigue, the only way to make AI truly work for people’s benefit is to give the law a real function.

Conclusion

The Kmart case exposes Australia as being on a fault line between technology giants and civil rights, so the relevant authorities need to both accelerate the pace of legislation and practice in algorithmic accountability, as well as defending the privacy rights of individuals. When technology returns to its instrumental nature, people can truly let their guard down.

Reference

Alazab, M., Hong, S. H., & Ng, J. (2021). Louder bark with no bite: Privacy protection through the regulation of mandatory data breach notification in Australia. Future Generation Computer Systems, 116, 22-29.

Agarwal, A., & Ratha, N. (2023). Manipulating faces for identity theft via morphing and deepfake: Digital privacy. In Handbook of Statistics (Vol. 48, pp. 223-241). Elsevier.

Anyanwu, C. (2013). Challenges to privacy law in the age of social media: An Australian perspective. Australian Journal of Communication40(3), 121-137.

Australian Bureau of Statistics. (2021). Privacy impact assessment: National Health Measures Study. Australian Bureau of Statistics. https://www.abs.gov.au/about/legislation-and-policy/privacy/privacy-impact-assessments/National%20Health%20Measures%20Study%20PIA.pdf

China Briefing. (2024, March 5). China’s facial recognition regulations 2025: Stricter rules for data privacy. China Briefing. https://www.china-briefing.com/news/china-facial-recognition-regulations-2025/

CHOICE. (2022). Kmart, Bunnings and The Good Guys using facial recognition technology in stores. CHOICE. https://www.choice.com.au/consumers-and-data/data-collection-and-use/how-your-data-is-used/articles/kmart-bunnings-and-the-good-guys-using-facial-recognition-technology-in-store

Daly, A. (2017). Privacy in automation: An appraisal of the emerging Australian approach. Computer Law & Security Review33(6), 836-846.

Dunsin, D. (2025). Deepfake and Biometric Spoofing: AI-Driven Identity Fraud and Countermeasures.

European Data Protection Board (EDPB). (2021). Legal study on government access to personal data held by private sector companies. Retrieved April 5, 2025, from https://www.edpb.europa.eu/system/files/2022-01/legalstudy_on_government_access_0.pdf

Federal Trade Commission. (2024). Privacy and data security update: 2023. https://www.ftc.gov/system/files/ftc_gov/pdf/2024.03.21-PrivacyandDataSecurityUpdate-508.pdf

Francis, L. P., and Francis, J. G. (2017). Privacy: What Everyone Needs to Know. Oxford: Oxford University Press.

Just, N., & Latzer, M. (2016). Governance by algorithms: reality construction by algorithmic selection on the Internet. Media, Culture & Society39(2), 238-258. https://doi.org/10.1177/0163443716643157 (Original work published 2017)

IBM. (2024). Data privacy examples: 10 examples of data privacy mishaps. IBM. https://www.ibm.com/think/topics/data-privacy-examples

Goggin G. Digital Rights in Australia. The University of Sydney; 2017. https://ses.library.usyd.edu.au/handle/2123/17587

Kmart Australia Group. (n.d.). Privacy policy. Kmart Australia. Retrieved July 15, 2024, from https://www.kmart.com.au/privacy-policy/

PASQUALE, F. (2015). INTRODUCTION: THE NEED TO KNOW. In The Black Box Society: The Secret Algorithms That Control Money and Information (pp. 1–18). Harvard University Press. http://www.jstor.org/stable/j.ctt13x0hch.3

SBS News. (2022, June 14). ‘Creepy and invasive’: Kmart, Bunnings and The Good Guys accused of using facial recognition technology. SBS News. https://www.sbs.com.au/news/article/creepy-and-invasive-kmart-bunnings-and-the-good-guys-accused-of-using-facial-recognition-technology/h08q8evb1

Witzleb, N. (2023). Responding to global trends?: privacy law reform in Australia. Data Disclosure: Global Developments and Perspectives, 147-168.

Zhou, N. (2024, November 19). Bunnings’ use of facial recognition technology in stores may breach law, experts say. The Guardian. https://www.theguardian.com/australia-news/2024/nov/19/bunnings-facial-recognition-technology-breach-stores-ntwnfb

Be the first to comment

Leave a Reply