Privacy protection in the digital age: Does 2023 Australian privacy law reform work.

Real-life threats to privacy and help from mobile phones

With the rapid progress and development of the media age in the past decades, the use of the Internet and social media coverage in people’s lives has skyrocketed, and according to the Australian Bureau of Statistics, mobile phone penetration in 2005 was 54 per cent, and as of 2020 mobile phone penetration has reached 120 per cent. (Australian Bureau of Statistics, ABS) But again behind at the same time means that our personal information, personal data is also being synchronised to this also businesses and application service providers or third party platforms, so how do we get protection and rights to this information? With the emergence of these issues, in 2023 the Australian government introduced the Privacy Act Review Report 2023 to improve the privacy issues caused by this widespread use of data. But how well has this act been implemented, and how much has it improved things in people’s daily lives? We’ll continue to explore this in the next section.

Core Changes in Australia’s Proposed Privacy Law Reform 2023

In the next section, let’s focus on the major reforms in Australia’s Privacy Act 2023. In my opinion, there are three main parts that are closely related to life. The first part is the ‘Targeted Advertising Regulation’, a term that refers to the process of analysing the preferences of users through their behaviour (e.g. time of day, type of content viewed, time of day of use) so that personalised content can be delivered accurately. The term ‘Targeted Advertising Regulation’ refers to the process by which a company or service provider analyses a user’s behaviour (e.g. time of day, type of content viewed, time spent in a particular location, etc.) and analyses their preferences in order to accurately deliver personalised content. This type of data modelling is particularly common in some large platforms.(Attorney-General’s Department. 2023) The main point is to prohibit Targeted Advertising to minors, in order to protect the healthy growth of minors. The main point is to prohibit Targeted Advertising to minors in order to protect the healthy development of minors. As minors have not yet formed a complete judgement and are highly susceptible to advertisements, this amendment is similar to the child protection mechanism in the European Union’s GDPR.

The second part of the reform proposal is the introduction of the ‘Fair and Reasonable’ principle, which was relatively vague in the previous Australian Privacy Act, the two core meanings of this reform are Reasonable Expectations and Contextual Integrity. Contextual Integrity, how to interpret these two policies, I will take two simple examples, when I am shopping online, I subjectively want to buy a certain product but the platform pushes the product does not meet my goal, the platform pushes a simple document for collecting my shopping data, not the advertisement push this is called ‘ Reasonable Expectations’. Reasonable Expectations’. Similarly, ‘Contextual Integrity’ means that my shopping data can only be used on the shopping platform and cannot be used by the advertising platform for commercial purposes.

The third part is ‘Data Subject Rights’, in which users can request enterprises to delete, move, restrict the processing of, or object to the processing of personal data, and these four user rights are conducive to enhancing the ability of users to control their own data, and to keep their personal data in their own hands. However, whether these rights are useful in practice and whether they are supported by the relevant technologies and organisations is open to discussion.

Privacy Law Reform Proposals Coexist with Progress and Shortcomings

Children use social Media

Children’s privacy protection remains inadequate

Although there have been reforms to the child protection section of the Targeted Advertising Regulation, there are no specific operational measures, and most of the reforms in these bills remain at the level of principles and do not have a child-centred approach to legislation.For example, compared to the GDPR, there are no clear criteria to clearly define the age of a ‘child’; there is also a lack of technical regulation of the platforms that children access on a daily basis, such as which platforms meet the criteria for children’s use, and insufficient protections for child-targeted ads, which, although restricted, still allow children to personalise ads through tagging and other behaviours. Personalised advertising that allows children to be promoted through tagging, searching and other behaviours, and these specific implementation criteria and checklists are not clearly presented.(Livingstone, S., & Third, A. 2017)

Therefore, in order to address this phenomenon, it is necessary to establish a specific child protection checklist or strengthen the supervision of platforms, for example, by strictly prohibiting data analysis and any form of advertisement push targeting children and adolescents without the permission of the law, and setting up modes that are free of advertisements and data tracking, such as the ‘child mode’. At the same time, the role of parents as the guardians of their children can also be strengthened by introducing a ‘parental supervision and consent mechanism’, whereby the ‘Kids’ module can only be opened with parental permission, in order to achieve the purpose of explicitly prohibiting undesirable messages from affecting children’s development. The EU’s GDPR can also be used as a reference to set control obligations based on the sensitivity of the platform’s data, e.g., different levels of regulation based on the type of platform (education, video entertainment, social, shopping).

The principle of ‘fairness and reasonableness’: is it really working?

“When people consent, they do so under conditions of inequality, information asymmetry, and functional dependence on platforms.”— Suzor, 2019.

The ‘fair and reasonable’ principle adapts and improves on some of the vague and structurally unreasonable concepts in the Privacy Act 1988. The principle proposes a new system of governance, which is in fact a further interpretation of the theory of ‘contextual integrity’, giving a more straightforward meaning to the processing of information, not in terms of whether the user has consented but in terms of whether the information is being used appropriately in its transmission. However, the key point of this theory is to make an ethical judgement on the context of the processing, but the proposal does not set up a framework of ethical standards, such as ‘risk assessment committees’ and other external organisations. Also lacking in the reform proposal is a vague definition of what constitutes a ‘reasonable expectation,’ which allows companies to circumvent the Act. Giving platforms the primary right to make decisions about ‘reasonableness’ could be a tool for platforms to legitimise data processing (Flew, 2021). At the same time, compared to the GDPR, Australia’s proposed privacy law reform in 2023 does not include a breach of the fairness and reasonableness principle as an offence and lacks strong regulation.

In order to better address this series of issues, we can refer to the requirements of the Legitimate Interest Assessment (LIA) in the GDPR, and establish in advance a unified model for the industry to judge the ‘reasonable expectation’ to provide users and platforms with a standardised assessment tool.

Yoti Company

For example, in the UK the ICO’s Regulatory Sandbox in 2019 is designed to help organisations innovate in their use of personal data while ensuring that it is used in a lawful and compliant manner.( Information Commissioner’s Office 2022) In the ICO’s partnership with identity verification company Yoti. As Yoti is a company that uses artificial intelligence to analyse a user’s facial data to predict their age, and the technology is widely used on platforms that need to verify a user’s age, it is extremely widely used and prone to breaches of data protection regulations. The main point of contention between Yoti and the UK GDPR was whether biometric data existed for the identification of personal information, and if it did, it was considered a special category of data that needed to be subject to stricter conditions of processing in order to avoid infringement of personal data. Through the cooperation of the UK ICO, Yoti’s main concern was resolved by showing that the data used by DuPre to analyse facial images to predict a user’s age was only used to filter and classify the user and was not used to identify personal information, and therefore was not considered to be a special category of data to be processed in contravention of the UK GDPR.(Yoti 2024) The Australian government could draw on the Regulatory Sandbox mechanism that was introduced in the UK to enable businesses to comply with the UK GDPR. Regulatory Sandbox mechanism so that businesses and users have a uniform assessment framework.

We need to take our rights into our own hands.

The Proposed Privacy Act Reform has a clear scope of user data control rights for Data Subject Rights. ‘These rights empower users to move from being passive to active in the digital ecosystem, an important step forward from data ownership to data sovereignty.’ (Albrecht, J. P. 2016). The new proposal considers the user as a participant in data governance, strengthens the user’s rights, is more in line with the EU GDPR, and is very helpful for cross-border data mutual recognition. However, there are still a series of unresolved issues, such as the lack of enforcement mechanisms and technical support, the difficulty of enforcing user rights, the lack of harmonisation of data standards, which leads to the possibility of different data structures being used by different enterprises, and the difficulty of enforcing the migration of users’ personal data.

There are some measures that can change these shortcomings, firstly the mandatory nature of the reform proposal needs to be reinforced, ‘Empowerment must be matched by accountability if it is to be more than symbolic.’ (Flew, T. 2021). Incorporating the powers referred to in the reforms into the provisions of the Privacy Act makes them fundamental rights for users.

Advertising accounts for approximately 98% of Meta’s revenue. Its platforms include Facebook, Instagram and WhatsApp. Photograph: Nikolas Kokovlis/NurPhoto/REX/Shutterstock


For instance, in a complaint that took place in the UK in March of this year, Meta received an allegation from a UK user, Tanya ‘Carroll, claiming that Meta had breached the UK’s data protection laws by continuing to process her personal data for the purpose of targeted advertising despite the user’s explicit request to stop doing so, an allegation that was ultimately met by the the UK Information Commissioner’s Office (ICO). (Hern, A. 2025) This allegation clearly demonstrates that although the conceptual level of the Privacy Act Reform Proposals 2023 is a complete advancement towards user privacy protection, it still lacks real operationalisation, incorporation into the law and enforcement mechanisms.

‘Paper rights’ are not what we really want.

Australia’s Proposed Privacy Law Reform 2023 is undoubtedly a key step towards protecting user privacy and data autonomy in the digital age. However, empowering users with truly effective digital rights must not stop at policy provisions. Therefore, the next step in Australia’s implementation must be to take these rights from paper to real life through clear enforcement mechanisms, technological support facilities and legally binding frameworks. Protecting privacy is not just a legal and technological challenge, it is a social movement about fairness, justice and dignity. Each of us does not need ‘paper rights’ that are not real, but digital sovereignty that can be exercised and mastered.

Reference List

Attorney-General’s Department. (2023, February 16). Privacy Act review report. Australian Government. https://www.ag.gov.au/rights-and-protections/publications/privacy-act-review-report

Albrecht, J. P. (2016). How the GDPR will change the world. Eur. Data Prot. L. Rev.2, 287.

Livingstone, S., & Third, A. (2017). Children and young people’s rights in the digital age. New Media & Society, 19(5), 657–670.

Lynskey, O. (2015). The foundations of EU data protection law. Oxford University Press.

Flew, Terry (2021) Regulating Platforms.

Information Commissioner’s Office. (2022). Regulatory Sandbox Final Report: Yoti. https://ico.org.uk/media/for-organisations/documents/4020427/yoti-sandbox-exit_report_20220522.pdf

Yoti. (2024). Yoti Age Estimation White Paper – September 2024. https://www.yoti.com/wp-content/uploads/2024/11/Yoti-Age-Estimation-White-Paper-September-2024-PUBLIC.pdf

Hern, A. (2025, March 22). Meta confirms it is considering charging UK users for ad-free version. The Guardian. https://www.theguardian.com/technology/2025/mar/22/meta-confirms-it-is-considering-charging-uk-users-for-ad-free-version

Be the first to comment

Leave a Reply