In our journey towards a digitized world, “I have read and agree to the User Agreement and Privacy Policy” has become a default option we must face every time we download a new app. Without ticking that box, we can’t proceed. Yet, the truth is, few of us have the patience to sift through those lengthy, dense terms that could rival a graduation thesis.
Our daily online communications are, in fact, built on the infrastructure of service providers. According to the law, when we agree to register for a new service, we essentially only possess the rights stipulated in that contract (Suzor, 2019). Even if we wish to read those terms, our options are severely limited when faced with disagreeable or unacceptable clauses—unless we opt out and do not proceed to the next screen.
In this era, increasingly permeated by digitalization, every time we click the “Agree” button online, we are silently conducting a transaction: exchanging our personal information for the convenience of life. But what is the true cost of this transaction? Faced with such a trade-off, are we truly prepared?
“The age of privacy is over”
Zuckerberg
The Indispensable Digital Privacy and Rights
Privacy has always been regarded as an inalienable human right. As early as 1980, the Harvard Law Review introduced the concept of “The right to be alone,” emphasizing the protection individuals have over their information, private life, and personal space – the right not to be disturbed by the outside world.
However, in today’s digital world, the definition of privacy has far exceeded the traditional sense of the “right to be alone” or the simple need to prevent unauthorized access to personal information. The potential leakage of personal privacy online has become one of the focal points of concern in the digital media and internet sectors (Flew, 2021).
Moreover, digital rights, as emerging human rights derived in the internet era (Hutt, 2015), ensure our safety, freedom, and fair use of digital technology and the internet in the digital age. With the continuous advancement of technology and the deepening of digital life, these rights have become increasingly important.
Who is Sharing Our Data?
With the advancement of 5G technology, high-speed connections, low latency, and low power consumption have become possible, turning every electronic product around us into real-time online sensors. The widespread application of the Internet of Things has blurred the boundaries between online and offline life, transforming people’s physical existence into a stream of data on the network. From shopping, buying houses, to buying cars, every aspect of our lives is generating massive amounts of data.
In this context, the issue of privacy rights has gained a new dimension – it’s not only about the provision of information but also about the balance between privacy rights and free online services, as well as the extent to which businesses and government institutions utilize big data for personal analysis without user consent (Flew, 2021).
When we visit a new website, we are usually required to accept all cookies. These cookies not only help the browser remember our shopping history on Taobao, from SK-II to iPhone 15, but also our usernames and passwords, simplifying the login process. However, cookies record not just shopping information; they can also track the news we browse, the movies we watch, and the diseases and medications we search for online.
As we become increasingly reliant on online shopping, socializing, and information sharing, our digital rights – especially privacy rights and the right to freedom of speech – become particularly important. We must clearly understand how companies, governments, and internet giants (such as TikTok, Facebook, Google) are using our data. Are these data being treated fairly and cautiously, or are they being sold or shared without our consent?
When Face-Changing Apps Surpass Your Consent
In recent years, AI face-changing apps have seen massive popularity on social media. These applications allow users to upload personal photos and choose the facial features of celebrities they wish to mimic, creating natural and realistic face-swapped results.
Zao, a standout in this craze, has not only achieved tremendous success in the Chinese market but also enabled many users worldwide to fulfill their dreams of sharing the screen with celebrities.
However, behind these simple upload and click actions, lies the potential risk to user privacy. Using face-changing software like Zao involves more than just mobile phone verification for login; downloading or sharing face-swapped videos requires users to perform a series of actions—such as blinking, turning the head, or opening the mouth—in front of the camera to verify their identity, similar to the facial information collection process used in face recognition payments.
Worryingly, according to the user agreement, the software can freely use and modify the user’s portrait and grant it to any third party for information trading. This means that more people might see and use your photos, or even spread them further.
Zhu Wei, the deputy director of the Communication Law Research Center at China University of Political Science and Law, pointed out that this process involves significant risks of personal information privacy leakage. He emphasized that collecting users’ dynamic and static facial photos essentially gathers the user’s core facial recognition information, the most sensitive part of personal privacy. Once misused, these details could pose a severe threat to the user’s property and personal safety (Pan, 2019).
Do We Really Care About Our Privacy?
Recently, Penguin Intelligence conducted a detailed survey of Chinese netizens through its research platform, the results of which are compiled in the “Chinese Netizens’ Personal Privacy Status Report.”
The survey reveals a thought-provoking status quo:
- Only 16.1% of respondents thoroughly read the “terms and agreements” when faced with them. The majority, over 40%, either skim through or directly check “agree.” The main reasons cited for not engaging deeply are the excessive length of the agreements and the complexity of the terminology.
- The survey also found that nearly 40% of netizens frequently worry about their information being leaked online, while less than 10% are “not worried at all.”
These findings raise a fundamental question: What is our attitude towards privacy in the digital age? While concerns about data leakage are widespread, most people seem to prefer sacrificing a certain degree of privacy for convenience. This phenomenon suggests that although privacy rights are universally deemed important, our choices often lean towards convenience when it conflicts with daily convenience.
Should We Entrust Our Data to the Government?
As digitalization deepens, personal privacy protection has become a global focus. The United States, a pioneer in personal privacy protection, introduced the “Computer Matching and Privacy Protection Act” as early as 1988 to address new challenges in privacy protection in the internet era. Following closely, the European Union enacted the landmark General Data Protection Regulation (GDPR) in 2016, emphasizing the need for explicit consent from users before processing their data, thus enhancing the transparency and fairness of data handling.
However, there are significant differences in data protection legislation across regions. Europe emphasizes the protection of personal information as part of human rights, aiming to secure personal data and individual privacy rights. In contrast, the United States focuses more on the property rights within data, emphasizing the protection of consumer information and national security significance (Jia & Zhao, 2022). In Asia, China enacted the “Personal Information Protection Law” in 2021, marking an increased emphasis on personal data protection, though further strengthening and refinement are needed in handling big data and new trends in personal privacy.
Australia is also actively exploring best practices for data protection. Through ongoing public inquiries by the Australian Competition and Consumer Commission (ACCC), including personal information protection, the relationship between platforms and data brokers, and transparency requirements, it strives to create a more fair and transparent data processing environment.
Yet, laws alone cannot be the sole means of protection; safeguarding privacy requires collaboration from multiple fronts.
- Government data collection typically serves public services, national security, and public health monitoring, whereas companies often collect data for commercial purposes, such as personalized advertising, market analysis, and product development.
- Governmental bodies may be subject to stricter regulation and public oversight in some countries/regions, and data usage must comply with specific legal frameworks. On the other hand, corporate data handling policies and practices may be more flexible, though they are also subject to strict regulations in some areas (like the EU).
- Data usage and sharing: Governmental data sharing usually follows specific legal procedures, whereas corporate data sharing might be based on commercial agreements, often leaving users in the dark. Online service terms grant service operators and digital platforms considerable discretion in setting and enforcing their rules (Suzor, 2019).
In this global digital era, although laws provide a basic framework for data protection, collaboration among individuals, governments, and corporations in data sharing and privacy protection remains crucial. This is not just a legal issue but a broad societal concern, involving how we balance personal privacy rights with public interests and secure personal data in the digitalization process.
Finding the Balance Between Personal Privacy Rights and Public Interests
In our daily lives, we often face choices about whether to share personal information for a service or product. For example, downloading an app may require access to our location, contact list, and other sensitive information. In these instances, we must weigh the value of this information exchange.
Public interests often involve areas beyond individual interests, such as public safety and health. Against this backdrop, restricting certain privacy rights may become inevitable. The Health Code application during the COVID-19 pandemic is one example. In China, people needed to scan their Health Code wherever they went to record their location information for tracking and controlling the spread of the virus. While this practice somewhat limited personal privacy, it was deemed necessary for public health safety (Flew, 2021).
Thus, finding a balance between personal privacy rights and public interests requires extensive dialogue among policymakers, technology platform operators, and the public. By establishing clear legal frameworks, strengthening data protection measures, raising public awareness, and ensuring transparency and fairness in the process, we can protect individual privacy while also addressing public interests.
Conclusion
In the current digital era, complete anonymity is nearly impossible. Therefore, our focus should not solely be on whether privacy and data are recorded or known but on the right to know and the right to benefit everyone has the right to understand how their data is used and to benefit from it. The government is responsible for developing policies to ensure appropriate regulation of data collection, analysis, and usage by any platform.
Ensuring data security and personal privacy protection requires the joint efforts of law, technology, and society, as well as careful consideration and understanding from individuals before sharing personal data. Raising awareness about personal privacy and understanding the context of data usage are particularly important in a digital environment.
As digitalization progresses, finding an appropriate balance between maintaining personal privacy and pursuing public interests has become a significant challenge we face. This issue transcends technology and law, involving complex societal values and ethical principles. How we achieve this balance in digital society governance will be key to advancing societal progress.
References
Flew, T. (2021). Regulating Platforms (pp. 99–110). Polity Press.
Hutt, R. (2015, November 13). What are your digital rights? World Economic Forum.https://www.weforum.org/agenda/2015/11/what-are-your-digital-rights-explainer
Jia, W., & Zhao, L. (2022). Personal Data Protection in the Era of the Data Economy. Institute for Global Cooperation and Understanding, Peking University.https://www.igcu.pku.edu.cn/info/1026/4963.htm
Pan, Y. (2019). Behind the AI face-swapping app trend: Potential for leaking core personal information. Www.thepaper.cn. https://www.thepaper.cn/newsDetail_forward_4311257
Suzor, N. P. (2019). Lawless : the secret rules that govern our digital lives. Cambridge University Press.
Be the first to comment