Introduction
In today’s digital age, our personal information has become our most valuable currency. Every click, every search, and every interaction on social media is cataloged and unknowingly draws a picture of our digital identity. However, when we happily swipe the screen or click yes to the privacy terms, do we really realize where our data is going? Research in 2018 shows that 91% of adults agreed or strongly agreed that consumers have lost control over how companies collect and use their personal information, and only 9% of respondents were “very confident” that their data would be protected by social media companies (Rainie, 2018). However, even when digital privacy security is at stake, most people still choose to accept and agree to the privacy terms proposed by major platforms. This is where a special dilemma emerges: Even though we express concerns about our privacy online, our actions often imply a startling indifference. As people’s lives become increasingly intertwined with the Internet, the concept of privacy has transformed from a basic right to a complex paradox. People’s control over privacy is also falling into a more complicated dilemma.
Privacy and Privacy Paradox in Digital Realm
The term “the right to be let alone” is frequently used as the early definition of privacy. It comes from a seminal essay by Justice Louis Brandeis and Samuel Warren published in the Harvard Law Review in 1890 (Francis & Francis, 2017). However, with the rapid development of the Internet and digital technology, the scope of personal data has been greatly expanded, driving the evolution of the definition of privacy. Privacy is no longer just seen as the confidentiality of information alone, but includes the reasonable use of data, control over data use, and transparency requirements (Gellert, 2021). At the same time, people’s concern about privacy has shifted from simple protection of personal information to comprehensive consideration of how personal information is collected, processed, and used (Marmor, 2021). This shift suggests that the topic of privacy has become more complex, involving not only the protection of personal information but also considerations of the impact of technology.
The term “privacy paradox” was first discussed in the late 20th century as the Internet became widely adopted. With concerns about online privacy grew, it gained significant academic and public attention in the early 2000s. The concept of privacy paradox is used to explain the discrepancy between people’s expressed concerns about privacy and their actual behavior online. It highlights a situation that although individuals claim to highly value their privacy and fear to have their personal information shared, they still tend to continue practices that compromise their privacy, such as sharing personal information on social media or agreeing to an ambiguous privacy policy without fully understanding it (Martin, 2020).
Does privacy paradox mean people don’t care about their privacy?
In the digital environment, the amount of information that can be shared, the trade-offs between privacy rights and access to free online services, and the potential for commercial interests and government agencies to use big data for personal profiling have all provided a new significance to privacy issues (Flew, 2021). The topic of privacy paradox is also becoming more complicated. However, does online participation and information disclosure mean that consumers inherently accept or disregard privacy risks? Do people really don’t care about their privacy?
The answer is obviously a “No”. The concept of privacy trade-off is often used to explain why the privacy paradox exists. Users are believed to have made a risk-return assessment between privacy protection and access to services, and ultimately chose convenience over their privacy. However, in fact, there is a huge information gap between users and companies. The terms of most online service agreements are complex, vague, and legalistic. What’s more, users are often only provided with an all-or-nothing choice (Flew, 2021). As a result, many users may not fully understand how their data is collected, used and its potential risks.
Kirsten Martin strongly opposes the way companies use privacy paradox to rationalize privacy violations, arguing that this rationalization is based on flawed assumptions about consumer behavior and privacy expectations (2020). Through experiments using factorial vignette surveys and trust games, privacy concerns are shown to persist after data disclosure. It is proved that consumers maintain strong privacy expectations even after disclosing information to companies. Moreover, breaches of privacy norms can seriously erode consumer trust in companies. These findings highlight that consumers do not simply trade privacy for benefits, as is often claimed in privacy paradox narratives. Instead, they actually still value privacy and will react negatively to its vulnerabilities. According to the survey, one perspective on resolving the privacy paradox is to acknowledge that simply disclosing information does not necessarily mean that individuals no longer value their privacy, or that they have given up all expectations of privacy. It has to be known that companies and online platforms have a responsibility to respect and protect the privacy of individuals, even if they have disclosed personal information. This involves a broader understanding of privacy that goes beyond the act of disclosure, recognizing that privacy is a core value that should be upheld and respected in all circumstances.
Case Study: The privacy leak case of Zoom
As demand for remote work and online learning surges during the COVID-19 pandemic, Zoom has quickly become the platform of choice for millions of people around the world. However, its rapid growth has also revealed significant concerns about data collection and information security.
In 2020, Zoom was sued by customers, claiming that it shared data with Facebook and constituted an invasion of personal privacy. According to the analysis of the app, the iOS version of the Zoom app was indeed sharing analytics data with Facebook. The data sharing not only covered users who use Facebook to log in to zoom, but also involved those who did not have Facebook accounts (Cox, 2020). This type of data transfer is actually quite common. In fact, many apps have shared information with Facebook and used the software development kit (SDK) from Facebook as a way to implement functionality more easily within their apps. However, the crux of the matter is that, according to the ambiguous privacy terms of Zoom, users may never be aware of this situation. It is written in the privacy terms that the company may collect users’ “Facebook profile information (when you use Facebook to log in to our products or create an account for our products)”, but it never mentioned a word about collecting information from Zoom users who had never applied for Facebook accounts and sending the data to Facebook. Users may never know that when they log into the platform, their data may be sent to another service. After the news was made public, Zoom’s CEO, Eric Yuan issued a statement confirming the data collection. He said that Zoom used the Facebook SDK to implement the “Log in with Facebook” function to provide users with another convenient way to access the platform. “However, we recently became aware that the Facebook SDK was collecting unnecessary device data,” the statement reads. To solve this problem, Zoom decided to remove the Facebook SDK and reconfigure the function so that users can still login to the platform with their Facebook accounts (Yuan, 2020). Zoom took advantage of the gray area in its privacy policy and created a dilemma of user informed consent. It reveals that in complex technical environments, even if users are provided with privacy policies, their ability to understand and control their personal data is still very limited.
What should governments do?
Zoom’s case highlights the need for countries to adopt laws and policies to protect users’ online privacy.
- Strengthening data protection laws
Strong regulatory frameworks and enforcement mechanisms plays a key role in privacy protection. For example, the European Union’s General Data Protection Regulation (GDPR) has set out strict guidelines for the process and protection of personal data. Not only does GDPR emphasize consent and data minimization, it also provides people with a great deal of control over their personal data, including the ability to access, edit, and remove information. (Dove, 2021). The strictness and high coverage of such laws forces companies like Zoom to adopt privacy-first policies on a larger scale. Moreover, according to GDPR, companies can face significant penalties for non-compliance, which can effectively prevent the misuse of personal data. According to research, fines under the GDPR may amount to as much as 4% of a company’s yearly worldwide turnover. (Dove, 2021), which ensures the effectiveness of the law to a certain extent.
- Enforcing transparency requirements
In addition to strict and effective regulations, enforcing transparency requirements is also a major way to protect user privacy from being violated. Take the California Consumer Privacy Act (CCPA) as an example, it requires companies to disclose what data they collect, why they collect it, and with whom they share it. CCPA also authorizes the California Attorney General to enforce these regulations, emphasizing the government’s role in protecting consumer privacy (de la Torre, 2018). Recognized as one of the most important pieces of privacy legislation in the United States, CCPA illustrates some of the key steps that government can take to increase transparency to protect the digital privacy of online users. Talking back to the case of Zoom, after privacy leaks emerged, Zoom decided to begin publishing transparency reports detailing the data requests it received from governments and other entities. This proves that transparency can not only build consumer trust but also hold companies accountable to ensure they remain ethical when handling user data.
- Promoting digital literacy and public awareness
Individuals’ understanding of privacy-related information directly affects the security of online privacy. Research has shown a strong correlation between an individual’s level of digital literacy and online privacy behaviors: increased digital literacy will lead to more careful privacy practices (Park, 2013). By improving digital literacy, governments can empower users to make informed decisions and take proactive steps to protect their privacy. For example, the Australian government has invested in digital literacy programs aimed at educating citizens about online privacy risks and data protection. The “Be Connected” program is a prime example. It is a campaign specifically designed to improve the digital skills and online safety perceptions of the elderly. It provides a wide range of resources and personalized support to help older people understand digital technology, including using digital devices safely, managing online privacy and understanding social media platforms (Australian Government, n.d.). What’s more, the Australian Media Literacy Alliance (AMLA) focuses on promoting critical engagement with media for all age groups to ensure the success and safety of various digital activities. The knowledge it seek to popularize includes how media is produced, who it is intended for, and the technologies involved in the creation and distribution, which are critical to preventing misinformation and managing privacy online (AMLA, n.d.). As is shown, these plans highlight the proactive steps governments are taking to improve the digital literacy of citizens, ensuring they have the ability to protect their privacy online and navigate the digital world more safely.
Conclusion
In conclusion, privacy security remains a major issue in the digital environment which is becoming increasingly complex. And due to the information gap between users and companies, the privacy paradox also persists. However, although consumers may not fully understand the impact of their online actions, they still fundamentally care about their privacy and should be provided with a core of power over it.
As described above, the case of Zoom illustrates the complex relationship between digital convenience and privacy concerns. The changing digital landscape requires both technology companies and regulators to bridge the gap between privacy concerns and consumer behavior. Strengthening data protection laws, enforcing transparency requirements and promoting public awareness are three main ways to help consumers gain more control over their privacy. Ultimately, resolving privacy paradox requires a collaborative effort from all stakeholders to create an environment where digital engagement is safe, private, and user centric. By acknowledging the complexities of digital privacy and proactively seeking to address its multifaceted challenges, society can more confidently and safely navigate the waters of online information sharing.
Reference list
Be Connected. (n.d.). Be connected – every Australian online. Australian Government. https://beconnected.esafety.gov.au/
Cox, J. (2020, March 27). Zoom IOS app sends data to Facebook even if you don’t have a Facebook account. VICE. https://www.vice.com/en/article/k7e599/zoom-ios-app-sends-data-to-facebook-even-if-you-dont-have-a-facebook-account
de la Torre, L. (2018). A guide to the california consumer privacy act of 2018. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3275571
Dove, E. S. (2021). The EU general data protection regulation: Implications for international scientific research in the Digital Era. Journal of Law, Medicine & Ethics, 46(4), 1013–1030. https://doi.org/10.1177/1073110518822003
Flew, T. (2021). Issues of Concern. Regulating platforms. Polity Press.
Francis, L., & Francis, J. G. (2017). Privacy: What everyone needs to know. Oxford University Press.
Gellert, R. (2021). Personal Data’s ever-expanding scope in smart environments and possible path(s) for regulating emerging digital technologies. International Data Privacy Law, 11(2), 196–208. https://doi.org/10.1093/idpl/ipaa023
Marmor, A. (2021). Privacy in Social Media. In C. Véliz (Ed.), The Oxford Handbook of Digital Ethics (pp. 575–589). essay, Oxford University Press. Retrieved 2024, from https://doi.org/10.1093/oxfordhb/9780198857815.013.31.
Martin, K. (2020). Breaking the privacy paradox: The value of privacy and associated duty of firms. Business Ethics Quarterly, 30(1), 65–96. https://doi.org/10.1017/beq.2019.24
Park, Y. J. (2013). Digital Literacy and Privacy Behavior Online. Communication Research, 40(2), 215–236. https://doi.org/10.1177/0093650211418338
Rainie, L. (2018). Americans’ complicated feelings about social media in an era of privacy concerns. Pew Research Center.
The Australian Media Literacy Alliance. (n.d.). The Australian Media Literacy Alliance. AMLA. https://medialiteracy.org.au/
Yuan, E. S. (2020, March 27). Zoom’s use of Facebook’s SDK in IOS client. Zoom Blog. https://www.zoom.com/en/blog/zoom-use-of-facebook-sdk-in-ios-client/
Figures
Henriquez, M. (2022). Data Privacy Day: Raising awareness and encouraging compliance. Security Magazine. https://www.securitymagazine.com/articles/96948-data-privacy-week-raising-awareness-and-encouraging-compliance
Screenshot. (2020). Zoom’s use of Facebook’s SDK in IOS client. Zoom Blog. https://www.zoom.com/en/blog/zoom-use-of-facebook-sdk-in-ios-client/
Be the first to comment