
In our contemporary society, where technology is advancing exponentially, the digital wave is sweeping across the world, ostensibly treating all members of society equally and creating the illusion of fair coverage. However, in-depth analyses reveal that there is an extremely serious problem of age inequality lurking behind the scenes. Access to privacy is particularly difficult for those who are marginalized in other areas of life (Marwick & Boyd, 2025). Older persons, as a significant part of the social fabric, are now caught in an unprecedented privacy dilemma. In the process of technological iteration, they are being marginalized in the technological age under the guise of “goodwill” due to factors such as lack of technological adaptability and information literacy gaps and are struggling to survive in the wave of digital transformation.

What is the Privacy?
The right to privacy is often considered an inherent and fundamental human right and one of the cornerstones of individual freedom and dignity. Nonetheless, in reality, this right is not absolute, but often needs to be balanced against other legal rights, social obligations and widely held moral or legal norms (Flew, 2025).
From a historical and philosophical perspective, the right to privacy is wealthy and varies according to the context of the times and the cultural context. In her study, scholar Alexandra Rengel points out that the right to privacy is not a single or fixed concept but encompasses multiple meanings and layers. It can be understood as the right of a person to be treated as an individual, i.e., people should not be regarded as merely part of a group but should be respected and treated individually; it also implies that an individual has the right to organize his or her own life without undue interference from others, which can be either from the surveillance of state power or excessive prying eyes of others.
In addition, the right to privacy includes the freedom to keep secrets – that is, the individual can decide on his or her own what information is public and what should be kept secret. It is also about the right to control personal information, a point that is particularly important in the digital age. How to protect users’ sovereignty over their own data in the face of widespread data collection practices by online platforms and tech companies has become a central topic in the global discussion of privacy.
Taking it a step further, Rengel argues that the right to privacy also involves the protection of individuality, personality, and human dignity. Each person has a unique lifestyle, thoughts, feelings, and behaviors, and the guarantee of the right to privacy is precisely the recognition and respect of these differences. Finally, the right to privacy is also expressed in the right to autonomous control over one’s intimate relationships and certain private areas of one’s life. For example, a person has the right to decide whether or not to disclose information about his or her family life, emotional relationships, or medical treatment without coercion or intrusion from outside (Rengel, 2013).

The Exploitation of Privacy Behind ‘senior-friendly’
Nowadays, there are many tech products flaunting the ‘senior-friendly’ flag in the market, like health monitoring bracelets and smart homes. Under the guise of caring for the health and convenience of the elderly, these products are quietly collecting large amounts of sensitive data.
More than 90% of health-related websites have code that triggers data transfers to third parties (Libert et al., 2015). From the health status of the elderly to their whereabouts and even their consumption habits, nothing is left out. Furthermore, they had simplistic conceptions of information privacy in general, including what information was gathered, where it was kept, who might access it, and its potential uses. For most participants, these were unfamiliar, ambiguous, or simply boring ideas (Lorenzen-Huber et al., 2010).
When companies promote these products, the complex terms of data use and privacy policies are akin to a book for the less digitally literate elderly. This begs the question of whether companies are exploiting the cognitive gap between older people and modern technology, viewing them as a ‘low-risk, high-value’ source of data. From a technological anthropology perspective, this is a manifestation of ‘ageism’ in the digital age. Elderly people are constructed as the ‘technological other’ due to their relatively slow acceptance of new technologies, and their privacy becomes less important in this marginalized perception, which in turn makes them the target of reckless data collection by corporations. Analyzed in terms of political economy, the data of the elderly population has enormous commercial value in areas such as health insurance and the pension industry. By collecting such data, companies can accurately segment the market, develop targeted products and services, and maximize benefits, while older people become targets of exploitation in the process.

Forced ‘Digital Survival’ and the Sacrifice of Privacy
Currently, a wave of comprehensive digitization of public services is sweeping across the country at an unprecedented rate, and while this change has brought convenience to most people, it poses a huge challenge to older people. In China, for example, medical registration, an important service for the health of the elderly, is now mandatory online. In the past, seniors could go to the hospital window and register with the assistance of staff without any problems, but now they must deal with a complicated online registration process.
To access these basic public services, the elderly also must accept complicated privacy terms. When registering for various online service platforms, long and obscure privacy terms are listed in front of their eyes, and they are often unfamiliar with the operation of electronic devices, so they are unable to study them carefully, but to enjoy the rights and interests, they have no choice but to click on the consent. Otherwise, they will not be able to use these services normally, which is undoubtedly a helpless move.
During the epidemic, the popularity of the ‘health code’ exposed this problem to the fullest extent. In many situations, such as traveling and entering public places, the ‘health code’ became an indispensable passport. Many elderly people who do not have smartphones, or even if they do, do not know how to operate the relevant software, have been mercilessly excluded from public life and have been reduced to ‘digital outcasts.’
There was a news report that an elderly person was blocked from entering a Dalian metro station because he was unable to present a health code or a paper epidemic pass. Although the underground side of the aftermath of the lack of service to special populations makes reflections, look at the whole country: around the health code, checking requirements vary in different regions and on different occasions. The care of special groups also tends to show differences in life and is also greatly affected (Leopard’s Ear Health, 2019).
This phenomenon is by no means just a problem at the level of technical application; the deeper level behind it reflects the neglect of the rights and interests of elderly groups in the process of digital transformation of society. The elderly are forced to make a difficult choice between enjoying basic services and sacrificing privacy, a situation that is clearly unreasonable. When society vigorously promotes the digitalization process, it pays more attention to the rapid development of technology and the convenient experience of young people but fails to take into full consideration the special needs and abilities of the elderly. Elderly people are relatively slow to learn new things and are not skilled enough in the operation of electronic devices, society has not provided them with adequate transition programs and assistance, which has placed them in an extremely awkward situation, struggling hard in the digital wave and unable to enjoy the fruits of social development smoothly.

The Paradox of Anti-fraud Propaganda
Society has been emphasizing the need for ‘older people to be wary of online scams,’ which is necessary. However, what we cannot ignore is the negative role that platforms play in this process.
Misinformation, or information that lacks veracity, is information that older individuals frequently come across on social media while looking for further information (Jaster & Lanius, 2021). Misinformation, a broad term that encompasses a variety of information hazards like fabrications, rumors, conspiracies, hoaxes, fakes, clickbait, etc. (Flintham et al., 2018), has been spreading on platforms in ever-increasing amounts and on an ever-increasing range of subjects, from elections and referendums to health issues and climate change (Melinda McClure Haughey et al., 2022). Expectedly, many of the misinformation topics and much of the quantity were, and perhaps still are, of interest to older adults (Sharevski & Vander Loop, 2023). But while we tend to blame privacy risks simply on the ‘ignorance’ of older people, we overlook the original sin of the algorithms behind the tech companies.
To address the question of how AI systems encode, generate, and reinforce age-related biases, the study found two main phenomena that contribute to digital ageism: first, older people are severely underrepresented in the datasets used to train models, mainly due to sampling bias, such as the low engagement of older people in data sources such as social media, and second, older people’s data is often crudely categorized into broad age groups, masking their diversity within the group, reflecting an age bias in the data labeling process itself. These issues suggest that AI systems may solidify and amplify discrimination against older people at the design and training stage (Chu et al., 2023).
In pursuit of traffic and profit, tech companies use algorithms to push out large amounts of low-quality and even harmful information, and older people are more vulnerable to being misled by this information due to their lack of digital literacy. This practice of putting the blame entirely on the elderly conceals the responsibility of tech companies in creating a bad online environment. At a deeper level, this reflects an unfair attitude in society’s treatment of the elderly, which does not really address the root causes of the problem but simply blames the victims.
Gaps in Legal Protection
In addition to the health sector, several general data privacy laws have been enacted in recent years. While these laws expand consumers’ rights regarding how their data is collected and processed, they also have some particularly relevant limitations to older adults (Friedman et al., 2022). For example, the EU General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) allow consumers to know how data processors collect and use their data. This allows consumers to make informed choices about the entities they interact with. In reality, however, GDPR- and CCPA-compliant privacy notices can be lengthy, dense, and complex for users to read or understand (van Ooijen & Vrabec, 2018).
However, due to cognitive impairment or lack of digital literacy, older persons find it difficult to understand complex privacy provisions and be genuinely ‘informed.’ This has led to considerable gaps in the law protecting older people’s privacy. The law seems to be acquiescing to the fact that older people are ‘incapacitated’ data subjects, and this acquiescence somehow deprives them of their rights in disguise.
From the perspective of law-making, the unique characteristics of the elderly group have not been fully considered, making it impossible for the law to effectively protect their privacy rights and interests in the implementation process. During rapid technological development, the law must keep pace with the times, fill this gap, and provide practical protection mechanisms for the elderly.
Conclusion
The privacy dilemma of the elderly is not a mere technical issue but a microcosm of the social power structure. In the age of technology, the elderly are regarded as objects in need of care on the one hand, but on the other hand, they have become objects to be harvested by data. To solve this problem, we cannot just stop at the superficial level of ‘teaching the elderly to use mobile phones.’ The real solution needs to fundamentally reconstruct the responsibility of technology companies to fully respect the privacy rights and interests of the elderly while pursuing commercial interests; the design of public services needs to be more humane, fully considering the needs and abilities of the elderly group, and avoiding to exclude them from digital services. The law needs to establish a special protection mechanism for vulnerable groups, to make up for the current protection of the elderly privacy Gaps. Only in this way can we enable the elderly to enjoy dignity and security in the digital age and no longer become a forgotten group on the digital fringe.
References
Chu, C. H., Donato-Woodger, S., Khan, S. S., Nyrup, R., Leslie, K., Lyn, A., Shi, T., Bianchi, A., Rahimi, S. A., & Grenier, A. (2023). Age-related bias and artificial intelligence: a scoping review. Humanities and Social Sciences Communications, 10(1), 1–17. https://doi.org/10.1057/s41599-023-01999-y
Flew, T. (2025). Regulating Platforms. Vitalsource.com. https://bookshelf.vitalsource.com/reader/books/9781509537099
Flintham, M., Karner, C., Creswick, H., Gupta, N., Moran, S., & Bachour, K. (2018). Falling for Fake News: Investigating the Consumption of News via Social Media. Falling for Fake News: Investigating the Consumption of News via Social Media. https://doi.org/10.1145/3173574.3173950
Friedman, A. B., Pathmanabhan, C., Glicksman, A., Demiris, G., Cappola, A. R., & McCoy, M. S. (2022). Addressing Online Health Privacy Risks for Older Adults: A Perspective on Ethical Considerations and Recommendations. Gerontology and Geriatric Medicine, 8, 233372142210957. https://doi.org/10.1177/23337214221095705
Jaster, R., & Lanius, D. (2021). Speaking of Fake News. The Epistemology of Fake News, 19–45. https://doi.org/10.1093/oso/9780198863977.003.0002
Leopard’s Ear Health. (2019). Elderly people can’t move an inch without a health code, CCTV anchor Haixia calls for attention|Elderly|Epidemic|Health Code| – Health. Cn-Healthcare.com. https://www.cn-healthcare.com/articlewm/20200914/content-1145628.html
Libert, T., Grande, D., & Asch, D. A. (2015). What web browsing reveals about your health. BMJ, 351(nov16 5), h5974–h5974. https://doi.org/10.1136/bmj.h5974
Lorenzen-Huber, L., Boutain, M., Camp, L. J., Shankar, K., & Connelly, K. H. (2010). Privacy, Technology, and Aging: A Proposed Framework. Ageing International, 36(2), 232–252. https://doi.org/10.1007/s12126-010-9083-y
Marwick, A. E., & Boyd, D. (2025). SAML Login Redirection. Exlibrisgroup.com. https://sydney.leganto.exlibrisgroup.com/leganto/public/61USYD_INST/citation/44140578700005106?auth=SAML
Melinda McClure Haughey, Povolo, M., & Starbird, K. (2022). Bridging Contextual and Methodological Gaps on the “Misinformation Beat”: Insights from Journalist-Researcher Collaborations at Speed. CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3491102.3517503
Redirect Notice. (2025). Google.com. https://www.google.com/url?sa=i&url=https%3A%2F%2Fnationalseniors.com.au%2Fresearch%2Fsocial-connectedness-communities%2Folder-australians-digital-engagement&psig=AOvVaw2ng3z2s-Wk8GVLTz0hiazX&ust=1744085620215000&source=images&cd=vfe&opi=89978449&ved=0CBUQjRxqFwoTCLjgo7GHxYwDFQAAAAAdAAAAABAZ
Rengel, A. (2013). Privacy in the 21st Century. The Hague: Martinus Nijhoff Publishers.
Sharevski, F., & Vander Loop, J. (2023). Older Adults’ Experiences with Misinformation on Social Media. Arxiv.org. https://arxiv.org/html/2312.09354v1/#S1
van Ooijen, I., & Vrabec, H. U. (2018). Does the GDPR Enhance Consumers’ Control over Personal Data? An Analysis from a Behavioural Perspective. Journal of Consumer Policy, 42(1), 91–107. https://doi.org/10.1007/s10603-018-9399-7
(2024). Mycourse.app. https://lwfiles.mycourse.app/65667d3eb3d7c72400ccde17-public/b4476c00785d42425f0aaade5bfc01f1.png
(2025a). Mycourse.app. https://lwfiles.mycourse.app/65667d3eb3d7c72400ccde17-public/0337f3dd1898b118c27120d180de24c4.png
(2025b). Olderpeople.wales. https://olderpeople.wales/wp-content/uploads/2023/07/Digital-Exclusion-Survey-1920×1280.jpg
(2025c). Hbr.org. https://hbr.org/resources/images/article_assets/2023/10/Oct23_12_1472039824-383×215.jpg
Be the first to comment