ManXiao/Patently Apple
The open view and surveillance of the middle has become a metaphor for the Internet age: the fear that cyberspace has become a panoramic prison is slowly becoming a valid worry.
Bentham
With the introduction of the Internet and artificial intelligence into our lives, in some specific fields, artificial intelligence has shown the ability to match or even surpass human work (Maccario & Naldi, 2022). While AI is convenient for life, it also faces the problem of stealing user privacy and violating users’ digital rights. The convenient life of the voice assistant, one after another revealed the problem of violating user privacy, whether it is Amazon’s Alexa or Apple’s Siri, which is known for its privacy protection, is the voice assistant a convenient tool, or the technology company’s receipt data, the means of violating privacy? The digital rights of users should be taken seriously.
What are digital rights
Digital rights are all the freedoms and rights that people have in the digital age. One example of a digital right is the right to control one’s personal data, including how it is collected, used, and shared on digital platforms (Subramanya & Yi, 2006). Another type of digital right is the right to privacy, the rights to information access, digital security, and freedom of speech (Subramanya & Yi, 2006). Additionally, people have the right to engage in digital culture, access digital devices, and access and transmit information on the Internet on an equal footing without unjustifiable interference or obstruction from governments or network service providers (Subramanya & Yi, 2006).
How voice assistants violate users’ digital rights
When we use voice assistant services, we rely on platforms, such as Apple or Amazon. Platforms are carriers of services, and many companies are service providers at heart rather than technology platforms in the true sense of the word (Flew, 2022). For example, third-party service providers do not develop siri, but provide services for siri maintenance and information processing. This means that when we use a brand of AI services, the information is processed not only by the brand developer, but also by third-party service providers.
The user’s data is gathered and saved when they decide to use the voice assistant. This includes location data, search history, and voice commands. Furthermore, the user’s voice data will be gathered by the voice assistant for marking. A former employee of Apple’s outsourcer disclosed in a prior Wall Street Journal article that Apple provides the outsourcer with a recording of conversations with Siri that have been uploaded to the server in order to assess Siri’s effectiveness and determine whether or not its responses are accurate (Schönherr et al., 2022). It should be highlighted, though, that these recordings of Siri conversations also include private and sensitive information about the user’s personal life, like business negotiations and doctor-patient interactions (Schönherr et al., 2022).
Furthermore, voice assistants have the ability to utilize users’ personal information for profit (Flew, 2022). For example, in the case of Siri mentioned above, Apple may use user conversations to enhance the service. Furthermore, there is a push for advertising, etc. We frequently perceive our phones as “eavesdropping” on our lives by displaying advertisements related to subjects we may have discussed, a practice known as data abuse. At the same time,some voice assistants may impose excessive censorship and restrictions on users’ speech, such as blocking discussion of specific topics or viewpoints, thus violating users’ right to free speech. Fortune reported that Google’s smart speakers automatically block some sensitive topics, making it impossible to answer them.
Who is responsible for violating digital rights
As mentioned above, the voice assistant is a system plug-in developed by the developer, and it can obtain permission, not only from the developer itself, but also the third-party service provider that maintains the system, and other responsible parties that may be involved in infringement.
The first principle of data analytics is that every participant, event, and transaction can be made visible and calculable (Flew, 2022). As a result, some voice assistant developers gather user voice data in order to enhance voice assistant functionality and modernize systems. Infringement takes place during the voice data collection and storage process. However, some businesses are also selling data to other services for comparison, even as developers work to improve their offerings. A voice privacy scandal has been made public by the US tech giant Facebook that the business is gathering voice recordings from users and employing independent contractors to convert user conversations captured by Messenger into text (Backes et al., 2023). The idea is to provide a foundation for AI to learn contrasts.
Furthermore, there is a close relationship between digital services and platforms. Rather than having applications and services that sit on top of digital platforms, the leading digital platforms companies also have dominant positions within digital services environments (Flew, 2022). Therefore, platform operators also have to bear part of the responsibility, when users use voice services on social media platforms and e-commerce platforms, there is a possibility of infringement. Although platform operators are generally not liable for user-generated content under Section 230 of the Communications Decency Act, in some cases they may be required to take action to prevent infringement (Goldman, 2017).
In addition, technology suppliers will also become the key problem of information leakage and infringement. Some voice recognition technology suppliers and data processing technology suppliers, etc. For the technology provided by the service project, there are certain loopholes or technical defects, which will cause user information to be leaked (Backes et al., 2023). For some brands of voice assistant services, there are outsourcers for maintenance processing.
Apple has claimed that it has the strictest privacy policy and does not disclose users’ personal privacy, but that does not mean that outsourcers will maintain the same rules. In the previous case of Apple’s leaked recordings, the official version was that the technology was optimized for system upgrades (Sharon, 2020). In addition, since iOS 10, Apple has used a data collection strategy called “Differential Privacy”, which uses hash algorithms, segmentation sampling, and mathematical noise injection to learn as much as possible about a particular group and as little as possible about any one of them (Sharon, 2020). Simply put, Apple doesn’t match data to an individual, but mixes it with hundreds of millions of other pieces of information.
According to reports from The Guardian, external employees have access to an exceptionally large amount of data, and Apple’s subcontractors hardly ever see audits (Sharon, 2020). However, speech is extremely difficult to separate from other types of data. When there are enough samples, voice recognition technology can be very helpful. Positioning can be accomplished by combining location data with analysis of speech features like tone, speech rate, pause, and emphasis. Of course, it’s not that hard to pick out a specific individual in the large crowd.
How should rights be regulated and upheld
Regulation can be not only coercive or restrictive (i.e. provide a red light for dealing with undesirable forms of behavior or actions) but also enabling or facilitative (i.e. provide a green light for the elderly enactment of specific opportunities). (Flew, 2022).
Laws and regulations are comprehensive attitudes to judge events, March 25, 2024, the European Union has launched investigations into Apple, Google and Facebook parent Meta on suspicion that they are failing to comply with a new landmark European law designed to promote competition in digital services. The Digital Market Act (DMA), which went into effect earlier this month, was “failed to comply effectively” with the various practices of the three companies, according to the European Commission, which expressed “suspicions” about this. “Significant fines” could be imposed, according to EU Commissioner Thierry Breton, should the investigation reveal a “lack of full compliance.” “To comply with the Digital Markets Act, we have made significant changes to the way we operate our services in Europe,” said Google’s head of competition, Oliver Bethell, in a statement. In a sense, this protects users’ digital rights, controls pertinent businesses’ business practices, and governs the freedom of speech and information access on digital platforms.
Additionally, a unique regulatory body ought to be set up to manage pertinent disputes and enforce pertinent laws and regulations. Similar to the smart speakers that tech companies rushed to adopt years ago, Baidu Browser, a Chinese platform service, has launched a voice assistant service that can be used in TVs, cars, and furniture simultaneously. From a different perspective, having a smart speaker in your living room that is always online and capable of recording is akin to giving other people access to a switched tape recorder.
To address this concern, Baidu ensures the security of sensitive data, can use irreversible algorithms to encrypt and store it, and only after specific decryption operations can the original data be obtained; According to the requirements of different types of business applications, the corresponding access control methods are designed to realize the security protection of sensitive data (Fu, 2019). To ensure that sensitive data is only accessed and modified by certain people, appropriate access permissions and data manipulation permissions must be set in the big data system (Fu, 2019). At the same time, establish a complete personal sensitive information security protection mechanism, including the identification and verification of user identity, access control lists, security log records and security policy management (Fu, 2019). In order to ensure the authenticity and reliability of data sources and prevent the proliferation of false information, users must accept real-name authentication in the process of data collection and use.
Therefore, as long as there is strict enough regulation, industry executives have good industry ethics, respect for self-regulatory norms, and restrain members’ behavior, users’ digital rights will be protected. Given the transnational and globalized nature of the digital environment, international cooperation and standardization are also important means of protecting digital rights. Countries can strengthen cooperation to jointly develop international standards and guidelines to address digital rights issues.
Conclusion
Technology facilitates life, but it should not be capital, and even people with intentions to use our lives to sell tools, the progress of science and technology, the participation of artificial intelligence, should become a yearning for a better life, rather than a symbol of “panoramic prison”. Defend our digital rights and make technology and artificial intelligence a convenient part of our better life.
Reference
Backes, C., Jungfleisch, J., & Pültz, S. (2023). OK google or not ok google?—voice assistants and the protection of privacy in families. European Union and Its Neighbours in a Globalized World, 207–222. https://doi.org/10.1007/978-3-031-40801-4_13
Digital Markets Act: Apple, google and meta at risk of ‘heavy’ fines as Europe launches new probes | CNN business. (n.d.). https://www.cnn.com/2024/03/25/tech/digital-markets-act-apple-google-meta/index.html
Flew, T. (2022). Regulating platforms. Polity Press.
Fu, T. (2019). China’s personal information protection in a data-driven economy: A privacy policy study of alibaba, Baidu and Tencent. Global Media and Communication, 15(2), 195–213. https://doi.org/10.1177/1742766519846644
Goldman, E. (2017, August 29). The ten most important section 230 rulings. SSRN. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3025943
Maccario, G., & Naldi, M. (2022). Privacy in smart speakers: A systematic literature review. SECURITY AND PRIVACY, 6(1). https://doi.org/10.1002/spy2.274
Schönherr, L., Golla, M., Eisenhofer, T., Wiele, J., Kolossa, D., & Holz, T. (2022). Exploring accidental triggers of smart speakers. Computer Speech & Language, 73, 101328. https://doi.org/10.1016/j.csl.2021.101328
Sharon, T. (2020). Blind-sided by privacy? Digital Contact Tracing, the apple/google API and Big Tech’s newfound role as Global Health Policy Makers. Ethics and Information Technology, 23(S1), 45–57. https://doi.org/10.1007/s10676-020-09547-x
Subramanya, S. R., & Yi, B. K. (2006). Digital Rights Management. IEEE Potentials, 25(2), 31–34. https://doi.org/10.1109/mp.2006.1649008
Image reference
Backes, C., Jungfleisch, J., & Pültz, S. (2023). OK google or not ok google?—voice assistants and the protection of privacy in families. European Union and Its Neighbours in a Globalized World, 207–222. https://doi.org/10.1007/978-3-031-40801-4_13
Digital Markets Act: Apple, google and meta at risk of ‘heavy’ fines as Europe launches new probes | CNN business. (n.d.-a). https://www.cnn.com/2024/03/25/tech/digital-markets-act-apple-google-meta/index.html
Be the first to comment