Technology Breakthrough: AI ‘resurrects’ loved ones, Dream or Nightmare?

In China, the Qingming Festival on April 4 is an important annual traditional festival for all Chinese, people sweep tombs and pay tribute to and remember their departed loved ones on this day. And in this year’s Qingming Festival, AI has brought people new ways to remember their loved ones who have passed away. AI builds virtual personas from known information about the deceased’s appearance as well as personality and experiences, and people can see the voice of their deceased loved ones through video. Scenes that have been staged in the sci-fi TV series “Black Mirror” will gradually become a reality in China. The use of AI technology has certainly led to better emotional support for those who have lost loved ones, but will it be a dream come true or a nightmare?

AI “resurrection” has formed the initial industry

In the past few months, there have been numerous cases of AI “resurrection”. Earlier this year, Taiwan’s famous musician Bao Xiaobai used AI to “resurrect” his daughter, who had died after a long illness. In a video posted on his social media platform, his “daughter” can interact with people and even sang a birthday song for her mom on her birthday. For AI “resurrection”, many supporters believe that it is a way to realize a good emotional attachment. Friends and family who have passed away, can be able to have basic expressions and movements and even talk to you with the help of AI, the deceased’s voice and smile forever, can commemorate the deceased, but also can comfort the living. But there are also some netizens do not agree with this approach: “People always have to face reality, AI “resurrection” will not be a real resurrection, will only allow people to continue to indulge in their own fantasies can not be extricated”.

Subsequently, some social media influencers, to obtain traffic, unauthorized use of AI technology to “resurrect” Coco Lee, Qiao Renliang and other deceased Chinese celebrities, attracting the attention of a large number of fans. On Youtube, you can also see netizens’ AI “resurrecting” stars such as Michael Jackson to cover other music works, and people can watch deceased musicians such as Buddy Holly and Roy Orbison. holographic “live” performance.

Search for “AI Resurrection” related on Taobao, China’s e-commerce platform, shows that there are already several shops selling related products. I randomly consulted a store merchant and learned that to make an AI “resurrection” video photo, you need to provide a front-face photo, and there are several options: It costs 20 yuan (about 4 Australian dollars) to make the character move, and 60 yuan (about 12 Australian dollars) to make the character move and speak (fixed Ai voice). You can provide your own text speech content, and 100 yuan (about 20 Australian dollars) to make the character move and talk + voice cloning that needs to provide your own audio.

Search “AI resurrection” on Taobao, there are several product categories

According to the analysis of the product purchase details page, there are roughly two types of user needs for purchasing these products. One is to express grief and miss loved ones, and the other is to use related AI works to hype and monetize. It is obvious that the “resurrection” of AI on the Internet has become a marketing gimmick, which makes people reflect on whether the improper use of AI violates ethics, morals, laws, and regulations. AI “resurrection” is a wonderful dream for us, but we also must be aware that human beings can’t be replicated exactly, at least with the current technological developments. Although we can seek emotional support from AI “resurrection”, the AI “resurrection” may turn into a nightmare.

An employee demonstrates the digital “resurrection” process at an AI technology company based in Nanning, south China’s Guangxi Zhuang Autonomous Region, April 1, 2024. (Hua,2024)

Using technical means to ‘resurrection’ the deceased may indeed bring risks such as infringement, fraud, and ethics. For example, using the deceased’s photos, audio, etc. for “AI resurrection” without the consent of the deceased’s family may infringe on the legitimate rights and interests of the deceased and his family; Improper use of the deceased’s image and the impact on family members’ emotions may cause social controversy and moral doubts.

Do individuals and groups have the right to challenge algorithmic decisions that affect their personal or collective well-being (Flew, 2021)?   iMedia Research data shows that in 2022, the market size of this industry will reach 12.08 billion yuan(about 0.25 billion Australian dollars), which will also drive the surrounding market size to 186.61 billion yuan(about 4 hundred million Australian dollars). It is expected that by 2025, the core market size of virtual digital humans will reach 48.06 billion yuan, and the size of the peripheral market driven by such growth may be close to 640.27 billion yuan (Times, 2024). The digital economy of the moment prioritizes marketing over productivity. In the current digital economy, the value of marketing is prioritized over productivity. This means that the market is more inclined to support start-ups that are able to identify potential consumers than to reward innovators who invent better products (e.g. better mousetraps). In today’s economy, the analysis (or “profiling”) of personal information has become an important business activity. The flexibility and innovative power of networked technologies make it possible for even actions intended to serve consumers to be bypassed or ignored by the technology as a tool for profit. We’re not going to be able to stop the flow of data, therefore, we need to become more knowledgeable about the entities behind it and learn to control their use of it. We need to hold business and government to the same standard of openness that they impose upon us— and complement their scrutiny with new forms of accountability. We need to enforce the laws that define fair and unfair uses of information (PASQUALE, 2015). However, many issues remain to be resolved, such as the qualifications of organizations engaged in AI “resurrection”, the right and responsibility of the initiators of AI “resurrection”, the boundaries of the technical application of digital images, and the supervision mechanism of “digital human” products. Many issues remain to be resolved.

Governance of concern

There is no right or wrong AI technology used, the key lies in how to apply it correctly and how to carry out the service within the scope of the law. Any technology has two sides, AI has great potential to enhance productivity, but we must also pay attention to the moral and ethical, legal compliance, data security and other issues in the development of AI. AI governance can be broadly understood as an institutionally oriented extension of both the vertical extension of traditional governmental institutions and the horizontal extension to non-public actors, such as private enterprise and industry self-regulation. Such governance requires that we move beyond traditional approaches that rely solely on government to include the non-governmental organisation and the use of technology itself for governance. At the same time, governance should also focus on a multi-level structure, from local to global, going beyond a single governance mechanism and adopting diverse strategies and tools to address the challenges posed by AI. (Just & Latzer, 2017).

China currently has relevant regulations. The “Administrative Provisions on Deep Synthesis in Internet-Based Information Services” that will come into effect in January 2023 have relevant regulations on services such as generating synthetic digital people through artificial intelligence technology and using other people’s biometric information to synthesize digital images (Zhang, 2023). According to these regulations, if you want to generate another person’s face, voice, or synthesize another person’s image, you should obtain the separate consent of the person concerned. This is also consistent with the relevant provisions of China’s Personal Information Protection Law, as this biometric information constitutes sensitive personal information. In addition, AI “resurrection” also faces legal scrutiny. Article 1019 of China’s Civil Code stipulates that no organization or individual may use information technology means to forge or infringe on the portrait rights of others without the consent of the portrait rights holder; Article 994 stipulates that if the portrait of the deceased is infringed, his spouse, children, and parents have the right to request the perpetrator to bear civil liability in accordance with the law ( Wininger,2020 ). Legislation usually lags, especially in new technologies. Inevitably, it is difficult to grasp how changes manifest themselves, as an algorithm of this nature is like a black box: opaque, complex, and hard for non-experts to understand (PASQUALE, 2015). The governance of AI “resurrections” and the risk management associated with their processes means that there is always pressured to address the consequences of algorithmic decisions. How a particular platform operates and what it prioritizes, or downgrades will be shaped by the internal and organizational cultures of the companies, communities, and IT specialists who develop such algorithms. Questions of algorithmic governance therefore come back to questions about the governance of digital platform companies (Flew, 2021).

What needs to be noted is that there is an essential difference between “AI resurrecting relatives” and “AI resurrecting celebrities”. The former is the initiative of the deceased’s relatives, while the latter is an infringement. But the former may also cause moral and ethical crises, another difference in the actor constellation of automated algorithmic reality constructions compared to realities traditionally constructed by mass media is the potential role of technology as an actor (Andrejevic, 2019). Not all family members can accept and understand this new technology. In a US survey, 59% of respondents were against the idea of their own digital resurrection. Opt-in rules seem to be socially desirable, with the default rule being that digital resurrection is prohibited and only allowed with the consent of the deceased (Iwasaki, 2023). The collision of different emotions can easily lead to ethical risks in the relationship of relatives, affecting respect for the deceased and the emotional relationship between family members.

The Platform governance triangle considers companies, States and non-governmental organizations as the three vertices of a triangle.

Overall, the bottom-line issue of AI “resurrection” must be resolved through the Platform governance triangle (Gorwa, 2019a,2019b). The boundaries of scientific and technological development should be carried out in a way that upholds the rule of law in society, protects the rights and interests of individuals and ensures an ethical bottom line. The rollout of AI resurrection technology requires more regulation and standardization to ensure both technological innovation and the protection of human values and dignity.

Can the dream of AI “resurrection” become a reality?

Zack Kass, former global head of commercialization at OpenAI, in the discussion of AI “resurrection”, “The reality is machines don’t have souls.” he said (Huang, 2024). The technological key to AI resurrection is nothing more than the integration and mimicry of information provided by people, reconstructing the deceased’s every move under algorithmic inference. AI systems are seeking to extract the mutable, private, divergent experiences of our corporeal selves, but the result is a cartoon sketch that cannot capture the nuances of emotional experience in the world (Crawford, 2021, p. 179). AI can’t do anything a rational, average person couldn’t do given enough time. People can imitate other humans by dressing up as them, copying their mannerisms, and aping their voices. But we can’t read each other’s minds. We can only guess what someone else is thinking. And AI is no different. No amount of data in the world can predict what a person will do next (or would have done were they still alive) (Greene, 2021). AI “resurrection” is a wonderful dream for us but we also have to be aware that human beings can’t be replicated exactly, at least with the current technological developments. We should seek emotional support but do not to let AI “resurrection” into a nightmare at the same time. AI “resurrection” technology must follow social and ethical norms and does not violate the laws and regulations under the premise of development. This will require all countries to strive to find a balance between innovation and regulation and to develop more laws and policies specific to this application in the future.

References

  1. Andrejevic, M. (2019). Automated Media (1st ed.). Routledge.
  2. Crawford, K. (2021). Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. (pp. 179). New Haven: Yale University Press.
  3. Flew, T. (2021). Issues of Concern. In T. Flew, Regulating platforms (pp. 91–96). Polity.
  4. Gorwa, R. (2019a). The platform governance triangle: Conceptualising the informal regulation of online content. Internet Policy Review, 8(2).
  5. Gorwa, R. (2019b). What is platform governance? Information, Communication & Society, 22(6), 854-71.
  6. Greene, T. (2021). What real AI developers and Black Mirror both get wrong about digital resurrection. Newstex.
  7. Huang, Y. (2024, March 23). Zach Kass, former head of global commercialization at OpenAI, talks about ai resurrecting the dead: machines have no souls. Chinanews. Retrieved 2024, from https://www.chinanews.com.cn/cj/2024/03-23/10185517.shtml.
  8. Iwasaki, M. (2023, December 27). Digital cloning of the dead: Exploring the optimal default rule. De Gruyter. https://www.degruyter.com/document/doi/10.1515/ajle-2023-0125/html
  9. Just, N., & Latzer, M. (2017). Governance by algorithms: reality construction by algorithmic selection on the Internet. Media, Culture & Society, 39(2), 238-258. https://doi.org/10.1177/0163443716643157
  10. PASQUALE, F. (2015). DIGITAL REPUTATION IN AN ERA OF RUNAWAY DATA. In The Black Box Society: The Secret Algorithms That Control Money and Information (pp. 19-58). Harvard University Press. http://www.jstor.org/stable/j.ctt13x0hch.4, 56-57.
  11. Times, G. (2024, March 24). Bringing back deceased beloved ones through AI technology becomes a new, controversial business in China as ‘era of digital humans’ approaches. Global Times. https://www.globaltimes.cn/page/202403/1309721.shtml
  12. Wininger, A. A. (2020, May 29). China‘s new Civil Law adds right of publicity. China IP Law Update. https://www.chinaiplawupdate.com/2020/05/chinas-new-civil-law-adds-right-of-publicity/
  13. Zhang, L. (2023, April 26). China: Provisions on Deep Synthesis Technology enter into effect. The Library of Congress. https://www.loc.gov/item/global-legal-monitor/2023-04-25/china-provisions-on-deep-synthesis-technology-enter-into-effect/

Images

Hua, X. (2024, April 6). Digital “resurrection” services stir debate on love, death and ai. Xinhua. https://english.news.cn/20240406/f1792109b193454ea822085dd9c83680/c.html

Notice: The OpenAI tool just generates images, OpenAI tool was not used in this Blog post.

Be the first to comment

Leave a Reply