ugc_banner

Deadbots: The digital soul that can speak for you after your death

NEW DELHIEdited By: Nikhil PandeyUpdated: May 13, 2022, 05:11 PM IST
main img
The development of a deadbot that can replicate a person's personality necessitates a large amount of personal data, such as social network data, which has been shown to indicate extremely sensitive qualities. Photograph:(Twitter)

Story highlights

Virtual assistants are now a threat to home privacy; news recommenders change our understanding of the world; risk-prediction systems advise social workers on which children to safeguard from abuse; and data-driven hiring tools assess your chances of finding a job. Many people, however, are unsure about the ethics of machine learning.

Eight years after his lover died, a bereaved Canadian man utilised cutting-edge artificial intelligence software to have life-like online "chats" with her.Joshua Barbeau, 33, told the San Francisco Chronicle that he paid $5 to participate in a beta test of GPT-3, artificial intelligence software created by a research group co-founded by Elon Musk.

Barbeau said he used her past text messages and Facebook postings to create the chatbot resemble his late lover's writing voice. Barbeau lost his 23-year-old sweetheart Jessica Pereira in 2012.

Robots that converse with dead people

Barbeau was able to exchange text messages with an artificial "Jessica" using a deadbot, a form of chatbot. Despite the case's ethically contentious character, I rarely came across materials that went beyond the facts and examined it via an explicit normative lens: why would developing a deadbot be right or wrong, ethically desirable or repugnant?

Let's put things in perspective before we tackle these issues: Jason Rohrer, a game developer, established Project December to allow people to customise chatbots with the personality they wanted to connect with, as long as they paid for it.

Watch | Impact of automation on jobs: Use of AI-based technology in hiring talent

The project was designed using the GPT-3 API, a text-generating language model developed by OpenAI, an artificial intelligence research firm. 

Barbeau's situation caused a schism between Rohrer and OpenAI because GPT-3 is clearly prohibited from being used for sexual, romantic, self-harm, or bullying objectives in the company's standards.

Rohrer shut down the GPT-3 version of Project December, calling OpenAI's approach hyper-moralistic and stating that persons like Barbeau were "consenting adults."

While we may all have opinions on whether developing a machine-learning deadbot is a good idea or not, articulating the implications is far from simple. This is why it's crucial to tackle the ethical issues posed by the case one by one.

WATCH WION LIVE HERE: 

Nikhil Pandey

Nikhil Pandey is TEAM LEAD - DIGITAL CONTENT with WION. He follows politics, sports and entertainment.He tweets at @Nikhil_Pandey04.