AI Offers Digital Immortality for Deceased Loved Ones—But Should It?
AI chatbots called “griefbots” or “deadbots” offer our loved ones a new digital way to grieve but raise ethical and privacy concerns.
Rachel Feltman: For Scientific American’s Science Quickly, I’m Rachel Feltman.
The idea of digital life after death is something science fiction has been exploring for ages. Back in 2013 a chilling episode of the hit show Black Mirror called “Be Right Back” followed a grieving woman who came to rely on an imperfect AI copy of her dead partner. More recently the idea of digital copies of the deceased even made it into a comedy with Amazon Prime’s show Upload.
If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.
Thank you so much for coming on to chat today.
Katarzyna Nowaczyk-Basińska: Thanks so much for having me. I’m super excited about this.
Feltman: So how did you first get interested in studying, as you call them, “griefbots” or “deadbots”?
Nowaczyk-Basińska: I’m always laughing that this topic has found me. It wasn’t me who was searching for this particular topic; it was, rather, the other way around. When, I was still a student we were asked to prepare an assignment. I was studying media studies and with elements of art and performance, and the topic was very broad, simply “body.” So I did my research, and I’m—I was looking for some inspiration, and that was the very first time I came across a website called Eterni.me, and I was absolutely hooked by this idea that someone was offering me digital immortalization.
It was almost a decade ago, and I thought, “It’s so creepy; it’s fascinating at the same time. It’s strange, and I really want to know more.” So I prepared that assignment, then I chose digital immortality as a subject for my master’s, and master’s evolved into Ph.D., and after 10 years [laughs] I’m still in this field working professionally on this topic.
Nowaczyk-Basińska: So actually, 10 years ago commercial companies sold promise …
Feltman: Mm.
Feltman: Mm.
Nowaczyk-Basińska: And [tries] to find links between different pieces of information and extrapolates the most possible answer you would give in a certain context. So obviously, when your postmortem avatar is speaking, it’s just the, it’s just the, the prediction of: “How would that person react in this particular moment and in this particular context?” It’s based on a very sophisticated calculation, and that’s the whole magic behind this.
Feltman: So what does this landscape look like right now? What kinds of products are people engaging with and how?
Feltman: And are we seeing any differences culturally in, in how different people are reacting to and engaging with these products?
Nowaczyk-Basińska: So that’s the main question that I am trying to pursue right now because I’m leading a project that is called “Imaginaries of Immortality in the Age of AI: An Intercultural Analysis.” And in this project we try to understand how people from different cultural backgrounds perceive the idea of digital immortality, so Poland, India and China are our three selected countries for this research, because it’s not enough to hear only a perspective and to know the perspective from the West and this dominant narrative.
Feltman: Mm.
Feltman: Yeah, obviously this sounds like a really complex issue, but what would you say are some of the biggest and most pressing ethical concerns around this that we need to figure out?
Nowaczyk-Basińska: So the list is pretty long, but I would say the most pressing issues are the question on consent. Because when you create postmortem avatar for yourself, so you are data donor, the situation seems to be pretty straightforward because if you do this, we can assume that you explicitly consent to use your personal data. But what about the situation when we have a third party engage in this situation? So what if I would like to create a postmortem avatar of my mother? Do I have the right to share my private correspondence with her and to share this with the commercial company and let them make use of and reuse this material?
And another variation on the question of consent is something that we called the “principle of mutual consent.” We use this in the article that I co-authored with my colleague from CFI, Dr. Tomasz Hollanek. We introduced this idea because I think that we quite often lose sight of the fact that when we create postmortem avatar, it’s not only about us …
Feltman: Hmm.
The other thing: data profit exploitation. Digital immortality is a part of commercial markets. We have the term “digital afterlife industry,” which I think speaks volumes where we are. Ten years ago it was a niche—niche that has evolved into fully fledged industry: digital afterlife industry.
Feltman: Hmm.
Feltman: Hmm.
Nowaczyk-Basińska: and a griefbot of, I don’t know, their parents. It may be devastating and really hard to cope with.
Nowaczyk-Basińska: I think they could serve as a form of interactive archives. It’s very risky to use them in a grieving process, but when we put them in different context, as a source of knowledge, I think that’s a potential …
Feltman: Mm.
Feltman: Sure, and maybe even in personal use, less like, “Oh, this is my grandmother who I can now have personal conversations with while grieving,” and more like, “Oh, you can go ask your great-grandmother about her childhood in more of a, like, family history kind of way.” Does that make sense?
Nowaczyk-Basińska: Yes, absolutely, absolutely. So to,to change the accents and to not necessarily focus on grieving process, which is a very risky thing, but rather try to build archives …
Feltman: Mm.
Nowaczyk-Basińska: And new sources of knowledge, accessible knowledge.
Feltman: Yeah, very cool. What do you think is important for consumers to keep in mind if they’re considering engaging with griefbots or deadbots?
Nowaczyk-Basińska: So first of all, that it’s not universal remedy. It works for some people, but it doesn’t necessarily have to work the same way for me because I’m a different person, I go through the grieving process entirely different. So definitely, that’s a very personal thing, and grief is also a very personal and intimate experience, so we should keep in mind that it’s not for everyone.
Feltman: Thank you so much for coming on to talk through this with us. I’m really looking forward to seeing your future research on it.
Nowaczyk-Basińska: Thank you so much for the invitation. It was pleasure.
Feltman: That’s all for today’s episode. We’ll be back on Friday to talk about why the world needs to start paying more attention to fungi.
For Scientific American, this is Rachel Feltman. See you next time!
Rachel Feltman is former executive editor of Popular Science and forever host of the podcast The Weirdest Thing I Learned This Week. She previously founded the blog Speaking of Science for the Washington Post.
Naeem Amarsy is a documentary filmmaker and multimedia editor based in New York City.
Fonda Mwangi is a multimedia editor at Scientific American. She previously worked as an audio producer at Axios, The Recount and WTOP News. She holds a master’s degree in journalism and public affairs from American University in Washington, D.C.
Alex Sugiura is a Peabody and Pulitzer Prize–winning composer, editor and podcast producer based in Brooklyn, N.Y. He has worked on projects for Bloomberg, Axios, Crooked Media and Spotify, among others.
Source: www.scientificamerican.com