0:37 | Intro. [Recording date: March 27, 2024.] Russ Roberts: Today is March 27th, 2024, and my guest is psychologist Paul Bloom of the University of Toronto. His Substack is called Small Potatoes, and I love it. This is Paul's sixth appearance on EconTalk. He was last here in December of 2023, talking about whether artificial intelligence can be moral. Paul, welcome back to EconTalk. Paul Bloom: Glad to be back. Thanks for having me. Russ Roberts: I want to let listeners know that this episode may touch on some adult or dark themes. You may want to listen in advance before sharing with children. |
1:09 | Russ Roberts: And our topic for today is a recent essay on your Substack. The title was "Be Right Back," which described a scenario for the future--a scenario I would call a certain kind of immortality--that you Paul called a 'blessing and an abomination,' which I thought was the perfect framing of what is perhaps, I think almost certainly coming for us in the afterlife that we're about to experience. What is that, Paul? Paul Bloom: I like the terms 'abomination' and 'blessing.' There's a nice sort of Biblical resonance to them. And, what I'm imagining is a world in which artificial intelligence [AI] is capable of mimicking real people. And, there's all sorts of usages you could imagine this having. I think a lot of people would enjoy having celebrities in their house they could interact with on their phones or on their whatever--their Alexa system. And, I think people would enjoy connecting to friends of theirs, family members who are out of town, who, in the middle of the night you wake up, you want to talk to your wife or your husband, they're asleep. So, you just kind of start talking to the simulation. I'm most interested in and most troubled by--and I think we're going to talk about this a bit--in cases where these AI simulations are people who have died. And, my inspiration for this--my inspiration for the title of the Substack, "Be Right Back"--is a Black Mirror episode. And, you and I talked about this: we don't want to have spoilers for the episode. People should just, if they have Netflix, they could watch it. But, the setup for this is: a woman--a young woman's husband just dies suddenly. And, in this sort of alternative future, she has the ability to have a simulation of him created. And, it ends up with--and I'm not telling you more than what's in the trailer--but it ends up with the simulation being a robot version of him which is indistinguishable from the original. But, the part I was most interested in, because it's most plausible, is the simulation is just online. So, they upload all the videos of him, everything he's written, everything he's commented on, all his DMs [direct messages], and texts, and so on. It establishes personality that way. And then she could talk to him over the phone, and she finds this--at first, she greets this possibility when it's raised to her with horror and disgust, the hallmarks of an abomination. But later she finds it addictive, tremendously moving, powerful; and it takes away some of her grief. And, since this does not seem tremendously futuristic, it seems we're close in certain ways, I want to talk about what that would be like and the implications. Russ Roberts: It reminds me--I've not seen the Black Mirror [Russ accidentally calls it Dark Mirror] episode, but I think it's nice because Black Mirror is, in theory I suppose, a science fiction series. We're very close to living in a science fiction world, rather than imagining it or reading it. There's a movie I love--or at least I loved a long time ago when I saw it--called Starman. [BEGIN SPOILER ALERT] And again--there's spoilers coming--it's about a young widow who is mourning her husband, and I think the movie opens with her drinking a glass of wine late at night, watching home movies of her husband--who is gone. He has died young, tragically. And, she's drunk because she can't deal with this loss. And, a alien creature slips in through a window--it comes in like a star beam, a beam of light--goes through a family album and finds a lock of his hair, I think is what this creature finds. And, within a very short period of time, the widow finds herself in the presence of a perfect, cloned DNA [Deoxyribonucleic acid] replica of her husband. But of course, without his memories--it's a twist on the Black Mirror version--she is of course, deeply attracted. This is not a robot, though. It's a flesh and blood creature. It looks just like her husband; and we watch as she and this clone get to know each other, although the clone's motive is not the same as hers. The clone is are on a mission from outer space. [END SPOILER ALERT] So, you can watch that if you want. But the power of that movie, and I think the power of Black Mirror and what we're going to talk about today is: as human beings, our finitude, our mortality is unbearable. And, religion provides, or used to, I think much more effectively, some solace for that with the idea of an afterlife, or reuniting. How many movies are there that exploit this human urge? One of my favorites is Heaven Can Wait, which is a magnificent--I love that movie. But, there's--I think it's called Truly Madly Deeply [Russ accidentally says Clearly Sadly Deeply]. That movie was so powerful. Alan Rickman is the star of it. I watched it once and I can't watch it again. It's a masterpiece. It's too sad. |
6:38 | Russ Roberts: So, as human beings, we cannot cope with our mortality. We long for immortality. And, as human beings, we do it many different ways. Our children, our books, the memories of the people who are alive when we go. But, this is a different level; and it's different. Paul Bloom: So, there are two questions I could ask you about this--both personal. But, the first is too sad and I won't even ask it, which is: would you want this of the people you love now, if you were to lose them? And I mean, honestly, when you lose them because we all lose people. What would you think of such a substitute? But, I'll ask you a different question, which is: If I go online, I could see hours and hours and hours of conversation with you, and video with you--more than just about anybody I know. So, imagine--and it's not crazy--we upload all of that and we could have a Russ Roberts simulation. And it would be a little bit--it wouldn't be the way you would normally talk to your family, and so on. It's a little bit more talking to people about intellectual matters. But there's a lot of stuff there, and it would be a pretty good simulation. Would you participate in creating one for when you die, for the people you leave behind? Russ Roberts: Can I answer the first one, also? Paul Bloom: Yeah, definitely. Russ Roberts: One of the things I love about having Paul Bloom as a guest is that he sometimes asks questions, which means I get to be the guest, which I appreciate--lonely here as the host for 940 episodes. Paul Bloom: I take away some of the responsibility. Russ Roberts: What? Paul Bloom: I take away some of the responsibility. Russ Roberts: Exactly. Paul Bloom: I should get a share of revenues from this episode. Russ Roberts: Yeah. We'll talk about that. Paul Bloom: We'll talk about that. Russ Roberts: Yeah. The first question I think is a very profound question, obviously. And, what's beautiful about your essay is it is a speculation about what might be coming quite soon, actually. But, it forces you to think about life, not just simulations of death. I lost my father about four years ago. I was very close to him. And I had dreaded that day for a long, long time. I was so close to him when I was younger, I was afraid--much more of his death, obviously, than my own--but just worried that I would struggle to cope with a world without my dad. And, when he got older, and sick, and lost some of his mental capability, I was surprised after his death how little I missed him. I still miss him, but I thought it would be much stronger. And, part of the reason I didn't miss him as much as I expected, is that when he died, he wasn't the same person he was when he was younger. I didn't miss the man who passed away at 89 years old. I missed the man who was the 50-year-old--or even better, the 40-year-old father of me as a young boy or as a young man--who I turned to, who I wanted his approval, and so on. And so, if you said to me, 'Would you want a recreation of your father?' I would say, 'Yeah, but not the one at the end.' Don't use all the data. And, you give this example about tweaking the avatar--the simulation. I want the 40- to 58-year-old Dad of mine, who was funny and wise and treated me a certain way--differently than he did at the end, in the last, say, five to 10 years of his life. So, that's the first thought I have about that. The second question is: You know, people have already done what you're talking about. They've uploaded transcripts from podcasts or writing of people. And, at the current level it's not very good-- Paul Bloom: No, it's not good at all-- Russ Roberts: I can have a conversation with Adam Smith based on his books that are online. And, it's disappointing. It's not very interesting. But it will get better. In fact, it will get better and better and better. And, the idea of whether--you posed the incredibly creepy, but I think inevitable dilemma: How much of my life would I spend preparing for others to enjoy this? How much of my own day-to-day life would I record so that my loved ones, when I'm gone, or strangers--forget my loved ones--strangers could enjoy my character, my persona? And, that's just a--I'm not sure it's an abomination, but it's a creepy thought. What are your thoughts on those two issues? Paul Bloom: It wouldn't be--I don't see it as much work, actually. I can imagine us just carrying around an unobtrusive recorder as we go about our lives and talk to our children and our partners and our friends. Sort of like a podcast, living one's life as a podcast, but just collecting a lot of data. And, I mean, there are two questions. The second question: would I participate in leaving something behind? If I felt the people who are close to me would want that, I see some negatives. I see it. I would be--I'm most sort of troubled and curious about children, about young children. I have older children who are out in the world and maybe they would enjoy being able to go onto a computer, have a little conversation with the version of me once I've passed. But, imagine a child whose mother or father has died at age five or 10. I tell the story in a Substack of my own--I don't tell this-- the first time I've ever talked about this--but my mother died when I was 10, of--and I was extremely close to her. And I think I would have wanted to[?] simulate--to hear her voice again. And I'm old enough so that for her, there weren't--I couldn't get a QuickTime movie. There was not a whole lot of a video that all of us leave behind now. So, I've never heard her voice again once I had my last conversation with her. But, would it have been good for me? Or would it have sort of blocked a grieving process in some way? Suppose my wife dies and I have a simulation of her [?] and I just enjoy talking to her, I'd have the same conversation--she knows me, or appears to know me--as well as my wife does. Would I ever seek anybody else? Or would I just spend my time talking to her? And, I don't know the answer. I think those are sort of two separate questions: What would we want? And then, the second question is what's good for us? |
13:35 | Russ Roberts: The other thought I had-and I'm going to phrase this about someone else; it's funny how it's hard to talk about it, about you or me--imagine someone who loses their spouse, has the avatar available, or created, the chatbot. And, again, it could be on your phone: You talk about it like you'd call them--but of course you'd call them up. They will be hovering in the room with you in 3-D hologram form with all the gestures, just like the Starman DNA thing. They'll have the physical features: again, you'll pick the year that one wants one's spouse to come back from-- Paul Bloom: And, you know what we'll have, which we can do already. It'll have the voice. Russ Roberts: Yeah. Paul Bloom: The voice, not just the voice as if in a mechanical sort of replication-- Russ Roberts: Not Alexa-- Paul Bloom: The same--yeah, the same cadences, the same use of vocabulary. You know, I hear you for 10 seconds; I know it's you. And, I could understand intellectually that an AI, even right now--I think there's been famous cases of this--can fake it. So, I'm not talking to Russ Roberts, I'm talking to an AI deepfake. But, I think that will be so compelling. [POSSIBLE SPOILER] And, in the Black Mirror episode, when she stops sort of messaging him over the computer and he says, 'I can give you--if you want, you could talk to me.' She hears his voice and then breaks down and is so moved; and watching it, we're moved, too. [END SPOILER] Sorry I cut you off. Russ Roberts: No, that's all right. I was going to say two things. One, my children--one of my children in particular--is a superb mimic, and we don't need the avatar if they want to hear me. And, he knows all my catchphrases: he knows it better than I do, because he's noticed them. I won't reveal it, but my children have a WhatsApp group named after one of my common phrases. And, until I became aware of that, which was I think an accident: I'm not in the group, obviously-- Paul Bloom: No, of course not-- Russ Roberts: it's for them to talk about me. I didn't realize, 'Oh yeah, I say that a lot,' and they've figured that out and made that the name of the group. But, what I was going to say about the spouse--and this for me is where it gets particularly interesting and dark. Let me introduce my thought on this with a story. I went to a memorial service and a woman had lost her husband of 70-plus years of marriage, and for a reason not worth going into, I was friendly with both of them, but I was not close friends. And, the woman revealed something very personal to me, partly because I think I wasn't a close, close friend. She said--this memorial service took place after, some time after the death of the husband--she said, 'I talk to him all the time.' And I said, 'Of course you do.' And she said, 'My friends think I should stop doing that. I should get over it.' And I said, 'I don't think so. I think it would be weird if you didn't talk to him. You talked to him for 70 years. Why would you stop?' But, what I'm thinking is that that's a particular case, people in their 90s. But, if God forbid someone lost a spouse fairly young or certainly at midlife, and you had that avatar, and you're saying, would that delay you from entering out and dealing with your grief? But, there's a second possibility, which is if you then remarried, would you not be tempted, and maybe encouraged by your new spouse to continue that relationship? Just like people who break up with their spouses or divorce, either, or separate from their partners, say, and I'm always--I guess it varies tremendously by the relationship of the person--but they say, 'Oh, we're still friends.' And, often the person who is the new spouse--the new partner--resents that. Sometimes they encourage it, sometimes they resent it. But, imagine if on your watch, one could chat with one's former spouse about their problems, including their problems with their new spouse or new partner, and say, 'I'm having trouble,' because often our relationship with our friends are as an ear, a shoulder to cry on. Anyway, that's coming. I don't think there's any doubt that that is coming. Paul Bloom: I don't know. |
18:41 | Russ Roberts: But, my view is--the way I phrased it; you phrased it a different way--I phrased it, and we'll come back to this. I'm not ready to talk about this yet, Paul, but I think the right question is: Can we possibly resist this future? Should we, which is the question I think you were asking. And, if we should--if we decide that this is not healthy for ourselves, is it possible? I'm not sure it's possible. When I think about the seductiveness of screens and social media--let's move away from spouses. If I have a chance to hang out with Adam Smith--let me give you three scenarios and let you react to it. So, at the end of my book on Adam Smith, I imagine having a drink with him. And, I love that idea, right? So, imagine I could conjure up his avatar, and have that drink and we could talk about tariffs, we could talk about why he at the end of his life was a customs official. I can find out more about David Hume, right? And, this is a world where Adam Smith is more than just the collection of his writings. This is the Adam Smith whose every conversation he's had with David Hume has been recorded and saved. And so, I can find out about not just hanging out with David Hume--of course, I guess I could have both of them over; and I could watch their interaction, which would be charming, and I get to be their new friend, and so on. So, that's one level. The second level is: I say, 'Paul, I literally enjoy talking to you and it's a shame we only talk every three or four months. Could we have a Zoom relationship where we have a drink together now and then?' And, I have a friend I do this with: we have coffee now and then. We don't do it much since I moved to Jerusalem because of the time difference, but I used to talk to him now and then. So that's Level Two. So, Level One is a fake, but maybe really real Adam Smith: hard to describe it as fake if it has absorbed all of his interactions with his friends, his mom, and his writing. Second level is: I'm hanging out with you, but it's over Zoom and I can't smell how smoky your scotch is. The third level is: I have a friend here I like to have a l'chaim with now and then--maybe he's not as interesting as you, Paul. So, instead of seeing him in the flesh, I hang out with Paul Bloom and I hang out with Adam Smith. And eventually, maybe Paul--I hate to say it--I might also want to hang out with someone else, not just Adam Smith, but you'd fall--you'd slide down the totem pole. I'd hang out with Michael Jordan because he'd have an avatar for sale that would let you interact with him. Forget living people. I'd have a whole host of Adam Smith-like extraordinary conversationalists. Dorothy Parker would be in my living room; Samuel Johnson. It would be: Why would I ever spend time with you, Paul, over Zoom and certainly not with--I won't pick a name--but my friend here in Jerusalem who is not nearly--possibly but maybe--but could not, maybe can't compete with my online friends. In which case I am living a total, digital life. Is that appealing to you? Paul Bloom: Wow[?]. There's a lot there. Sometimes people are given this question and maybe for dating services and so on, if you could have three people over for dinner, who would they be? And, no, I don't know. Maybe for you it would be Adam Smith, Dorothy Parker, and your late father, and you have a great conversation and then maybe another night you choose another three--sports or current economists. And, well, now we can. In the near future, in our lifetimes, maybe we can. And, part of the seductive part of this, and part of why it's worrying is any individual in reality is at times sleepy, impatient, rude, self-centered, uninteresting. Conversations don't always go the way you wanted. Maybe I really want your advice on something, I want to tell you something, but maybe you're bored, or maybe you want to tell me something and you one-up me on my story. I didn't want that. I wanted sympathy[?]. But of course, the AI will be just right. And, it could be--there's an analogy here with pornography. There's an analogy here with super-sweet foods. That, our minds have evolved to have certain tastes--evolved through evolution, through culture--to have certain tastes. And, we have tastes in people. We're looking for kindness and love and patience and humor. And, what if these simulations can do that better than real people can? Where would the draw be of real people? Putting aside the physicality, which maybe AI can't do, but most of my relationships with people are not physical in any way. So, I'm perfectly happy just talking to them. And, maybe AI does better talking. And, it's easy to see this as a dystopia. You lose contact with your friend. Why would you--you gave the reasons for why would you think Adam Smith is wiser. If you could conjure up somebody who is much better? And so, in some way that's terrible. But, it could also be the end of loneliness. I mean, I'll make the other argument, which is: you know, you and I are, I think, very, very fortunate that we have people who love us, and are into--varying extents--enmeshed in communities. You're a university President, you're more enmeshed than I--you're more enmeshed than I'd want to be. But, there probably are not--there are not days and days that you go without human contact, without anybody interested in you talking to you. But, there are people--people not far from either one of us right now--who haven't spoken to another person for a long time and are desperate for human interest. And, what if AIs could scratch that itch? People mock those who seek out, you know, AI boyfriends and AI girlfriends, but loneliness is awful. There are few psychological torments worse than lonely--and it's possible AI could fix that. And that's the case for it. |
25:15 | Russ Roberts: You write the following. We've touched on this, but I'm going to take a variant on it. "How much do we want"--this is a quote: How much do we want the simulations to correspond to the people they simulate? A couple is married for thirty years, the husband dies from a long illness, and his widow misses him desperately. They had been taping their interactions for many years--they knew this time would come--so the simulation she later signs up for is excellent; it's just like talking with him. Their conversations are an enormous relief. But nobody is perfect. Her husband had his flaws; while he loved her very much, he could be sharply critical, and in his later years, he was forgetful, telling the same stories over and over again. Can she contact the firm that provides the simulation and ask for a few tweaks? Endquote. And, I am reminded of a song I've quoted on here before. The song is "It Had to Be You." It goes like this--I'm not going to sing it because I have a little trouble with the melody. It's a little bit challenging in parts, so I'm just going to read it. It had to be you, it had to be you. I wandered around and finally found the somebody who Could make me be true, could make me feel blue, And even be glad just to be sad thinking of you. And, here's the key part. That was pretty good, though: Some others I've seen might never be mean, Might never be cross or try to be boss, But they wouldn't do, For nobody else gave me a thrill. With all your faults, I love you still. It had to be you, wonderful you, It had to be you. So, who would add faults to their avatar? Who would not tweak that simulation to take out the obnoxious criticisms of a spouse, the moments of cruelty that probably the person who said them might even regret? It might be happy that the spouse takes them out when they're gone; but they're not human. Russ Roberts: Really. Paul Bloom: There's an analogy with food, which is: we have engineered food that hits all of our buttons--sugary sodas and impossibly fatty meat--and it just lights us up. But sometimes you eat this food and afterwards you don't feel right and you want real food. You want real food that isn't gussied up and energized. When you're a kid, candy is wonderful and it's hard to eat vegetables for many kids. They don't want to eat vegetables. There's none of the bang of food. But, vegetables can be terrific. And, you are making a case for a similar point with people, which is: it might be that a perfectly designed avatar of somebody I love, all the flaws removed, would be inhuman, and, you know, wouldn't come off right. It might be that to be seen as human, to be appreciated as human, you have to re-insert some flaws. So, a little bit of repeating the same story twice, a little bit of a bite at a comment. And then, part of it, he says, 'Yeah, this is a person.' I wanted to go back to something you said about the widow because I found that a great story. And, in my piece, I quote a friend of mine, the developmental psychologist, Paul Harris. And, Paul has this wonderful essay on death, and how we respond to death. And he points out that a very common picture among developmental psychologists, starting with Bowlby, a great attachment theorist, is that what happens when somebody close to you dies is first you don't believe it, and then you respond with anger and despair and all of these emotions. But, it turns out it's more complicated than that. It turns out that studies with adults--with widows, actually--there's a big study of widows, finds exactly like the story you told me. Widows very often report continuing conversations with their dead husbands, hearing their dead husbands' voices, keeping things that he owned and he used, around them as reminders, having photographs to remind them. And then, there was a similar study with children--children who lost their parents, children who lost their siblings, same thing. They would hear from them, they would talk to them. But, if you ask the children, 'Do you understand that your father is really dead?' 'Yeah, of course.' Only a tiny minority expressed any doubt. They fully come to grips with the fact these people are gone, and yet in their minds, they resume a relationship with them. And, in some way, then, this AI would just facilitate that. It's like a prop to continue this, maybe. And, you can imagine it having some therapeutic uses that way. |
30:17 | Russ Roberts: Well, I'm pretty sure my mom still talks to my dad, and, like that story I told, and guess what? She doesn't need the avatar. When you've been married--they were married 60 or so years, almost 70 years, 69 years. She kind of had the data. She's still of sound mind, and she doesn't need the simulation to remember how--not just my dad's catchphrases and favorite things to say, but she can, I'm sure as I can, have a very good conversation with him in my head. And, similarly, my wife, thank God, is alive--we talk all the time. Without her, me and her. Because I think of things to say and I think of what she'd say back and so on. That is part of an enduring friendship or marriage. And, I think what's troubling about this--and I think we should talk a little bit about the abomination part, because we kind of haven't, we've said it's weird or creepy, but abomination is a very strong word. The abomination part is about the tweaking or the altering, and then the relying on--I think it's not just, 'It'd be kind of cool to ask a question of Adam Smith,' or to watch a video, by the way, of someone who has advice for me. I don't have to conjure up some crazy AI science fiction thing. I turn to dead people all the time. I read their books. It's fine. Nobody thinks it's weird. The weird part, I think, is twofold. It's the tweaking to produce what you want as opposed to the reality that was. And then, the second part is living in that world full time. And I think that will be the challenge. Just like junk food is seductive, I think the appeal of digital friends, both romantic and sexual as well as--that'll be much more interesting I think, than me trying to have a drink with Adam Smith. But, I think that world, that retreat from the human flesh-and-blood world is what is creepy, abomination-ish. Paul Bloom: Yeah. I agree. I think that there's a couple of things. One is: I don't find anything creepy about an Adam Smith simulation. That's all--it could be intellectually stimulating, all good fun. But, imagine somebody whose child dies--say, a teenage child--and then there's a simulation. He talks to the child and shares stories and talks about, 'Oh, remember when we--.' There's something about that which might be repellent, above and beyond any sort of implication it has for the grieving process, and for how you spend your life. It might be repellent because this machine is purporting to be somebody who it's not. And, even if you have--and I discussed this a bit in the essay--even if it's sort of very careful to say--you say 'Remember when we--' and then, the simulation comes back, says, 'I should remind you that although I have the voice of your son, I'm not really him.' But, even if it does that, there's something--I don't know, unholy about a machine trying to replicate faithfully somebody you love, so that you could pretend that they are that person. And then, I'll also add just something sort of practical, which is: everyone has observed and is panicking about the extraordinarily addictive powers of the Internet and social media and artificial worlds we live in. Jonathan Haidt most recently has a book coming out on a topic, and this just adds to it: 'Oh, great. Now we have the people we love accessible online.' And, all of that makes it less likely you'll see your friend or Zoom with your real friend, and more likely you'll just press a button and get it through the computer, and get something better. Russ Roberts: Yeah, I hadn't thought about this, but of course it's obvious that it's going to change. We'll be in competition with these creatures, and it will change how we interact. And, only the most extraordinary, perhaps--maybe only the most extraordinary people--will have real friends. And, the less attractive, less charismatic people will be driven to a fully online existence. That's painful. But, I want to come back to your word--you want to comment on that? Paul Bloom: Yeah. I think here's another way of putting your point, which is--it's an issue which sometimes comes up. I have an essay with some friends of mine at Toronto on empathic therapy done through AI. And, one problem with it is that just like a child who eats a lot of candy and drinks a lot of sugary soda, and then won't go near real food, you could imagine a case where people become used to these perfectly compliant, frictionless, incredibly interested in you, incredibly witty AI simulations. And so, real people just are not--they don't match up. Why would I want a real girlfriend when my AI girlfriend is so much--is so interested in me, loves me so much, and has no needs of her own apparently, above just cares all about me? And, what would that do to people? Russ Roberts: Yeah. I'm 69, Paul. I forget how old you are. Russ Roberts: So, our 23-year-old listeners, of which there are a few--maybe more than a few--may find this puzzling, this conversation. They're more used to technological comfort than we are. I think there would be a big age gap in what is considered abominable and what is considered a blessing. |
36:57 | Russ Roberts: But, I want to come back to that word 'unholy,' because I think that gets at something you touch on only obliquely in the essay, which is religion. So, religion believes--most religions, I think, certainly, of the Judeao-Christian ones--believe that you have a soul. There's something divine about your essence, and when you die, something happens to that soul. You don't just decompose as a physical object. You're different from a dog, and you're different from a table. And, you have--humanity is maybe crooked timber, but it has a spark of the divine. Certainly in Judaism, and I won't speak for other religions. But, most religions deal with some kind of afterlife, some kind of hope for reuniting, and so on. And that worldview has diminished in the West: it has become less appealing. And, we had a conversation a long time ago, which got into my book--ended up in my book, Wild Problems--about whether it's better to be a philosopher or a pig. Whether it's better to live the life examined, the examined life of the philosopher, or to be cavorting, and enjoying a physical life. And, as religion diminishes, I think it's harder and harder to reclaim anything other than utilitarian physical pleasure. And so, you and I, we're older. We come from a different era. We still have in us some unease about some of these scenarios. I think younger people, particularly secular young people, would find some of our unease both baffling and perhaps silly. Life, if you're not religious, life's to be enjoyed, and why wouldn't you spend it with the best possible experiences with those avatars? You know, it goes back to this wonderful idea of Robert Nozick's, The Experience Machine, where you hook yourself up to a machine, you program it, and while you're on the machine, you will think you are the greatest golfer of all time, the President of the United States, the doctor who cures cancer, the rock star who plays before 100,000 people, and so on. You choose whatever you want. And, while you're hooked up the machine, it'll feel like real. And then, you die when you finish your life on the machine; and you accomplish nothing. And, for those of us--myself, for example--who feel that life has some kind of purpose and that we have things to achieve and growth to experience, these digital alternatives are abominable and creepy. But, I think for most people, I don't think it bothers them at all. And, when Nozick wrote that Experience Machine example, which was back in the 1970s, I think most people would have been horrified by it. And--I'm sure there's data on this; people have asked about this--and I know that in the modern world, meaning now, many more people are willing to live that, do that. They say would. Even though they would do nothing with their life other than lay on a table but feel like they were doing something. And, I think your examples get at that. To interact with a machine--a digital avatar--day-to-day, instead of real human beings might be very pleasant. It's like the pig. But it's not the philosopher's life. Paul Bloom: I never thought of it that way. I've never connected the idea that being immersed with sort artificial friends is like being a Nozick's Experience Machine, where you have interactions that ultimately lead to nothing. They're sort of all in your head. They're in your head: now your head is supplemented by a machine, but it's still all in your head. You're not really making a connection, you're not really establishing a relationship, you're not really changing people's lives. And, I think that's a really clever way to put it, and maybe helps us figure out what we might find so disturbing about it. It's an escape from reality. Sure, maybe you're happier, maybe it's more [?], maybe it alleviates your loneliness better. But in reality, the person you love is dead, and you should be seeking out after a certain amount of time, when the grieving has ended, more people. In reality, yeah, your AI simulation is fantastic company, but your friend is a flesh-and-blood person; and connecting to an actual person means so much more. So, yeah, I hadn't thought of it that way, but that is one way of establishing the unease I have with the substance. Even if it had no other consequences, even if people were just happier and more satisfied. I'll push back on one thing and maybe this would take us too far afield, so you could just kind of let me ramble and then ignore it. But, I'm not sure that kids today are becoming more utilitarian. I think it's complicated. You're certainly right: they're less religious. There's been a decline in the sort of religious belief, in an afterlife, and so on that's easily documented. But, I think it's interesting: I look at this sort of, so-called social justice, so-called woke movement of our time where you have people in their 20s--your younger listeners--who think of the world radically different from the way you and I do. But, it's not like they are hardcore utilitarians. Rather, they're actually caught up in all sorts of high-level values like diversity and social justice. With regard to sexuality, they tend to be more sharply moralistic than older people. I think probably promiscuity is more acceptable to a 40-year-old than a 20-year-old. And so, I just think it's kind of complicated, and my sense is more that our values and extent to which we are sort of utilitarian as opposed to seeking out higher values really ebbs and flows through history and doesn't simply show a simpler pattern. Russ Roberts: I agree with that. It's a great correction. |
43:22 | Russ Roberts: Let me just say a different version of it. After my father died, I went through the traditional Jewish practice of mourning, which is quite complicated, and really quite extraordinary, and I'm very grateful for it. But, it's a lot of work. It involves saying a certain prayer three times a day in the presence of at least 10 men. It's 10 people if you're following a different version of Judaism than I follow. But it's 10 people at least, which means having to interrupt your day at least twice, because two of the three are combined. It involves abstaining from live music if you're Orthodox--as I am--for 11 months. And, for seven days you sit on the ground and don't do anything except accept visitors. That's so-called Shiva, which is a variant on the Hebrew word Shiv'ah, which is seven. And, many people like this practice when they're in the presence of it, because they see it has a slow, steady, cathartic effect. And, the alternative--which I don't mean to pick on, this is not a religious comment, this is just a statement about grieving in general--when Mel Carnahan and his son died tragically in a plane crash while he was campaigning, I think, for the Senate, his wife, immediately--like, within a week; I think within a day--began campaigning in his place. His advisors and others encouraged her to--'There's higher values than grieving and mourning. You should just--Don't think about it, or do, but either way, get back to life.' This is the equivalent of you don't talk to your dead husband. It's 'You need to get back to life.' A different kind of life. And, an alternative version of that is: If you're sad after you've lost your father, you should take some drugs. Get drunk, or take an antidepressant, or get high. Because, why would you want to suffer? And, a religious person is into suffering. Let's be honest. Suffering is at the core of the Judaeo-Christian tradition--not necessarily something you seek out, but it is part of the experience of being a human being, is to suffer and to be sad, and to deal with that, and to learn how to grow from that. And, if I can ramble on for one more minute, one of my favorite stories--and if anyone knows the sources, I'd love to hear it, but I love this story. It sounds like a Hasidic story. I cannot find it. But, the story is that the master has a student, and the student comes to the master for wisdom. And, the master says, 'Here is a stone and here is a tower. Your job is to get the stone through the doorway of the tower and climb the stairs and take the stone to the top of the tower.' An, the student picks up the stone and he takes it to the door of the tower; and it doesn't fit. He can't get it through the door. He turns the stone, he rotates the stone. Nothing. He goes to the master, says 'I can't. it's impossible. You've given me an impossible task.' And the master takes a hammer and he splits the stone and says, 'Now you can get it through the door.' And he says, 'The stone is your heart, and it must be broken before it can climb and be elevated.' This simple--and a parable. This simple parable, I think, says something about life. But, I understand the alternative, which is: Why would you suffer if you could avoid it? And so, the idea that we should endure pain or misery--if we have a digital alternative, is--I understand that view. So, while I agree with you that young people are not merely pleasure-seekers: even in the absence of religion, of course, they look for purpose. We're human beings. Human beings want to belong. We want to have a reason to live. People who pursue pleasure in a utilitarian way tend not to be very happy, although they may have a lot of pleasure. At least that's my religious bias. I don't know if it's backed up by the data. You'll tell me, Paul. But, I understand that, I think as religion, if it does die out, seeking pleasure will become, maybe, more attractive. And, at the same time, I agree with you: there could be a pendulum swing back toward a search for meaning if, when these pleasures of digital simplicity maybe are not so appealing after all. Paul Bloom: Yeah. I know better than to try to throw some happiness data at you. We've talked before; I know your skepticism--and I share some of it. So, I'm not going to try. I agree with your defense of grief. I think that there's two defenses of grief, and also the structures that religions like Judaism have set up in order to allow us to experience grief. And, I think there's a genius, at least--I'm also most familiar with the Jewish way of doing things, and there's a genius to it. It's titrated. It's titrated to--the rituals you experience would be for your father, but not for your second cousin. They are for a week, but then for a month they are lessened, but they don't go away. There's all sorts of restrictions. And then, for a year. But then after a year it's over. And, the assumption is--am I right that after a year? Russ Roberts: Well, except it's never over in Judaism. Russ Roberts: After a year--and I'm glad you made that point--that it slowly ebbs, which mimics of course your own--usually--your own mourning and grief. But, the most intense is that year. But then, after that year is over--it's actually 11 months--annually, you observe the death date of the parent. It's called the yahrtzeit; and, you light a candle, you go back into synagogue to find a group of 10 people to say the prayer of mourning. That mourning prayer, by the way, is called Kaddish, which people have heard of for a variety of reasons. But, I can't help but mention that that Kaddish prayer is an exaltation of the Divine. It is not anything remotely like what you think would be your duty to say in the face of death. That's a longer story. We'll put that aside. But yeah: It's titrated for the kind of person who passed away. And, it certainly is, over time, changes. Paul Bloom: Well, I think that there's two defenses of grief. One is a sort of cold-blooded, almost secular version that it's better for you. It's psychologically better for you if you grieve, as opposed to try to rush away and live your regular life. But, I think there's more--there's a different kind of defense, which I could imagine you'd be sympathetic to, which is: It's a matter of proper respect for those who have died. It's respect for your love for them, respect for them. If after my wife died, the next day I went dancing, you would think--even if it turned out to be psychologically better for you, for me; as a psychologist, 'Oh, this is great. This will help them recover. They will be so much happier,'--at least I feel, 'Yeah, but that's wrong. You shouldn't do that. Somebody you love--what kind of character do you have? What kind of relationship did you have with this person that you could dance afterwards?' And, I share the idea that suffering and struggle is an important part of life. I wrote a book called The Sweet Spot, which was all about that theme. But, I also--and I think you do too, say, 'Yeah, but pleasure also matters.' I wouldn't take somebody--suppose there is somebody, and, say, he is seven years old; he has no family and no friends, and he's desperately lonely. And maybe for reasons having to do with his character or his social position, he's not in a position to get friends. Maybe he's a difficult person. But he's lonely. And, now we have on offer good company for him--good, caring, loving company, a simulation. What is it for you and me to say, 'Oh no, that's an abomination. Stay clear. If you can't find real people, find nobody.' If he's dying of thirst and all there is is a sugary soda, he should indulge. |
52:27 | Russ Roberts: Well, part of me likes that, but part of me says, 'That's my uncle.' And, if I can pass him off to a charming, beautiful, young avatar that entertains him in various ways, I won't bother. Phew: I don't have to bother visiting him in the old age home or in his apartment, which doesn't smell the way I like things to smell. And, it's the decoration of--the decor of it--is tawdry and out-of-date; and it just depresses me to go visit him. So, that's part of the reason he doesn't have any friends, is that we haven't risen to the occasion of seeing him. Paul Bloom: No, I haven't thought of that. But that is true. That is true: in that, once there are substitutes for friendship, for companionship, it lessens the moral obligation we would have to reach out to people in need. And, I would say sometimes the cost-benefit analysis there goes, 'Yeah, it's still better to provide it because maybe you wouldn't have seen your uncle that much. So, now you see him once a year instead of once a month. But, he is still very lonely when nobody is around. So, this will make a difference for him.' Russ Roberts: Fair enough. Paul Bloom: And, I think we agree on this: Both sets of values are on the table. There's a certain way in which these AI substitutes are wrong, and they're wrong in the same way to plugging yourself into a Nozick Experience Machine, or injecting a tube of constant stream of heroin into your system and living your life like that, in the way that that's wrong. But, there's also ways in which that this could sort of help alleviate a lot of human pain. I guess maybe if I had to sort of sum up a policy recommendation, I would say people who are in a position to live good lives--they're young, they're able to, they're out in the world, there's nothing horribly wrong with them--they shouldn't succumb to these substitutes. They should try to go in the real world. But people on the periphery maybe would benefit from these substitutes. Russ Roberts: I like that. I'm reminded of my--both Mom and Dad would have--my dad when he was alive, and my mom, still, will call me for a question. And, my first thought: I get frustrated at that. I used to. Because I'd say to myself, 'But, can't you Google it? I mean, come on. Why are you asking me this? It's ridiculous.' And then, I realized at some point that they're not calling to get the answer. They're calling to talk to me, and they don't want the avatar. They want me. Paul Bloom: They wouldn't be happy if you brought over to their house one day a box that said 'Russ Roberts, Son' in the big print they have and they have a single big button which they make for the elderly. You've got to push the button, and then you say, 'Oh, hi, what could I do? I am here to help.' And, they wouldn't be happy with that. Russ Roberts: Oh, that's a good one. Yeah. |
55:49 | Russ Roberts: So, we've been talking about loved ones, maybe a celebrity. In the old days, your friends and potential romantic partners were the people you ran into at work or at play. Now you have the Internet. You can, in theory--'It's much better: you're not limited to the people who live near you. You can find people anywhere you want! As long as they're on the app.' And, it's soon going to be maybe a choice among not just every living person who has access to the Internet, but any person who has ever lived, as your best bud. Could we resist that? Let's get back to the question I raised earlier. You're suggesting that talented young people should resist it, but could they? And then, I want us to go back to should. You said 'wrong,' which surprised me. I would have thought you'd have said unhealthy. It strikes you as wrong? Paul Bloom: Yeah, I could be moralistic. I'm comfortable with that. I think for some of us, we should resist it, just like we shouldn't always drink sugary sodas, just like we shouldn't hide away. And, the equivalent to Nozick's Experience Machine is probably ordering a pizza on Uber Eats and watching Netflix; and the time goes by quickly and there's a certain pleasure to it. And, I think that's fine for your occasional evening. Do it all the time where that defines your life, not only is it kind of bad for you in the long run, but it's wrong. It's squandering away your limited time on earth where you could be connecting to people, and making the world better, even in the small ways of causing another person pleasure and amusement and human connection. Could we resist it? Your mileage may vary. I know people who are off social media. I think people--all the distractions. Some people--my older son in a world full of online material, my older son reads books. He once took a summer and read Russian literature. He doesn't even like to read on iPad. He has a Washington, D.C. apartment which is filled with books. And so, some people can't. And, some people--and I got to admit, I'm one of them--are somewhat addicted more than I'd want to be to the Internet. So, right now, sometimes I wake up 2:00 in the morning and I can't get back to sleep. And, there's my phone and I do a bit of scrolling. And, before you know what it's 4:00 in the morning--because the algorithms, particularly on Facebook, are so powerful. It knows exactly what I want. I watch endless clips of Key and Peele and Little Sopranos, and it knows what I like. And there I go--two hours gone. And I feel ashamed of myself and angry at myself. And so, I'm vulnerable to this. But then, there are people who are less vulnerable. And, I think there's a really hard problem, the sort of problem that I think you're very interested in. To what extent should we say, 'Well, let people do what they want. They'll make their choices'? And, to what extent should the state come in and say, 'This has to be regulated because people are not capable of choosing properly and it's destroying people's lives'? And, one way to make the division, for instance--and this brings us back to Jon Haidt--is to say: Children should operate under different rules than adults. We can be paternalistic towards children, for obvious reasons. And so, you could say: 'Under 17, no simulations for you.' Okay, fine: 'Fine, there's a little nanny simulation for when we're out of town. But, besides that--.' And, 'You could have your Adam Smith, but you're not allowed to have your pop stars, and no Britney Spears for you. No Taylor Swift for you.' But then, when you're an adult, you do what you want. Russ Roberts: Yeah, I don't--it won't surprise you: I'm not comfortable with it, even though I think this is unhealthy potentially for some and healthy for others. I don't want the state involved. But, for children, I think you're right. The irony is that we've given our children at younger and younger ages over the last 10 years, access to all of what the Internet has to offer, because of social pressure to do otherwise is to be cruel to your children. They're cut off from so many other--the good parts of the online experience. So, we let them get the whole package. |
1:00:28 | Russ Roberts: I want to add two things from recent episodes that--get your reaction to. I talked with Brian Klaas about--I think it's Nietzsche's idea. I don't know if Nietzsche talks about it, if other people--if it's his idea--but this idea of amor fati--of loving one's fate. And, loving one's fate means you embrace everything that's happened to you, the good and the bad. And, of course, that's a different way of saying that the Experience Machine is not the right way to go. But, it does--what we're talking about is the fate you love could be very different because you can hang out with people who only like you, who love you, praise you, whatever you want. And by the way: we've assumed everybody would want a perfect spouse, partner. There are many people who would want an imperfect one, I assume, and would want those tweaks to go the other direction: 'Make it harder for me. I want to suffer.' Paul Bloom: Many people who would say: 'I want my wife to be more critical. I want my husband to be less attentive.' Russ Roberts: Yeah. Well, as my--I quoted, my friend's father before: 'Until I got married, I was an idiot.' Many people actually see benefits to having a spouse they live up to rather than exploit. But, what do you think of that idea? That there's something in loving one's fate, and that if one could--fate, by the way, almost by definition means the things that happen to you that you can't control. And, we're talking about it where you control a lot more of the things that happen to you, at least inside. Paul Bloom: It's funny you mentioned that episode with Brian Klaas. Because I love that episode. And I actually wrote a Substack piece called "Clumsy Gods" where I talked about this aligned[?] from Will MacAskill. And, it fits with the theme that all sorts of things we do have these unpredictable consequences, and there's no way we can regulate them. I walk down the street: I distract somebody. There's this tumbling of dominoes in all of our actions, and setting us up for fate that we can't predict. And yeah, I think to some extent there's that. Of course, to some extent--this is always sort of a 'On the one hand/on the other hand'--we do want to sort of savor our autonomy and our agency. We want to--it's I think, an important part of being a person, to say, 'I'm going to do this. I'm going to affect things and I'm going to try to establish a certain future for myself and for people I love.' But, I think you're right, that that has to be leavened by an understanding that our fate is beyond our control, and maybe we should learn to love the outcome. Russ Roberts: And, there's a tension between control--our huge human desire, I think, to avoid the uncertainty that we're talking about--and then, the joy of serendipity. The unexpected pleasure is--I treasure those so much. All the things that have happened to me that I didn't plan, didn't control, didn't anticipate, and enrich my life even in a three-minute conversation. Paul Bloom: Yeah. And, in some way, it's a problem with our technological pleasures where they're too consistent, they're too reliable. They give us just what we want. And, you know, I remember--I used to love bookstores; and I still do, but not as much. And, I just wander around the shelves, and see what catches my eye. But, now when I go on, when I look for books, it's typically through Amazon or something like that. And, I kind of know what I want. Or the algorithm tells me what I want. And then I get it. And I probably, on balance, read a lot more books that I like now than when I was 20. But, there are fewer surprises. And, there's a benefit to surprise. Russ Roberts: Yeah. I mention this, I think, at some point maybe that it fascinates me how many authors I have come to love--this would be true digital or in print, in physical print--who I didn't know of. So, you still have plenty of good serendipity, and there's greatness there. And, it's hidden below the surface that if we only stick to the books that are easily, and we know about and that everyone loves, everyone recommends, we'll miss some real gems that for us can be transformative. Paul Bloom: And to bring it back: these simulations of people will likely give us exactly what we want. And, that's partially what's wrong with them. Russ Roberts: Yeah, it's true. Two things I just want to mention; I'll just mention one. Just by chance today--amor fati--I read a story called "The Birthmark" by Nathaniel Hawthorne, and it's about exactly what we're talking about. It's about a man who loves his wife except for this really terribly annoying birthmark, that for him, makes her beauty imperfect. And, well, you can imagine what that story's about. It's four pages. We'll put a link up to it. Way out of copyright. We can get link to it with abandon. |
1:05:55 | Russ Roberts: I want to close with referencing the conversation I had with Charles Duhigg about conversation. So, you and I have never met in person. I joked earlier about interacting with you online. If we lived in the same city, I like to think we would get together now and then. I don't know how often, but now and then. It's a cliché, probably, or it should be that most of our human interactions are rather dull. And, Charles Duhigg's book is an attempt to improve our communication with each other. And, mindfulness is one way that we can do that. Religion is another way. We see another person being made in the image of God, you're going to treat them in a certain way. But, I think the up-side of our entire conversation is that I made a remark that this would make--this phenomenon we're talking about, of simulations--would make people compete harder to be more attractive to other human beings. But it would also create a different kind of conversation, a different kind of interaction with real human beings, maybe much better. So, a huge part of our interactions with each other are mediocre--in my lifetime. Maybe others' are different. You, listening at home, could tell me otherwise. But, I think I'm always struck by how much of our banter and chit-chat with other human beings is merely: 'You're another human being. I'm another human being. It's nice to see you. Bye.' And, real conversation--deep, connecting, stimulating, provocative, poetic, moving--it's rare. It's rare here on EconTalk: I've had 940 conversations of an hour plus, and most of them are pretty good, I like to think; and only a handful are extraordinary, which maybe is the way humanity has to be. And maybe none of them are extraordinary to others, but they're extraordinary to me, which is all the only point I'm making. And, I like the idea--I like to think that if we had these online options, which by the way we've assumed are going to work perfectly. They may get better, but they may ultimately have certain flaws like a glitch where they repeat certain things over and over again. We're talking about human beings who repeat stories. They may have technological challenges, they may-- Paul Bloom: Or they may interject every five minutes with, 'And now a commercial for better health,' and something like that. Russ Roberts: But, I like the idea that maybe this would make our human interactions so much more intense, and so much more precious, and maybe we would think longer and harder about how to make that so. So, that's my optimistic take on your piece. Paul Bloom: I like that. So, having talked with these infinitely charming, intelligent simulations, we up our game regarding people. We say, 'This is how it could be done.' I like that idea. That's a very optimistic take. So, instead of always eating sugary, fatty foods, we say, 'Wow, food could be tasty. Let's try to do that with real food.' And, I like that idea. I will, however, just put in a plug for mediocre conversations. So, I have three regular commitments in my life for Zoom conversations with friends, two of them weekly, both for an hour--it's sort of scheduled and we're good friends who are far away. One monthly. And, I enjoy them, but I'm curious what you find this. So, I find Zoom conversations wearying for all sorts of ways. One is you just spend an hour on those conversations, an hour staring into somebody's face talking. And that's great: I think they are far more, deeper, conversations than I would otherwise have. But recently I was with a friend and we went to a coffee shop together and we just gossiped. We didn't look in each others' face. Sometimes we just sit and stare out and look at other things, and it was so much more human, and part of what [?]--so much was mediocre: we didn't say anything. We just kind of made small talk and joked about things. But the lack of purposeful intensity made it more human, more social. Russ Roberts: Yeah. I don't mean to suggest that you have to talk about Nietzsche every time you have an interaction--or maybe Kierkegaard with a stranger, or Isaiah, the Book of Isaiah. I don't think that's--that's a fair point. Much of our interactions with our loved ones are formulaic and they're the better for it because there's a comfort there. Paul Bloom: Yeah. I remember, again, my older son was young--in the evenings he would knock on my study door. The rest of my family would go to sleep early. We were both night owls, so he'd knock on my study door. I would come downstairs with him to the kitchen. He'd make himself a bowl of cereal. He'd just done his homework. I'd pour myself a whiskey, and he just talked to me, and I wouldn't say very much. He talked to me about his ideas and anything, and we did this--it was, like, a year. We did this pretty regularly, and I really cherish that time though. Sometimes, that's not much of a conversation. But-- Russ Roberts: It's heaven-- Paul Bloom: It's heaven. So, yeah, I leave a space for that. But, I also agree with you and Duhigg, which is: too much of our conversations--I've had conversations where--it's an interesting question why it's so hard to make a conversation with somebody serious and deep. It's so much safer to talk about superficial topics and not sort of, say, ask a question like, 'What do you think of your job? Are you happy? Are you happy about the choices you made?' Interesting questions, and I always find I'm so much to blame because I find it very awkward to do these things; and maybe it's sort of a moral cowardice that it's just easier to slide by. Russ Roberts: And, I think it's just hard to pay attention. I mean, I find my failed conversations with the people around me because I'm thinking about what I'm going to have for lunch or what I'm going to do next at the meeting at 2:00 or this little minor success I had that I'm savoring. And I forgot I'm actually talking to you--Paul, forgive me--and now it's over. You're off to your meeting and I missed it. Russ Roberts: I feel like a lot of the times we miss those chances because we don't have the chance, the ability, the skill to focus the way we could, if we were better at it. Paul Bloom: I really do like your idea that AI interlocutors could help cure us of this--could make us better. That, interacting as inhuman things could make us more human. Russ Roberts: We'll see. I think--I said 'we'll see.' I think we will see. My guest today has been Paul Bloom. His Substack is Small Potatoes. Go subscribe. It's excellent all the time. Paul, thanks for being part of EconTalk. Paul Bloom: Hey, thanks for having me. This was great. |
READER COMMENTS
Jacob
Apr 22 2024 at 12:36pm
Hi Russ, thanks for this episode. I’m 18 – a freshman studying Econ at UofT – I found Bloom’s perspective so interesting that I went straight to the academic calendar only to be disappointed that all he teaches is a 4th-year Psychology seminar.
I wanted to push back a little bit on how you perceive younger people’s response to these technologies because I think it ignores that for a Canadian my age, more than 15% of our life was spent in quarantine from the pandemic. My best friend has an autoimmune disorder and I didn’t see him in person a single time from March 2020 to October 2022. For most of us, those 3 years – spent mostly on the internet – were isolating and the antithesis of everything we expected of our youth. As such, any technology which has the potential to replace real social interaction – AI Chatbots, VR, New Social Media Platforms – is viewed with deep suspicion, not as a novel innovation but as the precursor to an “eternal pandemic” in which we isolated from eachother and interact solely through technology that does not have our interest at heart.
I, alongside most of my circle, have no social media, and a common sentiment among people my age is that if UofT didn’t require a phone for coursework, we wouldn’t have one. Just as the pandemic lockdowns were perceived as robbing our youth in service of those who have already lived long lives, these technologies are perceived as robbing it for Big Tech, who profit not off our flourishing but from having us sit alone and scroll our lives away.
While this is anecdotal, I think the failure of Facebook’s attempt to pivot to a “Metaverse” online world demonstrates that younger people, despite their irreligiosity, reject the idea that they should live hedonist lives alone in their homes, and are unlikely to embrace the AI immortality that you discussed in this episode.
Chag Sameach to all who are celebrating tonight, and thanks for another great episode.
Sylvain Ribes
Apr 25 2024 at 8:56am
Hmm. Interesting comment. I came to say something along those lines. I don’t think the suspicion is limited to your generation (I’m 38)
I do feel like Russ and Paul are being way too “pessimistic” with regards to what people would want.
It’s only a minority of people that are so alienated by tech that they become recluse, and I would need to see more evidence that tech is the root cause anyway.
We fantasize about those poor souls a lot, but I think the overwhelming majority of people would thoroughly reject the idea of simulating their loved ones.
Matt
Apr 22 2024 at 2:15pm
Thanks so much to Dr. Bloom for bringing up loneliness. It is so easy for those who are well-connected to mock or poo-poo things that really help others. (Not just tech, but antidepressants, etc.)
Adam
Apr 23 2024 at 9:55am
Probably worth discussing the massive physical infrastructure background that makes this possible, and what that might imply for the actual purposes of these person-constructs.
Also, this moves only towards masturbation, isolation, and narcissism, especially for people who don’t have pre-existing strong interpersonal bonds.
Far from ending loneliness, this is a deeply lonely vision. Very sad and very lonely.
Stephen
Apr 24 2024 at 3:49pm
I am only about 2/3 of the way through the episode so if this comes up in the final 1/3 my apologies… the holographic AI conversation made me wonder about pets that folks have lost along the way – I wonder if some people would want the opportunity the see them again in this AI form? I mention above of the younger generation living a big portion of life under quarantine or in online school and socialization situations (I am in my 60’s and was not too far behind Russ at NC State). Those pets and rescues became an even bigger deal for companionship during those years. Just a thought – if I had to guess I would say my grown children would have mixed emotions about seeing family pets again from their childhood or even pets they have lost in their adult lives… looking forward to the conversation about that.
Trent
Apr 24 2024 at 4:29pm
Regarding the idea that with AI we’ll be able to “talk” to anyone in history, I don’t see how that’s possible. Even with the best AI engine ever.
When you think of all the people who have been lost to history….where there’s no record of a person at all…I don’t see how that person can be recreated by AI. And some civilizations, like ancient China, didn’t record their people history, but used oracle shells to record inventories of goods at specific times…how can AI recreate those people?
Frank C Graves
Apr 24 2024 at 11:29pm
Not sure if this is redundant, because I already tried to post this once but do not see it, yet I and cannot imagine it was rejected: I have 3 related reactions to the question of whether we would want an AI avatar of a loved one, all pushing me towards “no” or “not so much”. First is the simple tension over what or who we would think we are communicating with. Would we perceive the avatar as having some sort of consciousness or would we think it is just a very smart user of historical data from your joint lives that was being parroted back in some stylistically believable way? I think the uncertainty would become problematic. Perhaps regardless of that, there are two ways in which an avatar would seem inherently unsatisfying. We may think we would like an adoring virtual soulmate, but that appetite seems to ignore that one of the key strengths of a relationship comes from giving, not just (or even as much from) receiving. In my experience, it is the fact that your mate or friend sometimes needs you and allows you to take help care of them thru complex, painful or trying situations that cements the relationship. If the avatar were not also vulnerable and able to allow you to give, it would seem hollow to me. (Yet if that affect could be programmed in, would you really want to pander to it, i.e., to try to cheer up your avatar?) Another dimension of a sustainable relationship is growth with each other thru sharing of new experiences (good or bad). It is hard to imagine going with your avatar to visit a new city, try a new restaurant, or watch a new movie. Maybe AI will give them such informational context to discuss, but they will not have literally shared the experience with you. If so, they are likely to eventually feel stuck in the past and perhaps stale to you as you move on.
That said, I could enjoy an evening with a virtual Adam Smith. Short lived intellectual encounters seem very attractive. However, as a thought experiment, imagine ways this could become ambiguous as to what to make of it: Consider having a dinner with some virtual Founding Fathers of the US and asking them what they think about gun control or other modern issues that have outgrown the conditions of the early republic. Would we think their answers were anything other than a canned recitation indirectly created by the way they had been trained by the modern AI programmers (not necessarily deliberately)? Would we think we understood more about textualist or originalist legal viewpoints we should apply today? Or is it just a game?
[Hi, Frank. Regarding your re-attempts to post your comment: I’m not sure what happened but I, too, had a connection problem with some being material wiped out in the last few hours on Econlib/EconTalk. We apologize for any inconvenience. Thanks for your correct assumption that your comment was not rejected. — Econlib Ed.]
Gregg Tavares
May 7 2024 at 3:21am
Talking to famous people from the past is already something people do via role play. College courses where each student is assigned a figure to research and then debate sonething from their pov. Of course the student gets a special lesson but the other students get the experience of hearing this figure’s pov even if it’s an imperfect role play
To me, it wluid be pretty awesome to be able to ask Euclid explain math to me since he understood it so much better than my math teacher. I’d love to be able to talk to various philosophers if they understand the implications of their pov. Maybe Lenin who saw tbe world through the state of the world he was living in at the time and I’d wonder, given a look at today; would he come to the same conclusions. I get the AI would be making it up but so would a professor channeling what they think Lenin would have thought. The experience still seems like it would be thought provoking.
Eric
Apr 25 2024 at 10:54am
Throughout the episode both Russ and guest Paul Bloom repeatedly sense that there would be something wrong about spending your life interacting with a simulation (whether as an AI simulation of a loved one or a full simulation of life in Nozick’s Experience Machine). Yet, it seemed elusive to identify exactly why that is so.
I would suggest the core reason why all such scenarios would be wrong is that they all indulge the natural and common self-centered illusion that “My life is meant for me.” Philosopher Hannah Arendt has said:
“Every generation, civilization is invaded by barbarians – we call them ‘children’.”
We enter this world self-centered, aware only of what we feel, concerned only for our own pleasure and satisfaction, and largely oblivious to others as others. Yet Jesus and others point toward a very different path for what God intends life to be, a path that is fundamentally oriented toward loving others, not merely loving ourselves.
Jesus said to him, “Love the Lord your God with all your heart, with all your soul, and with all your mind. This is the greatest and most important command. The second is like it: Love your neighbor as yourself. All the Law and the Prophets depend on these two commands.”
The Apostle Paul taught that God gives different gifts to different people so that they would need each other, just as a body needs all the different members of the body and no member is complete on its own. The result of that way is that no individual is meant to be a self-sufficient island caring only for themselves and their own benefit.
Above, first commenter Jacob was headed in a good and encouraging direction with his observation that “young people … reject the idea that they should live hedonist lives alone in their homes”. Mere self love is a shriveled misunderstanding of what humans are meant to be.
Everyone is meant to use some of what they have received for the benefit of others who will need that. We know from economics a bit about how the division of labor and specialization can work to benefit society as a whole. In reality, God’s design runs even deeper than that. We are meant to be different in ways that enable us to contribute to what others need and build up others in love.
David
Apr 29 2024 at 7:50am
Let me just dump here a few random thoughts.
So Russ says, at one point: “as religion diminishes, I think it’s harder and harder to reclaim anything other than utilitarian physical pleasure….And so, you and I, we’re older. We come from a different era. We still have in us some unease about some of these scenarios. I think younger people, particularly secular young people, would find some of our unease both baffling and perhaps silly. Life, if you’re not religious, life’s to be enjoyed, and why wouldn’t you spend it with the best possible experiences with those avatars?”
Quite a dualism that Russ posits here (it’s really not the first time). The divide is roughly as follows: liberal – secular – hedonistic – utilitarian – materialistic – individualistic – lacking spirituality vs religious – spiritual – thoughtful – involved in a community.
In the postmodern fashion, we could go on here to give a lot of examples from daily life of how religion, the way it’s practised by its adherents, is anything but spiritual (just think Hamas for a second) and how, in contrast, we see one thougthful atheist after another arriving on this very show most Mondays.
One thing is for certain though – no manner how many thoughtful atheists arrive on Russ’ show every Monday for a talk, he will stick to his favourite strawman!
Ron Spinner
May 3 2024 at 5:02am
My kids complain that they and their kids are on the cell phone too much. I tell them that we haven’t yet learned how to use the internet and social media and eventually we will figure it out. We now have AI to deal with before we have learned how to control our use of social media.
Maybe the answer is for us to strengthen our self control muscle. Then we can take whatever comes at us more successfully..
Stephen
May 13 2024 at 12:54pm
Yet another AI show but still interesting. When the discussion focused on the concept of using an AI avatar of a deceased relative as therapeutic means to ease grief, I wondered if any consideration has been given to Artificial Emotion? Lots of discussion on EconTalk and abroad about a malevolent AI but “malevolent” implies emotion not intelligence. Someone can be intelligent but emotionless. Someone can be full of emotion and not very intelligent. So what are we creating with AI? Can it lead to AE? And if so what would happen to the AI avatar used to over come grief? Would it then grieve for its patient? Has this been explored?
Comments are closed.