
An important question has come up, which has never come up before in the history of our species. It’s not very often you encounter a question, a problem, or anything at all, that has never happened before.
The question is something like, “Are real connections with humans more important or more valuable than interactions with empathic AI?”
With the advent of artificial intelligence, we are encountering a new set of possible tradeoffs and moral problems. Most of those problems boil down to whether or not to supplement or even “replace” a thing in life with an artificial version of that thing. Knowledge, work, workers, effort, relationships, even entire organizations. Do we hand this thing to AI, or do we keep doing it for ourselves?
One of the most interesting, and difficult, things I’ve encountered that threatens to be replaced by AI is empathy. Connection. A vocalized understanding between two parties which validates the “humanity” in each of them. The struggles, the emotions, the living in each of them (or one of them).
It’s hard to imagine that empathy could be simulated. How can a computer program walk a mile in your shoes? It doesn't have feet. It doesn’t have emotions. It doesn’t have a childhood, or adult problems, to compare and contrast with yours. It doesn’t even have an emotional prism through which to reflect what you’re saying to it. It just has calculations.
So then how can it “mean” anything that an AI can put together a string of words which “validates” your emotions or “recognizes what you’re going through?” What does it mean to “mean” something?
Does a recognition of your pain require feeling one’s own counter-pain? Does that make AI incapable of truly empathizing?
We generally think of psychopaths as incapable of putting on someone else's shoes. And we generally consider that a bad thing, and we recognize that it makes psychopaths not worth trying to build a relationship with. It makes psychopaths incapable of adding anything real to your life — you can't be friends with them, because they can't be friends. So, why is AI any different?
Paul Bloom, Michael Inzlicht, and C. Daryl Cameron have been having a sort of back-and-forth discussion about this topic in a series of very interesting articles. If you check out Bloom’s latest one here, you can find the links to the entire series.
Bloom’s conclusion after discussing the topic at length, is that yes there is in fact inherent value in the relationship between humans and other humans. There is a certain je ne sais pas about a real connection between organic beings that a simulation simply cannot live up to. Interacting with a human is worth something that interacting with a computer isn’t.
And although Inzlicht and Cameron agree with a lot of Bloom’s arguments, their conclusion is more empirical and pragmatic: if people’s mental health outcomes and happiness levels when talking to an AI are the same as when talking to other humans, what’s the difference? All that matters is the outcome — there’s no magic here if it doesn’t come out in the data. What matters is that “empathy,” by some definition, is being delivered. What matters is that empathy is being delivered in a technical sense, and that the person feels seen or heard — even if by a computer program.
For the sake of reinforcing Bloom’s conclusions (which I wholeheartedly agree with), I’d like to propose a thought experiment.
Imagine a world where, as Inzlicht and Cameron suggest, AI interaction is just fine as a replacement for interaction with other humans. People get their emotional needs met through chatbots and personal intelligence programs.
Now, carry this thought all the way to its most extreme endpoint: total isolation of all individual humans, all of the time. We interact only with AIs.
We never see each other with our own eyes, never smell or taste or touch each other. We don't give or receive back rubs with human hands. We cannot bond over shared childhood experiences or important life events like surviving danger together or being the same age or going to the same schools or growing up with the same bullies or pretty girls around us. We do not get to form bonds with siblings, because we aren't raised together. We do not get to feel lust for real human beings, we do not get to go through the work of building relationships patiently over the course of years. We don’t get to dance or sing or play music together.
We do not get to interact with other human beings, at all. We live lives completely without the context of a tribe, an upbringing, a family, or a personal story.
Now what are humans alive “for” at this point?
We can’t build things with each other. We can only build them with AI.
We can’t form bonds with each other. We can only form them with AI.
We can’t reproduce with each other; we can’t reproduce at all, unless AI milks us and then impregnates itself with our gametes.
We can’t have a “human” experience with any other “human.” We can only have simulated experiences with computers.
In other words, our entire lives are lived, to put it bluntly, in service of interactions with AI. We are born, we consume nutrients, we interact with computers, and then we die.
I mean, that’s what that means, right?
Now, you tell me that such a life would be worth living. For anyone. It seems, by its very definition, an absolutely pointless existence. It’s like being born just to die.
The point of this thought experiment is to highlight the pointlessness of being “validated” or “recognized” by AI or receiving “empathy” from it. It’s not worth anything, because if that was all you had your life would be pointless. Now sure you could say the same about human empathy or human cooperation — that, at the end of the day we’re all just really complicated bits of dust and that we’ll return to the planet and that none of that was “worth” anything either.
The difference is, you would never wish that hypothetical scenario on anyone if you actually cared about them. Because there is something inside you, something innate, that knows it isn’t acceptable. The difference is, a life of isolation with computers is what you’d wish on your enemy. Not your friend.
And the pointlessness exists on a scale — it goes up and down depending on how much of our lives we share with AI instead of of other humans.
I'm not saying that sitting around, while you're bored, interacting with AI is bad for you. Because it might not be (maybe). But if you have a choice in front of you to interact with a real warm human or an artificial replacement, over the long term choosing the AI would be a mistake. Replacing is the mistake.
It makes me wonder a very long-term question: is the inevitable endpoint of AI the reversal of the real and the artificial? Are we destined to design AIs to take over our role as the supreme beings, while we turn ourselves into mere computer programs that they run? Are we to become cattle for our own computers? In trying so hard to play God, are we to be overtaken and then rejected, maybe even enslaved, by our own creations? In a sense, that’s what we did to God. Maybe now it’s our turn.
Or to take the more simplistic angle, the consumeristic angle: can you imagine living in a world where we spend our entire days interacting with AI in various forms? That is a world where you have literally — completely and literally — replaced all of the most important relationships in your life with a consumer product.
If that doesn't give you the dystopian creeps, I don't know what will.
In the United States, for 20 years now we have been interacting not with each other but more and more with simulated versions of each other. And we have never been unhappier. I mean, that data is in; that experiment is over. We don't like it and it's not good for us.
“But can't AI fix that by becoming better at simulating us?” I don't know, can you fix a flat tire by taking even more air out of it?
There comes another question (as suggested in the linked article series) about whether reality is better than illusion. Is there more value in receiving authentic attention from a living being which has the agency to say no thanks? Is it better to receive love and attention from a person who is freely choosing to give you said love and attention? Or is it okay that you have the same emotional experience even if it’s fake? What if, like in some experiments, you got fooled and you didn’t even know it was fake? You thought you were interacting with a real person, but it turned out to be a chatbot.
Again, I think this answer can be found in our thought experiment: if you are to “live” at all, as a member of a species, then yes. It matters whether your connection is with another authentic member of that species. Again, this comes down to replacement. There’s no harm in sitting around having emotionally stimulating conversations with a chatbot, having it validate you or having it deliver words to you that are meaningful to you.
But if someone is trying to fool you into giving up real human connections to do that instead, without telling you, that person is not your friend. They are milking you. They are trying to get you to live your life in service of machines, instead of in service to other people.
Empathy is a service shared between human beings. It is the spiderweb that holds us all together. It is an emotional bond that bests the absurd randomness of the universe. And every time we interact with an AI instead of another human being, we take away just a tiny bit of that bond. We remove just one strand of silk from that spiderweb. Every time we choose computers, we weaken our species.
And what’s the point of being the dominant species if you’re just going to weaken yourself?
Sure we can “experience” empathy by individually sitting at home getting it from robots, but that’s like saying a shoddily-crafted wedding dress held together with duct tape is still a wedding dress — because it still presents as a wedding dress once you piece it all together. No, it’s not. The purpose of a wedding dress is not merely to pass the test of being wearable one time. By crafting it terribly and then holding it together with tape. A wedding dress is not an empirical claim of being “technically” a dress. A real wedding dress is a work of art. It's something one crafts; something to be proud of.
A cooperative, self-empathic human species is also a work of art. If you don’t see the point in making our species a work of art, then I don’t know why you’re still here. Beauty is the point of life. It’s what makes us human.
Maybe this question comes down to your definition of “living,” and what life does and doesn't consist of. I would argue that, if you aren't connecting with real members of your species and having authentic connections with other living organic beings, you're not “alive” in any real sense of the word. You're a test tube experiment. You’re a collection of carbon atoms temporarily arranged into the shape of a human being and accomplishing exactly nothing.
But then of course the question becomes, well, what does life “accomplish” if you do live with other members of your species?
I don't have the answer to that, if there is a “the” answer to that. That’s a religious question, and I’m not religious. But maybe all I really want on my deathbed is to feel that I lived with other human beings. Because if I didn't, it wasn't worth it.
Drink some dihydrogen monoxide.
JR
“Tradition is not the worship of ashes, but the preservation of fire.” - Gustav Mahler
I see AI as the destruction of artists. What has this world come to?
"The Machine Stops" by E.M. Forster is describing this problem of technology mediated isolation of human beings. It´s a short book worth the read.