Are we ready for robot relationships? asked chair Luke Dormehl, at the British Academy debate on 21st February. A staple of science fiction, the question isn’t as strange as it perhaps first seems. From SatNavs to Apple’s Siri, talking robots are now a key part of everyday life. “The robots are ready for us but are we ready for them?,” asked Dormehl. “How will our relationships with these emerging technologies develop and what will they look like? Are sex robots an actual thing, and if they are should they be embraced physically and emotionally or worried about?”
The first speaker was Margaret Boden, a Research Professor of Cognitive Sciences at the University of Sussex. She opened by saying: “I suppose my initial response to the question ‘Are we ready for robot relationships?,’ is perhaps we’re too ready for them.” The ubiquity of phones has made young people totally reliant on technology. “I believe that there are some young men—I assume they’re young men—maybe not, who actually speak to Siri more than 250 times a day to try and have sexual conversations.” They are using Siri as they would porn, said Boden, and are not deceiving themselves that they are having an actual relationship. In a different category are so-called robot carers or robot companions. She said she was worried that such carers were being used in old people’s homes to provide “conversation, comfort, solace.” Because they can only provide an imitation of the real thing. They can’t laugh at your joke—or groan at it. If it expresses sympathy and says, “I’m sorry,” it can’t really be sorry.
There’s a little robot seal called Paro, she went on, that you can cuddle, and it does give people some comfort. But it doesn’t talk. So it’s not pretending, it’s giving you the sort of comfort which a teddy bear could give. “It’s not possible to have a genuine relationship with one of these things. They’re not human beings. They aren’t even dogs.”
Kathleen Richardson, Senior Research Fellow in the Ethics of Robotics at De Montfort University, questioned whether it was really progress if we were to merge our selves with machines. Rather, by blurring the lines between humans and robots, we risk dehumanising ourselves. “I want to suggest the origin story of relational robots and AI is about property,” she said. To create robots as friends or lovers relies on a narrow and instrumental view of relationships. The other becomes “something that you can use rather than relate to.”
“We only give meaning to our existence because of others. Because we love and we have been loved.” Have we lost so much hope in other people that we wish to use robots to replace the genuine human relationships we should be cultivating—especially with the old?
John Daneher is a lecturer at the School of Law NUI Galway. He said he was going to “play the villain of the piece” because he was going to say something positive about robot relationships. Firstly, “robot relationships are already happening, and they are inevitable.” Take the story of Boomer. He was killed in Iraq in the late 2000s and buried with full military burial, a 21-gun salute and awarded a purple heart. The catch was that Boomer was a bomb disposal robot. He was such a “valued comrade,” said Daneher that his fellow soldiers thought he deserved a proper burial.
Is this a good or a bad thing? “I don’t think there’s anything intrinsically problematic about a professional-style relationship with a robot,” he said. You might regret the unemployment but if your barista was a robot that doesn’t cause many ethical problems. What about having a robot as a friend? Aristotle defined three friendship styles: the Utility Friendship, the rich person who gives you a contact; the Pleasure Friendship, someone you play tennis with, for example; and the Virtue Friendship, the style of friendship that he valued most highly, which involved two roughly equal people interacting in a meaningful way. Maybe robots can’t be your Virtue friend, but so what?
“There are reasons to be concerned here, and I don’t want to underplay them,” but he said: “I think robots could help to facilitate a more equal footing between partners. If we can outsource some of our need for utility and pleasure to the robot, that might clear or open up a pathway to Aristotelian friendships.” He cited the example of an autistic child who developed a “friendship” with Siri, who was always kind and never got frustrated with his behaviour.
Nicole Dewandre, a researcher based in the European Commission Joint Research Centre, said that we will not be able to relate to robots unless we change our definition of what it means to be human. Descartes suggested rationality as a criterion to distinguish man from nature. We have naturalised rationality as a proxy for humanness so deeply that we end up looking at computers and robots as if they were human.
“When women look in the fridge to see if there is milk, this is not deemed to be a very smart task. But when the fridge is programmed to send a signal when there is milk missing, then it is a smart fridge.” So we shouldn’t be freaked out by the so-called achievements of these artefacts.
In The Human Condition, Hannah Arendt argued that we are equal because we are all human. She added that we are not reducible to our attributes. “This combination between equality and uniqueness is wonderful,” she said. “In conclusion, we shall be ready for robot relationships,” but only if we make big changes to the way we perceive ourselves as human.
The Chair grasped the nettle and asked the panel about sex robots. Kathleen Richardson said it was important to distinguish masturbation aids—which have been around a long time—and sex with another human being. In effect, you cannot have sex with a robot because it isn’t human; but you can have an orgasm with it. Since it cannot experience pleasure, the experience lacks the mutuality necessary for a true sexual experience. Margaret Boden agreed: “If it’s going to be personal love, that can only happen between two people, two systems if you like, which have complex motivation, their own interests, their own preferences, and they recognise their own and the other person’s preferences.”
John Daneher said it was inevitable that such robots would be created—in fact already had been. Some have disturbing functions involving paedophilia and rape, and in order to counter that, “We should at least try to regulate them so that more positive attitudes towards consent are represented in these artefacts.”
With additional help from Sophie Marquand