Do we need robot law? The question might seem to belong more to the world of science fiction than reality; but technological advances have made it a pressing issue. At a British Academy debate on 31st January in London, the chair Hannah Devlin, a Guardian journalist, began with an arresting anecdote.
Last year, a man in a Tesla car in self-driving mode was killed when he collided with a truck that neither the car nor the human saw. Was the technology to blame or the human being? And if the authorities don’t have complete access to the technology then how can they adjudicate?
Patrick Haggard FBA, Professor of Cognitive Neuroscience at University College London, set up the argument about the responsibility of robots. “From a cognitive point of view, there are two things you need: you need to know what you’re doing; and you need to know what the result of your actions will be.” As a child, when you hit someone, for example, you realise they will be upset. “A sense of agency is crucial for regulating our behaviour.” Robots, right now, lack this agency.
Could robots learn agency? Haggard thought that with advances in deep learning and large databases this could happen. But this would create further problems. When humans learn as children they proceed by “trial and error.” They had parents to guide them and are too small to do much harm usually. But are we prepared to accept robots making the same mistakes? What happens when an intelligent system running a car makes a mistake—misses an approaching truck, for example?
We will reach a point when “robotics” are deeply embedded within society—and that will mean we will need to “fundamentally change the way society works.”
Professor Noel Sharkey, Emeritus Professor of Artificial Intelligence and Robotics, University of Sheffield outlined this brave new world. Amazon is already doing robot deliveries; in New Zealand you can get can get a pizza delivered by drone. Sharkey himself has a robot vacuum cleaner, and believed that in a short time everyone in the room would have one.
There is very little joined-up thinking over these issues, he said. Who is going to control all these devices flying round our heads? Also, many more jobs will be lost than will be created, and we have to prepare…