Robots are increasingly taking on human tasks. When things go wrong companies need to be held accountableby Noel Sharkey, Aimee van Wynsberghe / January 30, 2017 / Leave a comment
We could be rushing headlong into a revolution in robotics without due caution. Industrial robots are not a new phenomenon: but now there is an upsurge in service robots for everything from healthcare to the care of children and the elderly; from cooking and preparing food to making and serving cocktails; from domestic cleaning to agriculture and farming; from policing, security and killing in armed conflict to monitoring climate change; and from robot surgery, to robot intimacy and protecting endangered species.
There were 4.7 million robots sold for personal and domestic use in 2014 including a 542 per cent increase in assistive robots for the elderly and disabled. This figure is forecast to rise to 35 million by 2018 at a conservative estimate. And the predictions do not include the rapid developments of driverless technology. Autonomous cars, trucks and buses are set to change our roads forever and revolutionise our transport and delivery services. Not to mention how our farming practices will change with automated tractors, ploughers and threshing machines.
The lure of massive new international markets is pushing governments and corporations to view robotics as a powerful economic driver and they are starting to pour funding into developments. Many companies and startups are creating a multitude of new robot applications in what is becoming a highly competitive market that will drive innovation.
Despite the disruptive impact that such automation could have in our work places, our streets and our homes, little more than lip service is being paid to the potential societal and ethical hazards by governments. There are certainly protections for our rights built into existing law but there are cracks. For example, a case involving parents going off to work and leaving a child in the care of a robot would have to be prosecuted on grounds of neglect. But who is responsible if something goes wrong: the parents or the robot manufacturers? Without clear regulations and laws, major robotics corporations could develop a variety of dubious practices.
We propose that robotics should be considered to be an exceptional category that requires its own laws and/or regulations to rule out ambiguity and preserve human values in the face of a new autonomous technology. We need to ensure a clear chain of human accountability for when things go wrong. We need to ensure that humans remain in full control of processes that impact on human lives: for example, a robot should never be delegated with life and death decisions.
The European Union parliament is making a first stab at a new robot law and will vote on a report by MEP Mady Delvaux in February 2017. However, it is difficult for policy makers and the legislators to understand the extent of the detailed issues and to keep up with such rapidly emerging developments. While a great start on issues such as accountability and responsibility, some of the report is unrealistic and wanders into sci-fi tropes. There is clearly scope for independent advice from technology scholars who do not have vested interests in funding or the companies.
The development of new robot law would benefit from cognisance of the burgeoning literature (from early 2000s) on the ethical and societal issues in robotics both in general and on specific applications. We need to bridge the gap between these discussions and legal discussions. We need to promote concrete action: developing codes of conduct for responsible and accountable research; developing guidelines for responsible design and manufacturing practices; and advocating for new national and international policy formation along with the generation of new regulations and laws.
It is vital to this enterprise that all stakeholders are involved. It requires an integrated multidisciplinary approach combining law, social science, philosophy and robotics researchers and designers working in tandem with manufacturers, policy and law-makers as well as engaging with the public. This is essential if we are to strive for responsible and accountable developments and practice in robotics without stifling innovation or trampling on people’s research or commerce.
This is why we co-founded the Foundation for Responsible Robotics together with many of the world’s leading tech scholars, writers and roboticists in 2015. We are concerned with the sensitive task of working out regulations that limit the risks to our humanity whilst still maintaining progress and innovation in robotics research. If robotics is to have a successful future, we need to ensure that new developments will be created responsibly and with due consideration for basic human rights such as the rights to privacy, dignity, autonomy and life.
The British Academy debate “Do We Need Robot Law?” takes place on 31st January. Click here for details