The answers people give to this question can be both fascinating and disturbingby Julian Baggini / November 13, 2018 / Leave a comment
Published in December 2018 issue of Prospect Magazine
The other day, someone confidently told me that we were heading for a golden age for philosophy. With the growth of AI, he explained, there’s going to be a hell of a lot of work figuring out the ethical rules to govern machine behaviour.
Since it is something of a mantra for me that there is no algorithm for ethics, this filled me with despair. Whether I like it or not, though, we are going to have to devise ethical algorithms to serve as imperfect proxies for the messy moral judgments that until now it has fallen to humans to take.
With this in mind, an international team of researchers has been gathering data from 233 countries on what people believe the life-saving priorities of autonomous vehicles should be, in situations where some death is unavoidable. The decisions of over two million people were collected and analysed; the results make for fascinating and disturbing reading.
There are few surprises at the top of the list of people whose lives should be prioritised: babies, children and pregnant mothers. More worrying is that the lives of athletic men and women are valued more highly than those of their overweight counterparts. The homeless also count for less than executives but more than old people. The least-valued humans are criminals, who come after dogs in the list of the world’s priorities, with only cats more dispensable.
What to make of this? The pessimistic response is that it just confirms that moral judgments are not based on thought-through principles but knee-jerk reactions. Worse, these reactions betray morally indefensible prejud…