Listened to a really interesting podcast in which they likened the current ethical dilemmas associated with driverless cars to the famous trolley problem.
If a driverless car has to “make a decision” between hitting a pedestrian or swerving out of the way and potentially killing the driver, which should it be programmed to choose? An engineer for Mercedes said the cars will swerve to avoid the pedestrian. Who’s going to buy a car knowing that it will deliberately kill you in order to save someone else?
It also discussed the idea that the sensors on these cars will get so good that they will be able to not only detect a pedestrian, but also that pedestrian’s age, gender, race etc. If there is a young child in a car which is about to hit two elderly people, does the car swerve and kill the child or does it instead take age into account and kill the elderly people?
Essentially, decisions are going to have to be made and algorithms developed that place different values on human life. Interesting stuff.