4.1 PULL THE LEVER
AN EASY DECISION
Kill 1 person to save 5. This question could seem pretty easy. Most people would pull the lever to save the 5. By pulling the lever you have chose the rational action. But wait, the problem gets revisited with self-driving cars and you might just change your answer once you have to become more involved.
It is easy to mark someone else for death, or kick them off the island in Survivor but what happens when you are on the receiving end?
The trolley problem as it applies to self-driving cars poses difficult questions from both a legal and ethical standpoint.
How do we program an ethics-based decision-making system?
(especially that pertains to situations which humans cannot even come to a consensus on)
You are the owner of a self-driving car. You paid good money for the leisure and safety of a self-driving car. Unfortunately, the AI vehicle has been put into an impossible situation. You are on a one-way collision path with a group of bystanders and there is no way around them. The only thing the car can do is drive off the side of the road and kill the driver thus saving the 5 innocent pedestrians.
What should the AI do?