“One day, while you are driving along, an unfortunate set of events causes the car to head toward a crowd of 10 people crossing the road. It cannot stop in time but it can avoid killing 10 people by steering into a wall. However, this collision would kill you, the owner and occupant. What should it do?”
This fascinating piece in MIT Tech Review discusses why car makers need to solve an impossible ethical dilemma of algorithmic morality before self driving cars become widespread.
Read the full story here.