Business and society

 

 

Many people believe that autonomous cars will save a lot of lives—tens of thousands every year in the U.S. alone. Does that fact reduce the need to think about rare ethical dilemmas where a few innocent people may be injured or harmed? That is, does the greater good or utility excuse any bad outcomes? Can you think of scenarios—whether about robot cars or anything else—where the greater good does not justify a bad action, such as a few wrongful deaths?

 

 

This question has been answered.

Get Answer