Will your self-driving car be programmed to kill you if it means saving more strangers?

The computer brains inside autonomous vehicles will be fast enough to make life-or-death decisions. But should they? A bioethicist weighs in on a thorny problem of the dawning robot age.

Source: www.sciencedaily.com

The concept of your car being smarter than you is generally unattractive to most people that will be buyers of cars in the future. The missing element is a conscience, a uniquely human possession. Consciences demand moral training and are influenced by the environment each of us live in. I don’t think anyone will be able to train a robot or car to replicate that and own the liability that goes with it.

Leave a comment