As Wallach, Lin and other ethicists wrestle with the philosophical complexities, Gerdes is conducting real-world experiments. This summer on a racetrack in northern California, he’ll test automated vehicles programmed to follow ethical rules to make split-second decisions, such as when it’s appropriate to disobey traffic laws and cross a double yellow line to make room for bicyclists or cars that are double-parked.
Gerdes is also working with Toyota to find ways for an autonomous car to quickly hand back control to a human driver. Even such a handoff is fraught with peril, he says, especially as cars do more and driving skills degrade.
Ultimately, the problem with giving an an autonomous automobile the power to make consequential decisions is that, like the robots of science fiction, a self-driving car still lacks empathy and the ability to comprehend nuance.
“There’s no sensor that’s yet been designed,” Gerdes says, “that’s as good as the human eye and the human brain.”
I used the human eye illustration in the past, emphasizing that 40% of the central nervous system connections join the eye to the brain, double most other senses. Eyes are critical to driving and brains are even more so. The sensors we can develop and camera technology will only make them better, but not replace them. There is plenty of room for innovation and business in ADAS even if we don’t get a car to fully drive itself in complex environments.