Ethics of Driverless Cars

Driverless CarsGoogle has been developing the driverless car for over half a decade, and with each passing year these automated chauffeurs move farther away from science fiction and closer to reality. Five states have passed laws enabling such vehicles, and as of April of this year, over 700,000 miles have been driven without an accident.

While commercialization is still a long way off, last week a debate raged over the ethical situations that may arise when a driverless vehicle is put into a situation where a crash is imminent. As a hypothetical example, what if a crash was unavoidable, and a driverless car had to choose where to guide the car – either to collide with a schoolbus or a volvo?  Physics dictates that hitting an object of larger mass will statistically be the safest, but there are kids on board! What is the correct decision?  Is the law, and by extension, the public, prepared for cars that “choose” the best scenario?

Patrick Lin at Wired.com asked this question on May 6th in a blog post, and the topic was picked up by Gizmodo, Popsci, and Slate.  Patrick was quick to point out that this hypothetical “imminent crash” situation is highly improbable, but so are fatal accidents. In the U.S., a traffic fatality occurs about once every 100 million vehicle-miles traveled. That means you could drive for more than 100 lifetimes and never be involved in a fatal crash. Thinking over the question, “Should a robot be allowed to make decisions that affect the health of human beings, even in the name of minimizing damage?” is more of a thought experiment than a true problem, but as the technology evolves to be capable of making those types of decisions, ethical discourse should be encouraged.

w800

In the future, every device will be able to talk to each other. What will our commute be like when our cars know who is in them, what cars are around them, no one speeds… Who is responsible when an automated car makes a mistake, dealing property damage or part of an accident?  Will we see insurance cases where makers of automated car software are sued for damages?  Some states have proactively passed laws addressing this very question, stating the human passenger is responsible for the actions of the automated vehicle, but time will tell if this holds up in court.

 The larger challenge isn’t thinking through ethical dilemmas. It’s about setting accurate expectations with users and the general public who might find themselves surprised in bad ways by autonomous cars. Whatever answer to an ethical dilemma the car industry might lean towards will not be satisfying to everyone.

~PL

Posted in Hardware, Science, Security, Shifting Perspectives, Technology. Tagged with , , .

Leave a Reply

Your email address will not be published. Required fields are marked *