Ethical questions such as ‘who is responsible when an autonomous vehicle crashes’ are now set to be played out in the legal system
It is a terrible tragedy, but one that seemed to be an inevitable happening in the development of autonomous vehicle technology. Overnight, in the US, a 40-year-old man named Joshua Brown died when the Tesla he was riding drove under an 18-wheel truck while in ‘autonomous mode’.
In a blog post from Tesla written Thursday, the company revealed the National Highway Traffic Safety Administration (NHTSA) will launch an inquiry into the accident.
Tesla could have avoid commenting on technicalities and defensive rhetoric and responded with “We will take this seriously” followed by a heartfelt is apology. That would have been enough.
However, with the exception of condolences given to the family at the bottom of the statement, Tesla immediately went into defense-mode by pointing out autonomous driving must be explicitly turned on by drivers, they are recommended to keep their hands on the wheel and warned that the technology is not perfect.
In the second sentence, Tesla was quick to point out that this was the first known fatality after 130 million miles in which autonomous driving was activated; compared to the US average of ‘one death per 94 million miles driven’ or a worldwide statistic of ‘one death per 60 million miles’.
The safety argument should be reserved for the NHTSA inquiry and is not one to be publically posited in the first paragraph of a ‘condolence’ statement. Tesla just leveraged a family’s tragic loss to highlight the safety advantages of its product. Nice one.
Also Read: Artificial Intelligence 101: A map to understand where we are now
Unfortunately for proponents of driverless vehicles, the community is now forced to confront the cloud that has been hanging over autonomous vehicle development since the technology began to show signs of implementation.
The question is: When someone dies in a driverless car accident, who is to blame?
Here is Tesla’s version of events.
What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.
In this statement, it is clear the autopilot failed because it encountered a ‘one in a million’ circumstance which bamboozled the technology.
But is that a good excuse? No, of course not.
As Tesla readily admits, the technology is not perfect, and this incident is Example A.
If the statement is to be believed, it also sounds as if Brown, had be been driving a normal car or had a better view of the semi-truck, could have stopped before the accident. So, is it Brown’s fault for not paying enough attention? Isn’t ‘not paying attention’ supposed to be a feature rather than a bug?
What about the truck driver? Was his position on the highway reasonable? Even if the truck had made a risky move, would a human-powered vehicle have been able to avoid the accident? Or, at the very least, make a desperate move to minimise the damage?
As of last night, for autonomous vehicles, these questions are no longer game-theory.
If the Brown family decides to sue Tesla, the legal teams for both sides will grapple with these questions and the resulting answers will set concrete legal precedents moving forward.
Also Read: Obstacles lay ahead for autonomous driving
The video below from Patrick Lin does a fantastic job of presenting the ethical dilemmas presented by autonomous technology. The difference is, as of today, it is no longer a ‘thought experiment.’
Anyone with reasonable critical thinking knew there would be a fatality in the driverless car experiment. The moment is here; hopefully it can bring about positive change.
The post Tesla’s autonomous vehicle death brings ethical debate to real world appeared first on e27.
from e27 http://ift.tt/2984oGP