What Will Happen When a Self-Driving Car Kills a Bystander?

by CSTPR Visiting Scholar, Jack Stilgoe

The Guardian
June 24, 2017

As a social scientist researching emerging technologies, I am fascinated by the bumps, scrapes and abrupt turns of self-driving cars as they accelerate towards the market. Here is a technology whose algorithms are learning how to behave in the wild. For the rest of us to make sense of the opportunities, we need to get beneath the hyper-optimistic story offered by the would-be disruptors. This is why accidents are so important. They shatter the veneer, forcing society and its innovators to confront the very real uncertainties of technology.

In May last year, a crash in Florida made Joshua Brown the first casualty of a self-driving car. His Tesla Model S, in Autopilot mode, failed to see a truck that was crossing his path. Without slowing, his car drove between the wheels of the trailer at 74mph. The Tesla’s roof was torn off and Brown died instantly. Purely by chance, nobody else was harmed.

I have already written about the report earlier this year from the National Highways Traffic Safety Administration (NHTSA), a body whose responsibilities include issuing product recalls for defective cars. The NHTSA report largely exonerated Tesla, blaming the driver for failing to understand the technology’s limits. This week, the other group investigating the crash, the National Transport Safety Board (NTSB), released its own reports.

The NTSB is an independent body originally set up to investigate plane crashes. It is one reason why the airline industry’s safety record is so good and, more importantly, has improved so quickly over the last few decades. The NTSB’s job is not to lay blame but to find out what happened, so that the chances of things happening again can be reduced.

The 500-plus pages of NTSB reports include interviews, weather reports, blood tests and a detailed autopsy of the Tesla autopilot system. There is a gruesome medical report of the injuries sustained to the driver’s head as his car passed under the truck, and a transcript from the only witness to have come forward. The witness was surprised that the car was travelling so quickly before and after the crash. He reported seeing:

A white cloud, like just a big white explosion… and the car came out from under that trailer and it was bouncing…I didn’t even know… it was a Tesla until the highway patrol lady interviewed me two weeks later…. She said it’s a Tesla and it has Autopilot, and I didn’t know they had that in those cars.

The car kept going because nobody was in control.

The easy explanation for the crash is that the truck driver was in the wrong place, moving slowly across a road that he didn’t have time to cross. However, a human driver might still have been able to swerve or brake. We know from data Tesla gave to the NTSB that the brakes were never applied. We also know that Brown’s 40-minute journey consisted of two and a half minutes of conventional driving followed by 37 and a half minutes of hands-free Autopilot. While in Autopilot mode, he touched the wheel every five minutes or so in response to the car’s warnings, but spent 37 minutes with his hands off the wheel. There was no evidence Brown was watching a Harry Potter film, as was widely reported after the crash, but an SD card found in the car did contain tunes from the Harry Potter soundtrack. Read more …

Jack Stilgoe is a senior lecturer in the department of Science and Technology Studies at University College London. He teaches courses on science and technology policy, responsible science and innovation and the governance of emerging technologies.

This entry was posted in New Publications. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

Comments are moderated and must be approved to become visible to the public. Please do not submit your comment twice.