Who’s to blame when a self-driving car crashes?

While we are constantly told by Silicon Valley geniuses that software will always make safer decisions than a human, it’s clearly not the case

In March 2018, Elaine Herzberg became the first person to be killed in by  a robotic autonomous car. As she crossed the road in Tempe, Arizona,  a Volvo SUV being tested by  Uber failed to take avoiding action and hit  her
In March 2018, Elaine Herzberg became the first person to be killed in by a robotic autonomous car. As she crossed the road in Tempe, Arizona, a Volvo SUV being tested by Uber failed to take avoiding action and hit her

Elaine Herzberg will some day be the answer to a question in a television quiz show, simply because she chose the wrong moment to step off a kerb and into the street. In March 2018, Herzberg became, tragically, the first person to be killed in a collision with a robotic, autonomous car. As she crossed the road in Tempe, Arizona that fateful evening, a Volvo SUV being tested by the taxi and ride-hailing service Uber failed to take avoiding action, struck her, and fatally injured her.

Who's to blame? Not Uber at any rate. Yavapai County attorney Sheila Sullivan Polk stated in a letter last month that: "After a very thorough review of all evidence presented, this office has determined that there is no basis for criminal liability for the Uber corporation." It's possible that Rafaela Vasquez, the Uber employee who was in the car at the time and was supposed to have been shepherding the autonomous systems as they drove the car around might yet face criminal charges. Ms Vasquez had been watching TV on a smartphone at the time of the crash.

To say that the Uber incident opened up worm-filled cans that the car and tech industries, so hell-bent on driving us into an autonomous future, would rather stay sealed up is putting it mildly. Although Uber was cleared of any wrong-doing, it has become clear that, since the incident, many of those who once evangelised about robot cars have taken a hasty step back.

According to Philip McNamara from Mobility X: “As much as people would like to say that Level 5 [fully-autonomous, with no input from the driver] autonomy will be reached in the next couple of years, I’m not so sure. The systems are really good for motorway use, but not so great on rural roads. I think you could be looking at maybe as much as 20 years to Level 5 autonomy to come to fruition. I mean, the work being done by the likes of Nvidia in AI vehicles is amazing, but even then I don’t think it can deal with absolutely every situation that you can find on the road.”

READ MORE

Software

Software will get better, of course, but our roads won’t and neither will we. And we, we humans, will be sharing road space with the software-driven cars for some decades to come (assuming that software-driven cars ever get beyond the lab space and limited city-centre use, which is not quite the given it once was). Accidents are going to happen, and while poor Ms Herzberg was the first, she won’t be the last.

Which is where the liability question comes in. If you or I drive in a manner which causes someone's death, however accidentally and unintentionally, we can be prosecuted and jailed for that death. What do we do with software? Well, according to Mobileye, we make software automatically not liable. Mobileye is an Israeli company that's at the forefront of the kind of sensor technology that robotic cars will rely on to 'see' the world around them, and its solution to the problem is to make robotic car-driving software 100 per cent legal, so that it can never be blamed as the specific cause of any crash. That's not necessarily the same as being safe - after all, it's technically illegal to cross a solid white line on a road, but if you're doing it to swerve away from a tractor that's just unexpectedly emerged from a roadside field, you'd be preventing an accident. Mobileye's solution seems to suggest that a robotic car would accept such a collision in order to remain entirely legal.

Assumptions

Michael Nelson, a risk expert from Eversheds Sutherland in New York aruges that such assumptions probably won’t wash, legally speaking. “Regretfully, I do think that we’re going to have to run through the courts, and we’re going to get a very mixed bag because we always do, and technology always outpaces the law. There’s only so much forethought that regulation or legislation can provide. Can you imagine the myriad of scenarios that are going to play out over the next 20 years?

“I think that the potential threat to getting this technology out into the field, where it can do good, is the absence of a rigorous risk-transfer system, with a clarity of law, and a great insurance compensation system in place. I think the lack of that will impede the introduction of this really important technology, which will save lives. The Uber accident in Arizona was widely talked about, but what wasn’t widely talked about is that 13 other pedestrians were killed by automobiles that same night in Arizona.”

Therein lies the rub. While we are constantly told by the Silicon Valley geniuses that software will always and forever make better, safer, decisions than a human, it’s clearly not the case. Even the best software makes mistakes, errors, gets caught in runtime loops. The phone in your pocket has some of the most sophisticated software ever conceived by man, but you still occasionally need to reboot it to pickup the wifi signal. Equally, the best facial recognition software - a relatively simple task of pattern recognition, compared to the immense complexity of driving a car - is only around 85 per cent accurate and that’s on a really, really good day. Worse still, as seems to be the case in the recent tragedies involving Boeing’s 737 Max airliner, overly-complex software can trigger disaster, even with the most experienced and highly-trained humans still in the control loop.

The conclusion? Don't listen to the autonomous vehicle hype, and sure as heck don't believe any when you do hear it. Michiel van Ratingen, Euro NCAP secretary general said: "Euro NCAP's message is clear - cars, even those with advanced driver assistance systems, need a vigilant, attentive driver behind the wheel at all times. It is imperative that state-of-the-art passive and active safety systems remain available in the background as a vital safety backup."