, April 26, 2024

0 results found in this keyword

Tesla Autopilot Crashes: With at Least a Dozen Dead, “Who’s at Fault, Man or Machine?”


  •   6 min reads
Tesla Autopilot Crashes: With at Least a Dozen Dead, “Who’s at Fault, Man or Machine?”
Andres Jasso/Unsplash

After a Tesla car on autopilot recently killed two people in China and many other drivers report self-driving system malfunctions, the automaker is facing increased scrutiny over its technology and regularly attempts to shift the blame

by Lauren Richards

From July 2021 to October 2022, the US Department of Transportation reported 605 crashes involving vehicles equipped with advanced driver assistance systems (ADAS) – aka autopilot – and 474 of them were Teslas, that’s three quarters of the accidents.

On November 13, a disturbing video of a deadly Tesla crash in Guangdong, China went viral. The footage showed a Tesla appearing to try and park, but moments later erratically swerving back onto the road instead, accelerating uncontrollably until finally crashing into a building, killing two people and injuring three more in the process.

And China matters to Tesla: It’s the second major market, accounting for 60 percent of sales. So what happens in China can adversely affect the company’s bottom line.

Concerningly – now that Twitter is in Elon Musk’s hands – many reported on social media that tweets of the disturbing footage kept getting deleted from Twitter.

The driver of this white Model Y Tesla – an unnamed 55-year-old – reportedly lost control of his vehicle, and claimed he had an issue with the brake pedal as he attempted to stop outside his family store. The driver asserted that although he repeatedly attempted to apply the brakes throughout the few minutes of acceleration, the car’s automated systems malfunctioned and a technical problem prevented him from stopping the vehicle.

The November 5 Tesla crash in China’s eastern city of Chaozhou killed a motorcyclist and a high school girl on a bicycle.

Tesla has pledged to assist local police in investigating the fatal incident, but have denied all allegations that their vehicle or technology is to blame for the crash, citing both the footage showing no brake lights illuminated, and their own data logs revealing no attempts from the driver to depress the brake pedal for the duration of the uncontrolled journey.

In their own investigation of the events that led up to the crash, Tesla reported findings that it was in fact the accelerator that was excessively engaged throughout instead.

This is not the first time Tesla’s pioneering autopilot technology has been linked to fatal road accidents, nor is it the first time the automaker has dismissed implications of their own soft- or hardware being at fault.

Self-driving cars under scrutiny

The National Highway Traffic Safety Administration (NHTSA, the travel agency of the US Federal government) has since June 2021 ordered all automakers and vehicle tech companies to report “timely and transparent notification” of all road accidents that in any way involve automated or advanced driver assistance systems (ADAS) within 24 hours.

Of the 600+ collisions reported since last summer, 18 have been fatal, most of them involving Tesla vehicles, two of which were reported to NHTSA between September and October this year. Since June the agency has also upgraded its special investigations of 830,000 Tesla vehicles with defects.

Fatalities and near misses linked to Tesla autopilot have been stacking up across the world for a few years, many of which Elon Musk’s company have denied responsibility for or filed defamation lawsuits against.

The first known death tied to Tesla’s self-driving functionality happened in 2016, when 40 year-old Joshua Brown’s Model S autopilot drove the car full-speed into a white 18-wheel truck due to its sensors failing to distinguish the trailer against the bright sky. Brown was killed when the top of his car was “torn off by the force of the collision.”

Another three people were killed last year in two separate Tesla autopilot crashes, both accidents were caused by self-driving Model S cars veering off the road and bursting into flames after hitting trees. Both incidents raised questions about the fire safety of the cars’ lithium batteries, as well as its operating system, as reports revealed the possibility that at least one of the passengers was killed in the fire rather than collision, due to being unable to open their door.

Tesla has shared condolences in relation to these tragedies, but the take home message of their responses has mostly been about shifting the blame.

“Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert,” said Tesla.

In August this year, an activist group released a report showing some Tesla cars in self-driving mode were unable to detect children in the road.

This is just one video of a broader experiment, called The Dawn Project, to independently test Tesla car safety. The project is funded by Dan O’Dowd, a California software entrepreneur and billionaire who has often criticized Elon Musk, seeing him as the one responsible for Tesla’s “reckless” unsafe self-driving vehicles.

Also, on Tuesday this week, the Australian government recalled more than 1000 Teslas over a software calibration issue causing steering defects meaning the “Electronic Power Assist Steering system (EPAS) may not operate as intended.”

With their technology under increased scrutiny, Tesla has also begun cracking down hard on their critics for vocalizing concerns.

Last April at a Tesla auto show in Shanghai, a woman climbed atop one of the showroom cars to protest that she had almost died when her Tesla’s brakes failed – one of two vocal anti-Tesla Chinese citizens who were later silenced and sued by the car giant.

Zhang Yazhou and Han Chao were both forced to pay compensation and publicly apologize for making “unverified, ungrounded” claims that wrongfully tarnished Tesla’s brand.

As these reports of Tesla-related accidents have piled-up, many have suggested the company’s marketing strategy may be partly to blame, as amping up the automated functionality to a level that removes responsibility from drivers could make them inattentive and over reliant on the vehicles autopilot capabilities.

Autopilot undermines human agency

In the first case attempting to navigate the tricky landscape of human agency in self-driving cars, Kevin George Aziz Riad is currently facing trial after crashing his Tesla Model S into another car in 2019, killing two people inside.

On the face of it, Riad’s expected verdict seems obvious, but given that his car was in autopilot mode at the time of the crash, the jury are faced with a difficult question, as law professor and expert in self-driving cars Edward Walters, put it: “who’s at fault, man or machine?”

Tesla is not facing any criminal charges from the incident, but the nature of the crash has created space for much-needed conversation on the philosophical issue of driver vs. vehicle autonomy, a topic that’s sure to surface time and time again as more manufacturers roll-out autopilot and driverless features.

This landmark case in California, the recent fatal crash in China, and many other incidences of autopilot-related road accidents all underline the urgent need for more clarity on the safety of self-driving vehicle technologies.

But as well as raising questions around autonomy, safety and responsibility, these reports have also opened the floor for consumers and autonomous vehicle corporations to debate whether; behind the wheel, are humans or technology more trustworthy?

Silicon Valley has always believed in the superiority of technology, and that self-driving cars would result in fewer accidents and fewer road deaths. This belief is even reflected in the optimistic tone of the US Department of Transportation page on “Automated Vehicle Safety”.

But this belief is inevitably destined to be shaken as more road accidents happen, showing the limits of technology.

It is encouraging that the Transportation Department has developed an “interactive AV test tracking tool” to keep the public informed about developments in this area. As time passes and evidence accumulates, one may expect that some sort of answer might be arrived at.


About the Author

Lauren Richards

Lauren is a research scientist turned writer, currently working in open-access publishing in London alongside a Journalism Internship at Impakter. As a graduate of Medical Biological Sciences, Lauren’s origins in science have taught her to be forever curious, which is reflected in her love for sharing new concepts, perspectives, and ideas. When not reading/writing about science, culture, art, and everything in between, Lauren can most likely be found in a coffee shop or travelling.

First published in Impakter. You can read the article here.


Related Posts

You've successfully subscribed to Our Brew
Great! Next, complete checkout for full access to Our Brew
Welcome back! You've successfully signed in
Success! Your account is fully activated, you now have access to all content.
Success! Your billing info is updated.
Billing info update failed.
Your link has expired.