A Tesla on autopilot killed two people in Gardena. Is the driver guilty of manslaughter?
On Dec. 29, 2019, a Honda Civic pulled up to the intersection of Artesia Boulevard and Vermont Avenue in Gardena. It was just after midnight. The traffic light was green.
As the car proceeded through the intersection, a 2016 Tesla Model S on Autopilot exited a freeway, ran through a red light and crashed into the Civic. The Civic’s driver, Gilberto Alcazar Lopez, and his passenger, Maria Guadalupe Nieves-Lopez, were killed instantly.
Nearly two years later, prosecutors in Los Angeles County filed two counts of vehicular manslaughter against the driver of the Tesla, 27-year-old Kevin George Aziz Riad. Experts believe it is the first felony prosecution in the United States of a driver accused of causing a fatality while using a partially automated driver-assist system.
As such, the case represents a milestone in the increasingly confusing world of automated driving.
“It’s a wake-up call for drivers,†said Alain Kornhauser, director of the self-driving car program at Princeton University. “It certainly makes us, all of a sudden, not become so complacent in the use of these things that we forget about the fact that we’re the ones that are responsible — not only for our own safety but for the safety of others.â€
The federal government hasn’t required automakers to provide data on crashes involving cars with automated-driving systems. That’s changing.
While automated capabilities are intended to assist drivers, systems with names like Autopilot, SuperCruise and ProPilot can mislead consumers into believing the cars are capable of much more than they really are, Kornhauser said.
Yet even as fully autonomous cars are being tested on public roads, automakers, technology companies, organizations that set engineering standards, regulators and legislators have failed to make clear to the public — and in some cases one another — what the technical differences are, or who is subject to legal liability when people are injured or killed.
Riad, a limousine service driver, has pleaded not guilty and is free on bail while the case is pending. His attorney did not respond to a request for comment Tuesday.
Should Riad be found guilty, “it’s going to send shivers up and down everybody’s spine who has one of these vehicles and realizes, ‘Hey, I’m the one that’s responsible,’†Kornhauser said. “Just like when I was driving a ’55 Chevy — I’m the one that’s responsible for making sure that it stays between the white lines.â€
After the deadly collision in Gardena, the National Highway Traffic Safety Administration opened an investigation into the crash to determine what went wrong. Though the court documents filed in Los Angeles do not mention Autopilot by name, the agency is expected to soon post findings reflecting that the technology was engaged.
A parade of similar probes in the years that followed have continued to question the safety and reliability of automated driving features, including a larger investigation into as many as 765,000 Tesla cars built between 2014 and 2021.
Last year, NHTSA ordered dozens of automobile and technology companies to report crash data on automated vehicles in order to better monitor their safety.
The top federal traffic safety regulator says Tesla’s partially automated driving system, Autopilot, failed to spot parked police cars and fire trucks. It wants to know why.
No commercially available motor vehicle today can completely drive itself, the agency said. Tesla’s Autopilot feature is classified as “Level 2†vehicle autonomy, which means the vehicle can control steering and acceleration, but a human in the driver’s seat can take control at any time.
“Whether a [Level 2] automated driving system is engaged or not, every available vehicle requires the human driver to be in control at all times, and all state laws hold the human driver responsible for the operation of their vehicles,†a NHTSA spokesperson said. “Certain advanced driving assistance features can promote safety by helping drivers avoid crashes and mitigate the severity of crashes that occur, but as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly.â€
The outcome of the case could set a precedent for an industry idling at the corner of cutting-edge technology and basic human error.
Bryant Walker Smith, a professor at the University of South Carolina and an expert in laws related to automated motor vehicles, said that while companies routinely face civil liability, they “rarely face criminal liability for design decisions.â€
Yet Smith takes issue with the assertion that drivers — and only drivers — are liable at or below Level 2 automation. For example, if the airbag of a well-maintained car were to suddenly explode and cause a crash, the driver would not be culpable, he said.
“Conversely, if an automaker sells a dangerous car, they might be civilly liable. And, at least in theory, they might face criminal liability. This, however, is a rare prospect,†he said, noting that when the state of Indiana prosecuted Ford over a crash involving its Pinto model more than four decades ago, the company was acquitted.
A representative for Tesla, which has disbanded its media relations department, could not be reached for comment.
Police said a Tesla exited the freeway, ran a red light and slammed into a Honda Civic on Sunday.
Many legal experts are clear that the liability of Level 2 systems like Autopilot lies squarely on the driver — not on companies that market technologies that may lead consumers to believe the features are more capable than they are.
But the California Department of Motor Vehicles is struggling with confusion over Tesla’s Full Self-Driving feature, a cutting-edge version of Autopilot intended to eventually do just what the name says: provide full autonomy, to the point where no human at all is needed to drive.
But while other autonomous car developers, such as Waymo and Argo, use trained test drivers who follow strict safety rules, Tesla is conducting its testing using its own customers, charging car owners $12,000 for the privilege.
And while the other autonomous technology companies are required to report crashes and system failures to the Department of Motor Vehicles under its test-permit system, the agency has been allowing Tesla to opt out of those regulations.
After pressure from state legislators, prompted by scary videos on YouTube and Twitter pointing out Full Self-Driving’s poor performance, the DMV earlier this month said it is “revisiting†its stance on the Tesla technology.
The agency is also conducting a review to determine whether Tesla is violating another DMV regulation with its Full Self-Driving systems — one that bars companies from marketing their cars as autonomous when they are not.
That review began eight months ago; the DMV described it in an email to The Times as “ongoing.â€
L.A. County prosecutors have filed two counts of vehicular manslaughter against the driver of a Tesla on Autopilot after a deadly crash in 2019.
Amid the confusion over automated cars, what is less cloudy are the real tragedies that result from accidents.
In 2020, authorities in Arizona filed negligent homicide charges against the driver of an Uber SUV that struck and killed a pedestrian during a test of fully autonomous capabilities. The victim of that collision, Elaine Herzberg, is believed to be the first fatality from a self-driving vehicle.
In Los Angeles, the families of Lopez and Nieves-Lopez have filed lawsuits against Riad and Tesla.
Arsen Sarapinian, an attorney for the Nieves family, said Tuesday that they are closely monitoring the criminal case, awaiting the results of NHTSA’s investigative report and hoping for justice.
But, Sarapinian said, “neither the pending criminal case nor the civil lawsuit will bring back Ms. Nieves or Mr. Lopez.â€
More to Read
Sign up for Essential California
The most important California stories and recommendations in your inbox every morning.
You may occasionally receive promotional content from the Los Angeles Times.