Federal agency to investigate fatal Tesla crash in Newport Beach - Los Angeles Times
Advertisement

Feds probe Tesla Autopilot in Newport Beach crash that killed 3

Pieces of car are loaded into the back of a truck
Debris is collected during an investigation into a crash that killed three people in a Tesla Model S on Pacific Coast Highway on May 12. This week, the National Highway Traffic Safety Administration said it had sent a team to investigate whether the Tesla had its Autopilot system on.
(Don Leach / Times Community News)
Share via

Federal authorities are investigating whether a Tesla involved in a crash that left three people dead and three others injured last week in Newport Beach had its Autopilot system activated at the time of the wreck.

A special crash investigation team was sent for the May 12 incident on Coast Highway, the National Highway Traffic Safety Administration said Wednesday.

In that crash, Newport Beach police were called around 12:45 a.m. to the 3000 block of Coast Highway, where they found a 2022 Tesla Model S sedan had crashed into a curb and hit construction equipment.

Advertisement

Three people were found dead in the Tesla; they were identified last week as Crystal McCallum, 34, of Texas; Andrew James Chaves, 32, of Arizona; and Wayne Walter Swanson Jr., 40, of Newport Beach, according to the Orange County Sheriff’s Department.

Three construction workers suffered injuries that were not life-threatening, police said, adding that the department’s Major Accident Investigation Team had been brought in.

Tesla, which has disbanded its media relations department, did not respond Wednesday to a request for comment from The Times about NHTSA’s investigation into the Orange County crash.

Advertisement

The federal investigation is part of the agency’s broader probe into crashes involving advanced driver-assist systems such as Tesla’s Autopilot. Investigators have been sent to 34 crashes since 2016 in which the systems were either in use or suspected of operating; 28 of those involved Teslas, according to a NHTSA document released Wednesday.

In those 34 crashes, 15 people were killed and at least 15 others were injured, and all but one of the deaths occurred in crashes involving Teslas, according to the document.

In light of videos showing self-driving Teslas attempting dangerous maneuvers, the California DMV is considering whether to require the company to conform to the same laws as other robot car makers.

NHTSA told The Times on Wednesday evening that it does not comment on open investigations.

Advertisement

In addition to those crashes, NHTSA is investigating several incidents in which Teslas on Autopilot crashed into emergency vehicles parked along roadways despite flashing lights or hazard cones, as well as a string of complaints that the Autopilot system triggered “phantom braking†at high speeds for no apparent reason.

NHTSA also is investigating two crashes involving Volvos, one Navya shuttle crash, two involving Cadillacs, one in a Lexus and one in a Hyundai. One of the Volvo crashes was an Uber autonomous test vehicle that ran over and killed an Arizona pedestrian in March 2018.

In Los Angeles County, the district attorney’s office filed in January what experts believe is the first felony prosecution in the United States of a driver accused of causing a fatality while using a partially automated driver-assist system.

The charges came two years after the crash in Gardena. Kevin George Aziz Riad, 27, was at the wheel of a 2016 Tesla Model S on Autopilot on Dec. 29, 2019, when it exited a freeway, ran through a red light and crashed into a Honda Civic.

The Civic’s driver, Gilberto Alcazar Lopez, and his passenger, Maria Guadalupe Nieves-Lopez, were killed instantly.

Riad faces two counts of vehicular manslaughter.

Tesla has warned drivers using Autopilot, as well as its so-called Full Self-Driving system, that the cars can’t drive themselves and that drivers must be ready to intervene at all times.

Advertisement

Elon Musk tweeted early Friday that his planned acquisition of the social network was “temporarily on hold.†This is a familiar pattern for the tech magnate.

Last June, NHTSA ordered dozens of automobile and technology companies to report crash data on automated vehicles in order to better monitor their safety.

No commercially available motor vehicle can completely drive itself, the agency said. Tesla’s Autopilot feature is classified as “Level 2†vehicle autonomy, which means the vehicle can control steering and acceleration, but a human in the driver’s seat can take control at any time.

“Whether a [Level 2] automated driving system is engaged or not, every available vehicle requires the human driver to be in control at all times, and all state laws hold the human driver responsible for the operation of their vehicles,†according to a NHTSA spokesperson. “Certain advanced driving assistance features can promote safety by helping drivers avoid crashes and mitigate the severity of crashes that occur, but as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly.â€

Many legal experts are clear that the liability of Level 2 systems such as Autopilot lies squarely on the driver — not on companies marketing technologies that may lead consumers to believe the features are more capable than they are.

But the California Department of Motor Vehicles is struggling with confusion over Tesla’s Full Self-Driving feature, a cutting-edge version of Autopilot intended to eventually do just what the name says: provide full autonomy, to the point where no human is needed to drive.

The federal government hasn’t required automakers to provide data on crashes involving cars with automated-driving systems. That’s changing.

Although other autonomous car developers, such as Waymo and Argo, use trained test drivers who follow strict safety rules, Tesla is conducting its testing using its own customers, charging car owners $12,000 for the privilege.

Advertisement

Other autonomous technology companies are required to report crashes and system failures to the DMV under its test-permit system, but the agency has been allowing Tesla to opt out of those regulations.

After pressure from state legislators, prompted by scary videos on YouTube and Twitter pointing out Full Self-Driving’s poor performance, the DMV said in January that it was “revisiting†its stance on the Tesla technology.

The agency is also conducting a review to determine whether Tesla is violating another DMV regulation with its Full Self-Driving systems — one that bars companies from marketing their cars as autonomous when they are not.

Times staff writer Russ Mitchell and the Associated Press contributed to this report.

Advertisement