Your Tesla could explain why it crashed. But good luck getting its Autopilot data - Los Angeles Times
Advertisement

Your Tesla could explain why it crashed. But good luck getting its Autopilot data

Tesla crash
Emergency personnel work at the scene where a Tesla Model X SUV crashed into a barrier on U.S. Highway 101 in Mountain View, Calif., in 2018, killing engineer Walter Huang.
(Associated Press)
Share via

On Jan. 21, 2019, Michael Casuga drove his new Tesla Model 3 southbound on Santiago Canyon Road, a two-lane highway that twists through hilly woodlands east of Santa Ana.

He wasn’t alone, in one sense: Tesla’s semiautonomous driver-assist system, known as Autopilot — which can steer, brake and change lanes — was activated. Suddenly and without warning, Casuga claims in a Superior Court of California lawsuit, Autopilot yanked the car left. The Tesla crossed a double yellow line, and without braking, drove through the oncoming lane and crashed into a ditch, all before Casuga was able to retake control.

Tesla confirmed Autopilot was engaged, according to the suit, but said the driver was to blame, not the technology. Casuga’s attorney, Mike Nelson in New York City, asked Tesla to release the data to show exactly what happened. Tesla refused, the suit claims, and referred Casuga and his lawyer to the car’s event data recorder, known as the black box. But the black box — a common feature in cars since the early 2000s — doesn’t record Autopilot data. Autopilot information is captured and stored separately, often sent over the airwaves to Tesla’s remote cloud computer repository.

Advertisement

Finding out who or what caused a car crash should be easier today. Cars have become computers on wheels, bristling with sensors, data processors and memory chips. The information “is significantly better and more potentially useful than ever,†said Jason Levine, executive director of the Center for Auto Safety, an advocacy group.

But the ownership and accessibility of much of those data are in flux, as legislators and regulators play catch-up with the fact that human beings and mobile robot systems increasingly share the driving.

For years, automakers have been offering driver-assist technologies on many new cars.

Casuga’s lawsuit is an attempt to get a court order for Tesla to turn over his car’s data, Nelson said. If the data were recorded on the car’s black box, Casuga would have legal access. But no laws or regulatory requirements give car owners the right to access operational information, not even basic safety data, if it’s not on the black box. (And in some states, even the black box information doesn’t belong to the car owner.)

Advertisement

“The car manufacturer knows what happened,†Nelson said. But short of a court order, a carmaker is not bound to release information on semiautonomous driving systems to a car owner, a lawyer, safety researchers, even to a police department investigating a fatal crash (though The Times found no evidence that Tesla or other companies are resisting police requests).

Only federal safety regulators have on-demand rights to car crash data collected onboard by the manufacturer but not on the black box.

A rare public airing of driver-assist technology’s role in traffic crashes will occur Tuesday in a meeting of the National Transportation Safety Board, where two fatal Tesla incidents involving Autopilot will be discussed, including the 2018 Model X crash that killed Apple engineer Walter Huang in Mountain View, Calif. The meeting will be viewable via webcast.

Advertisement

Levine hopes some basic questions will be addressed at the meeting. “How is the technology working? Is it failing? When it’s failing, is it operator’s failure or the technology’s failure?†Levine said. “You can’t always determine all of that from the [black box] event data recorder.â€

Tesla did not respond to multiple requests for comment.

At a time when massive data sets can be threaded through sophisticated computers to determine trends in safety performance, robot car data are locked in manufacturer silos, thus far unobtainable even in anonymized form to be evaluated by independent safety researchers. Autonomous driving is a nascent field crowded with competitors backed by billions of dollars jockeying for market leadership, and they jealously guard proprietary technology. But safety experts say an over-aggressive attitude about intellectual property is getting in the way of basic safety assessment.

“Data associated with a crash, or even near-crash, of a vehicle operating under automated control should not be deemed proprietary,†said Alain Kornhauser, professor of operations research at Princeton University, a specialist in driverless technology and policy, and founder of the annual Smart Driving Car summit. “Safety must be a cooperative effort, not a ‘I know something you don’t know’ competitive play.â€

Elon Musk has a huge financial incentive to turn on full-service driving. But is Tesla anywhere close to ready?

It’s fairly safe to assume that, properly designed and planned for, fleets of fully automated robot cars, trucks and buses one day will be safer than vehicles driven by humans. But getting to the point where humans feel comfortable sharing the road with robots will require demonstrating how and how well the technology works.

“There are many places where self-driving cars are going to be safer. I have a driver-assisted Subaru and I love it,†said Madeleine Clare Elish, who studies the confluence of humans and automated systems at the research group Data & Society. “But that doesn’t mean it cancels out all the complications, including when the technologies fail.â€

Elish’s recently published academic paper, titled Moral Crumple Zones, concludes that when blame is assessed in major accidents involving humans and advanced technology, human operators tend to feel the heat even when the technology is poorly designed.

Advertisement

Technology defects are inevitable, and are not limited to hardware and software glitches. Major gaps in logic and sensing still limit the ability of robot cars to handle the baroque complexities of the real world.

After a Tesla Model S on Autopilot drove under a semitrailer in Gainesville, Fla., in 2016, killing the car’s driver, Joshua Brown, the National Highway Traffic Safety Administration concluded Autopilot couldn’t tell the side of a truck from the overcast sky. The agency said the system’s track record wasn’t long enough to conclude Autopilot was defective. Tesla changed the supplier of its sensor system after that crash and created its own. But a similar Autopilot-related crash in Delray Beach, Fla., in 2019 also involved a Tesla that had its top sheared off after running under a truck. The NTSB is investigating that second crash.

Tesla Chief Executive Elon Musk has referred to Autopilot as “beta†software, a term for early versions of programs whose everyday users help identify bugs. Tesla’s manual tells drivers they must keep a hand on the steering wheel at all times, an admonition that Tesla uses as a defense in Autopilot lawsuits (even though Musk himself has sometimes not followed the advice).

Tesla is hardly alone in selling semiautonomous systems. Most manufacturers offer some form of driver-assist system, including Nissan’s ProPilot, Volvo’s Pilot Assist, Subaru’s EyeSight and Cadillac’s Super Cruise. But with its lane-change and automatic-parking features, Tesla’s technology is closer to the cutting edge. (Without sufficient data, it’s unknown whether Tesla Autopilot crashes are reported more often than other such systems or whether it’s because Tesla is covered more thoroughly by news organizations.)

Developers such as Waymo and Zoox chose to skip over semiautonomous systems to focus on full autonomy, to the point where a steering wheel will be unnecessary. Early experiments at Waymo — a subsidiary of Google parent Alphabet — showed human drivers are slow to correct when an automated system makes a mistake.

Advertisement

Musk has long insisted that his company’s Autopilot feature already is safer than the average vehicle on the highway and safer than Teslas driven without Autopilot engaged. After Sen. Edward J. Markey (D-Mass.) asked Tesla how it prevents drivers from abusing Autopilot, the company said its data monitoring of hundreds of thousands of Teslas on the highways shows cars with Autopilot engaged crash less often than cars with Autopilot disengaged.

With Tesla’s Smart Summon feature, the car drives itself to its human owner. It’s already being abused, raising public safety questions.

But Tesla has not released the data for safety researchers to evaluate. Statisticians at Rand Corp. and elsewhere have questioned his methodology, citing problems with sample size, sample representation and other issues. Princeton’s Kornhauser said he offered to do an independent safety evaluation on Tesla’s claims, using anonymized data. Tesla never responded to his invitation.

Just how far behind safety regulators are in data collection can be seen in the current strategic plan posted online by NHTSA, an arm of the Department of Transportation.

NHTSA assembles national traffic fatality data by collecting a sample of local and state-level police crash records and extrapolating that into national estimates. In its strategic plan, called “The Road Ahead,†the agency said it plans “enhancements†to the data collection system, “such as enabling states to transfer crash data electronically†instead of on paper. These are the numbers Musk is comparing his secret Autopilot data against.

Nothing about Autopilot or driver-assist systems from other manufacturers finds its way into the statistical estimates. The reporting forms filled out by police at crash scenes are created, for the most part, at the state level.

In California, police forms include dozens of checkboxes but currently none relate to any kind of automated or semiautomated driving system, though an individual officer can add written notes. That means automated driver-assist technologies such as automatic lane change or adaptive cruise control are not reflected in national statistics.

Advertisement

The California Highway Patrol said officers aren’t now required to ask about semiautonomous systems, and if they do, the information is included in the narrative portion of the crash report. But the CHP is working on changes in its crash reporting form that will “contain data elements relative to the use of Motor Vehicle Automated Driving Systems. The new data elements will allow officers to document whether a vehicle was equipped with [driver-assist technology] and if so, whether the technology was engaged at the time of the crash, and at what level of automation the vehicle was engaged,†the CHP said in a written statement.

How local police departments investigate crashes that could involve semiautonomous systems varies.

Last May, a speeding Tesla slammed into a Ford Fiesta in downtown Berkeley. Four passengers were inside the Ford; a woman in the back seat was killed. In July, a rented Tesla Model 3 slammed into two vacationing pedestrians in San Francisco, a couple celebrating their anniversary. The husband was killed. And in late December, a Tesla Model S ran a red light in Los Angeles near Gardena and struck a 2006 Honda Civic, killing both Honda occupants.

The Times asked all three police departments to discuss how they investigate Tesla crashes in which the question of whether Autopilot was switched on might be relevant, and for specifics on the fatal incidents.

Berkeley Police Department spokesman Byron White said it determined Autopilot was not a factor in the crash by interviewing witnesses. Tesla was consulted and was helpful, he added, though he would not say whether the department had asked the company to provide data.

Advertisement

Adam Lobsinger, a spokesman for the San Francisco Police Department, said in a written statement that “preliminary information†indicates that Autopilot was not switched on but added: “Investigators from the San Francisco Police Department’s Traffic Collision Investigation Unit are preparing search warrants and working with Tesla Inc. to obtain documentary evidence. Investigators also removed a data storage device from the Tesla. The information contained in the device will be analyzed to help determine the actions and events that led up to the collision.â€

Joshua Rubenstein, spokesman for the LAPD, said by email, “we don’t have that information†and declined to elaborate. NHTSA is also investigating the L.A. crash.

Automobile headlights haven’t been very good, but they’re improving, according to the auto insurance industry.

Safety advocates say it’s difficult for police to know whether a car is even equipped with robot driving technology outside of a driver’s statements about a crash. Every major safety advocacy group has recommended that the government require the presence of such technology be included in the vehicle identification numbers issued with each new car. No government action has been taken.

Even black box data are difficult for some departments to access. Each automaker issues its own proprietary access cables and other equipment that can cost hundreds of dollars each.

Both NHTSA and NTSB have access to all crash data when they investigate. NHTSA holds the power to force a recall of unsafe vehicles. The NTSB is a stand-alone agency with no enforcement power, but its investigation-based safety recommendations carry weight, in part because it chooses cases, it says, “that can advance our knowledge of safety issues.†NHTSA is currently probing at least a dozen Autopilot-related crashes. The NTSB has 16 traffic safety investigations open, four of them involving Teslas.

NHTSA has taken a light hand on setting rules for robot car development. Under Presidents Obama and Trump, agency officials have said that not enough is known about what regulations may be needed and that they did not want to stifle innovation in the meantime.

Advertisement

There’s been some movement on the data front. In January, Transportation Secretary Elaine Chao announced that six automakers — General Motors, Fiat Chrysler, Nissan, Honda, Toyota and Mitsubishi — have agreed to share data on their semiautonomous driver-assist systems to see where improvements are needed. Tesla is not involved.

Advertisement