Tesla expands self-driving test -- to those it deems worthy - Los Angeles Times
Advertisement

Tesla to expand self-driving software test — but only to drivers it deems worthy

The dashboard of the Tesla Model S P90D.
A dashboard in a Tesla Model S P90D. The company is expanding a test of its Full Self-Driving software, but only to drivers it deems qualified.
(Chris Walker / TNS)
Share via

The wait is almost over for some Tesla Inc. customers to get access to driver-assistance technology the company has marketed in controversial ways — as long as they’re on their best behavior.

Chief Executive Elon Musk has said that on Saturday, the electric-car maker will roll out an updated version of its Full Self-Driving beta software, which until now has been available to only about 2,000 people.

Those with access to this ever-updating software — a mix of Tesla employees and fervent Musk fans — have for almost a year been honing a system for which the company has charged as much as $10,000 for customers to use sometime in the future. Tesla says the system, often referred to as FSD, is designed to someday handle both short- and long-distance trips without driver intervention.

Advertisement

It’s unclear how broad the wider release will be because of a curveball Musk threw this month. The CEO tweeted that the download button customers were set to begin seeing Friday will request car owners’ permission for Tesla to assess their driving behavior for seven days. If the company deems the behavior good, it will grant access to FSD beta.

The expanded access and surprise condition are the latest twist involving FSD and Autopilot, the driver-assistance system that has divided Tesla watchers for years. Musk’s fostering of the perception Tesla is a self-driving leader has helped make it the world’s most valuable automaker by far. But others have taken issue with what they see as a reckless and misleading approach to deploying technology that isn’t ready. The U.S. National Highway Traffic Safety Administration recently opened its second defect investigation into Autopilot since 2016.

The driver of a Tesla Model S that hit a Culver City firetruck last year was using Autopilot when a vehicle in front of him suddenly changed lanes, a government report says.

“This is another example of Tesla marching to its own drum. It’s like, damn the torpedoes, full speed ahead,†Gene Munster, a co-founder of investing firm Loup Ventures, said by phone. “Setting aside some of the regulatory concerns and pushback, Tesla is determined to move forward on its own agenda.â€

Advertisement

NHTSA started investigating Autopilot in August after almost a dozen collisions involving first-responder vehicles. The regulator, which has the authority to deem cars defective and order recalls, is assessing the technologies and methods Tesla uses to monitor, assist and enforce drivers’ engagement when using Autopilot. It’s also looking into the system’s detection of objects and events on the road, and how it responds.

Musk first announced his plan to sell FSD in October 2016, a few months after he told a tech conference he considered autonomous driving to be “basically a solved problem.â€

In April 2019, he predicted that roughly a year later, Tesla’s technology would advance to the point that drivers wouldn’t need to pay attention.

Advertisement

In March of this year, however, Musk announced Tesla had revoked FSD beta from drivers who didn’t pay enough attention to the road.

The new head of the other investigator of auto crashes in the U.S., the National Transportation Safety Board, has taken umbrage with this sort of mixed messaging.

“Whether it’s Tesla or anyone else, it is incumbent on these manufacturers to be honest in what their technology does and does not do,†Jennifer Homendy told Bloomberg News in her first interview after she was sworn in last month.

Homendy has since called Tesla’s use of the term Full Self-Driving “misleading and irresponsible†and expressed concern to the Wall Street Journal about FSD’s readiness to be used by more drivers on public roads.

“For investors, it’s terrifying,†said Taylor Ogan, CEO of Boston hedge fund Snow Bull Capital, who has closely watched videos of FSD beta testers at times demonstrating the software’s shortcomings. “It’s like the CEO of a drug company broadening the test pool of the experimental drug that the FDA is investigating for potentially hurting people.â€

Tara Goddard, an urban planning professor at Texas A&M University who’s researching how auto safety tech and automation are being marketed to consumers, questions whether Tesla’s seven-day evaluation of drivers’ behavior goes far enough to weed out unsafe users.

Advertisement

She pointed to a Tesla enthusiast’s recent blog post giving car owners pointers on how the company is likely to judge their driving.

“People are already saying, ‘Here’s how you game the system to make sure you can opt in and use it how you want,’†Goddard said. “I just worry that we’re going to see an uptick in this being used in places where it’s really not ready to be used — and not by professional drivers.â€

Advertisement