Tesla stopped reporting its Autopilot safety numbers online. Because?

Like clockwork, Tesla reported Autopilot safety statistics once every quarter starting in 2018. Last year, those reports stopped.

At the same time, the National Highway Traffic Safety Administration, the nation’s top auto safety regulator, began requiring crash reports from automakers that sell so-called advanced driver assistance systems such as Autopilot. It started publishing these numbers in June. And those numbers don’t look good for autopilot.

Tesla won’t say why it stopped reporting its safety statistics, which measure crash rates per mile driven. The company does not have a media relations department. A tweet sent to Tesla Chief Executive Elon Musk inviting his comments did not receive a response.

However, Tesla’s critics are happy to talk about the situation. Taylor Ogan, chief executive of fund management firm Snow Bull Capital, held a Twitter Spaces event on Thursday to weigh in on his own interpretation of Tesla’s safety numbers. He thinks he knows why the company stopped reporting its safety record: “Because it got a lot worse.”

Also Thursday, NHTSA announced it had added two more crashes to the dozens of Tesla self-driving incidents it is already investigating. One involved eight vehicles, including a Tesla Model S, on the San Francisco Bay Bridge on Thanksgiving Day. Through Friday’s close, Tesla shares have lost 65% of their value this year.

Ogan, using NHTSA accident numbers, previous Tesla reports, sales figures and other records, concluded that the number of Tesla accidents reported on US roads has grown much faster than the growth of Tesla sales. Average monthly growth in new Teslas since NHTSA issued its standing order was 6 percent, he estimates, while comparable crash statistics rose 21 percent.

Tesla’s Autopilot accident numbers are much higher than similar driver assistance systems from General Motors and Ford. Tesla has reported 516 accidents from July 2021 to November 2022, while Ford has reported seven and GM two.

Of course, Tesla has many more vehicles equipped with driver-assistance systems than the competition: about a million, Ogan said, about 10 times more than Ford. All the same, that would mean Tesla would have to have a total of 70 NHTSA-reported accidents since last summer to be comparable to Ford’s rate. In contrast, Tesla reported 516 accidents.

Tesla’s quarterly safety reports were always controversial. They put Tesla Autopilot in a good light: During the fourth quarter of 2021, Tesla reported one accident for every 4.31 million miles driven in Autopilot-equipped cars. The company compared that to government statistics that show one accident for every 484,000 miles driven on the nation’s roads, for all vehicles and all drivers.

But statisticians have pointed to serious analytical flaws, including the fact that Tesla’s statistics involve newer cars being driven on the roads. General government statistics include cars of all ages on roads, country roads and neighborhood streets. In other words, the comparison is of apples and oranges.

None of the statistics, neither Tesla’s nor the government’s, separate Autopilot from the company’s controversial full self-driving feature. FSD is a $15,000 option that’s more aspirational than its name suggests: No car sold today is fully autonomous, including those with FSD.

Autopilot combines adaptive cruise control with lane keeping and lane change systems on highways. FSD is marketed as advanced AI technology that can navigate neighborhood streets, stop and go at traffic lights, turn on busy streets, and generally behave as if the car were driving itself. The fine print, however, makes it clear that the human driver must be in full control and is legally responsible for accidents, including those involving injury and death.

The internet is full of videos of FSDs behaving badly: turning into oncoming traffic, mistaking railroad tracks for roads, running red lights, and more.

The number of injuries and deaths related to Autopilot and FSD is unknown, except perhaps in Tesla. Publicly available safety statistics on autonomous and semi-autonomous vehicles are scarce. Meanwhile, the accident reporting system in the US is rooted in 1960s methodology, and there appears to be no serious attempt to update it for the digital world, either at NHTSA or elsewhere.

NHTSA’s Driver Assistance Crash Statistics Collection Order, issued in 2021, relies on auto companies for accurate and complete reporting. (Musk has misrepresented FSD’s safety record in the past, including a claim that the technology had not been involved in any accidents, when the public record made clear that it had.)

Not all information submitted to NHTSA is available for public scrutiny.

Ogan, who drives an FSD-equipped Tesla, said more public information would allow for much more transparency in the safety of robot cars, at Tesla and other automakers. Tesla once reported its Autopilot utilization rate, but no longer does. “I’m looking at the only data available,” he said.

The California Department of Motor Vehicles has been investigating whether Tesla is violating its rule against marketing vehicles as fully autonomous when they are not. Musk has clearly stated that the company plans to develop FSD to create a fully autonomous robotaxi that Tesla owners could rent for extra money. He had promised 1 million of them on the road by 2020, but that date came and went and no fully autonomous Tesla exists. The DMV declined to comment.

FSD’s safety and capabilities are, by Musk’s own admission, existential concerns, especially as Tesla’s stock continues to sink. In a June interview with Axioshe said that “solving” FSD is “really the difference between Tesla being worth a lot of money and being worth basically zero.”

Leave a Reply

Your email address will not be published. Required fields are marked *