Last week, Tesla revealed a new section of its website dedicated to reporting safety data for its advanced driver assistance systems, Autopilot and Full Self-Driving (FSD).
The new hub appears to be an effort to move beyond the company’s traditional quarterly security reports, which have been criticized for failing to take into account basic facts about traffic statistics, and toward something more verifiable and reliable. And given that Tesla’s future depends on people trusting its self-driving technology, the stakes couldn’t be bigger.
But security experts say the updated report is too little, too late.
“Yes, on the surface it looks like FSD is performing quite well,” said Noah Goodall, a civil engineer who has published several peer-reviewed studies about Tesla Autopilot. “But I have little confidence in these figures because of Tesla’s past deceptions.”
“I have little confidence in these figures due to Tesla’s past deception.”
Tesla owners have driven 6.47 billion miles on FSD – and counting. The site literally has a miles counter that is always increasing. “Full self-driving (supervised) keeps you safe,” the site reads, noting that Tesla owners using FSD are driving approximately 5.1 million miles before a major collision and about 1.5 million miles before a minor collision. That’s far better than the average American driver, who travels 699,000 miles before a major collision and 229,000 miles before a minor collision.
One of the common criticisms of Tesla’s quarterly safety reports was that they focused almost exclusively on Autopilot, a less capable driver assistance feature that is primarily used on highways, rather than FSD, which can be used on local roads. The report did not take into account the fact that crashes are more common on city streets and undivided roads than on highways, where Autopilot is most commonly used.
The new safety center finally separates highway miles from non-highway miles, which Carnegie Mellon University autonomous vehicle expert Philip Koopman calls a “good start.” But on his Substack, Koopman highlights several details that he says undermine Tesla’s claim that drivers who use FSD are safer than those who don’t.
For example, the company’s claim that “a brand new Tesla loaded with safety technology is safer than a used car without that technology,” he says, is akin to claiming that a particular high school produces the best athletes because its students can run faster than the average American citizen, including disabled people and those living in nursing homes.
They also noted that the safety report did not include any information about people injured or killed in accidents involving Autopilot or FSD. Tesla claims that injury reports “are inconsistently provided by drivers through voluntary reporting or otherwise not accessible to Tesla due to health-related privacy laws.” But Koopman says Tesla probably has a good idea of the number of deaths in FSD-related crashes, if in no other way, by counting the upcoming lawsuits.
Koopman concluded, “Tesla has released a document full of marketing puffery, and not any serious safety analysis.”
Waymo, which operates robotaxis in five US cities, regularly publishes safety data in its online hub. But the company also publishes peer-reviewed studies to support its claims that its fully driverless vehicles are outperforming human drivers. This level of independent verification is completely absent from Tesla’s reports. In fact, Goodall has said that he has trouble publishing studies about Tesla’s numbers because critics assume they are fake.
“None of this data is independently verified, so I’m forced to trust Tesla here,” Goodall explained. The Verge“But it’s very difficult given their history of deceptive practices when it comes to security data.”