Tesla only counts crashes with pyrotechnic deployments. NHTSA has stated this only accounts for ~18% of crashes on average [1] which can be derived from publicly available datasets. No competent statistician or scientist would miss a literal 5x underestimation that is frequently mentioned by laypeople as a source of uncompensated bias and that is easily derivable from well-known public datasets. They make no attempt to account for other less easily computable or subtle forms of bias before blasting it at the top of their lungs to convince customers to risk their lives.
That is intentional falsification meant to push product and has no place in civil society.
[1] https://static.nhtsa.gov/odi/inv/2022/INCR-EA22002-14496.pdf
there's no such thing as perfect safety
Tesla does not release any verifiable data on FSD / Autopilot, and for a good reason (good for them).
If you have information that Tesla is faking statistics it's giving to the government, you should make your evidence public
Tesla is pretty much the only car maker that does NOT provide data on autonomous car testing. It's not mandatory.
As for the concerns about the data accuracy, they are very much public now. Here is the latest ongoing NTHSA investigation into Tesla:
https://static.nhtsa.gov/odi/inv/2022/INCR-EA22002-14496.pdf
Some quotes:
"Gaps in Tesla’s telematic data create uncertainty regarding the actual rate at which vehicles operating with Autopilot engaged are involved in crashes. Tesla is not aware of every crash involving Autopilot even for severe crashes because of gaps in telematic reporting. Tesla receives telematic data from its vehicles, when appropriate cellular connectivity exists and the antenna is not damaged during a crash, that support both crash notification and aggregation of fleet vehicle mileage. Tesla largely receives data for crashes only with pyrotechnic deployment,2 which are a minority of police reported crashes.3 A review of NHTSA’s 2021 FARS and Crash Report Sampling System (CRSS) finds that only 18 percent of police-reported crashes include airbag deployments."
"ODI uses all sources of crash data, including crash telematics data, when identifying crashes that warrant additional follow-up or investigation. ODI’s review uncovered crashes for which Autopilot was engaged that Tesla was not notified of via telematics. Prior to the recall, Tesla vehicles with Autopilot engaged had a pattern of frontal plane crashes that would have been avoidable by attentive drivers, which appropriately resulted in a safety defect finding.
Peer Comparison Data gathered from peer IR letters helped ODI document the state of the L2 market in the United States, as well as each manufacturer’s approach to the development, design choices, deployment, and improvement of its systems. A comparison of Tesla’s design choices to those of L2 peers identified Tesla as an industry outlier in its approach to L2 technology by mismatching a weak driver engagement system with Autopilot’s permissive operating capabilities. "