Tesla reportedly asked highway safety officials to redact information about whether driver-assistance software was in use during crashes

Elon Musk Tesla
Tesla's Autopilot system has come under federal scrutiny in an ongoing NHTSA investigation covering hundreds of thousands of the company's vehicles.
  • Tesla asked a federal agency to redact information about whether driver-assistance software was in use during crashes, The New Yorker reported.
  • Tesla has faced numerous controversies over Autopilot and Full Self-Driving in recent years.
  • The NHTSA launched a probe into Autopilot in 2021 that covered hundreds of thousands of Teslas.

Tesla directed the National Highway Traffic Safety Administration to redact information about whether driver-assistance software was being used by vehicles involved in crashes, The New Yorker reported as part of investigation into Elon Musk's relationship to the US government. 

"Tesla requested redaction of fields of the crash report based on a claim that those fields contained confidential business information," an NHTSA spokesperson told Insider in a statement. "The Vehicle Safety Act explicitly restricts NHTSA's ability to release what the companies label as confidential information. Once any company claims confidentiality, NHTSA is legally obligated to treat it as confidential unless/until NHTSA goes through a legal process to deny the claim."  

Tesla did not immediately respond to Insider's request for comment, nor did it respond to The New Yorker's request for comment.

While Musk has promised for years that self-driving Teslas are around the corner, Tesla's Autopilot and Full Self-Driving systems have weathered a number of controversies over the years.

Autopilot, which is meant to help on highways, is the driver-assist software built into all Teslas, while Full Self-Driving is a beta add-on that costs $15,000 a year. Full Self-Driving is more advanced, and allows cars to change lanes, recognize stop signs and lights, and park, Tesla says. The company adds that "a fully attentive driver" should be behind the wheel when both features are used.

In June, The Washington Post reported that there have been a total of 736 crashes and 17 deaths involving Teslas in Autopilot mode since 2019.

Steven Cliff, former deputy administrator of the NHTSA, told The New Yorker he'd seen data indicating Tesla vehicles were involved in "a disproportionate number of crashes involving emergency vehicles," but that the agency hadn't yet concluded if human drivers or Tesla's software were to blame.

The NHTSA announced in June 2021 that it was investigating the role of Tesla's Autopilot in 30 crashes that killed 10 people between 2016 and 2021.

Two months later, the agency announced another investigation into the feature after identifying 11 crashes since 2018 in which Teslas hit vehicles at first-responder sites. Seven of the 11 crashes identified resulted in injuries, and one resulted in a death.

The NHTSA said at the time its probe would include all Tesla models X, Y, S, and 3 made between 2014 and 2021 — or roughly 765,000 vehicles. In June 2022, the agency upgraded the probe, saying it would now look at data from 830,000 Tesla vehicles.

A Department of Justice criminal investigation has also been underway, with Tesla confirming in February that the DOJ requested documents about the Autopilot and Full Self-Driving features.

A spokesperson for the NHTSA told The New Yorker that "multiple investigations remain open."

Read the original article on Business Insider


from Business Insider https://ift.tt/BngKmfT
via IFTTT

Comments