Breaking news

Federal regulators find “serious safety flaw” in Tesla Autopilot linked to hundreds of crashes

Federal regulators find “serious safety flaw” in Tesla Autopilot linked to hundreds of crashes
Federal regulators find “serious safety flaw” in Tesla Autopilot linked to hundreds of crashes
--

Federal authorities say a ‘serious security breach’ exists TeslaAutonomous driving systems have caused at least 467 accidents, including 13 fatalities and “many” serious injuries.

The research results come from a National Highway Traffic Safety Administration analysis showing Tesla Autopilot was believed to have been used in 956 crashes. The results of the nearly three-year survey were released on Friday.

The NHTSA report said Tesla’s Autopilot design “resulted from past abuse and avoidable accidents.” The system “did not sufficiently ensure driver attention and appropriate use”.

The NHTSA document states that the “driver engagement system is weak” and that Autopilot will remain active even if the driver is not paying enough attention to the road or driving. Driver engagement systems include a variety of cues, including “alerts” or chimes that tell the driver to pay attention and keep their hands on the steering wheel, and in-car cameras that detect when the driver is not looking at the road. .

The company also said it was launching a new investigation into the effectiveness of a software update issued by Tesla during the December recall. The update is intended to fix an Autopilot flaw discovered by NHTSA during the same investigation.

The voluntary recall, conducted via an over-the-air software update, covers 2 million Tesla vehicles in the U.S. and is designed to specifically improve Tesla driver monitoring systems equipped with Autopilot.

The National Highway Traffic Safety Administration said in a report Friday that software updates may not be enough as more crashes involving Autopilot continue to be reported.

In the most recent example, a Tesla driver struck and killed a motorcyclist in Snohomish County, Washington, on April 19, according to records obtained by CNBC and NBC News. The driver told police he was using autopilot at the time of the crash.

NHTSA’s findings are the latest in a series of regulatory and watchdog reports questioning the safety of Tesla’s self-driving technology, which the company has touted as key to competitive differentiation with other car companies.

On its website, Tesla says Autopilot is designed to reduce the driver’s “workload” through advanced cruise control and automatic steering technology.

Tesla has not yet responded to requests for comment on Friday’s NHTSA report or sent to Tesla’s news inbox, with its investor relations team and Lars Moravi, the company’s vice president of automotive engineering, requesting comment.

After the release of the NHTSA report, Sens. Edward J. Markey, D-Mass. And Richard Blumenthal, D-Conn., issued a statement urging federal regulators to restrict Tesla’s Autopilot feature to “certain roads.” Designed for. “

On its user manual website, Tesla warns drivers not to operate Autopilot’s Autosteer feature “in areas where bicyclists or pedestrians may be present,” along with a series of other warnings.

“We urge the agency to take all necessary steps to protect these vehicles from endangering lives,” the senators said.

The Tesla lawsuit settlement earlier this month came from the family of Walter Huang, an Apple engineer and father of two who died in a car accident when a Tesla Model X with Autopilot crashed into a highway guardrail. Tesla has tried to keep the terms of the settlement from public view.

In the face of these developments, Tesla and CEO Elon Musk said this week that they are betting the company’s future on self-driving.

“If someone doesn’t believe that Tesla is going to solve the self-driving problem, I don’t think they should be an investor in the company,” Musk said on Tesla’s earnings call Tuesday. “We will, and we will,” he added.

Musk has promised customers and shareholders for years that Tesla would be able to convert its existing cars into self-driving cars through software updates. However, the company only offers driver assistance systems and has not yet developed self-driving vehicles.

He has made security claims about Tesla’s driver-assistance system but has not allowed third parties to review the company’s data.

For example, in 2021, Elon Musk claimed in a post On social media”, “Autopilot capable Teslas are now 10 times less likely to be involved in an accident than normal cars. “

Philip Koopman, an automotive safety researcher and associate professor of computer engineering at Carnegie Mellon University, said he believes Tesla’s marketing and claims are “self-evident.” In response to the National Highway Traffic Safety Administration’s report, he expressed hope that Tesla will take the agency’s concerns seriously and move forward.

“People are dying because they lost faith in Tesla’s Autopilot capabilities. Even simple steps can improve safety,” Koopman said. “Tesla can automatically limit the use of Autopilot to predetermined roads based on in-car map data. Tesla could improve monitoring capabilities so that drivers are not constantly intoxicated while using Autopilot on a mobile phone.”

A version of this story Published on NBCNews.com.


The article is in Bengali

Tags: Federal regulators find safety flaw Tesla Autopilot linked hundreds crashes

-

NEXT Zuckerberg takes only 1 dollar salary per year, and benefits of 250 crores