US investigation of Tesla's "Autopilot" system expanded

The US Department of Transportation has expanded its investigation into Tesla’s Autopilot driver assistance system after a series of rear-end collisions. Since the investigation began in August, she has identified six other incidents in which Teslas with the “Autopilot” system switched on crashed into emergency vehicles parked on the side of the road. Originally, there were eleven such accidents. The most recent crash happened in January.

The investigations are now to be expanded, among other things, with the evaluation of additional data, as the traffic authority NHTSA announced in a document published on Thursday (local time). She also looks at a good 100 “autopilot” accidents in which no emergency vehicles were involved. It should also be examined to what extent the system of the electric car manufacturer increases the risk of human error. The NHTSA sees signs that in around 50 of the accidents investigated, the drivers reacted inadequately to the traffic situation.

“Autopilot” is just an assistance system

Tesla itself points out to customers that “Autopilot” is only an assistance system and therefore the person in the driver’s seat must keep their hands on the steering wheel at all times. He should always be ready to take control. Nevertheless, it happens again and again that drivers rely completely on the “autopilot” system. Tesla tightened its safety measures a few years ago: The software notices when the driver’s hands are not on the wheel and emits warning tones after a short time. According to the NHTSA, the current “Autopilot” investigation involves an estimated 830,000 vehicles of all four current model series from the years 2014 to 2022.

The NHTSA had already examined the “autopilot” system after a fatal accident in 2016. At the time, a driver died after his Tesla crashed under the trailer of a semi truck crossing the road. NHTSA concluded that the system was working correctly, within its capabilities, but that the human driver was over-reliing on it. The “autopilot” system had not recognized the trailer with its white side panel and had not braked. The driver didn’t respond either.

Musk: Autopilot makes driving safer

In the current investigation, the NHTSA pointed out that in all rear-end collisions, the fire and ambulance vehicles were clearly identified, among other things, thanks to the flashing lights being switched on. Tesla released a software update in September last year, thanks to which the “Autopilot” should recognize the vehicles with their distinctive flashing lights even in difficult lighting conditions. The NHTSA then questioned why the update was not declared a recall.

Tesla boss Elon Musk always emphasized that “Autopilot” makes driving safer and helps to avoid accidents. For several months, the company has been letting selected beta testers try out the next version of the software with more functions for city traffic. There are many videos circulating on the web in which the software makes mistakes. The NHTSA has already requested information about the test on public roads.

Since February, the NHTSA has also been investigating Tesla for reports of sudden braking. The trigger was 354 complaints within nine months because the “autopilot” system suddenly and unexpectedly activated the brakes. The authority also requested information from other car manufacturers about their assistance systems.

#investigation #Teslas #Autopilot #system #expanded

Leave a Reply

Your email address will not be published.