International

Why do Tesla cars keep crashing into emergency response vehicles? Federal safety agency is investigating



In theory, identifying and avoiding stationary objects set off by hazard cones or flashing lights ought to be one of the easiest challenges for any autonomous-driving or driver-assist system.

Yet at least 11 times over the last seven years, cars made by Tesla Inc. and running its software have failed this test, slamming into emergency vehicles that were parked on roads and highways. Now the National Highway Traffic Safety Administration wants to know why.

A federal investigation announced Monday involves Tesla cars built between 2014 and 2021, including models S, X, 3 and Y. If the probe results in a recall, as many as 765,000 vehicles could be affected.

The 11 crashes at issue resulted in 17 injuries and one death. Three took place in Southern California.

The new investigation indicates that the safety agency, under President Biden and Transportation Secretary Pete Buttigieg, is paying more attention to automated driving safety than the more laissez-faire Trump administration. In June, the NHTSA ordered automobile manufactures, including Tesla, to forward data on crashes involving automated systems to the agency.

It’s about time, said Alain Kornhauser, director of the self-driving car program at Princeton University. “Teslas are running into stationary objects,” he said. “They shouldn’t be.”

Autopilot is the brand name for Tesla’s partially automated driving system. It combines adaptive cruise control with automatic steering and lane changing. Tesla’s manuals note the driver needs to pay full attention, but the feature has been marketed as a hands-free system, with Chief Executive Elon Musk appearing on television news shows behind the wheel of various Teslas with his hands raised high in the air and a grin on his face.

In a statement emailed to reporters Monday, the NHTSA noted that “no commercially available motor vehicles today are capable of driving themselves … and all State laws hold human drivers responsible for operation of their vehicles.”

But while human drivers indeed can be held legally liable in such crashes, the NHTSA is investigating Autopilot’s ability to identify obstacles in the roadway and its “event detection and response” systems. The technical language obscures some common sense questions that surely will be explored: Why does Autopilot sometimes not see fire trucks and police cars with lights flashing? Why did its automatic emergency brake systems not work?

As countless YouTube videos attest, Autopilot is easy to cheat. It monitors driver attention by sensing hands on the steering wheel, but weights can be hung on the wheel to defeat the monitoring system. Autopilot has frequently been misused by drivers, who have been caught driving drunk or even riding in the back seat while a car rolled down a California highway.

Other manufacturers use sophisticated cameras that analyze eye and head movement to assess the level of driver attention. Warnings are issued and eventually car power is cut off if the driver fails to keep eyes on the road.

Tesla has disbanded its media relations department. A request for Tesla comment was directed at Elon Musk’s Twitter account, to which he has not responded.

The emergency vehicle incidents under NHTSA investigation include a 2018 crash into the back of a fire truck stopped on the 405 in Culver City; a 2018 crash into a parked police cruiser in Laguna Beach; and a crash last month into an unoccupied California Highway Patrol car in San Diego.

The National Transportation Safety Board, which also has investigated some of the Tesla crashes, has recommended that the NHTSA and Tesla limit Autopilot’s use to areas where it can safely operate. The NTSB also recommended that the NHTSA require Tesla to have a better system to make sure drivers pay attention.

The NHTSA has not taken action on any of the recommendations. The NTSB has no enforcement powers and can only make recommendations to other federal agencies such as the NHTSA.

Tesla is also under review by the California Department of Motor Vehicles for its marketing of “Full Self-Driving” technology. That’s a significant enhancement to Autopilot that allows the car to be driven on city streets, with the claimed ability to handle traffic signals and make turns at intersections. The feature costs $10,000, which includes future enhancements, but Tesla has noted that its Full Self-Driving does not make the car self-driving. DMV regulations prevent auto manufacturers from making false claims about automated driving capabilities.

The DMV launched its review in mid-May. Asked how long the review might take, a DMV spokeswoman said via email, “The review is ongoing. The DMV does not comment on an ongoing review.”

Tesla has issued Full Self-Driving in various “beta” versions to select customers, who are testing the self-drive system on public roads. Other companies deploying experimental autonomous car technology on public roads use trained test drivers and make reports on safety progress to the state. Asked why Tesla was not required to do so with its Full Self-Driving beta, the DMV noted that the feature is not actually full self-driving, and so does not fall under those requirements. Tesla does tell Full Self-Driving beta testers they must pay full attention.

The Times has repeatedly asked to interview DMV officials to clarify its stance. Those requests have been repeatedly declined.

The Associated Press was used in compiling this report.





Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button