Skip to main content
Open this photo in gallery:

Tesla's Autopilot is promoted at a showroom in Zurich, Switzerland, on March 28, 2018.Arnd Wiegmann/Reuters

A U.S. senator on Friday urged Tesla Inc to rebrand its driver assistance system Autopilot, saying it has “an inherently misleading name” and is subject to potentially dangerous misuse.

But Tesla said in a letter that it had taken steps to ensure driver engagement with the system and enhance its safety features.

The electric automaker introduced new warnings for red lights and stop signs last year “to minimize the potential risk of red light- or stop sign-running as a result of temporary driver inattention,” Tesla said in the letter.

Senator Edward Markey said he believed the potential dangers of Autopilot can be overcome. But he called for “rebranding and remarketing the system to reduce misuse, as well as building backup driver monitoring tools that will make sure no one falls asleep at the wheel.”

Markey’s comments came in a press release, with a copy of a Dec. 20 from Tesla addressing some of the Democratic senator’s concerns attached.

Autopilot has been engaged in at least three Tesla vehicles involved in fatal U.S. crashes since 2016.

Crashes involving Autopilot have raised questions about the driver-assistance system’s ability to detect hazards, especially stationary objects.

There are mounting safety concerns globally about systems that can perform driving tasks for extended stretches of time with little or no human intervention, but which cannot completely replace human drivers.

Markey cited videos of Tesla drivers who appeared to fall asleep behind the wheel while using Autopilot, and others in which drivers said they could defeat safeguards by sticking a banana or water bottle in the steering wheel to make it appear they were in control of the vehicle.

Tesla, in its letter, said its revisions to steering wheel monitoring meant that in most situations “a limp hand on the wheel from a sleepy driver will not work, nor will the coarse hand pressure of a person with impaired motor controls, such as a drunk driver.”

It added that devices “marketed to trick Autopilot, may be able to trick the system for a short time, but generally not for an entire trip before Autopilot disengages.”

Tesla also wrote that while videos like those cited by Markey showed “a few bad actors who are grossly abusing Autopilot” they represented only “a very small percentage of our customer base.”

Earlier this month, the U.S. National Highway Traffic Safety Administration (NHTSA) said it was launching an investigation into a 14th crash involving Tesla in which it suspects Autopilot or other advanced driver assistance system was in use.

NHTSA is probing a Dec. 29 fatal crash of a Model S Tesla in Gardena, California. In that incident, the vehicle exited the 91 Freeway, ran a red light and struck a 2006 Honda Civic, killing its two occupants.

The National Transportation Safety Board will hold a Feb. 25 hearing to determine the probable cause of a 2018 fatal Tesla Autopilot crash in Mountain View, California.

Report an editorial error

Report a technical issue

Tickers mentioned in this story

Study and track financial data on any traded entity: click to open the full quote page. Data updated as of 11/03/26 4:15pm EDT.

SymbolName% changeLast
TSLA-Q
Tesla Inc
+2.15%407.82

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe