Tesla ordered by auto regulators to provide data on 'Elon mode' Autopilot configuration

0
132

Tesla received a special directive from the federal auto safety regulators, requiring it to provide detailed information about its driver assistance systems and monitoring system. This included a previously secret configuration known as “Elon Mode”. “

Typically, when a Tesla driver uses the company’s driver assistance systems — which are marketed as Autopilot, Full Self-Driving or FSD Beta options — a visual symbol blinks on the car’s touchscreen to prompt the driver to engage the steering wheel. The “nag” will escalate to a beeping sound if the driver does not take the wheel within a certain time frame. The vehicle will disable its advanced driver assistance functions if the driver does not take control of the car at this point. This can last for the remainder of the trip or even longer. The National Highway Traffic Safety Administration (NHTSA) sent Tesla a letter on July 26 asking for details regarding the use of this special configuration. This included how many cars Tesla had authorized to use it. In the letter and order, acting chief counsel John Donaldson stated:

NHTSA has concerns about the safety impact of recent changes made to Tesla’s Driver Monitoring System. The concern stems from information that suggests it is possible for owners of Tesla vehicles to alter Autopilot’s configurations for driver monitoring to enable them to drive the vehicle for longer periods in Autopilot without Autopilot prompting them to apply torque on the steering wheel. Tesla was given an August 25 deadline to provide all information requested by the agency. They responded on time, but they requested that their response be treated as confidential by NHTSA. The company did not immediately respond to CNBC’s request for comment.

Automotive safety researcher and Carnegie Mellon University associate professor of computer engineering Philip Koopman told CNBC after the order was made public, “It seems that NHTSA takes a dim view of cheat codes that permit disabling safety features such as driver monitoring. I agree. Software production should not include hidden features that compromise safety. In recent press interviews, NHTSA acting administrator Ann Carlson has suggested that a conclusion is near. NHTSA’s acting administrator Ann Carlson said in recent interviews that the agency is close to a decision.

Tesla has been telling regulators, including the California DMV and NHTSA, that their driver assistance systems, including FSD Beta, are only “level 2,” and that they do not make cars autonomous. This, despite the fact that the company markets them under names that may confuse the issue. Tesla CEO Elon Musk who also owns and runs the social network X, formerly Twitter, often implies Tesla vehicles are self-driving.

Over the weekend, Musk livestreamed a test drive in a Tesla equipped with a still-in-development version of the company’s FSD software (v. 12) on the social platform. Musk used a mobile phone to stream the demo while driving, chatting with Ashok Elluswamy, Tesla’s director of Autopilot software. He had his hands off the steering yoke at times. According to him, the whole drive was “like waving a flag in front NHTSA.” The Tesla website warns drivers in the section “Using Autopilot” to “stay alert, maintain control and keep your hands on your steering wheel.” “

Grep VC managing partner Bruno Bowden, a machine learning expert and investor in autonomous vehicle startup Wayve, said the demo showed Tesla is making some improvements to its technology, but still has a long way to go before it can offer a safe, self-driving system.

During the drive, he observed, the Tesla system nearly blew through a red light, requiring an intervention by Musk who managed to brake in time to avoid any danger.