Tesla received a special order from federal auto safety regulators, demanding that the company provide troves of data about its driver assistance and driver monitoring systems, as well as a once-secret configuration known as “Elon mode.”
Usually, when tesla When a driver engages the company’s driver-assistance system, which is sold as an Autopilot, Full Self-Driving, or FSD Beta option, a visual symbol on the car’s touchscreen flashes, prompting the driver to turn the steering wheel. If the driver leaves the steering wheel alone for an extended period of time, the “humming” can escalate into a beep. If the driver is still not behind the wheel at this point, the vehicle can disable its advanced driver assistance features for the remainder of the drive time or longer.
As CNBC previously reported, with the “Elon Mode” configuration enabled, Tesla could allow drivers to use the company’s Autopilot, FSD, or FSD Beta systems without the so-called “nag.”
NHTSA sent Tesla a letter and special order on July 26, requesting details about the use of this special configuration, including how many cars and drivers Tesla has authorized to use.The file is Add to the institution’s website tuesday and Bloomberg It was first reported.
In the letter and special order, the agency’s acting chief counsel, John Donaldson, wrote:
“NHTSA is concerned about the safety implications of recent changes to Tesla’s driver monitoring system. This concern is based on available information that owners may change Autopilot’s driver monitoring configuration to allow drivers to operate in Autopilot The vehicle was left without Autopilot prompting the driver to apply torque to the steering wheel for an extended period of time.”
Tesla was given a deadline of August 25th to provide all the information the agency requested and to respond on time, but they made the request and their response was approved confidential treatment By the National Highway Traffic Safety Administration. The company did not immediately respond to CNBC’s request for comment.
automotive safety researcher and Carnegie Mellon University After the order was announced, Philip Koopman, an associate professor of computer engineering, told CNBC: “NHTSA seems to be pessimistic about cheat codes that would allow disabling safety features like driver monitoring. I agree. Hidden features that reduce safety don’t exist in the US.” Production software. “
Koopman also noted that NHTSA has yet to complete its investigation into a series of accidents in which Tesla’s Autopilot system may have been responsible, including a series of “fatal truck undershoots” and involving Tesla vehicles. A crash into a stationary emergency vehicle. Ann Carlson, the acting administrator of the National Highway Traffic Safety Administration, said in a recent news interview that the conclusion was close.
For years, Tesla has told regulators including NHTSA and the California DMV that its driver assistance systems, including FSD Beta, are only “Level 2” and won’t make the cars self-driving, despite doing so under the brand name. Marketing can confuse matters. Tesla CEO Elon Musk, who also owns and operates the social network X (formerly known as Twitter), has often hinted that Tesla vehicles are self-driving.
Over the weekend, Musk live-streamed a test drive of a Tesla equipped with a version of the FSD software (v.12) the company is still developing. In that demonstration, Musk used a mobile device he held while driving to livestream and chat with a passenger, Ashok Elluswamy, Tesla’s head of software engineering for Autopilot.
In the blurry video stream, Musk did not show all the details of the touchscreen, nor did he demonstrate that his hands were on the steering column, ready to take over the driving duties at any time. At times, he clearly didn’t put his hands on the shackles.
Greg Lindsay, an urban tech researcher at Cornell University, said his use of the Tesla system may have violated the company’s own terms of use for Autopilot, FSD and FSD Beta. He told CNBC that the entire drive was like “waving a red flag in front of NHTSA.”
Tesla’s website In a section titled “Using Autopilot, Enhanced Autopilot, and Full Self-Driving Capabilities,” drivers are warned that “it is your responsibility to remain alert, keep your hands on the wheel and maintain control of your vehicle at all times.”
Bruno Bowden, managing partner at Grep VC, an expert in machine learning and an investor in self-driving car startup Wayve, said the demo showed Tesla was making some improvements to its technology, but it was still a long way from delivering safe self-driving cars. To go the system.
He observed that during the driving process, the Tesla system almost ran a red light, requiring Musk’s intervention, and he braked in time to avoid any danger.
Svlook