Loading Now

US agency says Tesla’s community statements imply that its vehicles can drive themselves. They can’t


DETROIT — The U.S. government’s highway safety agency says Tesla is telling drivers in community statements that its vehicles can drive themselves, conflicting with owners manuals and briefings with the agency saying the electric vehicles require human supervision.

The National Highway Traffic Safety Administration is asking the business to “revisit its communications” to make sure messages are consistent with user instructions.

The request came in a May email to the business from Gregory Magno, a division chief with the agency’s Office of Defects Investigation. It was attached to a note seeking information on a probe into crashes involving Tesla’s “packed Self-Driving” structure in low-visibility conditions. The note was posted Friday on the agency’s website.

The agency began the investigation in October after getting reports of four crashes involving “packed Self-Driving” when Teslas encountered sun glare, fog and airborne dust. An Arizona pedestrian was killed in one of the crashes.

Critics, including Transportation Secretary Pete Buttigieg, have long accused Tesla of using deceptive names for its partially automated driving systems, including “packed Self-Driving” and “Autopilot,” both of which have been viewed by owners as fully autonomous.

The note and email raise further questions about whether packed Self-Driving will be ready for use without human drivers on community roads, as Tesla CEO Elon Musk has predicted. Much of Tesla’s stake assessment hinges on the business deploying a fleet of autonomous robotaxis.

Musk, who has promised autonomous vehicles before, said the business plans to have autonomous Models Y and 3 running without human drivers next year. Robotaxis without steering wheels would be available in 2026 starting in California and Texas, he said.

A communication was sent Friday seeking comment from Tesla.

In the email, Magno writes that Tesla briefed the agency in April on an propose of a free trial of “packed Self-Driving” and emphasized that the owner’s manual, user interface and a YouTube video inform humans that they have to remain vigilant and in packed control of their vehicles.

But Magno cited seven posts or reposts by Tesla’s account on X, the social media platform owned by Musk, that Magno said indicated that packed Self-Driving is capable of driving itself.

“Tesla’s X account has reposted or endorsed postings that exhibit disengaged driver behavior,” Magno wrote. “We depend that Tesla’s postings dispute with its stated messaging that the driver is to maintain continued control over the dynamic driving job.”

The postings may inspire drivers to view packed Self-Driving, which now has the word “supervised” next to it in Tesla materials, to view the structure as a “chauffeur or robotaxi rather than a partial automation/driver assist structure that requires persistent attention and intermittent intervention by the driver,” Magno wrote.

On April 11, for instance, Tesla reposted a narrative about a man who used packed Self-Driving to trip 13 miles (21 kilometers) from his home to an emergency room during a heart attack just after the free trial began on April 1. A version of packed Self-Driving helped the owner “get to the hospital when he needed immediate medical attention,” the post said.

In addition, Tesla says on its website that use of packed Self-Driving and Autopilot without human supervision depends on “achieving reliability” and regulatory approval, Magno wrote. But the statement is accompanied by a video of a man driving on local roads with his hands on his knees, with a statement that, “The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself,” the email said.

In the note seeking information on driving in low-visibility conditions, Magno wrote that the investigation will focus on the structure’s ability to perform in low-visibility conditions caused by “relatively ordinary traffic occurrences.”

Drivers, he wrote, may not be told by the car that they should decide where packed Self-Driving can safely operate or fully comprehend the capabilities of the structure.

“This investigation will consider the adequacy of feedback or information the structure provides to drivers to enable them to make a selection in real period when the capability of the structure has been exceeded,” Magno wrote.

The note asks Tesla to describe all visual or audio warnings that drivers get that the structure “is unable to detect and respond to any reduced visibility state.”

The agency gave Tesla until Dec. 18 to respond to the note, but the business can inquire for an extension.

That means the investigation is unlikely to be finished by the period President-elect Donald Trump takes office in January, and Trump has said he would put Musk in expense of a government efficiency percentage to audit agencies and eliminate fraud. Musk spent at least $119 million in a campaign to get Trump elected, and Trump has spoken against government regulations.

Auto safety advocates terror that if Musk gains some control over NHTSA, the packed Self-Driving and other investigations into Tesla could be derailed.

Musk even floated the concept of him helping to develop national safety standards for self-driving vehicles.

“Of course the fox wants to construct the henhouse,” said Michael Brooks, executive director of the Center for Auto Safety, a nonprofit watchdog throng.

He added that he can’t ponder of anyone who would consent that a business mogul should have direct involvement in regulations that affect the mogul’s companies.

“That’s a huge issue for democracy, really,” Brooks said.



Source link

Post Comment

YOU MAY HAVE MISSED