Some Tesla engineers were aghast, said former employees with knowledge of his reaction, speaking on the condition of anonymity for fear of retribution. They contacted a trusted former executive for advice on how to talk Musk out of it, in previously unreported pushback. Without radar, Teslas would be susceptible to basic perception errors if the cameras were obscured by raindrops or even bright sunlight, problems that could lead to crashes.
Musk was unconvinced and overruled his engineers. In May 2021 Tesla announced it was eliminating radar on new cars. Soon after, the company began disabling radar in cars already on the road. The result, according to interviews with nearly a dozen former employees and test drivers, safety officials and other experts, was an uptick in crashes, near misses and other embarrassing mistakes by Tesla vehicles suddenly deprived of a critical sensor.
Musk has described the Tesla “Full Self-Driving” technology as “the difference between Tesla being worth a lot of money and being worth basically zero,” but his dream of autonomous cars is hitting roadblocks.
In recent weeks, Tesla has recalled and suspended the rollout of the technology to eligible vehicles amid concerns that its cars could disobey the speed limit and blow through stop signs, according to federal officials. Customer complaints have been piling up, including a lawsuit filed in federal court last month claiming that Musk has overstated the technology’s capabilities. And regulators and government officials are scrutinizing Tesla’s system and its past claims as evidence of safety problems mounts, according to company filings.
In interviews, former Tesla employees who worked on Tesla’s driver-assistance software attributed the company’s troubles to the rapid pace of development, cost-cutting measures like Musk’s decision to eliminate radar — which strayed from industry practice — and other problems unique to Tesla.
They said Musk’s erratic leadership style also played a role, forcing them to work at a breakneck pace to develop the technology and to push it out to the public before it was ready. Some said they are worried that, even today, the software is not safe to be used on public roads. Most spoke on the condition of anonymity for fear of retribution.
“The system was only progressing very slowly internally” but “the public wanted a product in their hands,” said John Bernal, a former Tesla test operator who worked in its Autopilot department. He was fired in February 2022 when the company alleged improper use of the technology after he had posted videos of Full Self-Driving in action.
“Elon keeps tweeting, ‘Oh we’re almost there, we’re almost there,’” Bernal said. But “internally, we’re nowhere close, so now we have to work harder and harder and harder.” The team has also bled members in recent months, including senior executives.
Meanwhile, Musk pulled dozens of Tesla engineers to work with code at Twitter, the struggling social media platform Musk purchased with fanfare last fall, according to people familiar with the matter, who spoke on the condition of anonymity for fear of retribution, and documents reviewed by The Washington Post. Earlier this month, after Tesla failed to announce a big new product on investor day, the company’s stock sank 6 percent.
Musk has defended the company’s actions as long-term bets, with the prospect of unlocking tremendous value, and Tesla has said vehicles in Full Self-Driving crash at a rate at least five times less than vehicles driving normally. Musk and Tesla did not respond to repeated requests for comment.
But the story of Full Self-Driving offers a vivid example of how the world’s richest person has complicated one of his biggest bets through rash decision-making, a stubborn insistence on doing things differently, and unyielding confidence in a vision that has yet to be proven.
“No one believed me that working for Elon was the way it was until they saw how he operated Twitter,” Bernal said, calling Twitter “just the tip of the iceberg on how he operates Tesla.”
In April 2019, at a showcase dubbed “Autonomy Investor Day,” Musk made perhaps his boldest prediction as Tesla’s chief executive. “By the middle of next year, we’ll have over a million Tesla cars on the road with full self-driving hardware,” Musk told a roomful of investors. The software updates automatically over the air, and Full Self-Driving would be so reliable, he said, the driver “could go to sleep.”
Investors were sold. The following year, Tesla’s stock price soared, making it the most valuable automaker and helping Musk become the world’s richest person. Full Self-Driving followed Autopilot, which was launched in 2014 and went on to allow cars to navigate highways, from steering and changing lanes to adjusting speed. Full Self-Driving aimed to bring those capabilities to city and residential streets, a far more difficult task.
The cars rely on a combination of hardware and software to do so. Eight cameras capture real-time footage of activity surrounding the car, allowing the car to asses hazards like pedestrians or bicyclists and maneuver accordingly.
To deliver on his promise, Musk assembled a star team of engineers willing to work long hours and problem solve deep into the night. Musk would test the latest software on his own car, then he and other executives would compile “fix-it” requests for their engineers.
Those patchwork fixes gave the illusion of relentless progress but masked the lack of a coherent development strategy, former employees said. While competitors such as Alphabet-owned Waymo adopted strict testing protocols that limited where self-driving software could operate, Tesla eventually pushed Full Self-Driving out to 360,000 owners — who paid up to $15,000 to be eligible for the features — and let them activate it at their own discretion.
Tesla’s philosophy is simple: The more data (in this case driving) the artificial intelligence guiding the car is exposed to, the faster it learns. But that crude model also means there is a lighter safety net. Tesla has chosen to effectively allow the software to learn on its own, developing sensibilities akin to a brain via technology dubbed “neural nets” with fewer rules, the former employees said. While this has the potential to speed the process, it boils down to essentially a trial and error method of training.
Rivals at Waymo and Apple take a different approach to autonomy, by setting rules and addressing any breaches if those constraints are violated, according to Silicon Valley insiders with knowledge of company practices, who spoke on the condition of anonymity because they were not authorized to speak publicly. Companies developing self-driving also typically use sophisticated lidar and radar systems which help the software map out their surroundings in detail.
Waymo spokesperson Julia Ilina said there are evident differences between the companies’ approaches, pointing to Waymo’s goal of full autonomy and emphasis on machine learning. Apple declined to comment for this story.
Tesla’s method has at times proven problematic. Around two years ago, a popular YouTuber captured footage of the software struggling to navigate San Francisco’s famously winding Lombard Street in a video that garnered tens of thousands of views. So Tesla engineers built invisible barriers into the software — akin to bumpers in a bowling alley — to help the cars stay on the road, Bernal said. Subsequent YouTube videos showed them operating smoothly.
That gave Bernal pause. As an internal tester who drove that stretch of road as part of his job, it was clear that it was far from the typical experience on public streets elsewhere.
Radar originally played a major role in the design of the Tesla vehicles and software, supplementing the cameras by offering a reality check of what was around, particularly if vision might be obscured. Tesla also used ultrasonic sensors, shorter-range devices that detect obstructions within inches of the car. (The company announced last year it was eliminating those as well.)
Even with radar, Teslas were less sophisticated than the lidar and radar-equipped cars of competitors.
“One of the key advantages of lidar is that it will never fail to see a train or truck, even if it doesn’t know what it is,” said Brad Templeton, a longtime self-driving car developer and consultant who worked on Google’s self-driving car. “It knows there is an object in front and the vehicle can stop without knowing more than that.”
Cameras need to understand what they see to be effective, relying on Tesla workers who label images the vehicles record, including things like stop signs and trains, to help the software understand how to react.
Toward the end of 2020, Autopilot employees turned on their computers to find in-house workplace monitoring software installed, former employees said. It monitored keystrokes and mouse clicks, and kept track of their image labeling. If the mouse did not move for a period of time, a timer started — and employees could be reprimanded, up to being fired, for periods of inactivity, the former employees said.
After a group pushing to unionize Tesla’s Buffalo factory raised concerns about its workplace monitoring last month, Tesla responded in a blog post. “The reason there is time monitoring for image labeling is to improve the ease of use of our labeling software,” it said, adding “its purpose is to calculate how long it takes to label an image.”
Musk had championed the “vision-only” approach as simpler, cheaper and more intuitive. “The road system is designed for cameras (eyes) & neural nets (brains),” he tweeted in February 2022.
Some of the people who spoke with The Post said that approach has introduced risks. “I just knew that putting that software out in the streets would not be safe,” said a former Tesla Autopilot engineer who spoke on the condition of anonymity for fear of retaliation. “You can’t predict what the car’s going to do.”
A rise in vehicle crashes
After Tesla announced it was removing radar in May 2021, the problems were almost immediately noticeable, the former employees said. That period coincided with the expansion of the Full Self-Driving testing program from thousands to tens of thousands of drivers. Suddenly, cars were allegedly stopping for imaginary hazards, misinterpreting street signs, and failing to detect obstacles such as emergency vehicles, according to complaints filed with regulators.
Some of the people who spoke with The Post attributed Tesla’s sudden uptick in “phantom braking” reports — where the cars aggressively slow down from high speeds — to the lack of radar. The Post analyzed data from the National Highway Traffic Safety Administration to show incidences surged last year, prompting a federal regulatory investigation.
The data showed reports of “phantom braking” rose to 107 complaints over three months, compared to only 34 in the preceding 22 months. After The Post highlighted the problem in a news report, NHTSA received about 250 complaints of the issue in a two-week period. The agency opened an investigation after, it said, it received 354 complaints of the problem spanning a period of nine months.
Months earlier, NHTSA had opened an investigation into Autopilot over roughly a dozen reports of Teslas crashing into parked emergency vehicles. The latest example came to light this month as the agency confirmed it was investigating a February fatal crash involving a Tesla and a firetruck. Experts say radar has served as a way to double check what the cameras, which are susceptible to being washed out by bright light, are seeing.
“It’s not the sole reason they’re having [trouble] but it’s big a part of it,” said Missy Cummings, a former senior safety adviser for NHTSA, who has criticized the company’s approach and recused herself on matters related to Tesla. “The radar helped detect objects in the forward field. [For] computer vision which is rife with errors, it serves as a sensor fusion way to check if there is a problem.”
Musk, as the chief tester, also asked for frequent bug fixes to the software, requiring engineers to go in and adjust code. “Nobody comes up with a good idea while being chased by a tiger,” a former senior executive recalled an engineer on the project telling him.
Musk’s resistance to suggestions led to a culture of deference, former employees said. Tesla fired employees who pushed back on his approach. The company was also pushing out so many updates to its software that in late 2021, NHTSA publicly admonished Tesla for issuing fixes without a formal recall notice.
Last year, Musk decided to buy Twitter, something that became a distraction for the Tesla chief executive, former employees of both companies said. After taking the helm in October, he diverted dozens of engineers — including on Autopilot and Full Self-Driving — to work there with him, further setting back Tesla, according to former employees and documents reviewed by The Post. Software updates that were otherwise issued every two weeks were suddenly spaced out over periods of months, as Tesla worked through bugs and chased more ambitious targets.
Some lament Musk’s involvement at Twitter, saying he needs to refocus on Tesla to finish what he started. Ross Gerber, a Tesla investor who is running for a seat on the company’s board over concerns about its perceived inaction on Musk’s dueling role as head of Twitter, said Full Self-Driving heralds a bright future for Tesla.
“We love Elon. He’s the innovator of our time,” he said. “All we want to see is him working full time back at Tesla again.”
Tesla engineers have been burning out, quitting and looking for opportunities elsewhere. Andrej Karpathy, Tesla’s director of artificial intelligence, took a months-long sabbatical last year before leaving Tesla and taking a position this year at OpenAI, the company behind language-modeling software ChatGPT.
“Since Andrej was writing all the code by himself, naturally, things have come to a grinding halt,” Musk said on an earnings call last year, noting he was speaking in jest.
Ashok Elluswamy, Tesla’s director of Autopilot, has taken on work at Musk’s other company, Twitter, according to employees and documents reviewed by The Post.
One of the former employees said that he left for Waymo. “They weren’t really wondering if their car’s going to run the stop sign,” the engineer said. “They’re just focusing on making the whole thing achievable in the long term, as opposed to hurrying it up.”
The Justice Department has requested documents related to Full Self-Driving as part of an ongoing probe, and the Securities and Exchange Commission is looking into Musk’s role in pushing Tesla’s self-driving claims, part of a larger investigation, according to Bloomberg News.
The lawsuit filed in February alleges that Tesla made “false and misleading” statements, arguing Tesla “significantly overstated” the safety and performance of Autopilot and Full Self-Driving.
That is in addition to NHTSA’s two probes into Autopilot, one of which is the look at emergency vehicles. That investigation has been upgraded to a more advanced stage: an engineering analysis. The other, into “phantom braking” reports, is ongoing.
At an investor showcase this month, Musk appeared alongside more than a dozen Tesla employees onstage, touting the company’s broad array of expertise. But the company failed to offer any major developments on Full Self-Driving, despite a segment on the technology.
And some of Musk’s most loyal customers have given up hope that his initial promise will come true. Charles Cook, a commercial pilot and engineer from Jacksonville, Fla., owns a Tesla Model Y that he frequently drives in Full Self-Driving mode.
While he is amazed at what the technology can do, he is surprised by both the slow pace of progress and the status of Musk’s promises. “Someone might have purchased Full Self-Driving thinking they were going to have a robotaxi by now and spent their hard earned money on that,” he said.
“Now his engineers may have laughed at that” but “a customer may have spent $15,000 thinking they’re going to have it next year.” Those customers, he said, lost out.
“I do not believe you can remove the driver on this hardware suite, ever,” he said.