Wall Street Analyst Nearly Crashes While Testing Tesla’s “Full Self Driving”

All the Chinese EV companies that have full self-driving or an AI driving aid give it to you for free. Tesla locks it and makes you pay $8,000 to unlock it after you’ve already paid $80,000 for the car.

And then it kills you.

CNN:

Tesla CEO Elon Musk says anyone who doubts how valuable robotaxis will make the company should test drive its latest self-driving car. A Wall Street analyst did just that – and said the car almost crashed.

The experience of William Stein, an analyst with Truist Securities, is not unique; others have also reported problems with Tesla’s full self driving, or FSD, feature. But a note from Stein raises questions as to whether the autonomous driving and robotaxis Musk is betting Tesla’s future on are as close as he claims.

Tesla’s FSD driver-assist feature is sold as an $8,000 option. It can navigate city streets and highways, as long as a human driver is ready to take over at any time. Musk claims FSD is safer than a human driver, and he has pegged Tesla’s future on creating a fleet of robotaxis, carrying riders without a driver at the wheel.

“I would encourage anyone to understand the system better to simply try it out, let the car drive you around,” he said on an investors call earlier this month after reporting disappointing second quarter financial results. “Once people use it, the they tend to continue using it. So it’s vastly compelling.”

But there were problems when Truist’s Stein recently tried a special “demo mode” only available to Tesla employees during demonstration drives, he wrote in a note. He wrote it made a number of illegal maneuvers while in FSD mode, including switching lanes on a portion of highway with solid white lines indicating lane changes were prohibited.

In addition “the Model Y accelerated through an intersection as the car in front of us had only partly completed a right-turn. My quick intervention was absolutely required to avoid an otherwise certain accident,” he wrote. “Another intervention was required when a police officer used hand motions to signal to us to pull to the side of the road to allow a funeral procession to pass.”

The system is “no better, arguably worse, than last time” when he tested it in April, Stein wrote.

Worse than last time.

I can believe it.

The Cybertruck is the worst car Tesla ever put out, so it makes sense that their software would also be getting worse.

It’s all programmed by Indians whose code always gets worse the more they work on it. That is the nature of Indian code. They are famous for this. It just gets longer and more convoluted, and more things break.