#Tesla vs. Police 🥊 (FSD V12.3.6 FAIL – AutoPilot trying to pass a cyclist, instead FSD aims at a cop head on 💥 🤦🏼♂️) – (Filmed 5/3/24)
One of the most horrific drives I had with FSD yet! Tesla #AutoPilot tries passing a cyclist, would’ve been a head-on collision with a police… pic.twitter.com/nFarRgTdxR
— FixorFkit (@FixorFkit) May 5, 2024
Tesla drivers could just go to Canada and kill themselves for free. The government will pay for it.
Instead, they spend $100,000 on an Elon Musk Brand Death Machine.
Highly illogical.
U.S. auto safety investigators are seeking detailed answers and documents from Tesla in a probe into the automaker’s December recall of more than 2 million vehicles to install new Autopilot safeguards.
The National Highway Traffic Safety Administration (NHTSA) said last month it was investigating after receiving reports of 20 crashes involving vehicles that had the Autopilot software updates installed under Tesla’s recall. The agency’s letter said it had identified “several concerns” regarding the recall.
Tesla said in December its largest-ever recall covering 2.03 million U.S. vehicles – or nearly all of its vehicles on U.S. roads – was to better ensure drivers pay attention when using its advanced driver assistance system.
The NHTSA recall investigation covers models Y, X, S, 3 and Cybertruck vehicles in the U.S. equipped with Autopilot produced between the 2012 and 2024 model years.
Tesla, which did not immediately respond to a request for comment, has said repeatedly that Autopilot does not make vehicles self-driving and is intended for use with a fully attentive driver who is prepared to take over and has hands on the steering wheel.
Yeah, it’s not actually self-driving, because Tesla’s self-driving doesn’t work. It is sort of these people’s own fault for not understanding that. The company says you have to keep your hands on the wheel and your foot on the brake, because it will eventually crash and you will have to stop it.
I mean, it’s understandable that people get used to it after a week or so and just assume it’s not going to crash and then just start playing with their phones while using the “Autopilot.” But the company is saying “look, this doesn’t actually work.”
The name “Autopilot” does imply it is fully self-driving. But Tesla has said continually that it is not, and will eventually crash if you’re not paying attention. It’s therefore questionable what the point even is. Isn’t the entire purpose of self-driving to allow you to do other things in the car, or to drive while you’re drunk? If it doesn’t allow you to do that, it is just a dumb gimmick that is bound to lead to disaster.
But the real question is:
Why doesn’t it work, Elon?
Google’s self-drive works.
China’s self-driving cars work.
You were the first in this industry, Elon.
Why doesn’t your car work?
In 2024, 95% of Teslas produced are defective, jamming up Tesla’s notoriously incompetent “service” centers, which means Tesla will spend billions on warranty fixes — liabilities that will continue to eat into free cash flow, bankrupting $TSLA by 2027 at this rate. https://t.co/VCXmD15e8h
— Facts Chaser 🌎 🤦🏻♂️ (@Factschaser) May 7, 2024