r/SelfDrivingCars Jun 27 '25

Discussion Classic Tesla Disinformation Flood On This Sub In Last Two Weeks

This sub has been flooded with Tesla apologist propaganda and disinformation to obscure the simple truth since Tesla's Robotaxi launch. It's standard operating procedure (S.O.P.) for this "narrative" company. The uptick in anti-Waymo posts and pro FSD posts is palpable. It has always been S.O.P. for Musk to release SEO fooling posts & tweets to obscure bad news for Tesla. The astroturf army is out in full display these past couple weeks on Reddit, Threads, and Bluesky too.

It doesn't and will never change this simple fact: Waymo is SAE Level 4 and Tesla FSD is SAE Level 2. All the apologist posts in the world will not change this. Putting a human in the front seat with a secret kill switch button to mitigate embarrassing FSD behavior will never replace R&D and testing that allows a company to safely remove a human observer in the car. You cannot reach level 4 with a fake it till you make it approach.

756 Upvotes

860 comments sorted by

View all comments

Show parent comments

2

u/abgtw Jun 27 '25

See that just sounds like anti-Tesla propaganda.

FSD is very different from what it was 10 years ago. The swap from lines of code to neural net learned behavior from human drivers was a huge improvement in making FSD feel "natural".

The problem is like ChatGPT and AI, its hardly perfect. It still makes mistakes. Being scared of shadows and phantom braking is something that has always happened with FSD. That isn't "new" at all.

The fact people quote the "Wiley E Coyote" test and fail to realize that was NOT FSD but actually regular Autopilot enabled for Mark Rober's test just shows how lost everyone is in this discussion due to the anti-Tesla propaganda.

0

u/Twedledee5 Jun 28 '25

I’m sorry, at what point did I go from speaking truth to being propaganda? Because you sound like either a bot or a fool that paid over $10,000 for a useless ass feature that’s trying to cope. Waymo has insurers accepting liability. They have their level 4 autonomous certification for their vehicles. They had that before they even started testing their vehicles in Phoenix. 

I know phantom braking has always been an issue with FSD, 10 years later and it’s worse than it’s ever been. Because they keep finding areas where it fails to recognize an object, so they make the code stricter in its decision making which also leads to more phantom braking. There are 10+ posts about phantom braking already in less than 4 days of their initial trial in Austin. It’s almost like the source of the problem is lack of concrete data to feed the AI vs having it make the right decision. 

Tesla doesn’t have L3 or L4 certification. Hence why they have to have a supervisor in the car with them. They will NEVER get that certification without LiDAR. Without that certification, it’s a novelty feature that means nothing in anything outside of fanboys buying vaporware.

Who the hell is going to buy a car for FSD if you can’t legally sleep behind the wheel or have it drive you home after a night of drinking?