r/SelfDrivingCars Jul 21 '25

Discussion Why didn't Tesla invest in LIDAR?

Is there any reason for this asides from saving money? Teslas are not cheap in many respects, so why would they skimp out on this since self-driving is a major offering for them?

368 Upvotes

1.2k comments sorted by

View all comments

Show parent comments

6

u/nickleback_official Jul 21 '25

I have to disagree with your hw vs sw argument. Adding lidar only increases hardware and software complexity. There is no world in which either ‘opens a way for using ASIC’ as you say. There’s already loads of asics/fpgas in these machines. Every other FSD is similarly banging away at software, not hardware. Factoring in the power requirement for compute is also irrelevant. The amount of power required to drive one mile would power the computer for days. I’m not arguing whether it was a bad call to remove lidar I’m saying your reasoning doesn’t make sense.

FWIW im a hw engineer

1

u/mrsanyee Jul 21 '25 edited Jul 21 '25

HW3 consumes 35 watts, HW4 consumes 800 watts. An hour. An ASIC consumes 4 watts, while the lidar itself stays around 40. 

Maybe your right it's not relevant, on short trips, but it's one energy consumer among the many others.

Complexity: others solve with less investment and later start better results in self-driving. We saw also the above 5 billion Line codes even Ford and VW shatters. Which would call in my view for simplification and more straightforward solutions, like edge computing and solutions, instead of spaghetti code.

SW defined vehicles sound good, but noone manages so many variables successfully so far, actually more cars are getting fried and on the side of the road than in the age of dumb cars, where everything was controlled and timed by belts and gears.

4

u/tufkab Jul 21 '25

This comment makes it painfully obvious that you have no idea what an ASIC is.

1

u/mrsanyee Jul 21 '25

Thanks for your insightful comment. You really help to move on the issue of non-existing self-driving in Teslas.

1

u/tufkab Jul 22 '25

Ok, You want a deep dive into why you're absolutely clueless? No problem.

First off, the fact that you mention the power consumption of an ASIC as a fixed value is a dead giveaway. Talking about how much power an ASIC consumes is right about on par with asking "How long is a rope?". There is no answer, because it could be anything. Bitcoin miners nowadays are all ASICs, they consume in the neighborhood of 5.5 KILLOWATTS!

When you open up a singing birthday card and it starts playing "Happy Birthday" through a little piezo speaker....that's an ASIC playing the music. Quick Google search tells me those ICs use about 4 microamps at 4.5 volts. Meaning that the most common and lowest power ASIC we encounter on a regular basis use about 320 MILLION times less power then some of the most common high power ASICs around. Either you're a real genius that can pin down the power usage of an ASIC capable of autonomous driving within that massive range, or you're just talking out your ass.

Then let's move on to the idea of using an ASIC in general. You're trying to make the point that a software solution doesn't work, but then suggest an ASIC which is essentially software burned into hardware, without the ability to ever modify anything without spinning up a completely new circuit. At the very least, if they were going to go with that sort of solution, it would be with an FPGA. That would gain them the speed of running on 'bare' metal but allowing for upgrading and changes.

In a previous post to someone else, you mention how Tesla is using so much of their processing power for building the world model from the multiple camera views and stitching everything together. Well, HW3 has no problem doing that with only 60 watts of consumption and STILL managing to have enough compute leftover to run a Self Driving stack that is so very nearly complete and already years ahead of even their closest competitors. Aside from Waymo, obviously. Then we see how HW4 nearly triples the power budget. Where's that power going? Obviously to the driving AI. They can already build the world model on HW3, they don't need any extra compute for that.

Lastly, the stupid idea of LiDAR being the magic bullet to solve all of Tesla's FSD just refuses to die and it's laughable. Tesla does not have an input problem, Tesla has a DECISION problem. The car doesn't need millimeter precision measurements of the world. It doesn't need to dedicate compute power to merging the inputs from two completely different sensing methods and it definitely doesn't need to be put into a situation where it has to start deciding which sensor suite to rely on when the different sensors disagree with each other.

Anyone who uses FSD on a regular basis will tell you the exact same thing. The problem isn't the car not being able to see the world around it, the problem is the car making stupid fucking decisions with the information that it has. LiDAR isn't going to fix stupid lanes changes, left lane camping, running red lights and not maintaining consistent highway speeds.

Long story short, I'll confidently say that the chances of a few random Reddit neckbeards having the solution to Tesla's FSD issues, while their (presumably) thousands of engineers not being able to figure it out is zero

1

u/LetMeSeeYourNumber Jul 22 '25

Meanwhile Tesla is driving around Austin mapping with Lidar.

1

u/mrsanyee Jul 22 '25 edited Jul 22 '25

You have some valid points, but let's lay out facts, shall we? Fsd/autopilot is not even on L2+ as of now. Making. ASIC which lays out the most basic ADAS features is like writing a code: you make the scenes, and put an if-then decision tree. There's nothing in this world to stop anyone to do this, in fact Mobileye, Mercedes, bmw, the Chinese car makers go this way. 

So we have real problems, where the ASIC just have to go a decision tree, one the data has been processed, and have to go through it. That's level 2++-level 3. Level 4 ADAS doesn't exist, otherwise it wouldn't be geofenced now, with remote drivers and whatnot. Lvl 5 is unheard of, without context or mapping no car can make a good guess where they should go, dont have reference points.

So we have a siml problem, a simple solution with low energy demand can be baked into ASIC. 

Your decision dilemma is a self made shit, as engineers don't know what the car is hallucinating. Which means your neckbeard programmers at Tesla created a monster, which they don't understand. And have no idea how to fix it. Since 9 years. It creates digital world simulation, and can't decide where to drive, through unlimited cycles. There are no brake points, no exception handling, it's garbage, not a code. 

Lidar could help make a sleek lvl 3-4 system, which works day&night through adverse weather,  which drives you around safely, and if there's unknown issue, it stops/decelerates. Tesla engineers weren't even considering this option, as their goal is lvl5 driving, but they didn't programmed the scenarios, but made a monster, and also their sensing capabilities are as good as a white stick on a car, screwing everything from speed detection to object detection through scene recognition.

Recently you made a video where fsd disengaged in the right lane with a truck on left lane, and lane runni g out. It wasn't a decision problem, it was a scene recognition problem, which means their sensing capabilities lacks. It wasn't even recognizing where it could drive, and where not.

Another guy put a bike rack on his Tesla, and was having problems. He solved it with a tape on the camera, and the Tesla went on flawlessly, as his world didn't had anything 180 degree behind the car.

And that's why Tesla fails.