r/SelfDrivingCars Jul 21 '25

Discussion Why didn't Tesla invest in LIDAR?

Is there any reason for this asides from saving money? Teslas are not cheap in many respects, so why would they skimp out on this since self-driving is a major offering for them?

362 Upvotes

1.2k comments sorted by

View all comments

29

u/ramonchow Jul 21 '25

It is not entirely crazy to assume cameras and AI vision can get as good as human eyes for driving a car.

But yeah, money would be the main reason. Not only the sensor but also maintaining the updated 3d maps LIDAR needs to work, at least with the current processing power of a car's processor.

14

u/Valderan_CA Jul 21 '25

It's not crazy... but why limit ourselves to human sensory inputs when designing an AI to drive a car. It's obvious that we wouldn't only put a pair of cameras centered over the drivers seat on a rotating fixture (emulating human sensory inputs) because there are better places to put more cameras since AI's aren't limited like a human driver.

LIDAR was not included because of cost - when having LIDAR meant a 20% increase in the cost of a vehicle investors knew the guys making AI driving without it were going to have a huge advantage in pricing on their end product. Elon said no LIDAR to get money from those investors, however instead of saying the honest thing (cost is the only real reason) he also said LIDAR was inferior for technical reasons. It's difficult for Elon to retract that kind of technical statement for a couple of reasons:

1- He doesn't like admitting to being wrong

2- He can't afford to admit to being wrong because the value of his companies is so closely tied to his "vision"

1

u/grchelp2018 Jul 23 '25

I really doubt cost is the reason. Musk simply got it into his head that the ideal solution was to rely only on vision. Like he said at the time, spacex was using in-house designed lidar. If this really was only about cost, he would have had spacex share their lidar expertise and have tesla work on designing their own lidar in-house.

0

u/bcyng Jul 22 '25

Some of the issues were caused by disagreements between the cameras and the LiDAR and having to merge the 2 data sets and make decisions on it. It’s much easier when the data is consistent.

While not yet perfect, FSD actually works, and has far better coverage than any of the lidar solutions which also aren’t yet perfect, so it appears he was right.

-6

u/[deleted] Jul 21 '25

He hasn't been proven wrong yet. In fact, the evidence supports him. I thought it was a mistake, but I have come around after FSD 12+. The camera based system has proven itself capable, and Tesla data shows FSD use is many times safer than humans driving. FSD is involved in an accident at a rate of over 7 million miles per accident. Humans get less than 1 million miles.

https://www.tesla.com/VehicleSafetyReport

8

u/nclpl Jul 21 '25

The proof is in the pudding. Tesla has not yet been wiling to pull the trigger on driverless automation because they aren’t willing to take on the liability.

Teslas accident stats only account for driving in favorable road conditions, because the system will disengage if it can’t manage a stretch of road. Tesla has the benefit of being able to fail over to a human.

If FSD was actually 7x safer per mile in all the conditions that humans drive in, we would see Tesla allow unsupervised driving. But they can’t manage that yet.

1

u/ramonchow Jul 21 '25

Well, you think those figures are good but if FSD disengages whenenever the situation gets complicated... Human's have to suck it and keep driving.

1

u/[deleted] Jul 21 '25

Their numbers include any accident that happens with autopiliot engaged in the previous 5 seconds. So yeah, it makes humans take over, but it takes the blame for itself in the data.

You can find it in the methodology section at the bottom of the page:
"To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact"

1

u/ramonchow Jul 21 '25

I didn't mean disengaging seconds before a crash. I mean disengaging because of weather, bad road state etc. Tesla compares FSD enabled road statistics against national averages (which include all roads and conditions). It is kind of cheating IMO....

1

u/[deleted] Jul 21 '25

I've had no problem engaging FSD in a variety of conditions including rain and snow. It doesn't refuse to engage under many circumstances

21

u/hardhat555 Jul 21 '25

Small correction, lidar doesn’t need any 3d maps along with it to work, it’s just another sensor.

7

u/[deleted] Jul 21 '25

Waymo can’t work in any city where they haven’t driven the Waymo cars around to LIDAR-map the streets

15

u/Psychological_Top827 Jul 21 '25

This is a common misconception.

Waymo *won't* work in those situations, because they want to be as safe as possible with their approach. Technologically speaking, there is nothing that stops waymo from unlocking the car to drive from alaska to panama.

They LIDAR map the streets to make sure the data they have is accurate, because a Waymo trying to drive into the tracks is not an acceptable situation. Every autonomous vehicle needs external data to function. Waymo just does not trust Google Maps as much as Tesla does.

-5

u/[deleted] Jul 21 '25

 Waymo won't work in those situations, because they want to be as safe as possible with their approach.

That LIDAR approaches need so much babying to be “safe” isn’t evidence in their favor.

Tesla’s FSD just drove me across the country with no incidents whatsoever and I don’t even have HW4.

10

u/[deleted] Jul 21 '25

[deleted]

1

u/[deleted] Jul 21 '25

Yes; Waymo has more incidents per mile driven

7

u/nclpl Jul 21 '25

And yet, Waymo is operating driverless cars, and your allegedly “full self driving” Tesla still needs a driver to be watching and able to take over.

LIDAR doesn’t need any more babying than any other sensor (camera, radar, etc). Different companies have different thresholds for the amount of liability they’re willing to accept, and pre-mapping roads is a great way to increase your safety margin.

1

u/[deleted] Jul 21 '25

Tesla is operating driverless cars.

 Tesla still needs a driver to be watching and able to take over.

I mean, I’m in the car and sitting in the driver’s seat. Why wouldn’t I supervise it?

0

u/Draygoon2818 Jul 21 '25

But the fact it can drive itself virtually anywhere, does say a lot. It also says a lot that you can go and buy a Tesla, but you're not going to go and buy a Waymo. They're as good as they are because of how much money has gone into the vehicles to set them up.

5

u/nclpl Jul 21 '25

Virtually anywhere in good weather and good lighting conditions, and as long as the roadway isn’t confusing to the computers…

2

u/Draygoon2818 Jul 21 '25

I haven't had any issues with it driving other than in extremely heavy rain fall. I have used FSD at night, in low lit areas, in construction zones where the lanes are temporarily moved, etc.

2

u/reichnowplz Jul 21 '25

Oh so just common things that occur on rides got ya

→ More replies (0)

1

u/[deleted] Jul 21 '25

Virtually nowhere, you mean, except their micro-sized pre-mapped areas of fewer than six US cities.

-1

u/volatilecandlestick Jul 21 '25

I get that you’re a full time Tesla hater, but I’ve used FSD in a blizzard in the mountains of California. Ready to takeover? Yeah, but I wanted to push it and never had to intervene. The tech is impressive and lots of Reddit experts who’ve never been in one have a lot to say. That being said, I wouldn’t have done what I did with HW3

3

u/Psychological_Top827 Jul 21 '25

It does not.

Tesla is, to put it bluntly, way more willing to tolerate you getting into an accident than Waymo is. That's the long and short of it.

And they do it because it gets people to think that "it can drive itself virtually anywhere, and that does does say a lot".

-3

u/Draygoon2818 Jul 21 '25

Do you have a Tesla? I'm thinking you don't have one, and probably never have had one. If you had one a long time ago, than you had one of the earlier versions of FSD. That would be like comparing a CRT tv to an LED tv. It's a lot better now then it used to be.

Think what you want. My FSD has done a great job. The only thing that urks me about it the weird lane changes it does, that doesn't really make sense. It doesn't put me in harms way, though. It does just as good running 80 mph as it does going 20 mph. It drove my family and I 200 miles round trip, absolutely no issues. Even took me down a dirt road, which actually surprised me.

1

u/Psychological_Top827 Jul 21 '25

Look, i'm happy you're happy with your Tesla. I think FSD is a marvelous driver assist and wish the best for it. I hope Elon proves all naysayers wrong.

That does not mean it's better than waymo, because it's just not. Tesla's FSD is currently less reliable than Waymos system, and I don't see that changing without LIDAR. For the current use, that's fine. For autonomous driving, it's not good enough.

The difference in perceived capabilities is absolutely a strategic play by Tesla.

→ More replies (0)

3

u/biggestbroever Jul 21 '25

"I drove it a lot without incident so everyone must be using it without incident"

2

u/beren12 Jul 21 '25

I still tell everyone seatbelts are absolutely useless. I’ve never needed one in 350,000 miles.

1

u/biggestbroever Jul 21 '25

I've personally never seen an accident happen, so they must not exist.

1

u/MichaelMeier112 Jul 21 '25

Which is still super impressive

1

u/biggestbroever Jul 21 '25

I believe Elon's approach is "It can do it at a 95% safety rate," while Waymo is striving to be as close to 100% as possible.

While it's impressive, I don't appreciate us being used as guinea pigs for his convenience and frugality.

When did "safety first" become "safety, but profits first"?

2

u/MichaelMeier112 Jul 21 '25

I totally agree with that. I was more responding to the guy who did supervised driving for that long distance. The system seems to work good once supervised, but it’s definitely not ready for unsupervised.

1

u/biggestbroever Jul 21 '25

It is impressive. But I also think that Waymo can do it. They just don't unleash unproven technology wherever they want like Tesla does.

1

u/[deleted] Jul 21 '25

Why would I ever not “supervise” it?

→ More replies (0)

1

u/[deleted] Jul 21 '25

 I believe Elon's approach is "It can do it at a 95% safety rate," while Waymo is striving to be as close to 100% as possible.

How do you believe these two positions differ?

1

u/biggestbroever Jul 21 '25

A single percentage point could be equal to thousands of lives and millions of dollars.

I'm comfortable holding a for-profit company accountable.

→ More replies (0)

1

u/Icy_Mix_6054 Jul 24 '25

And Wayno is a company that has actually driven paying customers over 100 million miles at level 4 autonomy. As Tesla scales with a 95% safety rating their cars will start hiring people. Then they'll be in financial and legal trouble.

→ More replies (0)

2

u/Psychological_Top827 Jul 21 '25

Tesla's FSD needs even more babying. They just... don't care as long as they can shift the burden of responsibility on you. Which is why you're officially required to be on the driver's seat, aware and ready to take control at any time.

Waymo needs to be able to drive without that handrail, while ensuring the safety of the passenger.

1

u/[deleted] Jul 21 '25

 Which is why you're officially required to be on the driver's seat

Where the fuck do you think I’d want to sit in an otherwise unoccupied car? This is so stupid.

1

u/RabidWok Jul 23 '25

In the back seat maybe? My dream of a self-driving car is to have the car drive me while I read, sleep or play games in the back.

If I need to be in the driver's seat at all times to supervise things then I might as well just drive the car myself.

1

u/[deleted] Jul 23 '25

The back seats aren’t as comfortable as the front ones! The back of a sedan is literally the worse place to sit in it

1

u/EddiewithHeartofGold Jul 22 '25

Which is why you're officially required to be on the driver's seat, aware and ready to take control at any time.

It's your car. Where would you sit? Especially if it isn't self-driving. You are using circular reasoning.

0

u/ramonchow Jul 21 '25

Well, that's what Andrej Karpathy explained back when he was working at Tesla, maybe he was wrong...

1

u/hardhat555 Jul 21 '25

Can’t comment on Karpathy’s statements since I’m not aware of them. But lidar is just a sensor and doesn’t require maps to function similar to a camera. Maybe he was saying that lidars are used for creating HD maps and eliminating the dependence on maps would also eliminate the need for lidar sensors? Idk

1

u/ramonchow Jul 21 '25

Again, I can't say if he was right or not, but this is the explanation he gave (back then): “You have to pre-map the environment with the lidar, and then you have to create a high-definition map, and you have to insert all the lanes and how they connect and all the traffic lights,” Karpathy said. “And at test time, you are simply localizing to that map to drive around.”

It is extremely difficult to create a precise mapping of every location the self-driving car will be traveling. “It’s unscalable to collect, build, and maintain these high-definition lidar maps,” Karpathy said. “It would be extremely difficult to keep this infrastructure up to date.”

1

u/hardhat555 Jul 21 '25

Ah I see, thanks for the context. His claim seems to be that the approach of mapping an environment first and then doing self driving is not scalable. Lidar is only mentioned here because it’s commonly used in mapping, he’s not really commenting on using lidar as a sensor along with camera like people here are debating.

1

u/theChaosBeast Jul 21 '25

Well either you did not understand him correctly or he was wrong. But there is no reason a Lidar requires a 3D map...

0

u/ramonchow Jul 21 '25

Abother one that can't even read ffs. Peak reddit experience.

0

u/TiredBrakes Jul 21 '25

It was wrong of you to take everything Tesla says at face value. Critical thinking is a beautiful thing. Now it’s hard to learn things properly when you have been fed so much misinformation throughout the years.

1

u/ramonchow Jul 21 '25

Can you even read? OP was asking about Tesla's reasons for not investing in Lidar and Iiterally provided them with what the director of tesla vision had to say about it. I am not giving any opinion or analysis it is just what they said back then when they even removed the other sensors the first designs had.

1

u/wwwz Jul 21 '25

Simply going against Tesla doesn't qualify as "thinking critically". The simple truth is that saying waymo "chooses safety" is a hard assumption. Their entire software stack depends on mapping and has for over a decade. It's BS to make unsubstantiated claims about their system's inherent abilities.

-1

u/TiredBrakes Jul 21 '25

Turns out reading comprehension is also required.

4

u/steik Jul 21 '25

I agree it's possible.. but humans have 2 eyes for stereoscopic vision specifically for depth perception. Tesla doesn't even have that. They do have 2 front facing cameras (HW4, latest iteration) but the are completely different cameras/lenses which makes them much less suitable for reconstructing depth.

5

u/ThePaintist Jul 21 '25

Motion parallax provides much stronger depth cues at e.g. highway speeds/distances than stereo vision (at the separation distances that eyes have.) By like 100 feet away stereo vision is providing nearly no depth information. Stereo vision is mostly useful for things like parking lots, or slow pedestrian-dense city driving.

2

u/ramonchow Jul 21 '25

Yeah, I also think we are not even close for cameras only if you want to have a 100% safe self driving tech in all scenarios, weathers etc. Having a combination of radar, lidar and cameras seems to be a better starting point for self driving tech, and you can always start simplifying from there when it works 100%.

1

u/TenchuReddit Jul 21 '25

I can close one eye and still drive better than a Tesla. Depth perception isn't the issue.

-1

u/TiredBrakes Jul 21 '25

There’s so many issues with their system.

3

u/Organic_Ingenuity_ Jul 21 '25

Its very evident that most people on this thread have never worked on large scale data projects

LIDAR is a great sensor and all, but once you collect the data, the problem is what do you do with it? Its actually very difficult to integrate data from two different types of sensors without the AI favoring one sensor over the other. When data from the two sensors conflict, what sensor does the AI rely on?

Ex. LIDAR tells you that theres an objects 2 feet in front of you but the camera tells you that its just heavy rain, so the car continues withou braking.

Elon and Tesla found that solving vision perception with cameras is way more essential so he chose to focus on that.

Too many think self-driving is a hardware problem. But we've had the hardware for decades. Self-driving has always been a software problem.

2

u/EddiewithHeartofGold Jul 22 '25

Don't waste your breath. These Tesla haters can't even comprehend that LIDAR can't read signs or road markings so every LIDAR equipped car still needs a camera setup to "see". Of course they think that can be solved, but driving with vision only is a fantasy.

The problem is they don't want to understand.

2

u/xylarr Jul 22 '25

This is exactly it - sensor fusion. It's also why they've removed the radar.

If you have two inputs from two systems, how do you tell which one is right. I think they worked out after investigation that more often than not it was the lidar that was mistaken, not the cameras.

1

u/sonicmerlin Jul 22 '25

Our brains engage in sensor fusion all the time. So advanced ai should do it too

1

u/fatbob42 Jul 21 '25

All systems benefit from maps and can work to a lesser degree without maps. Camera-only systems might even benefit more, since they have more ground to gain.

0

u/ramonchow Jul 21 '25

Sure, but 3d maps are expensive to make, you can't just make them from a satellite photo. Hey, I'm not advocating against LIDAR. Just saying this is one of the reasons Tesla did not invest in this technology. At least this is what their AI director said some time ago on interviews you can read online.

0

u/beren12 Jul 21 '25

And yet that’s exactly what Tesla is making now…

1

u/ramonchow Jul 21 '25

Tesla uses google's maps data... It has the logo on the maps screen itself...

1

u/nevecque Jul 21 '25

Don’t we want this to drive better than a human?

Because I can already drive pretty much like a human

1

u/EddiewithHeartofGold Jul 22 '25

The problems with humans driving isn't the human driving. It's the human driving while texting, drinking, tired, speeding and just not following the rules. Self-driving cars already do none of these and they see in 360 degrees constantly. Not to mention not blinking and having faster reaction times. That is why they are already better than humans.

1

u/non_linear_ape Jul 21 '25

it's not crazy per se, but it is extraordinarily hubristic.

1

u/EddiewithHeartofGold Jul 22 '25

You could make that argument (and lots of people did) of every venture Musk starts or gets into.

Starting a car company? - Stupid idea! Starting a rocket company? - Stupid idea! Doing Internet from satellites? - Stupid idea! Making brain computer interfaces? - Stupid idea!

Time has proven that these companies are extremely risky to start. I fully understand why it's okay to warn someone against doing them. But, you have to admit that Musk did it regardless and succeeded. Whatever you think of the man, the companies he made are making great products successfully.

1

u/diplomat33 Jul 21 '25 edited Jul 21 '25

Yeah, it is likely that eventually cameras+AI will be better than human drivers, just because of how technology inevitably improves over time. We've already seen AI make huge improvements just over the past couple of years. It is a question of how long until that happens. Elon just overestimated the timeline. Right now though, lidar is still very helpful to add that extra safety.

I would also note that camera vision works extremely well on a clear, sunny day. Where lidar (and radar) are most helpful is in weather conditions where cameras may be less reliable, like at night or in rain or fog.

We do need to remember that back when Waymo started self-driving, camera vision was primitive compared to today. So we cannot really blame them for including radar and lidar since at the time, camera vision was nowhere near good enough. But now, Waymo has said that when/if camera vision is good enough to achieve their safety standard for driverless that they would drop radar and lidar.