Advertisement
Advertisement
⚡ Community Insights
Discussion Sentiment
50% Positive
Analyzed from 1155 words in the discussion.
Trending Topics
#tesla#lidar#autopilot#chinese#tests#data#every#something#hours#fsd
Discussion Sentiment
Analyzed from 1155 words in the discussion.
Trending Topics
Discussion (39 Comments)Read Original on HackerNews
https://www.youtube.com/watch?v=0xumyEf-WRI&t=1203s
https://electrek.co/2025/07/29/another-huge-chinese-self-dri...
XPENG (major chinese ADAS brand) recently decided to copy Tesla's vision-only+AI world gen data approach, after originally focusing only on LIDAR https://electrek.co/2026/04/29/xpeng-vla-2-test-drive-tesla-...
There's also been talk of companies pushing a hybrid LIDAR+vision approach using custom hardware since it's complex to merge the two datasets. So the answer might eventually be somewhere in between instead of companies choosing one or the other depending on costs.
- Tesla's vision only approach seems a lot more competent than the Lidar suites from smaller Chinese makers. Perhaps I misjudged how necessary Lidar was to achieve safe driving.
- Virtually all of the Chinese car infotainment were basically a 1:1 copy of Tesla's. I couldn't find any that genuinely tried something unique lol
Three things can be simultaneously true:
* Tesla's cameras are sufficient for some scenarios.
* Tesla's cameras are insufficient for other scenarios.
* A system with good data and bad algorithmic processing is still going to be bad. The Chinese vehicles almost always fail the tests because they see the obstacle but drive into it anyway.
Yes Waymo exists, but the amount of training data they have is a few orders of magnitude lower.
All of this said, once Karpathy left they have slowly looked at adding new sensors (recently radar), so who knows what the future for Tesla's sensor suite holds.
Cause if not, it would be hilarious to do that to a clapped out van...
Especially in right hand drive markets (non US) it’s even worse than Toyota’s radar cruise.
I’ve nearly been killed by it about 5 times because it randomly steers into fences and things. It also randomly fails to change lanes (1 in 100), and then just randomly steers full lock and goes out of control.
Model 3 - Highland
I can’t recall anytime either Autopilot or FSD put me in danger though.
For right hand drive markets, it seems to be a stripped down version of FSD 10 or 11. It automatically changes lanes, takes corners and highway exits, but does not stop at traffic lights. It drives exactly in the middle of the lane, doesn’t shuffle over for trucks, and is easily confused.
Every... TWO HOURS?! I mean, come on. Put a camera on yourself or another human driver. There's an unexpected braking event at least that often, almost always in a more dangerous situation. The human failure tends to be failing to detect a real obstacle, vs. slowing for a phantom one.
This is just too much. If you don't like it don't use it. But to pretend that stomps-the-brakes-every-few-hours is a stop ship kind of safety bug is quite frankly ridiculous.
Wait...what are counting as an "unexpected braking event"? I can't think of anything I do with brakes that would not be counted as ordinary braking that happens anywhere near as often as every two hours.
Don't most cars do something like that now? I'm curious what's different between Tesla and, say, a Honda Accord?
It seems other HW3 might get a FSD-lite version. There's no official way to upgrade HW3-HW4.
Any other administration and I would be willing to grant the benefit of the doubt, but Musk's spent a lot of money to corrupt government agencies over the past year and a half so that he could get silly pronouncements that the most dangerous "advanced" driving system in the world is somehow also the safest. (More people have been killed by Tesla's ADAS systems than every other automaker's ADAS systems, in the world, combined.)