Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> makes a strange decision ... for no obvious reason

Presumably it thought it was a person or vehicle so it dipped to avoid dazzling.

Not what you'd want in this exact case if you were making an intelligent decision, but not really 'strange' is it?

Also high-beam assist isn't any kind of unusual Tesla feature - it's pretty standard on all new cars.



It's a feature of many cars, and the majority of them buy a standard well working chip for it (usually from Bosch). Same with the automatic wipers.

Tesla is the odd one where they decided to implement those kinds of features themselves, using their own machine learning algorithms for the cameras. Which results in random weird behavior that just doesn't happen with other brands.

It's a fundamental problem with Tesla, they think they know better but the result is shit. Apart from the wipers and now high beam, autopilot is another example. While Tesla was a pioneer in the area, it now performs worse than most other brand's adaptive cruise control. They've shot themselves in the foot with computer vision, with many phantom braking events as a result.


> It's a feature of many cars, and the majority of them buy a standard well working chip for it (usually from Bosch).

We do not have the same definition of "working well" and standard then.

On the 6 non-Tesla cars I drove lately, 4 of them have issues with automatic wipers that goes nuts without reasons.

Automatic high beams tend generally to be even worst for the ones without matrix light.


> standard well working chip

What is it about the problem that means it needs custom silicon?


You want to turn it down for someone on the pavement, but not for someone crossing a major road at night.


Ya, I have it on my Corolla. Nothing special.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: