Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>>Who gets the speeding ticket in this case?

Whoever owns the algorithm. Or at-least in whoever's name the license/permission was issued. If its an organization, the top management signing off on this has to take the blame.



Legally, the person behind the wheel was still the driver. They are responsible for both the speeding and for killing a pedestrian. At this stage it's no different than using cruise control - you are still responsible for what happens.


I really hope you're wrong. If the legal system doesn't distinguish between cruise control and SAE level 3 autonomy, the legal system needs to get its shit together.


IMO as long as there is a human in the driver’s seat who is expected to intervene, they should bear the consequences of failing to do so.


No, that's bullshit. It's physically impossible for a human to intervene on the timescales involved in motor accidents. Autonomy that requires an ever-vigilant driver to be ready to intervene at any second is literally worse than no autonomy at all; because if the driver isn't actively driving most of the time, their attention is guaranteed to stray.


I agree with you - but that's literally the stage we're at. What we have right now is like "advanced" cruise control - the person behind the wheel is still legally defined as the driver and bears responsibility for what happens. The law "allows" these systems on the road, but there is no framework out there which would shift the responsibility to anyone else but the person behind the wheel.

>> It's physically impossible for a human to intervene on the timescales involved in motor accidents.

That remains true even without any automatic driving tech - you are responsible even for accidents which happen too quickly for anyone to intervene. Obviously if you have some evidence(dashcam) showing that you couldn't avoid the accident you should be found not guilty, but the person going to court will be you - not the maker of your car's cruise control/radar system/whatever.


Currently have two cars; one Mazda '14 3 with AEB, Lane Departure alert, radar cruise, BLIS, rear cross alert - and the other an '11 Outback with none of that (but DSC and ABS, as well as AWD).

The assists are certainly helping more than anything, so I feel that the Mazda is much safer to drive in heavy traffic than the older Outback.

The cruise has autonomy over controlling the speed only, and applying brakes, but it is still autonomy. Of course since my hands never leave the wheel it may not fit with what you have in mind.

Having said that, Mazda (or Bosch?) really nailed their radar, having never failed to pick up motorbike riders even though the manual warns us to not expect it to work.

I feel more confident in a system where the ambition is smaller, yet execution more solid.

Fwiw I also tested the AEB against cardboard boxes driving through them at 30km/h not moving accelerator at all, and came away very impressed by the system. It intervened so last second I felt for sure it wasn't going to work, but it did - first time was a very slight impact, next two were complete stops with small margins.

This stuff is guaranteed to save lives and prevent costly crashes (I generally refuse to use the word "accident") on a grander scale.


The latest top end Toyota Rav4’s have that too. It’s quite amazing how well they are able to keep cruise control and maintain distance behind a car.

I do love that even though they have a ton of driver alerting features, hands have to be on the wheel at all times.

Either you have full autonomy without hands or you don’t. There is no middle ground, it’s a recipe for a disaster.


Bullshit?? It may be autonomous but these cars are still far away from driverless. YOU get in the car, you know the limitations, you just said you even consider yourself physically incapable of responding in time to motor accidents, and that the safety will worse than a non autonomous car. Sounds to me what's bullshit is your entitlement to step into an autonomous vehicle when you know it diminishes road safety. Autonomous vehicles can in theory become safer than human drivers, what is bullshit is that you want to drive them now, when they are strictly not yet safer than a human, but do so without consequences.


I attended an Intelligent Transport Systems (ITS) summit last year in Australia. The theme very much centred around Autonomous Cars and the legality, insurance/liabilities and enhancements.

There are several states is USA that are more progressive than others (CA namely). But with many working groups in and around the legal side - it hopefully will be a thing of the past.

In Australia, they are mandating by some year soon (don't have it on hand) that to achieve a Safety Rating of 5 star, some level of automation needs to exist. Such as lane departure or ABS will become as standard as aircon.


Assuming ABS means "Anti-Lock Braking System" in this context, isn't that already standard? I can't think of a (recent) car with an ANCAP rating of 5 that doesn't have ABS. I'm not sure I would even classify ABS as automation in the same way that something like lane departure is automation. ABS has been around (in some form) since the 1950s, and works by just adjusting braking based on the relative turning rates of each wheel. Compared to lane departure, ABS is more like a tire pressure sensor.


Generally ABS does mean anti-lock braking system, but my guess is that they meant "Automatic Braking System"?


It creates an incentive to buy autonomous cars that are well programmed.


Does this responsibility stay with the driver, despite this clearly being an Uber operation? Aside from the victim, did self-driving tech just get its first, uhm, "marter"?


By law(and please correct me if I'm wrong), the driver of the vehicle is responsible for everything that happens with the vehicle. Why would it matter if the vehicle is owned by UPS, Fedex, PizzaHut or Uber? Is a truck driver not responsible for an accident just because they drive for a larger corporation?

Let me put it this way - my Mercedes has an emergency stop feature when it detects pedestrians in front of the car. If I'm on cruise control and the car hits someone, could I possibly blame it on Mercedes? Of course not. I'm still the driver behind the wheel and those systems are meant to help - not replace my attention.

What we have now in these semi-autonomous vehicles is nothing more than a glorified cruise control - and I don't think the law treats it any differently(at least yet.).

Now, if Uber(or anyone else) builds cars with no driver at all - sure, we can start talking about shifting the responsibility to the corporation. But for now, the driver is behind the wheel for a reason.


From the article:

The San Francisco Chronicle late Monday reported that Tempe Police Chief Sylvia Moir said that from viewing videos taken from the vehicle “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway." (bit.ly/2IADRUF)

Moir told the Chronicle, “I suspect preliminarily it appears that the Uber would likely not be at fault in this accident,” but she did not rule out that charges could be filed against the operator in the Uber vehicle, the paper reported.


I would be interested in hearing more about how he qualified that statement. Are shadows a known limitation with some or only Ubers systems?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: