Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When I was getting my learners permit I was practicing on a curvy mountain road. I saw a person with a camera phone on the "drop off" side of the road, taking a picture of from what my angle would've been a boring cliff face. But I knew that a camera phone photographer was probably actually photographing a human, who would likely be on the other side of the blind corner. So I slowed down, despite my instructor chastising me for doing so. When we came around and there was a person in the road she asked, "How did you know?" If a self driving vehicle can't make predictions about "irrational" pedestrian behaviour like this and slow down it shouldn't be on the road.


Your own story disproves your conclusion. Your driving instructor, like many (many) other drivers, didn't make the infererence that you did. People don't drive well. Cars are dangerous.

The criterion (really the only criterion) for whether or not automatic vehicles "should be on the road" is whether or not they are safer than the alternative. And that certainly doesn't include "being as good a driver as jdavis703 was in this particular anecdote".


The instructor, though, wasn't the one driving.

Being engaged in a particular role relative to a situation can (I believe) alter one's capacity to perceive and respond to it.

This isn't to say that for definite the instructor would have had the insight described in the parent post, but it is to say that you can't rule this out just based on the information that as a passenger they didn't have the insight.


> This isn't to say that for definite the instructor would have had the insight described in the parent post, but it is to say that you can't rule this out just based on the information that as a passenger they didn't have the insight.

That's a senseless digression though. Again, the criterion to use when deciding whether autonomous vehicles are safe is whether autonomous vehicles are measurably as safe as human drivers and very much not whether we can "rule out" the possibility that the machine might be subject to failure modes that human driver are already known to have anyway.

For every scenario and hypothetical like this one you can imagine where an autonomous vehicle would fail, I can come up with an equally hypothetical reason why they're better (hell, just read any of the media coverage of them). Both arguments are meaningless without numbers and analysis, c.f. this very article we're discussing.


> That's a senseless digression though.

In terms of your agenda, it might make no sense. It just irks me when I see simplistic assumptions about human information processing that don't take the "situatedness" of the processor into account.

I have no views, and did not attempt to comment, on machine information processing in this instance. I just wanted to point out that you can't simply go from the instructor's actually missing the inference as a passenger to the conclusion that they would have missed it as a driver. Thats it :)


But I think we can say that many human drivers would not have slowed down in that situation.


The self driving accidents might be the easiest for humans to avoid, but if accident rates are lower for self-driving cars, it's still worth it. Let's look at overall numbers and severity, not how avoidable it would be for a human.


I am a massive proponent of self-driving vehicles for exactly the same reasons you indicated. However, I do a huge reservation: this tech can only peak if it is the only thing on the road.

My outlook on the transition period is not very optimistic. Humans are good at anticipating crazy behavior. Fatalities increasing (before they eventually decrease) is currently just as likely as the promised miracle in my mind - we don't have enough data to sway my mind either direction just yet.


Should your driving instructor also not be on the road?


Probably not. But I also failed my first driving test for going to slow (I was within the minimum and maximum speed range, but new drivers going slow I guess is a sign of an unconfident driver).


It’s been a while since I took a driving test, but I recall the test to be more than staying within the speed limit.

There was also staying in between the lines, following road signs, and integrating with traffic. Do you think you might have failed not because you weren’t pegged at the speed limit, but instead because you weren’t capable of driving safely?

I wasn’t there, so it could’ve been because you really were just driving too slowly. But maybe it was more than that.


They tell you why you failed when the test is over. And it was because of that.


Maybe shouldn't be a driving instructor.


I wouldn't particularly blame your instructor, a lot of people would have reacted the same way, which to me proves that it's not just AI, it's the roads and the driving that are unsafe. As pedestrians, we learn what's dangerous and what we can't do, and as drivers, we assume pedestrians behavior from our own knowledge. But both AI and humans are subject to occasional 'irrational behavior'. Humans are not safe either in that situation. Even if in some cases a human might have advantage over AI, AI has many other advantages over humans. It would be a big mistake to reject AI just because there are situations where its performance can be inferior to some humans.

I agree there's still a lot of work to be done, but my point is that the problem is the driving system itself, and a lot of people dies because of it. AI can't directly change that, but if it can still do better than humans (in a few years), then it's worth it.


A proper (e.g. non-Uber) self driving car would never take a blind corner at a speed that doesn't allow it to come to a complete stop before it reaches its vision range.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: