And it’d be morally unacceptable to allow public deployment of software that has not been sufficiently tested. As other comments have pointed out, self-driving cars have undergone ridiculously little testing. In fact, based on only the objective statistics it is very unlikely that they are anywhere near as good at driving as humans.
There is no reason self-driving cars can’t be tested in private. Companies can hire pedestrians to interact with the cars, and the software can go through the same certification process that buildings and vehicles currently go through.
It’s a false dichotomy to say that you can either have self driving cars or minimally safe and accountable development, but not both.
I'm just going to point out that nearly every time someone gets their learners permit, or graduated from a learners permit to a full license, an insufficiently tested driver is allowed on the road in order to develop more skills and become a better driver. Often they kill people in the process of learning. Should we require years of private training for every human driver as well?
They can't even share what they learn with each other effectively.
Our city (which is pretty small tbh, 150k) had 424 DUI arrests just this past weekend (Fri-Sun) due to St. Patrick's day. This is despite availability of uber, lyft, taxis, public transit, and bar services which will give you a ride home and allow you to park your car until morning.
Ignoring these facts is just as intellectually dishonest.
These cases are not what the parent comment was referring to.
The PP was insinuating that people with leraner's permit (and, by law, an experienced driver in the passenger seat) or people who just got a license (and hence were, literally, tested) are "insufficiently tested".
The PP was responding tho the claim that letting "insufficiently tested" systems on the road with the goal of letting them improve is irresponsible.
For the response to have any merit, you need to cite accident statistics for people with learner's permits, or new drivers.
To be clear, the testing for getting a learner's permit (where I live at least) is 7 out of 10 questions on a multiple choice test, and the test for a driver's license is a 20 minute drive-about where the driver gets to more or less choose the area they'll drive and the weather when the test is done.
I don't think it's even a very tough argument that these are at best basically limited filters on actual driver skill. I've known people who literally went to another city for favourable conditions for their driver test. I've known of people who passed having driven not much more than a few hours in their lives.
Nowadays if you want to be the driver in the passenger seat for a learner you need to do a somewhat more difficult test and be older.
Also, I'm really not talking about this specific case but I think it's particularly relevant that in this case there was a qualified driver able to take over for the autonomous vehicle, which is actually more supervision than a 14 year old with a learner's permit has.
Are you really scaling off a single data point? I feel like you'd also have to compare the types of driving. What is the death per 100 million miles on a city street? If you exclude highway miles I'd imagine it's much worse.
It is the only data we have. Ironically, one reason for the paucity of such data is that these companies have been so reluctant to make public their records—the reason this accident occurred in Arizona and not California is because Arizona has relaxed reporting requirements. And the commenter notes that extrapolating might not be wise.
Are you really asserting your statements without any fact whatsoever?
I'm curious what your dispute is here. Unless you're reading something into it that I didn't say. I certainly didn't say most new drivers kill someone. But to say that this is not a thing that happens with a decent level of frequency is just silly. There is an actual reason insurance is higher for teenagers.
The post I'm responding to, from my perspective, has a flippant attitude towards the deaths caused by human drivers.
A learning driver on their first day has at least 16 years of experience with traffic (and life in general), and thus a fine model of which actions can map to injury and loss of life.
I'm sure there are some people who are attentive enough about traffic before they start studying for a learner's permit (at 14 most places, I believe, not 16) to be described that way, but many are not. Especially if you live and go to school in a suburban area your experience with traffic is probably largely a) crossing not very busy streets, b) riding a bike on, again, not very busy streets where traffic laws are honestly barely obeyed anyways, and c) getting on a bus or being driven everywhere, at which point maybe you pay attention or maybe you don't.
There's definitely going to be some osmosis but I think you're pretty vastly overstating it (and including several years in which your ability to even comprehend what traffic is is going to be severely limited).
That's not even remotely true. When I started learning to drive I had 0 experience with traffic. When I was fully licensed I had less than 2 years experience.
The difference is we have a lot of experience with the human brain, and not much at all with self-driving cars. All we have to go on are the statistics, which say that although humans are bad drivers, self-driving cars are even worse (or at best that we don’t have sufficient information).
Again, why not just test them in private and hire people to be pedestrians?
> why not just test them in private and hire people to be pedestrians?
And get these employees to do what exactly? You obviously can't tell them to put their lives at risk to test what happens if they walk onto a car's path, or if they ride a bike on a car's blind spot wearing dark clothes at night, or if they fall from a motorcycle in front of a car after hitting a raccoon.
The dichotomy here is that private courses are inherently orderly and the real world is inherently chaotic.
That is exactly what those employees would have to be paid and consent to do—just like test drivers. In fact, someone else linked a Waymo blogpost saying that they have employees do exactly those things (walking into cars’ paths, lying down on skateboards...)
How is it not OK to have consenting people do these things but OK to have random people on the street participate in exactly the same tests?
People willingly operate unsafe machinery in 3rd world country factories but personally I don't think it's OK when they get crippled due to some accident or malfunction.
If all you're testing are scenarios that are known to be safe for the employee in question, then what exactly are you gaining from that testing?
It's not a matter of whether they are better or worse, it's a matter of whether these tests are sufficiently realistic.
Anyone with half a brain would hopefully quickly realize that you don't actually need a live person to lay down on a skateboard. Heck, you can simulate a much more risky jaywalking scenario with a mannequin on a dolly than with a living person.
Waymo is deploying its fleets on public roads too, which suggests to me that they think that private course tests can only get you so far.
> Waymo is deploying its fleets on public roads too, which suggests to me that they think that private course tests can only get you so far.
No, it just suggests that building and operating huge private courses that realistically emulate daily traffic situations in cities is much more expensive than just (ab)using the "real" public infrastructure paid for by tax dollars for your beta-testing needs, and that Waymo (just like Uber) takes full advantage of this chance to privatize gains and socialize losses.
Yes, we're obliged to double down. On safety. The Uber self driving cars are known to be unsafe around pedestrians. Or another way of saying it "people walking". Or just "people". Like you and like me.
I had one try to run me down in the crosswalk (utterly failed to yield to pedestrians in the crosswalk) while Uber was running their ill-fated test in San Francisco. I walked into their offices on Harrison and asked to file a bug report. They laughed it off. Saw the exact thing happen a couple days later. They need to stop till they can figure it out how to do it safely.
Absolutely not. Current data point at self driving car being more dangerous than human driven cars. Double down in real environment testing and you’ll expose the public to increased danger until software becomes good enough - you’re basically telling random people to share publicly the risks so other people in the future may or may not be safer while private companies profit from the fruit of the research and, sincerely, screw that: public test should start after controlled testing proves the system to be at least not more dangerous than the status quo and the companies should use testers that opt in and are adeguately compensated for their liabilities to injuries instead of killing off random bystanders for the “greater good” which in trith is how they name their pyroll savings
When considering whether they're net saving lives we are now doing so from the starting point that self driving cars presently have a worse fatality record per real world mile driven than humans, including the portion of miles driven by humans who are unfit to drive or wilfully reckless.
Self driving cars can also be tested in real life environments with safety drivers behind the wheel whose own driving shortcomings are unlikely to coincide with software error. This is believed from the OEM's own telemetry to have prevented several accidents with early generation Waymo
(Edit: and apparently one was present in this vehicle and was unable to prevent the accident)
Of course, there's a PR benefit of taking the drivers out of the car at this stage of their development, and it appears this has taken priority...
Sure you could test it privately. Probably quite extensively. But if you have any inclination as to how deep learning works you might agree that this would not be sufficient in any way. The only way that will really work is on real world streets.
And if you had any inclination as to how robotics (or indeed engineering!) works I’m sure you would know the value of testing dangerous and extremely immature equipment with people who consented to be your test subjects and in controlled environments. “Deep learning” is no excuse.
Certainly in the long run we would want to test in the real world. The argument is whether or not we are there yet. Other commenters have made a compelling argument, again based on the statistics, that we are not there yet. Do you disagree, and if so, based on what facts?
I never made an excuse nor said we should be testing these on live streets.
What I did say is that with deep learning it simply is not possible to "test" in a simulated environment to any level of certainty.
You can use simulated environments with test subjects to help develop such as system (and should do so). But you will never be able to adequately test such a system in this environment.
Why? The state space of traditional robotics and engineering problems is far more constrained. So much so that you can't even compare the fields in my opinion.
> "it’d be morally unacceptable to allow public deployment of software that has not been sufficiently tested"
Does "sufficiently tested" mean you get the end-product a year later? What if the end-product saves 10,000 lives a year? In that case what cost is morally acceptable to get there a year faster?
You're misinterpreting me. I meant "nobody" can make that such statements. As in, if you (somebody) have a a product and you (somebody) haven't tested it enough, you (somebody) can't state that it will save N lives per year.
There is no reason self-driving cars can’t be tested in private. Companies can hire pedestrians to interact with the cars, and the software can go through the same certification process that buildings and vehicles currently go through.
It’s a false dichotomy to say that you can either have self driving cars or minimally safe and accountable development, but not both.
EDIT: Here is a link to a thread replying to a parent which is now auto-collapsed. https://news.ycombinator.com/item?id=16620968