Do you program? How can you program and not be aware of the X deaths, Y privacy violations and pretend you aren't part of that system?
Is every programmer in the world partly responsible for every death caused by every programming bug ever? Should we blame all civic engineers every time a bridge collapses?
I am an individual, not "part of a system". Simply being a driver doesn't make me a reckless or dangerous one. For example, 10K of those deaths involve DUI, and I am certainly not part of that system. I'm sure 95% of drivers never kill anyone.
>I am an individual, not "part of a system". Simply being a driver doesn't make me a reckless or dangerous one. For example, 10K of those deaths involve DUI, and I am certainly not part of that system. I'm sure 95% of drivers never kill anyone.
Massive misunderstanding. As a driver your actions form a part of the traffic around you, even responsible drivers are "attached" to the system and put pressure on it. Maybe one day somebody rear ends you and you are knocked forward into a crosswalk, or you break suddenly to avoid hitting a pedestrian and cause an accident behind you. Or maybe your very safe driving leaves enough room for somebody to take a foolish risk cutting in front of you and lose control of their car. Collectively society accepts that getting places quickly is worth the cost that we pay in human lives. And each road user agrees to the small risk of death/injury beyond their control in order to get somewhere.
It's almost impossible to drive a car without threatening small children and other people with lethal force. People regularly wait on the sides of roads when every traditional custom, convention and current law demands that car drivers stop, but they never do it. "I want to get to the shops quicker" is not an excuse for levelling a gun at someone and asking them to please step aside.
Moreover, it is illegal to drive a car safely. If you try it, you will be pulled over. You may be fined or lose your licence. Try it and see. So yes, being a car driver does make you a dangerous one.
Perhaps the same standard should be applied to human drivers: if a human makes a mistake and kills somebody, the rest should be taken off the road for retraining.
I wonder how the issue developed when the automobile was first introduced. I remember something about needing to have a guy walking in front of the vehicle waving a warning flag, but apparently that didn't last long.
Yeah, possibly. On one hand, the Uber setup should be examined for flaws. On the other hand, we shouldn't get paranoid about the occasional traffic death when we've already decided (for human drivers) that this is acceptable collateral damage in the transport system.
The entire pitch for autonomous driving is that it's safer than human drivers. If that's false it's not clear why we should continue to allow Uber to test their work on public roads.
It could be safer than human drivers and still cause quite a few deaths. We can't evaluate such statistics from one accident. We don't even know if this particular accident would have been avoided by a typical human driver (there was apparently one behind the wheel of the vehicle).
That's reasonable while the system is in testing and in small scale use. If it ever becomes a major part of the transportation system, it won't be desirable to shut it all down every time there's an accident.
Not a meaningful counterargument here. Someone can not only drive, but have a history of having been an at-fault driver causing death, yet justifiably not believe that: public roads should be used for private testing such that the profit is private, but the harm is socialized; or that money compensates for lost lives.
And yet, people and companies undertake activities all the time that involve risks to life, and we do indeed attach a monetary value when those lives are harmed.
Or, alternatively, we do indeed have a cap to the amount of resources we are willing to expend to save a life - lives are not of infinite value.
Wat a completely terrible thing to say. This isn't about saving a life, this is about not killing somebody. Just because a company like Uber thinks it can make a lot of money doen't mean it can simply take risks like these.
Every time you are selling food you take the risk of killing people if something goes wrong. And what about carrying people in planes. These risks are taken continuously, for profit. How is that different?
Both of those industries have tremendous regulations in place to prevent accidents and injuries. If someone gets salmonella poisoning and it is traced back to a company, there is a massive recall at the company's expense. Air travel is one of the safest modes of transportation available (statistically) because of the NCTB and the rules/regulations put in place after each and every accident.
Air travel is only safe because companies have taken these risks with people's life. A society that takes no risk is a society that will achieve nothing new.
Air travel and eating food at a resteraunt is an opt-in action. To avoid this risk, you would have to opt-out of using the public road system that you are required to use.
We wouldn't except it if people got killed by selling them poisoned food, at least not where I'm from. Plane crashes are investigated and licenses are suspended and blacklists are kept, furthermore software is tested and verified before it is used in production. We shouldn't except excessive risks, see regulations with truck and bus drivers, just because a profit can be made.
But what makes you think Uber didn't test their software? When Boeing introduces carbon fuselage it is taking risks with people's life. They do reasonable testing but a technology isn't proven until it has been widely used for a long time. No risk = no innovation.
Since there is absolutely no binding federal regulation I don't have a lot of confidence that the level of testing is comparable to what's done for airplanes.
It's all well and good to not like the choice, but the choices still have to be made - how much are we willing to give up economically in order to reduce immediate risk to lives? Included in this must be the consideration that economic value can be used to save lives, through higher living standards and better health care.
Did you ever take an engineering ethics course? One of the things talked about is the monetary value placed on a human life. You can't make that value infinitely high or literally nothing can happen. You also don't want it super low.
We get it, its sad and money isnt everything, but the world cant stop because of a single death. If there is legal liability here then itll be handled.