cross-posted from: https://derp.foo/post/81940
There is a discussion on Hacker News, but feel free to comment here as well.
cross-posted from: https://derp.foo/post/81940
There is a discussion on Hacker News, but feel free to comment here as well.
The whole argument for self driving cars is they don’t make stupid mistakes. Just because they do it less than humans isn’t a good argument.
I am putting my trust in the technology. It needs to work otherwise I’m not going to use it. It working 99% time is not good enough especially when it failed because it wasn’t good enough.
Beeing better that a human driver is exactly what they need to be. Currently, they fail at somewhat different things compared to human drivers. So if we combine both we get the least amount of mistakes. The car never sleeps, the human never mistakes a bridge column for the road.
That’s not the argument for self-driving cars at all. The argument for self-driving cars is that people hate driving because it’s a huge stressful time sink. An additional benefit of self-driving cars is that computers have better reaction times than humans and don’t stare at a phone screen while flying down the freeway at 70 mph.
If we find that SDC get in, say, 50% fewer serious accidents per 100 miles than human drivers, that would mean tens of thousands fewer deaths and hundreds of thousands fewer injuries. Your objection to that is that it’s not good enough because you demand zero serious accidents? That’s preposterous.
Is there any progress on the question who is responsible for such accidents? Of course less accidents are desirable, but if the manufacturer suddenly is responsible for deaths and not the human behind the wheel, then there is a really big incentive to have zero serious accidents. If the system is not perfect from the start, someday the industry and the government have to decide how to handle this.
I believe the manufacturer should be liable for damage caused by their product due to manufacturing defects and faulty software. This incentivizes manufacturers to make the safest product possible to reduct their liability. If it turns out that it’s not possible for manufacturers to make these cars safe enough to be profitable, then so be it.
I don’t think exceeding 99% is realistic with self-driving cars… There’s always going to be a margin of error with anything.
Probably. But we can do way better then what we have now