You are viewing a single comment's thread from:

RE: First self-driving car fatal accident

in #technology7 years ago (edited)

So just to recap: humans need a driving licence but self-driving cars do not even require a human? Is that even legal? Is anyone surprised this accident happened? And do the government just accept it as a necessary statistic for the development of the technology? Self driving cars on roads with no pedestrians or other human manned vehicles might work if they were all programmed to to recognise each other and avoid each other and co-ordinated from a central control point like trains. How much of this kind of testing has been done without human drivers? Soooo many questions, not much actual research before they are trying to implement these on public roads though. Self driving cars is a bit like asking cows to have road safety awareness. They are not quite there yet.

Sort:  

They have been under testing for years now. 2,660 pedestrians were killed by cars driven by people, with driving licences... Till we know what actuality happened, you might want go google some of your questions, as there are many answers out there if you take a look.

Here, let me get you started.
https://medium.com/waymo/waymos-fleet-reaches-4-million-self-driven-miles-b28f32de495a

On the nightly news tonight some where asking why these cars are being tested in cities with major traffic congestion. Some thought they should be tested in remote area's before being placed on busy roads.

Good point. You woud think they would be tested thoroughly in an empty car park with real time simulations first?

Self-driving cars are already less accident prone than humans, but that doesn't mean a self-driving car is less accident prone than YOU. So are YOU a good driver? Everyone SHOULD have a different answer to that question, but if you're in your 40s and have never been in a real accident and driving since teens, self-driving cars probably not up to your standard, thus less safe. But if you are an average or below average driver, then self-driving cars already have lower accident rate than you.

Really? How can you say that when self driving cars have a driving statistical history of near zero!?

But, they are quite there yet. You're not asking a car to do a cross-country race, you're asking an AI the widely studied and even perfected question "¿How do you get from point A to point B in the least amount of time without hitting anything?". Sure, reading about a pedestrian being killed by an AI is unsettling. But is simply not the machine's fault. Perhaps the note should read "Pedestrian killed by error in self-driving AI programming". The humans behind the screen are at fault, not the car itself. Tesla is doing wonderfully with their self-driving cars because they're AI team is wonderful, and you'd be surprised at just how much AI's have consitently beat humans at these sort of tasks, both in speed and safety.

How does the AI know what speed to travel at? Can it read speed limit signs? And what if there are no signs?

A lot of these comments are going on the assumption that the self-driving car failed, but police are already saying it looks like the pedestrian was at fault. If this were a human driver, it wouldn't have made national news. This is a tragic accident, but nothing more.

"Watch out for self driving cars because they will not break if you step in front of them" ?