self driving car

When A Self-Driving Car Gets Into an Accident – Whose Fault Is It?

self driving car

Self-driving car technology is definitely at top of mind right now.  Where will we be with it in five years time?  Is that enough time to get the technology to a place where it can be used?  How safe is it? What we might actually find out is that its not the technology that’s the problem, but rather humans.  Should we be surprised though?  I mean, humans muck up a lot of good things.  In a study of vehicle incident reports in California, Axios found that humans were at fault in the vast majority of accidents that occurred on the roads.  The study which looks at the last four years found that while the self-driving cars were in autonomous mode and driving on their own, 38 incidents occurred while moving.  In all but one of those cases, the accidents were caused by humans.

What does this mean?  Humans can’t drive cars! I kid, but it’s also kind of true.  Axios also found that there were 24 incidents when autonomous mode was on and the vehicle was stopped.  But none of them were caused by the self-driving technology. Even when the cars were in conventional mode, allowing for human interaction, just six of the 19 incidents when the cars were moving were caused by the self-driving technology.

What this really tells us is that maybe we’re not ready for self-driving technology.  I don’t’ necessarily mean that in a bad way.  Sure, humans are the problem in this case, but we’re human after all.  On a good day, we have a million and one things coming at us all at once.  Our brains are rarely ever shut down or disconnected in the least.  So it’s no surprise that we would be distracted.  What does that say about our roads now?  Well, if I’m being honest, we’re unsafe.  And I guess the idea of self-driving technology has been marketed as a way to put your feet up and relax.  But that’s not the case.

This study means that we’ve been wrong about the technology as well.  We often see headlines blaming autonomous cars for getting into accidents and even killing people, but in those instances, was it the fault of the technology?  Or was there enough human error to blame the individual?  Now I’m not suggesting that if someone died in a self-driving car accident that we should blame the individual, but I think we need to look at it more objectively.

Further, does this suggest that we’re not capable of using self-driving cars? People are required to get drivers licenses to drive a car. Not because the government is trying to get money from us, but because driving can be dangerous if not done properly.  So should we also consider requiring licenses for people who want to be in a self-driving car?  I always talk about the infrastructure that’s needed in order for this kind of technology to work and be effective.  But I think we’re overlooking a whole host of other things – like the people who are going to be “driving” the cars.  The industry is so focused on getting the technology to work.  But it won’t be relevant if we don’t have other kinds of systems and infrastructure in place to support it.