As technology advances, we ask ourselves questions that would have seemed ridiculous scant decades ago. One of these questions looms directly ahead: Should automated — self-driving — cars be legal, and under what conditions?
In days of the past, this query might have seemed ridiculous, like asking how to tax land ownership on Jupiter. However, it is now a very real possibility — one that companies, lawmakers and individuals all have to face.
For starters, where does blame lay in the case of an accident? Can a driver be ticketed for losing attention while his car drove itself? Can a driver sue the company that created and installed the automated technology when it fails? Can intoxicated people use their self-driving cars to safely get home, or would that still be considered driving under the influence?
These are just a few of the inexhaustible questions that arise with this oncoming innovation.
Additionally, there will have to be strict regulation of the self-driving technology to ensure that it is dependable and professionally made. As Time writer Adam Cohen puts it: “Just because your neighbor Jeb is able to jerry-rig his car to drive itself using an old PC and some fishing tackle, that does not mean he should be allowed to.”
At the end of the day, most of these questions boil down to safety. In simple terms, self-driven cars are by definition “safer.” Driving while eating, applying makeup, texting or drunk — these and other operator errors create most annual traffic accidents. The hope is that automated cars could cut out these issues and prevent deaths. And so far, it’s true: They result in fewer accidents and deaths than do cars driven by people. As we are often reminded, humans are imperfect.
According to the National Highway Traffic Safety Administration, 2012 had 34,080 traffic fatalities in the United States. That number comprises countless friends and family members that needlessly died. A study done by the Eno Center for Transportation released Oct. 24 has made some reassuring claims, however. According to the study, 1,100 lives could be saved each year if just 10 percent of the driving population used self-driving vehicles. On its own, that number is amazing; imagine the number of lives saved if 50, 60 or even 100 percent of cars were semi or fully automated.
To all appearances, this is fantastic news. In the next couple decades when we perfect the automated car, we could save hundreds of thousands of lives. There can’t possibly be an issue with these new statistics.
Unfortunately, there kind of is.
The problem is that in the saving of lives, we will be basically trading a set of lives that would have been lost for an entirely different set. For example, maybe there is a person whose bad or inattentive driving skills cost them and their passengers their lives one tragic day. Proportionately replicated over the population of the U.S., that’s a lot of deaths. Self-driving cars will do away with that danger so that poor drivers will be saved.
However, the few who do die in automated accidents will be entirely different people. A person in an automated car might be an excellent and responsible driver, but an unfortunate technical or mechanical error in the car could lead to their demise.
You might ask: “So what? We would still be helping countless people.”
It may seem like a simple answer, but what we will really be deciding is the worth of a human life. Can we trade one life for another? How about 100 lives for just one other? An inconsolable mother who has just lost a son in an accident may very well say her son’s life was worth 1,000 others. It is impossible to make these choices using broad generalities. It may seem to have an easy answer, but it is a far-reaching and deeply consequential moral question.
Overall, the invention of automated cars hints at a brighter future, one in which fewer people are injured or killed needlessly on the road. The widely-stretched establishment of our cities, towns and suburbs makes driving a necessity for most Americans. The innovation of self-driven cars might be just what we need to knock down the death toll. Regardless, the implementation of automated vehicles is not without moral implications.
It is our social responsibility to consider all of these ethical dilemmas and ask ourselves, what is the value of a human life?