Self Driving Cars

This is a topic that has been fairly prevalent on automotive news and I thought I would bring it up. Nowadays, automation is becoming more of a reality in our lives. We no longer have to put in our own labor in our everyday tasks, a lot of the time; we simply click a button and a computer takes full control. This brings me onto Google, who have been in the process of innovating an “Autonomous Vehicle” , among several other major car companies. This in itself raises some interesting ethical questions. For instance, how comfortable would you be with potentially leaving your lives in the hand of a computer driven vehicle? Obviously, the concept is still in its very early stages of testing and won’t surface on the market for a considerable amount of time but the idea still might trouble a lot of people.

There have been arguments that autonomous vehicles will result in drastic reductions in traffic accidents as it will completely eradicate driver error. However, to what extent could an automaker argue that their vehicles are that safe, and when an accident does inevitably occur, who is to blame? In terms of moral rights theory, a person has a basic right to life, which implies that ‘people’ have a duty to not harm them or take their life. Could we impose this duty on a computer and how would we enforce it? If it is somehow possible to hack the integrated systems on these vehicles, that could be a means for a disaster. Most of these cars are expected to communicate with one another wirelessly in order to make travel as efficient as possible and if this communication were to be affected, serious consequences could result. Would you say that autonomous vehicles are the right direction for the future of transit?

3 Responses to Self Driving Cars