Futurism logo

Ethics of Self-Driving Cars

Putting AI behind the wheel raises many questions and concerns, and on the verge of this becoming our reality, we must examine the ethics of self-driving cars.

By Ossiana TepfenhartPublished 7 years ago 4 min read
Like

Self-driving cars are slated to revolutionize the way that transportation happens. No longer will people have to worry about wasting time while finding parking or dealing with trying to determine whose fault an accident was.

If you listen to the hype, transportation will be safer, cheaper and more convenient than ever with self-driving cars. Even major taxi services like Uber are working on creating fleets of self-driving cars to ship people to destinations more affordably.

But, along with the rise in popularity are the raise of the morals and ethics of self-driving cars. Are we really ready to tackle the hard-hitting questions that come with having AI behind the wheel?

Whose fault is an accident?

One of the most obvious problems with self-driving cars comes during the event of an accident. If a driver collides with a self-driving car, is it the car's fault that the accident happened, or the driver's? Moreover, how can accidents with driverless cars even work out in court?

Self-driving cars cannot argue their logic in the court of law, and in many cases, things can happen that could potentially cause a driverless car to kill someone. The car's owner may not be behind the wheel, so are they still responsible?

No matter how you look at it, there's something morally wrong with the driverless car debate. If the owner of the car is held responsible, it can be arguing that you're punishing an innocent person because they weren't driving the car. Yet, if no one is held responsible, that doesn't seem fair either.

If the driverless car is headed towards a crowd and "knows" it can't stop in time, who dies?

Programmers working on creating driverless cars are having a tough time trying to figure out how to handle the possibility of a lose-lose scenario. If the car enters a situation in which it will either hit a toddler, a little old lady, or a guardrail, which way does the car drive?

If you program the car to hit the child or the granny, then they will be injured through no fault of their own. If you program the car to hit the guardrail, there's a very good chance that the riders of the car will be injured.

Once again, this is a problem with no good solution - and no matter what the cars will be programmed to do, it will be a morally reprehensible choice. This scenario is very unlikely, but it's one that could happen in theory and raises a concern when discussing the ethics of self-driving cars.

Regulations are in the works that may try to involve mandating self-sacrifice for driverless car buyers, but obviously, potential buyers may not want that. After all, who wants a car that would kill them?

What happens if hackers take control of a driverless car?

Imagine if you are a high ranking government official who's about to go out and speak about something extremely corrupt that's going on in the state level. In recent days, you've gotten warnings to step down and keep your mouth shut. You're worried about people trying to hurt you.

Even so, you are going to move forward with your life despite it all. The morning of your major corruption expose, you get into your self-driving car. It starts up, and it starts driving you towards the capitol building...and it starts speeding up past the 65 mile per hour speed limit.

It continues to speed up, clocking past 100 miles an hour - and you can't get out because the doors won't open. After it speeds up, it quickly drives off a cliff, falling hundreds of feet below. The last thing you see before you die is the steering wheel hitting your face.

Yes, this can happen.

If a driverless car is hacked, there's nothing saying that it may not end up being remotely controlled by someone who wants to hurt you or they can just go rogue, raising the ethics of self-driving cars. Moreover, this isn't even a new concept. The CIA has been tinkering with the idea of hacking car computers for ages.

If driverless cars become commonplace, hackers who break into them will also increase in numbers. Who's to say that someone won't decide to off someone else using those hacking skills?

And what about tickets?

Tickets are an important part of local government revenue, but that's not the issue that is a moral problem dealing with driverless cars. Morally, people shouldn't be dangerous drivers anyway, and that's why tickets were invented. But when it comes to the ethics of self-driving cars, tickets become a hot topic.

But, what happens if a driverless car parks in an area that's not mandated for parking? Since a person wasn't behind the wheel, does that mean that they shouldn't pay? Moreover, why should they have to pay for tickets when it was the fault of the car?

A question with no answers?

No matter how you look at it, driverless cars may end up causing major problems with both drivers and pedestrians alike. The ethics surrounding self-driving cars make you ask "Is it really an invention worth it?"

futurehumanitytech
Like

About the Creator

Ossiana Tepfenhart

Ossiana Tepfenhart is a writer based out of New Jersey. This is her work account. She loves gifts and tips, so if you like something, tip her!

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

There are no comments for this story

Be the first to respond and start the conversation.

Sign in to comment

    Find us on social media

    Miscellaneous links

    • Explore
    • Contact
    • Privacy Policy
    • Terms of Use
    • Support

    © 2024 Creatd, Inc. All Rights Reserved.