The Moral Dilemma of Self-Driven Cars

in StemSocial7 months ago (edited)

image.png
Image Source

According to a report published by WHO in 2015, there are around 1.25 million people who die annually due to car accidents. And due to 94% of these accidents being caused by human error, the idea of eliminating human errors from the equation came to be. The new concept is cars that drive themselves, designed with almost 0% chance of an error.

Self-driven cars won't only make the roads safer, but will actually make the traffic more efficient. A concept that is maybe supported by everyone who is aware of it. But how would you implement that?

Technology isn't the Issue

Maybe when it was first introduced there was the issue of the technology not being advanced enough to make such a concept real. But in the current time, we are closer than ever. The issue is more about how the car would behave in certain situations.

The Trolley Problem of it

The trolley problem is the perfect place to start discussing this issue. When forced upon a decision, how would the car behave when having to either hit 5 people or 1 person? Maybe also 5 to 3? Maybe 5 children on one side and 5 adults on the other? Or maybe, because you probably thought you were safe, one of the options would include sending you, the driver, to your death?

Philosophy Schools in Self-Driven Cars

According to Jeremy Bentham's utilitarianism, the car should go with the decision that causes the least total harm to society. However, according to Immanuel Kant, if you intentionally made and pre-planned (Since the car's decision would have to be programmed in advance) a decision, then you are effectively committing murder, even if we go with the third option then you are committing suicide. Kant is really leaving us no way out except to let fate decide.

Kantism or Utilitarianism: Which One Do You Prefer For Your Car?

When asked about which of these two schools is better followed, most people answer with Bentham's utilitarianism, which makes sense, in a case of an accident the goal is always to have the least amount of victims. However, those people also said that they wouldn't buy a car that might sacrifice them in order to save as many people as possible.

Everyone wants self-driven cars to be utilitarian in nature unless, of course, they were the ones driving them. This leads to an issue for manufacturing companies. If they create a car that would sacrifice the driver if it meant the least amount of victims, then no one would buy their cars. If they make their cars prioritize their passengers then they will be harming the public interest. Such a decision would lead to public outrage and might potentially open the manufacturing company and maybe even the client to lawsuits.

In 2016, when asked about how their cars behave in such situations, most companies avoided answering the question. Volvo spokesperson said that their cars don't get into accidents. However, in Mercedes's case, when Christoph von Hugo, the head of the safety department, was asked he answered

If you know you can save at least one person, at least save the one in the car.

The follow-up of that statement was an outrage that resulted in Mercedes taking back von Hugo's words that were said to assure their clients.

The Dilemma Once More

The dilemma is that we want a future that would guarantee our safety, but that can't happen without self-driven cars. However, people won't buy a car knowing that, in very rare cases, would sacrifice their lives to save other people. And if it doesn't do that, then you open yourself to all kinds of public outrage. Taking us in a full circle where we are back to our usual method of driving now leads to 1.25 million deaths years with 94% of them being avoidable.

In Conclusion

Maybe this conversation is still ahead of time, however, three years ago we actually had the first self-driven car accident and fatality. And with a touch of irony, the car itself was made by "Our cars don't get into accidents" Volvo. The accident also happened while the car was being observed by an Uber operator. The question is; who do you blame for such an accident, a question that the Maricopa County grand jury couldn't answer as even though they indicted the safety driver, the trial is still yet to start due to the complication of the issue.

There's definitely a little intellectual luxury at play here, since these accidents are very rare. Critics say that most of these scenarios don't exist mostly. And that humans find joy in diving into such a dark place, a place that these critics suggest we won't go anywhere forward as long as we remain in it.

That's my take and I look forward to reading your opinions and takes on the subject.

Sort:  

Great read!
One day, we would actually get to a point where self-driven cars are quite common and perhaps are the normal thing. Technology moves fast nowadays.

But I think what will forestall the issues you raised is to develop the transport system and network as a whole to improve the safety of self-driven cars. It's not just about working on technology in the cars to make them well-suited enough. We would have to bring technological advancements to the roads too. Perhaps we could create networks on the road and recognition points for cars such that they can safely run on them.

For instance, a train would always keep to its rails because the rails were built for it.
In a similar manner, a transport network with features built for the smooth running of self-driven cars would be the best option for its success.

Your breakdown actually makes sense. I guess the complication would be that the switch between regular cars and self-driven cars won't be instant. But your point about trains stands, maybe the way cars move should also be rethought.

I have trust issues. Cannot totally entrust my life into the hands of some AI. Also, I think driving is particularly fun for me, especially if the road is smooth.

Also, if you add the possibility of hacking it could turn into a nightmare, I didn't mention it because it was off-topic.

And you and I are alike in that, driving is really fun and cathartic. I can't see myself enjoying just sitting there for an hour or so.

At the end of the day, it is not very different from entrusting your life in the hand of the others drivers, isn't it?

That would be so if you are not driving yourself.

I was actually meaning that even if you are the best driver in the world, the fact that others are on the road too makes yourself at risk. Depending on what they do, things can indeed get wild. For that reason, having AI dealing with it is similar (and maybe a little bit better), at least in my opinion.

To me, this is all a matter of probabilities and risks. If globally, the number of deaths on the roads decreases, then self-driving cars are a good thing. For this reason, the dilemma is not really a dilemma, from my own perspective. The technology is of course not risk-free (no ones tells this), but the risks are reduced. End of the story. I may however be a little bit extremist here ;)

Like I describe in the end, it is a luxury thought. Each scenario requires a lot of variables to occur, I doubt all of them even add up to 1% of the accidents. But it is a thought that I found intriguing especially since, as I mentioned in the post, there was a fatality caused by one of these cars.

You're not an extremist, you are right. If self-driven cars become a thing, we are literally talking about saving more than a million lives annually. I am just a sucker for minor details lol.

Thank you for your comment. I was about to start exploring the posts on this community as you commented.

I think my reasoning is quite rational. But rationality is often not what dictates life in the society :)

Welcome around. I was personally quite away for some time, although I have some time this week. I hope to be able to constantly sneak around from now on, but I cannot guarantee it. We will see.

Well, then it means more my post was one of those you snuck by lol.

I think my reasoning is quite rational. But rationality is often not what dictates life in the society

It definitely should. But overall, this is why I brought this up for discussion.

It was selected by STEMsocial. This helps to attract me ;)

Hahahaha. Honesty, all you could ask for from anyone really.

I'd love to drive my own car. No thanks to self-driven cars. Great article!

Right?! Like, I really want to drive my own car around and listen to music, the whole experience.

Cars are meant to be driven. It's liberating when you drive your own car.

Couldn't have said it better.

Wao nice technology

Okay.

Thanks for your contribution to the STEMsocial community. Feel free to join us on discord to get to know the rest of us!

Please consider delegating to the @stemsocial account (80% of the curation rewards are returned).

Please consider including @stemsocial as a beneficiary to get a stronger support. 
 


The rewards earned on this comment will go directly to the person sharing the post on Twitter as long as they are registered with @poshtoken. Sign up at https://hiveposh.com.

One of the problems that I see is giving up freedom for safety. This could be very dangerous for our society. Your ideas make a lot of sense as long as freedom is kept as the most important concept.

The thing you didn't even mention here was, the danger of these cars getting hacked. Now I'm very well aware of the fact that not everyone experienced something as horrible as we did regarding being hacked some years ago. But I can assure you that if someone did, and have no sense of any privacy left for a certain period, they will forever change their way of thinking about anything AI-related.

AI can be used for great things, but at the same time, evil people can do very bad things with it. Like cars should be programmed safely, but everything with a computer can be hacked and therefore it would not make me feel safe at all if that was the only option left to drive in.

I personally hope that that era is still years away honestly. I'm not ready for that :)

Hi Amir. Self driving cars don't sound safe to me.

You are one of the few who has even voiced this issue, I think. Do we want to eliminate the human factor entirely? The answer will define the future.

The thought of autonomous cars always makes me think of the Johnny Cabs in the original Total Recall movie. lol

I guess over time if the streets were full of these cars things could be much safer and more effecient in moving traffic.

Self-driven cars are coming really fast. As far as who should survive, I think a random pick is the fairest thing to do LOL.