That’s an interesting title. How can a machine have morals? Isn’t that something that only humans have? Is it right or wrong? Only a human can answer that, right? Well sometimes my Golden Retriever seems to know when she’s been bad. So maybe it’s not just humans with a moral compass.
Talks about self-driving cars seem to be all the rage now. More fuel and energy efficient, they give the freedom for the disabled to travel at will, we can redesign our cities for more efficiency since street space to park cars won’t be necessary. That would be really cool- your car drops you off, then goes to park itself. Then it comes to get you when you’re ready to leave. Awesome.
But what about the moral aspect of self driving cars? If a collision is imminent does the car choose to swerve an hit pedestrians? Or potentially injure the passenger or driver of the car? Of course computers don’t have souls or a conceiouss but they do have fancy algorithms designed by humans. There will be plenty of controversy surrounding protection of those in or around autonomous vehicles. Do you save those who paid lots of money that are inside of the car? Or do you look out for the greater good of society and save the group of pedestrians on the side of the road? What do you think?
Read more about the self-driving car moral dilemma here.