easysurfer
Give me a museum and I'll fill it. (Picasso) Give me a forum ...
- Joined
- Jun 11, 2008
- Messages
- 13,164
Saw a good ethical discussion on the morning news about self-driving cars and the ethical decisions.
Can self-driving cars be programmed to make ethical decisions? - CBS News
Makes me think about Star Trek in "The Wrath of Con" where Spock and Kirk weigh the needs of the many vs the need of the one.
Even more, if the decisions of ethical questions are programmed into self-driving cars in the future, doesn't that take away freedom of choice which is a big part of what makes us human? Do we want to give up that choice? Food for thought.
Self-driving cars offer many promises for an improved experience on the road. They can increase traffic efficiency, reduce pollution, and eliminate up to 90 percent of traffic accidents, developers claim. Unlike human drivers, they don't get distracted, they don't fall asleep or text behind the wheel, and they don't drive drunk.
But as autonomous cars move closer to hitting the roads, a new ethical dilemma is coming to light - how will they make potentially life-or-death decisions that are instinctive to human beings? "You're also going to have to program ethics in a way that society hasn't dealt with.
Can self-driving cars be programmed to make ethical decisions? - CBS News
Makes me think about Star Trek in "The Wrath of Con" where Spock and Kirk weigh the needs of the many vs the need of the one.
Even more, if the decisions of ethical questions are programmed into self-driving cars in the future, doesn't that take away freedom of choice which is a big part of what makes us human? Do we want to give up that choice? Food for thought.