Page 1 of 11 123 ... LastLast
Results 1 to 10 of 103
Thanks Tree28Thanks

Thread: What moral decisions should driverless cars make?

  1. #1
    Veteran Member Southern Dad's Avatar
    Joined
    Feb 2015
    Posts
    32,124
    Thanks
    6743

    From
    A Month Away

    What moral decisions should driverless cars make?

    Listening to some archived TED Talks, I came across this one. Talk about a very interesting situation. Picture yourself in a driverless car, a situation occurs that there will be an accident. Should the car determine whether to cause your death while saving a crosswalk full of people? Sound far fetched? Picture an autonomous vehicle driving down an interstate when a drunk driver gets on the interstate headed the wrong way. There isn't enough time to stop. Does the autonomous car go off road risking your life, head on the car, take out the car in the lane next to you?

    What moral decisions should driverless cars make? | Iyad Rahwan (TED Talks)

    All of a sudden, the car experiences mechanical failure and is unable to stop. If the car continues, it will crash into a bunch of pedestrians crossing the street, but the car may swerve, hitting one bystander, killing them to save the pedestrians. What should the car do, and who should decide? What if instead the car could swerve into a wall, crashing and killing you, the passenger, in order to save those pedestrians? This scenario is inspired by the trolley problem, which was invented by philosophers a few decades ago to think about ethics.

    Now, the way we think about this problem matters. We may for example not think about it at all. We may say this scenario is unrealistic, incredibly unlikely, or just silly. But I think this criticism misses the point because it takes the scenario too literally. Of course no accident is going to look like this; no accident has two or three options where everybody dies somehow. Instead, the car is going to calculate something like the probability of hitting a certain group of people, if you swerve one direction versus another direction, you might slightly increase the risk to passengers or other drivers versus pedestrians. It's going to be a more complex calculation, but it's still going to involve trade-offs, and trade-offs often require ethics.

    We might say then, "Well, let's not worry about this. Let's wait until technology is fully ready and 100 percent safe." Suppose that we can indeed eliminate 90 percent of those accidents, or even 99 percent in the next 10 years. What if eliminating the last one percent of accidents requires 50 more years of research? Should we not adopt the technology? That's 60 million people dead in car accidents if we maintain the current rate. So the point is, waiting for full safety is also a choice, and it also involves trade-offs.

  2. #2
    Spock of Vulcan Ian Jeffrey's Avatar
    Joined
    Mar 2013
    Posts
    50,498
    Thanks
    23920

    From
    Vulcan
    I am not sure a driverless car would be able to make moral decisions or value judgments.
    Thanks from Friday13 and Rasselas

  3. #3
    Veteran Member DebateDrone's Avatar
    Joined
    Jul 2014
    Posts
    31,258
    Thanks
    26426

    From
    SWUSA
    It should make the same kinds of decisions a human driver does.

    if my vehicle is going to have an head on impact collision, then my car should chose to move to the right to avoid the initial collision.

    moving to the right does not guarantee I will not cause another accident, but that is not because of AI..that is on the fact that I am traveling with other cars.

    My car should never go left as it would be crossing the median into on coming traffic.

    If there is a bus stop full of kids ...on the right that my car hits in avoiding the head on collision, those kids were there at that particular time of the collision whether I had AI or not.

    No system or human can foresee every possibility or outcome of a split decision.
    Last edited by DebateDrone; 31st October 2017 at 08:11 AM.

  4. #4
    Veteran Member bajisima's Avatar
    Joined
    Mar 2012
    Posts
    41,688
    Thanks
    24510

    From
    New Hampshire
    Quote Originally Posted by Ian Jeffrey View Post
    I am not sure a driverless car would be able to make moral decisions or value judgments.
    No but it could do logical ones. Like the scenario above, artificial intelligence might decide sacrificing one might be better than hitting multiple.
    Thanks from Southern Dad

  5. #5
    Veteran Member bajisima's Avatar
    Joined
    Mar 2012
    Posts
    41,688
    Thanks
    24510

    From
    New Hampshire
    Quote Originally Posted by DebateDrone View Post
    It should make the same kinds of decisions a human driver does.

    if my vehicle is going to have an head on impact collision, then my car should chose to move to the right to avoid the initial collision.

    moving to the right does not guarantee I will not cause another accident, but that is not on AI..that is on the fact that I am traveling with out cars.

    My car should never go left as it would be crossing the median into on coming traffic.

    If there is a bus stop full of kids my car hits in avoiding the head on collision, those kids were there at that particular time of the collision whether I had AI or not.

    Not system or human can foresee every possibility or outcome of a split decision.
    Agree except that an autonomous vehicle has computers and sensors all processing at once. A human cant do that. When we see an oncoming car we might swerve right not knowing whats there. An autonomous vehicle knows whats there, the sensors already recorded that. So it could make a move elsewhere that a human might not. Or even slam on brakes and risk a rear end collision instead.
    Thanks from Ian Jeffrey

  6. #6
    Nuisance Factor Yeti 8 Jungle Swing Champion, YetiSports 4 - Albatross Overload Champion, YetiSports7 - Snowboard FreeRide Champion, Alu`s Revenge Champion boontito's Avatar
    Joined
    Jan 2008
    Posts
    86,272
    Thanks
    61450

    From
    out of nowhere!
    We had this discussion at work awhile ago. It could be as simple as evaluating risk. Computer detects three people step off curb in front of you. Program evaluates scenario and concludes one death is better than three and swerves you into a rock wall.

    There's your death panel right there, Sarah Palin.
    Thanks from bajisima

  7. #7
    Veteran Member bajisima's Avatar
    Joined
    Mar 2012
    Posts
    41,688
    Thanks
    24510

    From
    New Hampshire
    Quote Originally Posted by boontito View Post
    We had this discussion at work awhile ago. It could be as simple as evaluating risk. Computer detects three people step off curb in front of you. Program evaluates scenario and concludes one death is better than three and swerves you into a rock wall.

    There's your death panel right there, Sarah Palin.
    I would have to imagine that's likely. A computer cant decide a moral issue nor can it think "your" life is any more important that anyone elses. So it will just use logic. One death is better than 3, that sort of thing.

  8. #8
    Spock of Vulcan Ian Jeffrey's Avatar
    Joined
    Mar 2013
    Posts
    50,498
    Thanks
    23920

    From
    Vulcan
    Quote Originally Posted by bajisima View Post
    No but it could do logical ones. Like the scenario above, artificial intelligence might decide sacrificing one might be better than hitting multiple.
    Maybe, though even that is a value judgment.

    But it is one a court would make, too. In law school we studied a case where a person was operating a train with failed brakes. The choice was to go one way and hit six people, or the other way and hit one person. The train operator was not held liable for the one person death because he did not have a choice that would have saved everyone, and he chose six people over one person. (I cannot find any of my notes on that case, and my old text is in a box.)
    Thanks from boontito, bajisima and Friday13

  9. #9
    Veteran Member bajisima's Avatar
    Joined
    Mar 2012
    Posts
    41,688
    Thanks
    24510

    From
    New Hampshire
    Quote Originally Posted by Ian Jeffrey View Post
    Maybe, though even that is a value judgment.

    But it is one a court would make, too. In law school we studied a case where a person was operating a train with failed brakes. The choice was to go one way and hit six people, or the other way and hit one person. The train operator was not held liable for the one person death because he did not have a choice that would have saved everyone, and he chose six people over one person. (I cannot find any of my notes on that case, and my old text is in a box.)
    True. I think the biggest mental obstacle for humans will be its out of our hands. The car will decide. I have known people who have swerved or done a maneuver knowing they had a baby or small child in the back. They tried to put themselves more at risk to prevent the child from taking a full brunt. A vehicle wont think like that. It will just decide not realizing the driver might prefer the children are more important than they are.

  10. #10
    Veteran Member bmanmcfly's Avatar
    Joined
    Oct 2014
    Posts
    13,847
    Thanks
    2297

    From
    C-A-N-A-D-A-Eh
    Quote Originally Posted by bajisima View Post
    Agree except that an autonomous vehicle has computers and sensors all processing at once. A human cant do that. When we see an oncoming car we might swerve right not knowing whats there. An autonomous vehicle knows whats there, the sensors already recorded that. So it could make a move elsewhere that a human might not. Or even slam on brakes and risk a rear end collision instead.
    Or the software might lock up having to process too much in too little time

Page 1 of 11 123 ... LastLast

Similar Threads

  1. Replies: 14
    Last Post: 19th February 2014, 02:58 PM
  2. The Money Side of Driverless Cars
    By Lilly in forum Current Events
    Replies: 0
    Last Post: 9th July 2013, 04:11 PM
  3. Men Must Make Informed Decisions About Their Health
    By Babba in forum Political Discussion
    Replies: 8
    Last Post: 8th March 2012, 11:55 AM
  4. Are most decisions moral decisions?
    By coberst in forum Philosophy and Religion
    Replies: 5
    Last Post: 24th July 2009, 06:18 AM
  5. Obama's crap shit is going to make decisions!!
    By michaelr in forum Political Discussion
    Replies: 11
    Last Post: 8th April 2009, 01:42 PM

Tags for this Thread


Facebook Twitter RSS Feed