Page 7 of 11 FirstFirst ... 56789 ... LastLast
Results 61 to 70 of 103
Thanks Tree28Thanks

Thread: What moral decisions should driverless cars make?

  1. #61
    Veteran Member Southern Dad's Avatar
    Joined
    Feb 2015
    Posts
    32,118
    Thanks
    6739

    From
    A Month Away
    Quote Originally Posted by Pragmatist View Post
    I don't think the driverless car is going to do any swerving. It's going to stay in it's legal lane of traffic and do what it can to stop. If it runs over a few children then it's a tragedy but it won't be at fault just like if you were driving, if it swerves and runs over a lady on the sidewalk then it's clearly the fault of the software that the lady was killed. Who deserves to die more, the kids who should not have stepped into the street or the lady who was doing nothing wrong?
    You missed the entire point of the TED Talks conversation and the discussion. It completely went over your head. The point is that the computer makes the decisions about whose life to save and whose not to save. Self driving cars already have the ability to swerve and leave the roadway to avoid an accident.

  2. #62
    Veteran Member Pragmatist's Avatar
    Joined
    Nov 2006
    Posts
    45,985
    Thanks
    13352

    Quote Originally Posted by Southern Dad View Post
    I agree that driverless cars, once fully implemented would save many lives each year. But this thread is about the tradeoff of where the car decides that your life should be sacrificed to save other lives.
    I think what would save more would be for breathalyzers to be in all cars, you have to blow before starting or it won't.

  3. #63
    Veteran Member Southern Dad's Avatar
    Joined
    Feb 2015
    Posts
    32,118
    Thanks
    6739

    From
    A Month Away
    Quote Originally Posted by Pragmatist View Post
    I think what would save more would be for breathalyzers to be in all cars, you have to blow before starting or it won't.
    I see, so someone that doesn't drink, like myself would have to purchase this device and pay for it being added to my car? What about marijuana? Will it check for THC, too?

  4. #64
    Veteran Member Pragmatist's Avatar
    Joined
    Nov 2006
    Posts
    45,985
    Thanks
    13352

    Quote Originally Posted by Southern Dad View Post
    No doubt it would be safer but what if your car sacrificed your life to save others?
    I would say they are designed to never do that. If the thing is not going to consider me being more important than anything else then I sure as shit am not using it.

  5. #65
    Veteran Member Pragmatist's Avatar
    Joined
    Nov 2006
    Posts
    45,985
    Thanks
    13352

    Quote Originally Posted by Southern Dad View Post
    You missed the point, Amelia. The car in the scenario is about to hit some pedestrians or turn into something that will most likely kill you, the driver and owner of the car. Maybe the pedestrians darted out from between buildings or something. If you were driving, you would be faced with the same decision, turn into the pole or hit the pedestrians. But the point is that the car, using its intelligence might be rationalizing that killing you is the lesser of the evils.
    Nope, it will plow into the pedestrians. It's not designed to start breaking the law to save anyone or it then becomes the one at fault. If it cannot stop then neither could you. Don't play in the street because someones computer is not going to save you.

  6. #66
    Veteran Member Pragmatist's Avatar
    Joined
    Nov 2006
    Posts
    45,985
    Thanks
    13352

    Quote Originally Posted by Southern Dad View Post
    Okay, what if it was a group of Trump supporters wearing Make America Great Again hats and they darted out? Sacrifice your life for theirs? The scenario is the car makes the decision, taking it out of your hands.
    No the car slams on the brakes and with any luck at all it will be too late.

  7. #67
    Veteran Member Pragmatist's Avatar
    Joined
    Nov 2006
    Posts
    45,985
    Thanks
    13352

    Quote Originally Posted by Southern Dad View Post
    I agree, the programming could make your life worth less than a pedestrian to the computer.
    It won't break the law to save a pedestrian or the software will be worthless. The first time it swerves and the result is an accident noone will use it. It's not going to be able to tell the difference between a child or a poodle and will do it's best to stop and if it can't it will be the fault of the poodle or child.

  8. #68
    Veteran Member Pragmatist's Avatar
    Joined
    Nov 2006
    Posts
    45,985
    Thanks
    13352

    Quote Originally Posted by Southern Dad View Post
    I'm not trolling. I tried to pick something that I thought that you might consider less important than your life. Let's say it was a group of KKK that stepped out into the street. The computer determines there are six of them and only one of you. It crashes into a pole, killing you because you are the less loss of life. Make sense?
    And who is going to use it if you are telling me that if a bunch of assholes are breaking the law and jumping in the street in front of me it might decide that their life is more important than mine?

  9. #69
    Veteran Member Pragmatist's Avatar
    Joined
    Nov 2006
    Posts
    45,985
    Thanks
    13352

    Quote Originally Posted by Southern Dad View Post
    The computer is able to determine the best course of action to take. The point is that action might not be best for the owner or driver.
    I think it will be for the best course of action for the guy in the car in all instances or people won't buy it. That may include running over some pedestrians

  10. #70
    Veteran Member Pragmatist's Avatar
    Joined
    Nov 2006
    Posts
    45,985
    Thanks
    13352

    Quote Originally Posted by Southern Dad View Post
    The swerve off the road killing the driver is because the computer determines that the pedestrians are higher in priority of safety. It is a quandary that programers would be faced with figuring out. Do you kill one, even if it happens to be the driver, to save a dozen?
    Nope, the law abiding citizen is not going to be sacrificed and the computer will remain within the law and try to stop. If it can't we have a roadkill and it won't be the computers fault. If your daughter were walking down the sidewalk and a car jumped the curb and ran her over because some kids made the mistake of playing in the street would you have a case against the software manufacturer? You damn straight you would. if it were your child playing in the street that was killed would you have a case then? Nope. The software is going to be designed to NOT be at fault and jumping curbs to save pedestrians is not an option.

Page 7 of 11 FirstFirst ... 56789 ... LastLast

Similar Threads

  1. Replies: 14
    Last Post: 19th February 2014, 02:58 PM
  2. The Money Side of Driverless Cars
    By Lilly in forum Current Events
    Replies: 0
    Last Post: 9th July 2013, 04:11 PM
  3. Men Must Make Informed Decisions About Their Health
    By Babba in forum Political Discussion
    Replies: 8
    Last Post: 8th March 2012, 11:55 AM
  4. Are most decisions moral decisions?
    By coberst in forum Philosophy and Religion
    Replies: 5
    Last Post: 24th July 2009, 06:18 AM
  5. Obama's crap shit is going to make decisions!!
    By michaelr in forum Political Discussion
    Replies: 11
    Last Post: 8th April 2009, 01:42 PM

Tags for this Thread


Facebook Twitter RSS Feed