Comic Discussion > QUESTIONABLE CONTENT
WCDT: 2500-2504 (29 July- August 2, 2013) Weekly Comic Discussion Thread
J:
--- Quote from: Is it cold in here? on 01 Aug 2013, 20:50 ---Insightful analogy, but of course you don't get to put someone in a medically induced coma because your girlfriend is sleeping over.
--- End quote ---
personally, i think we should be careful about drawing parallels like that, since robots and humans are so unlike each-other as organisms. for example: we are our bodies and cannot be removed from them, and if we get turned off, we don't get to turn back on again. but for a robot the situation is fundamentally different; the body is hardware and the mind is software, which can migrate to any other compatible machine. likewise, getting turned off is not nearly so permanent for them as it is us.
as a result of these differences, many of our basic instincts and feelings about such things as applied to ourselves would logically be irrelevant to the machines. should they care bout physical injury, when their entire shells can be easily repaired or replaced? should they care about being turned off and on when it's something they are designed to do as a basic function? should they even care about deletion, if they can simply be restored from a backup archive?
but even all that still assumes something akin to a human psychology at work, which the AI might not even have. if the AI is programed not to care whether it lives or dies, is killing it still wrong?
Valdís:
--- Quote from: J on 01 Aug 2013, 21:48 ---likewise, getting turned off is not nearly so permanent for them as it is us.
--- End quote ---
It is.. if the other person doesn't turn them back on. It's out of their control. Loss of control is scary as fuck.
--- Quote from: J on 01 Aug 2013, 21:48 ---should they care bout physical injury, when their entire shells can be easily repaired or replaced? should they care about being turned off and on when it's something they are designed to do as a basic function? should they even care about deletion, if they can simply be restored from a backup archive?
--- End quote ---
These things about what is important to the existence of AI are all up to them to work out. Not something for their companions to decide to do. Nor their original manufacturers.
meatbag privilege
--- Quote from: J on 01 Aug 2013, 21:48 ---but even all that still assumes something akin to a human psychology at work, which the AI might not even have. if the AI is programed not to care whether it lives or dies, is killing it still wrong?
--- End quote ---
You were the one who programmed it without any sense of self-preservation, then, so that's on you too. But yes, extinguishing a thinking person without it being something they want is wrong (which is different from apathy - and if you programmed suicidal tendencies into them then that's way immoral). One doesn't have to rely on evolutionary gut reactions to say that.
Is it cold in here?:
It's considered unethical to gun down patients in a suicide prevention ward, so there's precedent for not taking life even when the subject doesn't care.
--- Quote from: Valdís ---It is.. if the other person doesn't turn them back on. It's out of their control. Loss of control is scary as fuck.
--- End quote ---
Valdís has brought up an important ethical point here. Many of the crimes which horrify us the most are exactly those that deprive victims of their control over their lives.
Perfectly Reasonable:
May comes clean to Dale because she's decided he's an OK guy. Awww...
So Dale still gets the money... Wait--- money??
I'm starting to wonder who has been lying to May...
J:
--- Quote from: Is it cold in here? on 01 Aug 2013, 22:08 ---
--- Quote from: Valdís ---It is.. if the other person doesn't turn them back on. It's out of their control. Loss of control is scary as fuck.
--- End quote ---
Valdís has brought up an important ethical point here. Many of the crimes which horrify us the most are exactly those that deprive victims of their control over their lives.
--- End quote ---
that's just my point though, it's horrifying to us on a viscerally emotional level. but a synthetic life form may have a different reaction and outlook. we are a product of billions of years of biological evolution, they aren't. thus, many of our reactions and outlooks may be entirely irrelevant to them.
if for example, pintsize were to say 'Meh. No harm, no foul" about getting turned off, why would our emotionally driven outlook be more valid than his?
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version