Comic Discussion > QUESTIONABLE CONTENT
WCDT: 2500-2504 (29 July- August 2, 2013) Weekly Comic Discussion Thread
Is it cold in here?:
--- Quote from: J on 01 Aug 2013, 23:56 ---
--- Quote from: Valdís on 01 Aug 2013, 23:00 ---Basically I think Momo would introduce your junk to her bokken for suggesting it'd be fine for humans to do such things. Them as people aren't hypothetical or abstract concepts, they're their own characters.
--- End quote ---
i'm not. i'm asking whether anthrocentric moral sensibilities are still relevant for life-forms which are so completely different from us by nature.
Edit for clarity: a better way of putting that is to say that i'm asking whether our sense of morality is too anthrocentric to be usefully applied when dealing with synthetic life.
--- End quote ---
Which is a fascinating question.
It's a question that could be sidestepped by asking them how they want to be treated.
--- Quote from: the imaginary address to the UN ---The small beige box replied: “I would like to be granted civil rights. And a small glass of champagne, if you please."
--- End quote ---
The answer to "How do you want to be treated?" will -- gasp! -- vary from one individual to the next.
If they have any motivations at all they'll have the equivalent of a desire for self-preservation, just to ensure they're alive to fulfill whatever other goals they have.
rschill:
--- Quote from: Valdís on 01 Aug 2013, 17:42 ---Well.. when I turn off my laptop it's often just going into sleep-mode, so maybe it was more of a time-out? Don't remember the particular strip.
Still immoral, though, even with the intent to unpause him in a bit. I mean, you can't just drug a human into passing out if they're annoying you, so no reason why you should get to do that to the faculties of an AI.
Friggin' meatbag privilege.
--- End quote ---
I'd rather be shut off for a bit than stuffed in the freezer, duct taped to the wall or suffer an owl attack during a blackout drunk.
Pintsize zone is different. What other character could get away with half of what he does and still be accepted?
Barmymoo:
--- Quote from: wiserd on 02 Aug 2013, 00:51 ---Is it just me or does May really not seem female. I mean, I'm not taking issue with the portrayal since gender is likely a rather arbitrary construct for AI. But so far, pintsize has seemed very 'male' and Momo has seemed very 'female.'
--- End quote ---
Maybe you just need to re-evaluate your gender constructs.
J:
--- Quote from: Is it cold in here? on 02 Aug 2013, 01:19 ---
--- Quote from: J on 01 Aug 2013, 23:56 ---
--- Quote from: Valdís on 01 Aug 2013, 23:00 ---Basically I think Momo would introduce your junk to her bokken for suggesting it'd be fine for humans to do such things. Them as people aren't hypothetical or abstract concepts, they're their own characters.
--- End quote ---
i'm not. i'm asking whether anthrocentric moral sensibilities are still relevant for life-forms which are so completely different from us by nature.
Edit for clarity: a better way of putting that is to say that i'm asking whether our sense of morality is too anthrocentric to be usefully applied when dealing with synthetic life.
--- End quote ---
Which is a fascinating question.
It's a question that could be sidestepped by asking them how they want to be treated.
--- Quote from: the imaginary address to the UN ---The small beige box replied: “I would like to be granted civil rights. And a small glass of champagne, if you please."
--- End quote ---
The answer to "How do you want to be treated?" will -- gasp! -- vary from one individual to the next.
If they have any motivations at all they'll have the equivalent of a desire for self-preservation, just to ensure they're alive to fulfill whatever other goals they have.
--- End quote ---
in that case i'd like to be worshiped as an infallible and handsome god, if you please.
in (semi)seriousness though, 'civil rights' is an extremely broad term. and most of what we consider to be 'civil rights' are themselves built around our own anthropic desires and needs.
is the 'right to life' relevant to a organism with no need for self preservation? say, a robot built specifically for a task, which is now complete. or is the right to personal autonomy relevant to an organism which only wants to follow orders? and if following orders is an intrinsic 'psychological' need, then is it unethical not to give it any?
and then there's the dualistic hardware/software independence of the AIs, which has no apparent parallel among humans at all. marten might not own pintsize, but he does own the shell that pintsize runs on, which he pays to power and maintain. does that ownership give him authority to choose the amount of RAM to be installed, or where it should be stored, or when to turn it on & off?
Tulpa:
80-90% of women are secretly trees.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version