Comic Discussion > QUESTIONABLE CONTENT
WCDT strips 3671 to 3675 (5th to 9th February 2018)
SpanielBear:
Comic!
A moment's silence, please, for the many Bothans who died to bring Faye this revelation.
Alas, their sacrifice was in vain.
TheEvilDog:
I felt a great disturbance in the Forum, as if millions of shippers suddenly cried out in terror and were suddenly silenced. I fear something terrible has happened.
tut21:
--- Quote from: SpanielBear on 07 Feb 2018, 19:52 ---Comic!
A moment's silence, please, for the many Bothans who died to bring Faye this revelation.
Alas, their sacrifice was in vain.
--- End quote ---
That's okay. The important part is that Faye briefly considered the idea for the first time. It will return.
Aenno:
--- Quote ---They teach themselves, and by doing so are able to do things that merely programmed AIs cannot.
--- End quote ---
So that just means Bubbles put this program into herself. I'm just using cybernetic terms AIs themself speaks, but I'm entirely ok with a "who teached Bubbles about arousal?"
--- Quote ---We have every reason to suppose that the AIs in QC would teach themselves arousal, some by looking at porn and some by reading novels.
--- End quote ---
Look into porno or novel. What would do you see?
1. Vague description of inner emotional state, that can't really be translated even to another human - it's qualia problem, we can't directly transfer sensual information; but, as we all have human bodies, we can safely assume that common reader/watcher felt arousal at least once.
2. Physical, and quite often very detailed, description of physiological effects. To learn about them it's actually better to look into medical book.
What exactly can be learned with such materials by a person that never experienced arousing? Do you ever tried to explain to child with words and images how orgasm feels?
--- Quote ---Philosophy is an incredibly frustating degree).
--- End quote ---
I know, right? :)
--- Quote ---Reasonably one could assume one could also do the same with a robotic mind- find the string of code that indicates arousal.
--- End quote ---
...and when it's achieved we have upload - a mean to retranslate human mind into robotic chassis. Which is directly defined as impossible feat (at least yet) in QC universe.
--- Quote ---I can only compare it to what I feel when I use that word, and react accordingly. Unless something happens that shows I am making a mistake, any differences just wont be apparent.
--- End quote ---
And that's the thing with us both being humans - we have similar base to start with. You have skin, I have skin, our skins are composed with basically same chemicals, they have basically same structure. When you're saying "I have a burn and now it's painful", I have actual reason to believe your experience is somehow close to mine. Not working each time, but empirically it gives humans enough common ground to understand each other.
Imagine we have AI put into toaster. It never had nociceptors, organic brain or cortisol reaction. I can't know what do he means, and I haven't any reason he means something like MY pain, because I know how it works and I know he lacks parts for it. That's not making him lesser, but it making him alien.
That's why I'm absolutely ok with AIs developing higher or abstract emotions. Love, happiness, suffering, sadness, aesthetic sense, religious sense - sure. Fear, loss, anger, sympathy, loneliness, affection - sure. I do believe that this kind of things are based on self-reflection and introspection.
--- Quote ---When she talks about anger, sadness or arousal, she demonstrates human-like behaviours that correlate with those words.
--- End quote ---
I believe she never talked about arousal. ;)
--- Quote ---So we are stuck with treating them like metal people, which, to be fair, in universe is how they want to be treated. Just because two minds are made differently tells us nothing about what it's like to have either of them, if functionally they are identical.
--- End quote ---
And so it goes - it's exactly problem that bothers me like a hell, so I even registered here on forum to write a big post about it. :)
SpanielBear:
I get that. I guess why it doesn't bother me as much is that while it's true we humans share biology, there is so much differences between our experiences culturally that running into the qualia problem is pretty inevitable. I didn't use the BDSM example in my last post for no reason; narrowing down feelings and emotions to just one sensation is incredibly complex, even impossible. Take anger. That word is descriptive, but the feeling could be overwhelming rage, the manifestation of stress, or the result of not having eaten all day. And that's assuming we can even identify our own emotions with any accuracy. The end result is we are having to make these assumptions and form incomplete pictures of others, and as long as it works we muddle through.
Let's go back to your toaster. Which is not a sentence I thought I'd ever write, but never mind. If the toaster reports feeling pain, you are right that it almost certainly is experiencing different qualia to you. But the question then becomes, what is it trying to say? Because assuming it is an intelligent toaster, it has the same knowledge you do- it knows you feel things differently than it does. So when it complains about pain, what reaction does it hope to get? Why say that?
My assumption is that although the feeling is different, the function is similar. Whatever it is feeling is intense discomfort and requires immediate attention. As it is distinct from human pain, it could signify it differently by using a different word, "I am feeling Gubrily". But what's the point of saying that to an English speaking human? The toaster gains nothing by the distinction. But if it has knowledge of humans and English, it could see that saying "I am feeling pain" could acheive it's goal- the human could help make the sensation go away.
Bubbles has needs which are functionaly emotional, and is operating in a human society where those needs are only going to be met if she can communicate them. The exact qualia she experiences are ultimately irrelevant, as long as the communication works between her and those she associates with. On a wider level, AI's in their programming can make that communication as subtle or as explicit as they like- so they can use their words, or also program in non-verbal communication cues that work just as well. The human language is there, so while they are with us they have no reason not use it.
I don't think you're wrong to find this bothersome. Humanity doesn't have an alien alternative point of reference in this way, and it is fascinating to think about what that relationship would be like. But for myself, I find the short-hand sufficient. The AI's pass the Turing test enough that they don't trigger the uncanny valley for me. But I think that is ultimately subjective.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version