Comic Discussion > QUESTIONABLE CONTENT
Something bothering me a lot
Is it cold in here?:
I think Bubbles is sincere by nature. When she teases about something it's not something important like romantic feelings.
Aenno:
Again, I don't think Dora actually acused Marten with concious lying. Or Claire, when they speaked about Pamela. Marten definitly is sicere by nature himself.
I'm trying to say, "All the evidence is that QC AI people feel emotions as genuinely as we do." isn't the best recomendation. We are known for being able to self-decept with astonishing ability.
JoeCovenant:
--- Quote from: Aenno on 07 Feb 2018, 18:01 ---Again, I don't think Dora actually acused Marten with concious lying. Or Claire, when they speaked about Pamela. Marten definitly is sicere by nature himself.
I'm trying to say, "All the evidence is that QC AI people feel emotions as genuinely as we do." isn't the best recomendation. We are known for being able to self-decept with astonishing ability.
--- End quote ---
The whole point of this entire thing is this.
AI's are JUST like us, on the whole.
That's what you have to relate to.
If you CAN'T, then you are going to miss a lot of the strip's enjoyment.
Aenno:
--- Quote from: ckridge on 08 Feb 2018, 08:32 ---From these passages, I gather that you mean that the object of arousal may be entirely socially determined, since it does not seem possible that some section of DNA codes for attraction to chairs, no matter how curvaceous, cozy, plushy, and compliant; but that the sensation of arousal is 100% biological. That narrows down the field of argument a lot.
Let me propose a thought-experiment: Suppose someone goes to their doctor and says "I'm sexually dysfunctional. I desire my spouse intensely, but my body can't respond properly. The frustration is killing me." The doctor hooks the patient up to some instruments and directs them to think longingly of their spouse, and says "No, you are mistaken. Your erectile tissue is not tumescent when you think about your spouse, and since arousal is 100% biological, that means you aren't feeling desire. There is no problem here." Would the doctor's response be correct? If not, and if arousal is 100% biological, why not?
--- End quote ---
Ok, let's do this thought experiment first. Doctor, patient, body can't respond.
What do patient means about "body can't respond", if not "Doctor, my mind is desiring to have sex with my spouse, but I can't get sexual arousal about him/her?" Yes, it's exactly the problem: desire isn't 100% biological, sexual arousal is 100% biological, there is a sector where they're not overlap, patient have a problem with being in this sector.
Actually it's quite common problem.
And yes, in a way it's kinda possible to try to retract arousal by social means. It's very hard (and you more likely would create neurosis), but in a principle it's the same with appetite.
--- Quote from: ckridge on 08 Feb 2018, 08:32 ---You argue from analogy here, writing that since humans have bodies analogous to our own, we have better reason to believe that they have sensations like our own than we would have for believing that robots did, regardless of what robots claimed. This argument is invalid. We decided that those neural structures correspond to those sensations by asking humans what they felt and then seeing what neural structures are activated when they say they feel that way. The fundamental evidence was the assertion of a feeling. The neural structure's involvement in that feeling was deduced on the basis of the assertion. Denying someone else's assertion that they feel that way because they haven't got the neural structure would be disregarding equally good evidence for no good reason.
--- End quote ---
Not exactly.
We can have no info about concrete neural mechanism, but when we're saying about another human, we would safely assert:
1. Ze has essentially same physiology.
2. Ze has essentially same origin.
If we reject this assumptions, assertion of a feeling is rejected as well. It actually happened more often that I ever wanted in human history - when some highest emotions were rejected on the ground of believes of fundamental difference.
Actually simple experiment. Enter to http://www.cleverbot.com/ and ask "are you sick?". There is a good probability it would answer "Yes I am" (happens at least twice with me just now). It's an assertion of a feeling. Would you believe it?
--- Quote ---AIs in this universe are largely self-programming. They have learning programs and built-in goals, both quite flexible.
--- End quote ---
Mild correction - I don't think they actually have built-in goals. There is a moment, when Marten asking "I wonder why no robot around doing things they are designed to", and when Winslow objected he is, Marten asked "what was you designed for?". Winslow couldn't answer.
Also as I see AIs in QC are into self-determination and free will. Can't say I see how they would be agree with built-in goals.
--- Quote ---AIs who are interested in associating with humans put on bodies for this purpose. Their bodies have automatic stress reactions producing simple, powerful mental events that are analogous to but not identical with ones humans have under similar circumstances. Just as with humans, these simple, powerful sensations are capable of a very wide set of possible interpretations depending on the context and on what part of the human sociocultural psychosexual matrix the robot has become embedded in. The same basic sensations may be experienced as fear, sadness, anger, pleasurable excitement, arousal, drunkenness, desire, or any combination of these depending on circumstance and on whom the robot has learned to be.
--- End quote ---
That's quite some problems in this reasoning.
1. Robot bodies, as was declared more then once, are builded by humans.
2. At least we know about drunkenness. It is conciouss effort from AI mind side. In a nutshell, AI became drunk when he want to be drunk; it's not body reaction (actually it shouldn't be - robots normally detached from their bodies a lot more then humans do), but mind command to body. Same is possible to sex - sure, we can emulate a reaction, but it should be conciouss effort from AI. AI should decide he wants to, download app/write app, setup an app (to have discriminatory reactions), launch an app.
It's possible, why not, I said it before. But that means Bubbles decided already she wants sex with Faye (as a part of social experience, I believe), downloaded an app, set it up on Faye, launched it and now tortuing herself, and it's conciouss decision from her side.
ckridge:
The question before us: "Question is How do AIs have uncontrolled emotional responses by every detail resembling human sexual arousment."
I have attempted to give an account of how that could plausibly come about. Because we are discussing a work in process, every part of my account could turn out to be wrong at any point, but that is not a problem. All I have to establish here is that Jeph isn't writing absurdities, not which particular non-absurd thing he is writing.
Now, to address your comments.
>desire isn't 100% biological, sexual arousal is 100% biological
If you want to distinguish desire from arousal, that is fine with me. In that case, let us say that the emotional response resembling human sexual arousal is desire, and that is similar in that it aches and different in that it does not involve hardening of erectile tissue and so on. That is a complete answer to your question.
>Enter to http://www.cleverbot.com/ and ask "are you sick?". There is a good probability it would answer "Yes I am" (happens at least twice with me just now). It's an assertion of a feeling. Would you believe it?
You are right that I would not. Let us say then that assertions about feelings are evidence of feelings only when made by a creature that can pass Turing tests as often as humans can. Let us add that it becomes well-nigh incontrovertible evidence when the creature can pass a Turing test and you can discover things about your own feelings by discussing their feelings with them and discovering that you have felt the same way without ever noticing.
>I don't think they actually have built-in goals.
Conceded, if you find it plausible that you could have a learning program that didn't start with some kind of goals, however mutable. We can leave this question up to the programmers and to evolution.
> Robot bodies, as was declared more then once, are builded by humans.
I don't see how this is an objection. Humans build machines with automatic responses to stress all the time.
> AI became drunk when he want to be drunk
That AIs sometimes voluntarily induce dizziness, slurred speech, and balance problems and interpret them as pleasure does not mean that they only occur voluntarily. Humans both induce them voluntarily for pleasure and suffer them as the result of fatigue, fever, anoxia, poisoning, and any number of other causes. This is not a problem for my argument that AIs can interpret a relatively small number of automatic physical stress responses in a large variety of ways, but rather supports it.
>But that means Bubbles decided already she wants sex with Faye (as a part of social experience, I believe), downloaded an app, set it up on Faye, launched it and now tortuing herself, and it's conciouss decision from her side.
No, what Bubbles is experiencing is involuntary desire. (Note that it is involuntary, not uncontrolled. She is maintaining better control of it than most of us could.) Desire is involuntary, whereas sex is both voluntary and a skill. Bubbles voluntarily learned to be human enough to feel desire, but having done so has no choice about feeling it. She could no doubt change bodies or download an app to dull the ache, but of course she will no more do that than any of us would castrate ourselves or take psychotropic medications under similar circumstances, because at this point the desire feels like an essential part of who she is.
To answer your question: AIs do not, after all, have uncontrolled emotional responses resembling human sexual arousal in every detail, but they do sometimes have involuntary desires that closely resemble human sexual arousal, or human sexual desire, in many of their external emotional displays, though not in ways that require a flesh body.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version