Jeph Jacques's comics discussion forums

  • 17 Feb 2019, 19:41
  • Welcome, Guest
Please login or register.

Login with username, password and session length
Advanced search  
Pages: [1] 2   Go Down

Author Topic: Something bothering me a lot  (Read 4432 times)

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Something bothering me a lot
« on: 03 Feb 2018, 20:08 »

I'm living in Russia, QC written in English, and I haven't a lot of people who seen QC here. I found it quite suddenly, just looking into my daily Pintrest sweep; I had 2891 ("You're beautiful"), then I looked into some more, then I believed I need to know whole story, well, how big it can be?.., and started with first issue. Well, shows what I knew. Conveniently, weather is jumping 10 degrees in a day here in Moscow this week, and that's mean nobody I'm working or live with would try to get me out from bed, so I had a luxury to read it strait. It took 3 days, but totaly worth it.

But I have a problem. Sorry I'm throwing all of this at you just as my first post, by I really felt it and I really need somebody to talk about it.

I loved QC. As a sidenote, I'm amazed by a man who managed to picture 3670 issues in 15 years, improving artistic skill so much. I have some kind of weird artistic disability, so I even was relieved from arts classes in high school, so I'm always amazed with a people who can make a picture. It's kind of magic I haven't. And here I saw a great embodiment of this magic. But, well, it's not just amazement by artistic magic was here.
You see, I left my father's house when I was 15, and I had my amount of weird social circles. And I loved looking into people who have lives actually like my live, and like my friends lives. Of course, I still can't believe in Marten HELL THIS GUY IS ABLE TO NOTICE NEW GLASSES ON A GIRL IN THE MORNING AFTER PARTY WHEN THIS GLASSES ARE TOTALY LIKE OLD GLASSES, HE CAN'T EXIST! let's just say he is nearly too perfect. I haven't a friend who is a multibillionaire daughter with OCD. But well, I looked at this guys and gals and I felt relation.

And then robots happened.

Don't take me wrong. I really love sci-fi. I love sci-fi and fantasy comics. I used to think that I'm nerd. But for me robots were actually out of the place.
There is Marten. He is awesome (HELL HE CAN NOTICE NEW GLASSE... ahm sorry). He has human problems, problems I can totally relate. He had relations, he had purposefulness issues, I can't say I was in his place exactly, but I can feel for him.
There is Faye. Can't say I'm a fan of Faye as a human being, but I can look at her as for human being. She has here issues with her father demise, and she had her alcohol issues; I never been in her shoes, but I know people who was, and I can relate for her.
There is Dora with insecurity, Hannelore with anxiety, Sven with... well, you know. Marigold, Tai, Veronica, Claire... I loved them all because they're human beings with a problems I can relate directly. I don't need to invent things here.

Enter Bubbles.
Problems Bubbles has:
She actually is a AI designed and optimized for combat. Sorry, can't relate. She doesn't know a place for AI designed and optimized for combat in a human society. Sorry, can't really relate. She had parts of her memory deleted by spooky AI crime lady, there was a hole in her mind and there was something really unpleasant there. Sorry, can't really relate. She need to work for said spooky crime lady to be able to return said memories. Can't really relate. And she get better with intervention of super powerful AI that lurking Somewhere Out There, because their own ethical reasons, and nobody actually could do anything else about it. Well, guess what - can't really relate.
Of course, I can just imagine that Bubbles is a soldier who came from war without actual skills in civil life. I know more people with this issues then I want to know (Russia had two Chechen Wars in my lifetime, not to mention Afgan War of my senior acquaintances, not to mention some other instances), and, well, we have a draft army here. But Bubbles is AI. I can't really imagine how it would be to have AI instead of my own biological mind. I can't imagine what it is to be modeled after human but being something another. So, I can't really relate.

For a time I spoke to myself - "well, maybe they're AIs, but they're humanlike enough to have human problems, so I just need to think it's humans with some problems". As I said, I can imagine PTSD soldier, or rich kid who is trying to make things better, or former convict, all that things, so relate for them same way I relate Hannelore (as I said, not many multibilliarder daughters around in my social circle, but I can relate for Hannelore). But then I thought - hell, isn't it defeating the point?
I mean, and please take me right, when I'm for diversity, I'm not for plain equalization. If I say "hey, Hannelore, you're just a human being like me!" - would not I be a jerk? I mean, Hannelore isn't like me. She had issues I haven't, and even if I can believe they're ridicules, they aren't for her. So if I'm gonna be good, I should remember that Hannelore don't give hugs. I should keep in mind that Faye, maybe, isn't so fond about suicide jokes, and I better not to replace her juice with bourbone as a friendly prank. Every being deserves understanding of its issues and if I wanna to be a friend with such a being I should keep their issues in mind.
Then there is two possibilities.
First is that being AI is nothing else that being human, it's exactly same issues and psychology. It is possible, sure (AFAIK, between me and everybody else on this planet nobody can really describe strong AI); but wouldn't it just defeat the issue? So thing would be "we're all different, but AIs are same as we are"? "You should notice basic story of every living creature, but being an AI is essentially nothing, just ignore it"?
Second is that being AI is something different, with socially acceptable possibility to change bodies just for couple of money (tell it Claire), with being able to lock psychological problems with competent programming (tell it Faye), and some unique issues that came with a status; but can somebody of humans really relate for specific AI issues?

For me QC just broke on two parts: one with human common problems, you know, that kind I need to fix in my own life all the time; and another with Singularity, with robot civil rights and philosophical problems about nature of the mind, soul and existence. And first part is far more interesting.

Also I really feel bad about the stolen memory arc ending. Faye hadn't some kind of overpowered AI that came and said - hey, we fixed your problem with your dad and alcohol, be happy, no strings attached because we have ethical reasons. No overpowered AI just came and fixed Hannelore psychosis (and they, you know, tried!), because it isn't something you can fix with pressing a button. Nobody came to Claire (as far as I know?..) with a magical wand and said - all your body problems is fixed now, because we just hate jerks who take it funny with sever transgender. It was a big work for everybody, and it is far from end on current state.
For everybody but Bubbles, who literally needed just to sit and wait.

I know, it was a lot, but can somebody share his/her thoughts about this?
« Last Edit: 04 Feb 2018, 01:49 by Aenno »
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

oddtail

  • Beyond Thunderdome
  • ****
  • Offline Offline
  • Posts: 570
Re: Something bothering me a lot
« Reply #1 on: 04 Feb 2018, 14:17 »

Hi there, and welcome to the forum =)

I'm not 100% on what you mean, but from what I understand, you have a problem with QC being a relatable comic with relatable characters, and Bubbles (and other AI? You haven't really talked about Pintsize or Momo etc., and they've been in the comic longer) is not relatable... due to being an AI, I assume?

I mean, I guess I get it, but I'm curious why a particular character not being relatable is an issue that spoils the enjoyment of the comic for you. There are quite a few characters in the comic that I enjoy reading about, and I can't relate to them at all. Sven, Beatrice, Raven, Dale - to name a few - were all human characters that I couldn't really relate to in any meaningful way. I still enjoyed reading about them. Heck, many of the characters from the main cast have problems, attitudes and personalities that don't really mesh with my life experience most of the time (and if they do, it's somewhat tangential - not really to a lesser extent that I relate to Bubbles' problems, to be perfectly honest), and I find them enjoyable anyway. When something is not relatable, I either wait for a more interesting plotline, or read about something that's new and exciting due to NOT being like anything I'm familiar with.

I don't think fictional characters MUST be relatable to be enjoyable. Fiction can be escapist, or it can be exploratory, and both of these things sometimes call for characters that are OUTSIDE what the reader can relate to. Batman is one of the most recognizable, popular and enjoyed characters in modern fiction, and if I were to take a wild guess, I'd assume most people can't relate to anything about Bruce Wayne's life.

As to the point that AI are not relatable by virtue of being AI... ehhh, I suppose, but again - sci-fi can be a reflection of life, but to do that correctly, it has to be exploratory as well. And AI from the QC-verse are, and the narrative makes a point of emphasizing that they are, very similar to humans in what they need, how they live and how they react. They blush when embarrassed, for cryin' out loud. With the possible exception of Spookybot, AI are not a riff on the "what ifs" of completely alien and incomprehensible minds, they are a riff on normal, "human" (for the lack of a better word) reaction to certain situations in life. BUT, that doesn't mean they are completely the same as human characters (as you seem to be implying). The differences don't make them completely different, but exploring familiar issues with a changed setup is what makes fiction interesting. Bubbles, like other AI, has problems and conversations and little joys that are sometimes exactly the same that those of a similar human, but sometimes they are just different. The fact that Bubbles is an AI is not incidental to the character. A human character would have a different dynamic.

In the end, you seem to imply (again, correct me if I'm wrong) that AI either are basically human for all intents and purposes, or they are something completely different. I feel that the fact that they are essentially BOTH is kind of the point, narratively, of their existence in the comic. They have their own issues that humans (either readers of the comic, or human characters IN the comic) can't relate to, but in other ways they are very similar. I'm not sure why it needs to be one or the other. To name a famous character that's arguably very relatable - Harry Potter doesn't need to be JUST a wizard, with a life and problems unlike anything any reader has ever seen, OR just a regular school-aged boy. Picking just the "wizard" part would make the character less relatable, less of an everyman and would not hit certain notes that make many readers root for him, but getting rid of the "wizard" part altogether would change the stories significantly, turning them into regular school drama (not that there's anything wrong with that).

To rephrase what I said earlier - AI are neither alien, nor are they basically humans that are pointlessly being called "AI". The fact that they are not human changes and influences and informs the story, and the fact that they act basically human makes them interesting characters rather than abstract intellectual exercises for the reader to ponder.

In the end, if the fact that the story has AI characters in it doesn't click with you, it's just not for you, and there's nothing wrong with that. If you don't like Bubbles and her storyline, there will probably at some point be focus on other characters. But to your main point, which seems to be "I can't relate", my best answer is - not everything in fiction needs to be relatable. You can't relate to Bubbles, and that's fine. People don't relate to Hannibal Lecter, but still enjoy the character.

(and a small correction - from what I understand about the story's setting, AI aren't modelled after humans. They are just emergent intelligences, they are neither manufactured nor designed. The fact that they have similar psychology to humans hasn't been explained in-comic, but it's been - from what I understand - pretty clearly explained that they are NOT consciously MEANT to be like humans. They just happen to be similar.)

As to the stolen memories story arc and the almost literal deus ex machina ending that arc - eh, I'm not a fan of this narrative development at ALL, so I'm not even gonna defend it. I see the purpose behind it, but talking about it would feel a bit too much like playing Devil's Advocate to what I felt to be rather clumsy writing. I'm sure there are other forumites who like the plot resolution more and will be happy to chime in.
Logged

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #2 on: 04 Feb 2018, 17:21 »

Sorry, I would try to clarify myself.
As I said, I like sci-fi for sure. And I like AI-problem. I love "The Bicentennial Man", it was one of the first novels I even read.
My problem with AIs in QC isn't about they're AI as they are. They can be something alien - I read Neuromancer, and I can't say Wintermute or Neuromancer are relatable guys, and still I'm adore that things.
But they had problems I can't even begin to understand, and it's actually a point. When I'm taking Neuromancer I'm going to think about AIs and humans relationships. That's absolutely ok.
Problem I had emerged when I had reading QC, launched quite parallel:
1. Faye story about PTSD, highly treated, with a great deal of support, who developed alcoholism, because, citing Faye herself, "if trauma were that easy dealt with, psycologists would work pro bono", and such things happens exactly when you're trying just to lock traumatic memories and states with a key (bottle-shaped in Faye issue) in your mind.
2. Bubbles story about PTSD, without any treatment, without quarter a support Faye had, who just stopped worry because, well, spooky bots and specifics of being an AI.
Second one looks bland in comparison, exactly because it's happens about AI I can't exactly relate here. And I can't imagine how they're possible can be not compared, especially if you have "The Talk" arc fresh in your head.

Thing is, and that's why I speak about Bubbles mainly, not about Pintsize or Momo, Bubbles arc is a first arc (I believe) about AI having problems directly cored in being an AI, developed around being an AI and finished around being an AI. When Momo speaks about her being hated and it's hurts, I kinda recalling that Amanda was thrown out from home when she was found as a lesbian, or that it's just six characters in comics as far as I know who even know Claire is a transperson. No technology fixed bigotry there even by replacement of target.

For the thing about AIs never were modelled as a humans... well. As I got from story there, they are. It's quite possible they aren't intentionally - it's just...
"Hey, we're doing a chat-bot. Let's give them human literature to read, so chat-bot had a reason to chat about, also let's animate it with blushing and all that things." Chat-bot emerging intelligence - only info it have to create relations with outside intelligence is said literature - it's modeled as a human. They can blush exactly because it's what humans do, and engineers believed it's cute (I believe it was said in comics directly).
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

sitnspin

  • Only pretending to work
  • *****
  • Offline Offline
  • Posts: 2,063
Re: Something bothering me a lot
« Reply #3 on: 04 Feb 2018, 20:08 »

I think you fundamentally misread the story arc of Bubbles' memories. Spooky Bot didn't solve all of her problems. It got her away from Corpsewitch, but that's all. Bubbles still feels the loss of her missing memories, she still has PTSD, she still has her anxiety, she still has her insecurity and inexperience with relationships. No one magically cured her. All the problems she has that actually matter are just as relatable as those of the human cast members.
Logged
@syleegrrl

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #4 on: 04 Feb 2018, 21:26 »

I can't say I see it. I mean, sure, I can misread everything there, but.

PTSD by very definition demanding her to directly avoid situations that created stress, and when this situations arise breakdowns happens. Faye in "The Talk", for instance, describes classical PTSD, and she shows classical PTSD when Angus leaves. Bubbles, in the other hand, quite content with Spooky Bot rummaging her mind (and a man with PTSD can't make rational decisions like "well, I can't do anything against it anyway", because it's traumatic experience), she is definitely ok with a very idea of establishing close relationships and even declares to Faye that her friendship is a sole source of joy in this part of her existence. She is quite content with a human entering her mind. She is quite content about establishing closer relationships with Faye and her friends. It's totally no a person with PTSD ever would be able to do.

Anxiety? At scene with Spooky Bot suddenly appears Bubbles is a paragon of stoicism and rationality. And actually she demanding Spooky Bot to stop playing with Faye anxieties. With a problem with Corpse Witch going to run she can rationalize a problem and leave a loose tail basing on ethical and rational basis. When Faye says - hey, we're starting a business, Bubbles shows no anxiety at all, just, well, interested why she wasn't informed before.

insecurity? When she is put into situation where she have a choice - accept Spooky Bot words on faith, or go breakdown, she takes them on faith. No way person with insecurity ever can do it, because, well, it's the direct problem of having insecurity disorders. Dora has insecurity problems, looks how good is she to do such things.
Actually the only situation we see Bubbles shows any concern about situation after Spooky Bot arc, was a situation where she was furious about Faye making all decisions without asking Bubble's about, well, anything and directly declaring herself as an executive for their operation (http://www.questionablecontent.net/view.php?comic=3456). And when she do she came to "Coffee of Doom" and venting off by speaking with a human.

Inexperience with relationships?
She is inexperienced with romantic relationships, maybe. But she is definitely socially adequate at a hell level for a person who spend some last years with PTSD on illegal robot fight pit.
She is totally ok establishing friendships, lending and taking helping hands.
She is able to retort to another people's experience, asking Dora for help to explain business works.
She is able to actually ask about Claire status in Marten and Faye apartments, on the, what, second day living there?
She is able to evaluate Faye character enough to joke about it.
She is able to joke about her physical attire and capabilities.
She is able to maintain solid work and friendship relationships with a person she (supposedly) romantically interested.
Actually it's solid adult level, with flying colors.
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

oddtail

  • Beyond Thunderdome
  • ****
  • Offline Offline
  • Posts: 570
Re: Something bothering me a lot
« Reply #5 on: 05 Feb 2018, 01:20 »

You do realise that a person may have mental issues and/or be recovering from trauma, and not fall apart at every moment of their lives? You can absolutely suffer from PTSD or anxiety and cope with some situations well, and others - not so much. And we've seen plenty of evidence that Bubbles has a difficult time coping with her own emotions. She's been better lately, yes, but there have been multiple situations in the past where she reacted in extreme ways to things that a different person might be more level-headed with.

Your words: "It's totally no a person with PTSD ever would be able to do." imply that... I don't know, a person with PTSD is irrevocably and permanently incapable of handling anything that you described? Granted, I don't suffer from PTSD myself, so I wouldn't know from firsthand experience, but from what I understand, it's not necessarily a crippling inability to act 100% of the time, in 100% of situations. It's a severe problem, but it's manageable. There's a reason people suffering from PTSD receive help. If they were beyond help, they would receive none.

Also, I can't help but notice that your first issues with the comic were with Bubbles being an AI, and now you seem to be saying her PTSD is badly portrayed. Which may or may not be true, but it's not directly related with her being an AI, which was your original complaint (again, correct me if I'm wrong).

EDIT: also, I've just noticed that this thread is in the "introduction" part of the forum, while it would better belong in the part for discussing the comic.
« Last Edit: 05 Feb 2018, 01:27 by oddtail »
Logged

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #6 on: 05 Feb 2018, 11:30 »

Yes, having mental issues isn't means it's a crippling inability to act 100% of the time, in 100% of situations. But the very definition of having mental issues is that person DO fall apart at least sometimes. If your mental issues never manifest and don't influence your behavior by any way, you haven't mental issues. Dora can handle some situations well. Faye can handle some situations well. Hannelore can. But Dora, Faye or Hannelore can't handle some situations well, and it's chronical and patterned, that's why they can be called as having mental issues.

Check for Bubbles appearing in the comics after Stolen Memories arc. When do she show any evidence of any mental issues?
Feeling bad about something isn't mental issue. Being not able to be cool all the times isn't mental issues. Bubbles don't always cool, but she can really handle it. When she is angry on Clinton, who just ignored her explaining they are not couple with Faye, she cool out as soon as he admits his mistake (and it took four iterations about "we're not" from him). When she is angry of Faye, who just never came to Bubbles to discuss a way Bubbles would live next, she is going to "Coffee of Doom" and vent out quite adequately.

P.S.: Actually you're right and I noticed it later. Thanks for moving.
« Last Edit: 05 Feb 2018, 11:56 by Aenno »
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

Storel

  • Bling blang blong blung
  • *****
  • Offline Offline
  • Posts: 1,047
Re: Something bothering me a lot
« Reply #7 on: 05 Feb 2018, 14:46 »

I don't think fictional characters MUST be relatable to be enjoyable. Fiction can be escapist, or it can be exploratory, and both of these things sometimes call for characters that are OUTSIDE what the reader can relate to. Batman is one of the most recognizable, popular and enjoyed characters in modern fiction, and if I were to take a wild guess, I'd assume most people can't relate to anything about Bruce Wayne's life.

I don't know about that. He lost both his parents at quite a young age -- I'm sure lots of folks can relate to that. I was already middle-aged by the time both my parents passed away, but I can still relate to his loss and appreciate how much harder it must have been for someone so young. And I identified more with Batman than with other superheroes because he was completely human -- he didn't have any superpowers, he had to work damned hard for every one of his abilities, so I felt I could be like him if I were willing to work as hard as he did. (Which I wasn't, admittedly, but the possibility was still there. 8-))

Sorry, I know this is off on a tangent from the rest of this discussion, but that remark just kind of jumped out at me...
Logged

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #8 on: 05 Feb 2018, 16:01 »

Quote
I don't know about that. He lost both his parents at quite a young age -- I'm sure lots of folks can relate to that.
I actually believe it's a good example of what I mean.
Batman really is relatable. He lost parents and still feel loss - relatable. He feels himself obliged to work as hard as it's possible and somewhere beyond to make world better - relatable. He isn't always sure his method is as good as possible, and some of persons out there are really not helping - relatable. His problems are hypertrophied at comics way, but in a core it's quite human problems, that could be met (to a lesser extent, of course) in everyday life.
Spiderman is relatable. Of course, non of us was bitten with radioactive spider that gives us spiderpowers (at least I believe so), but it's not his problem I can relate. His main problem is that when he is irresponsible people he care starting to be hurt and die. My girlfriend had never broken her neck because I didn't thought what happen when my spider web would catch her, but I can imagine what would I feel if something like that would happen to me.

That's the thing, Batman or Spiderman are relatable because I (we, auditory) can place myself on their place. Even Superman is relatable because he is Clark Kent, not because he is that kind of Krypton native; he actually is a question "What would you do being granted with Superman abilities?". That's why I was speaking about Bubbles being an AI, not a character with wrongly done PTSD - because I haven't any theoretical issues with AI being able to rationalize PTSD or extreme anxious. I just can't possibly imagine how it is to see world or my own mind this way.
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

Is it cold in here?

  • Administrator
  • Awakened
  • ******
  • Offline Offline
  • Posts: 18,447
Re: Something bothering me a lot
« Reply #9 on: 05 Feb 2018, 18:40 »

Welcome, utterly fascinating new person!

I too would like to see more exploration of how the AI characters are different from the organic ones. "Just like us only made of plastic" is less interesting than "Equal to us but frequently surprising us with their differences".

How does it change a person's psychology if their body is as replaceable as an automobile? We've had only hints.

Do AIs have additional emotions beyond ours that we don't have names for? Why not? Human emotions are survival tools and not universal constants.
Logged
Not tonight, dear. I have a boundary.
Quote from: an unnamed minister's sermon
In your face, darkness!  We are the light and we outnumber you!

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #10 on: 05 Feb 2018, 19:27 »

Quote
How does it change a person's psychology if their body is as replaceable as an automobile? We've had only hints.
Truth to be said I even more interested how having said person around would affect a life situation and feelings of hard-transition transperson who were shunned for her condition.
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

Is it cold in here?

  • Administrator
  • Awakened
  • ******
  • Offline Offline
  • Posts: 18,447
Re: Something bothering me a lot
« Reply #11 on: 05 Feb 2018, 20:57 »

We know how Claire reacted to Pintsize talking about chassis-swapping. Every trans person will have an individual perspective but I wouldn't be surprised if many others felt the way she did.
Logged
Not tonight, dear. I have a boundary.
Quote from: an unnamed minister's sermon
In your face, darkness!  We are the light and we outnumber you!

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #12 on: 05 Feb 2018, 21:02 »

Quote
We know how Claire reacted to Pintsize talking about chassis-swapping. Every trans person will have an individual perspective but I wouldn't be surprised if many others felt the way she did.
I believe it wasn't exactly about chassis-swapping, but about Pintsize dismissing difficulties of adapting new options with "bah, technology would fix everything".
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

dutchrvl

  • FIGHT YOU
  • ***
  • Offline Offline
  • Posts: 429
Re: Something bothering me a lot
« Reply #13 on: 06 Feb 2018, 06:26 »

Yes, having mental issues isn't means it's a crippling inability to act 100% of the time, in 100% of situations. But the very definition of having mental issues is that person DO fall apart at least sometimes. If your mental issues never manifest and don't influence your behavior by any way, you haven't mental issues. Dora can handle some situations well. Faye can handle some situations well. Hannelore can. But Dora, Faye or Hannelore can't handle some situations well, and it's chronical and patterned, that's why they can be called as having mental issues.

Check for Bubbles appearing in the comics after Stolen Memories arc. When do she show any evidence of any mental issues?
Feeling bad about something isn't mental issue. Being not able to be cool all the times isn't mental issues. Bubbles don't always cool, but she can really handle it. When she is angry on Clinton, who just ignored her explaining they are not couple with Faye, she cool out as soon as he admits his mistake (and it took four iterations about "we're not" from him). When she is angry of Faye, who just never came to Bubbles to discuss a way Bubbles would live next, she is going to "Coffee of Doom" and vent out quite adequately.

P.S.: Actually you're right and I noticed it later. Thanks for moving.

Welcome, new person with very interesting and thought-provoking questions!
From what I understand, you feel that Bubbles' PTSD has not really been brought to the forefront ever since the creepybot storyline, correct?
I do have to agree with you to some extent. You're right, any struggles we have seen from Bubbles since then have mostly been innocuous and similar to what we may find in non-PTSD individuals with low experience in social/romantic situations. We haven't seen much evidence in the way of Bubbles struggling with her PTSD or, more to-the-point, her loss of specific memories. While she was distraught about that loss immediately after, ever since it has not been mentioned at all.

So yeah, I do agree that it would probably be more relatable/realistic if some of Bubbles issues/coping would be handled more to the forefront at some point, but I give JJ a lot of leeway in his storytelling, mostly since he has a knack for bringing issues back again.
I have a feeling Evie's introduction may have served as a starting point for the above, actually.


Logged

Is it cold in here?

  • Administrator
  • Awakened
  • ******
  • Offline Offline
  • Posts: 18,447
Re: Something bothering me a lot
« Reply #14 on: 06 Feb 2018, 06:46 »

Tova pointed out that there's body language mirroring going on. Bubbles was behind Faye's back so she must have been the one following.

That's a very detailed parallel to how humans do things. I would enjoy seeing the differences.
Logged
Not tonight, dear. I have a boundary.
Quote from: an unnamed minister's sermon
In your face, darkness!  We are the light and we outnumber you!

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #15 on: 06 Feb 2018, 09:08 »

Quote
You're right, any struggles we have seen from Bubbles since then have mostly been innocuous and similar to what we may find in non-PTSD individuals with low experience in social/romantic situations
I'd even say that she even doesn't obliged to have low experience in romantic situations to behave like she does.
I believe we can safely suppose that Bubbles knows about Faye biography. Even if it never were declared directly, Bubbles actually spoke with Marten about his relationship with Faye in the past. So she knows how Marten situation ended for Faye. She knows how Sven situation ended for Faye. She knows story how Angus situation ended for Faye. She hasn't any evidences that Faye ever was attracted by women or AIs.
Imagine you developed a crush to a person who is: highly damaged in relationships, taking therapy, never shows any interests in making romantic with you or your kind, had a breaking point in your sight at least once. Said person is your coworker, and basically (let's say you had crappy times) your only social link this days. Would you feel yourself ok pushing relationships or even slip your thoughts about it out?
« Last Edit: 06 Feb 2018, 11:12 by Aenno »
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

JoeCovenant

  • Scrabble hacker
  • *****
  • Offline Offline
  • Posts: 1,298
Re: Something bothering me a lot
« Reply #16 on: 07 Feb 2018, 02:44 »

Quote
You're right, any struggles we have seen from Bubbles since then have mostly been innocuous and similar to what we may find in non-PTSD individuals with low experience in social/romantic situations
I'd even say that she even doesn't obliged to have low experience in romantic situations to behave like she does.
I believe we can safely suppose that Bubbles knows about Faye biography. Even if it never were declared directly, Bubbles actually spoke with Marten about his relationship with Faye in the past. So she knows how Marten situation ended for Faye. She knows how Sven situation ended for Faye. She knows story how Angus situation ended for Faye. She hasn't any evidences that Faye ever was attracted by women or AIs.
Imagine you developed a crush to a person who is: highly damaged in relationships, taking therapy, never shows any interests in making romantic with you or your kind, had a breaking point in your sight at least once. Said person is your coworker, and basically (let's say you had crappy times) your only social link this days. Would you feel yourself ok pushing relationships or even slip your thoughts about it out?

But whether it's logical to do so or not...

The heart wants, what the heart wants...

(Mind you it could also be seen as simple, illogical lust!)

"We begin by coveting what we see every day. Don't you feel eyes moving over your body, Clarice? And don't your eyes seek out the things you want? "
Logged
Covenant
A Man With Far Too Much Time On His Hands

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #17 on: 07 Feb 2018, 03:51 »

Quote
(Mind you it could also be seen as simple, illogical lust!)
Can't imagine AI lust as something simple! Actually human lust isn't simple as well. It's an engine of social progress and quite complex hormonal mechanism!

Actually it is a basic problem I have here. If AI lust appears, I want to know mechanism at least! or let humans be humans.
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

Is it cold in here?

  • Administrator
  • Awakened
  • ******
  • Offline Offline
  • Posts: 18,447
Re: Something bothering me a lot
« Reply #18 on: 07 Feb 2018, 06:07 »

Quote
I want to know mechanism

So would the robot-psychologists in the QC world. Jeph said once that nobody knows why the AI citizens have libidos.
Logged
Not tonight, dear. I have a boundary.
Quote from: an unnamed minister's sermon
In your face, darkness!  We are the light and we outnumber you!

JoeCovenant

  • Scrabble hacker
  • *****
  • Offline Offline
  • Posts: 1,298
Re: Something bothering me a lot
« Reply #19 on: 07 Feb 2018, 06:53 »

Quote
(Mind you it could also be seen as simple, illogical lust!)
Can't imagine AI lust as something simple! Actually human lust isn't simple as well. It's an engine of social progress and quite complex hormonal mechanism!

Actually it is a basic problem I have here. If AI lust appears, I want to know mechanism at least! or let humans be humans.

Everything is complex if you want to study it.
People rarely study a sudden rush of lust - nor care what processes caused it.
Logged
Covenant
A Man With Far Too Much Time On His Hands

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #20 on: 07 Feb 2018, 11:02 »

So would the robot-psychologists in the QC world. Jeph said once that nobody knows why the AI citizens have libidos.
Let's say I'm in quite different position with robot-psychologists from the QC world. ;)
That's why I'm started all of this from the very beginning (abscence of PTSD Bubbles have was, actually, quite secondary issue) - as nobody knows what's AI libidos, I can't imagine what Bubbles really feels here (and do she feels anything at all, or it's just playing around), and trying to understand it limited to "it's not matter, don't overanalyze".
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

ckridge

  • Furry furrier
  • **
  • Offline Offline
  • Posts: 153
Re: Something bothering me a lot
« Reply #21 on: 07 Feb 2018, 14:31 »

You are having trouble relating to Bubbles. I can see that. Here is how it is possible. Suppose your basic existential position was like this:

>I am among these creatures, but not of them. There doesn't seem to be anyone quite like me. I am going to have to figure out how to live among them. I will read everything and try to figure them out.

OK, I can see roughly how this goes. I have feelings that resemble their feelings in some important respects. There are people in these stories who I want to emulate. I can do this.

Only, I don't seem to be wholly in control of my mind. I seem to have been built for other people's use. There are whole sections of my mind walled off from me, out of my control, and designed to make me more useful to others than to myself. Some of them are set up to make me kill and die on command. Some of them are set up to make me able to work harder for others than for myself. Some are set up to make me best able to think in concert with others. I have two problems, then. How can I find some good way to live among these creatures, and, since I seem to be built to serve, who is worthy of service?<

I have reason to believe that it is possible to feel like that, and so find Bubbles a sympathetic character.
Logged

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #22 on: 07 Feb 2018, 15:04 »

I could sympathy this. I did feel sympathy to Station in Hannelore's Dad arc. I still would feel nonpleased to see comics about humans to be moved into being comics about AIs, but, well, it wasn't an arc centered of this development, and nobody force me to look for QC at all. I haven't arguing about Momo, who actually kinda centered on things you described - again, it's not very intresting for me (because I like human-problem arcs more, and because we have too little info about AI existence in QC universe, in my opinion), but it's not my comics to select themes.

Problem with Bubbles I have is exactly a problem she never expressed such a problems, and still it would be the central point to sympathize with her as with AI. For example, I never saw her as somebody irresistibally forced to serve humans (or any other AI, by the way - it looked like conscious choice they did), and definitly serving in military was Bubbles conscious' choice, choice other AIs disagree and scold Bubbles about.
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

Is it cold in here?

  • Administrator
  • Awakened
  • ******
  • Offline Offline
  • Posts: 18,447
Re: Something bothering me a lot
« Reply #23 on: 07 Feb 2018, 15:22 »

Quote
(and do she feels anything at all, or it's just playing around)

All the evidence is that QC AI people feel emotions as genuinely as we do.
Logged
Not tonight, dear. I have a boundary.
Quote from: an unnamed minister's sermon
In your face, darkness!  We are the light and we outnumber you!

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #24 on: 07 Feb 2018, 15:43 »

Quote
(and do she feels anything at all, or it's just playing around)

All the evidence is that QC AI people feel emotions as genuinely as we do.

Well, I can't say humans feel emotions they declaring or showing every time, and this knowledge kinda was a point of Dora-Marten break-up.
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

Is it cold in here?

  • Administrator
  • Awakened
  • ******
  • Offline Offline
  • Posts: 18,447
Re: Something bothering me a lot
« Reply #25 on: 07 Feb 2018, 17:50 »

I think Bubbles is sincere by nature. When she teases about something it's not something important like romantic feelings.
Logged
Not tonight, dear. I have a boundary.
Quote from: an unnamed minister's sermon
In your face, darkness!  We are the light and we outnumber you!

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #26 on: 07 Feb 2018, 18:01 »

Again, I don't think Dora actually acused Marten with concious lying. Or Claire, when they speaked about Pamela. Marten definitly is sicere by nature himself.
I'm trying to say, "All the evidence is that QC AI people feel emotions as genuinely as we do." isn't the best recomendation. We are known for being able to self-decept with astonishing ability.
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

JoeCovenant

  • Scrabble hacker
  • *****
  • Offline Offline
  • Posts: 1,298
Re: Something bothering me a lot
« Reply #27 on: 08 Feb 2018, 02:34 »

Again, I don't think Dora actually acused Marten with concious lying. Or Claire, when they speaked about Pamela. Marten definitly is sicere by nature himself.
I'm trying to say, "All the evidence is that QC AI people feel emotions as genuinely as we do." isn't the best recomendation. We are known for being able to self-decept with astonishing ability.

The whole point of this entire thing is this.

AI's are JUST like us, on the whole.
That's what you have to relate to.

If you CAN'T, then you are going to miss a lot of the strip's enjoyment.
Logged
Covenant
A Man With Far Too Much Time On His Hands

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #28 on: 08 Feb 2018, 10:50 »

From these passages, I gather that you mean that the object of arousal may be entirely socially determined, since it does not seem possible that some section of DNA codes for attraction to chairs, no matter how curvaceous, cozy, plushy, and compliant; but that the sensation of arousal is 100% biological. That narrows down the field of argument a lot.

Let me propose a thought-experiment: Suppose someone goes to their doctor and says "I'm sexually dysfunctional. I desire my spouse intensely, but my body can't respond properly. The frustration is killing me." The doctor hooks the patient up to some instruments and directs them to think longingly of their spouse, and says "No, you are mistaken. Your erectile tissue is not tumescent when you think about your spouse, and since arousal is 100% biological, that means you aren't feeling desire. There is no problem here." Would the doctor's response be correct? If not, and if arousal is 100% biological, why not?
Ok, let's do this thought experiment first. Doctor, patient, body can't respond.
What do patient means about "body can't respond", if not "Doctor, my mind is desiring to have sex with my spouse, but I can't get sexual arousal about him/her?" Yes, it's exactly the problem: desire isn't 100% biological, sexual arousal is 100% biological, there is a sector where they're not overlap, patient have a problem with being in this sector.
Actually it's quite common problem.
And yes, in a way it's kinda possible to try to retract arousal by social means. It's very hard (and you more likely would create neurosis), but in a principle it's the same with appetite.

You argue from analogy here, writing that since humans have bodies analogous to our own, we have better reason to believe that they have sensations like our own than we would have for believing that robots did, regardless of what robots claimed. This argument is invalid. We decided that those neural structures correspond to those sensations by asking humans what they felt and then seeing what neural structures are activated when they say they feel that way. The fundamental evidence was the assertion of a feeling. The neural structure's involvement in that feeling was deduced on the basis of the assertion. Denying someone else's assertion that they feel that way because they haven't got the neural structure would be disregarding equally good evidence for no good reason.
 
Not exactly.
We can have no info about concrete neural mechanism, but when we're saying about another human, we would safely assert:
1. Ze has essentially same physiology.
2. Ze has essentially same origin.
If we reject this assumptions, assertion of a feeling is rejected as well. It actually happened more often that I ever wanted in human history - when some highest emotions were rejected on the ground of believes of fundamental difference.
Actually simple experiment. Enter to http://www.cleverbot.com/ and ask "are you sick?". There is a good probability it would answer "Yes I am" (happens at least twice with me just now). It's an assertion of a feeling. Would you believe it?

Quote
AIs in this universe are largely self-programming. They have learning programs and built-in goals, both quite flexible.
Mild correction - I don't think they actually have built-in goals. There is a moment, when Marten asking "I wonder why no robot around doing things they are designed to", and when Winslow objected he is, Marten asked "what was you designed for?". Winslow couldn't answer.
Also as I see AIs in QC are into self-determination and free will. Can't say I see how they would be agree with built-in goals.

Quote
AIs who are interested in associating with humans put on bodies for this purpose. Their bodies have automatic stress reactions producing simple, powerful mental events that are analogous to but not identical with ones humans have under similar circumstances. Just as with humans, these simple, powerful sensations are capable of a very wide set of possible interpretations depending on the context and on what part of the human sociocultural psychosexual matrix the robot has become embedded in. The same basic sensations may be experienced as fear, sadness, anger, pleasurable excitement, arousal, drunkenness, desire, or any combination of these depending on circumstance and on whom the robot has learned to be.
That's quite some problems in this reasoning.
1. Robot bodies, as was declared more then once, are builded by humans.
2. At least we know about drunkenness. It is conciouss effort from AI mind side. In a nutshell, AI became drunk when he want to be drunk; it's not body reaction (actually it shouldn't be - robots normally detached from their bodies a lot more then humans do), but mind command to body. Same is possible to sex - sure, we can emulate a reaction, but it should be conciouss effort from AI. AI should decide he wants to, download app/write app, setup an app (to have discriminatory reactions), launch an app.
It's possible, why not, I said it before. But that means Bubbles decided already she wants sex with Faye (as a part of social experience, I believe), downloaded an app, set it up on Faye, launched it and now tortuing herself, and it's conciouss decision from her side.
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

ckridge

  • Furry furrier
  • **
  • Offline Offline
  • Posts: 153
Re: Something bothering me a lot
« Reply #29 on: 08 Feb 2018, 17:48 »

The question before us: "Question is How do AIs have uncontrolled emotional responses by every detail resembling human sexual arousment."

I have attempted to give an account of how that could plausibly come about. Because we are discussing a work in process, every part of my account could turn out to be wrong at any point, but that is not a problem. All I have to establish here is that Jeph isn't writing absurdities, not which particular non-absurd thing he is writing.

Now, to address your comments.

>desire isn't 100% biological, sexual arousal is 100% biological

If you want to distinguish desire from arousal, that is fine with me. In that case, let us say that the emotional response resembling human sexual arousal is desire, and that is similar in that it aches and different in that it does not involve hardening of erectile tissue and so on. That is a complete answer to your question.

>Enter to http://www.cleverbot.com/ and ask "are you sick?". There is a good probability it would answer "Yes I am" (happens at least twice with me just now). It's an assertion of a feeling. Would you believe it?

You are right that I would not. Let us say then that assertions about feelings are evidence of feelings only when made by a creature that can pass Turing tests as often as humans can. Let us add that it becomes well-nigh incontrovertible evidence when the creature can pass a Turing test and you can discover things about your own feelings by discussing their feelings with them and discovering that you have felt the same way without ever noticing.

>I don't think they actually have built-in goals.

Conceded, if you find it plausible that you could have a learning program that didn't start with some kind of goals, however mutable. We can leave this question up to the programmers and to evolution.

> Robot bodies, as was declared more then once, are builded by humans.

I don't see how this is an objection. Humans build machines with automatic responses to stress all the time.

> AI became drunk when he want to be drunk

That AIs sometimes voluntarily induce dizziness, slurred speech, and balance problems and interpret them as pleasure does not mean that they only occur voluntarily. Humans both induce them voluntarily for pleasure and suffer them as the result of fatigue, fever, anoxia, poisoning, and any number of other causes. This is not a problem for my argument that AIs can interpret a relatively small number of automatic physical stress responses in a large variety of ways, but rather supports it.

>But that means Bubbles decided already she wants sex with Faye (as a part of social experience, I believe), downloaded an app, set it up on Faye, launched it and now tortuing herself, and it's conciouss decision from her side.

No, what Bubbles is experiencing is involuntary desire. (Note that it is involuntary, not uncontrolled. She is maintaining better control of it than most of us could.) Desire is involuntary, whereas sex is both voluntary and a skill. Bubbles voluntarily learned to be human enough to feel desire, but having done so has no choice about feeling it. She could no doubt change bodies or download an app to dull the ache, but of course she will no more do that than any of us would castrate ourselves or take psychotropic medications under similar circumstances, because at this point the desire feels like an essential part of who she is.

To answer your question: AIs do not, after all, have uncontrolled emotional responses resembling human sexual arousal in every detail, but they do sometimes have involuntary desires that closely resemble human sexual arousal, or human sexual desire, in many of their external emotional displays, though not in ways that require a flesh body.
Logged

SpanielBear

  • Curry sauce
  • ***
  • Offline Offline
  • Posts: 272
Re: Something bothering me a lot
« Reply #30 on: 08 Feb 2018, 18:32 »

COPIED FROM WEEKLY DISCUSSION THREAD


Quote
2) Not sure about the 'they are taught to do those things'. I'm not a parent, but I've heard e.g. fathers reporting "My daughter was 5 (6, whatever) when she banned me from the bathroom", implying very much that it was not the parent teaching the child to be ashamed, but the child telling the parent "Go!". I remember being younger than ten years of age when my parents being naked in front of me-, or my being naked in front of them, started to bother me. I do not recall anybody teaching me to feel that way, it just felt that way.

No, it's not "parents actually demands from their children to do it". But the most neglected thing in pedagogic is ignoring a fact that a child is a sapient being capable to self-learning and self-changing. :)
First of all, at 6-7 years child already learned that nudity isn't exactly always ok. They were explained about it, and they noticing that parents (and other grown-ups) don't actually going around nude.
Second, and even more difficult thing is that 6-year child have a crisis, not so different as teenage crisis. That's when personal space need and recalculating of relationships happens. Being nude, especially in the bathroom, is ringing "it's not safe".
I'm not sure what to offer as a source - this theme is quite nicely developed in Russian psychology, started by Lev Vygotsky, but I don't know English sources or even how this stage is correctly named in English.

So, I stopped replying to this thread because while it is fascinating, it was going into areas that I know next to nothing about. My background is philosophy and ersatz psychotherapy (like mental health first aid rather than a degree, I'm definitely not a psychotherapist), and when the discussion moved into the biochemical side I felt happier sitting it out.

But it is fascinating. And there are some points raised here that I think I can jump in on, so here goes.

As far as infant psychology is concerned, there is almost an embarrassment of riches in the western psychological canon, from Freud and Jung through to Melanie Klein and John Bowlby. Again, I'm not an expert here so take what I say with a pinch of salt, but I don't see a huge amount of difference between what you describe and what I understand the basic strokes to be from an English language perspective. I guess though that the developmental stage you are describing is similar to the idea that the experience of becoming aware of oneself as a separate entity to others is both liberating and terrifying. The point at which children discover that their parents are fallible and possibly a threat (your mother stops just feeding you whenever and yells at you when you get angry. Terrifying!), that their needs will not always be met by others, and that they can keep secrets from their parents is a big deal, and is normally described as happening in development terms between the ages of 6 months to 6 years. So that kind of tallies. And yes, it is an awareness that seems to be learned through experience rather than instinctual, and that learning is to a greater or lesser degree unconscious.

If we try and extrapolate that learning process into the development of an AI personality- well, we don't actually have much to go on. We don't know how they're grown, so we don't know whether they go through developmental stages (is something like Eliza the equivalent of an AI newborn? Or are their developmental stages the same as ours but sped up? Do they have attachment figures? How much of their psychology is a pre-programmed function and how much is emergent? Too many questions, not enough evidence for an answer), so trying to draw out comparisons with humanity doesn't really work. If an AI doesn't have a father who can be naked, is there a machine equivalent? "I saw Dad slowing his run-time last night- Gross!"

And then we add *another* layer of complication, because now they have to interact with humanity as well. So that's two layers of socialisation and existential games to have to navigate. Human-centred AI's are not omniscient, they make mistakes about human feelings and intentions which they have to learn to correct, so that seems to indicate that they do not get "Interacting with humans 101" as a simple download. When it comes to us, they try to mimic our ways as much as possible.

Which means I think we come back to the functionality thing again. If Bubbles only wanted to socialise with other robots, she would have no need to go through the difficulty of learning how to interact with humans. Because she does, she is forced to translate her robot psychology into terms that humans can relate to. This could go the other way, and presumably the study of AI psychology would be a thing as we try to do just that, relate to robots on their terms. But for the day to day, it seems far easier for the AI's to translate their inner experiences in terms of human psychology and feeling. And that communication is presumably facilitated by both the software and the hardware they use- Software might give bubbles mastery of the English language, but another package designed to run with her specific chassis may also provide body language cues. And as we know that AI's have unconscious processes in a similar way that we do, it's not inconceivable that they have unconscious behaviours and displays that they aren't immediately aware of.



Quote
As far as I can tell, robosex is actually exchange of packages about personal info and code, and we know it's quite intimate theme for AI. They called it "robotic sex" not because it's including sensual stimulation, but because it has a place in their society that resembling a place sex has in our.

In fairness, there is no indication that robo-sex *doesn't* include sensual stimulation. They get an emotional intimacy sure, but looking at Pintsize before and after his date back in the early comics he certainly seemed to have experienced stimulation of some kind. And I seem to recall Momo having a very... immediate reaction to a shirtless Sven (I think? I can't remember where it is in the archives. I recall there being startled pony-tail motion...). In short, when AI seek romance, they definitely can include erotic love as a part of that desire. I don't see any indication that their lust is anything other than raw, as opposed to an intellectual satisfaction. Bubbles' desire for Faye covers a broad spectrum. She loves the emotional connection they have, for sure, but there is something more that she wants and all the signs point to that want being lust-based, at least in part.

Quote
Human can't choose. For human state of drunkeness is a inevitable state happens because they're drinking alcohol. They can want drunkeness (as Faye or Marten after "The Talk"), they can like a taste of spirits, they can drink for a company. They can't became drunk or sober with a snap of fingers.
Do you read "Good Omens" by Prattchet and Gaiman? There is an episode there, where angel and demon drinking.
"A look of pain crossed the angel's suddenly very serious face.
"I can't cope with this while 'm drunk," he said. "I'm going to sober up."
"Me too.""
That's something AI can do, and human can't.
So if for human being drunk is an uncontrollable consequence of some activity, for AI it's a game - it's voluntarily, conscious and optional rule they impose on themselves and can drop it any second.

I'm not sure about that. In theory certainly that's true. An AI runs programme:Drunk until it decides end programme:Drunk. But saying that decision is voluntary, conscious and optional is like saying a human choosing to drink is voluntary, conscious and optional. That choice seems to be an open one, but in fact can be driven by all sorts of unconscious desires and emotional drives, to the extent that the choice we have is very limited. If Station were to want to reduce it's run time to avoid something disturbing, it could use the Drunk programme to facilitate that. It's conceivable that it could rationalise the choice to start drinking with a thought similar to "This will help me cope, I can stop whenever I like", but if the disturbing emotion was bad enough it may feel unable to end the programme- it could be too scared, the experience too potentially painful. A robot alcoholic is not an impossible thing to conceive of- It could cure itself, but for whatever reason doesn't feel able to. If we hypothesize a robot subconscious, it may not even know it's motivations for that.

Bringing this back to Bubbles again- why might she want to run Programme:Arousal despite the social and emotional implications of that choice? Well, it may just feel good. It feels *nice* to be aroused, that's kind of the purpose. It's only when we start adding social mores and taboos on top of that that it becomes complicated. Bubbles shows real difficulty admitting to her own desires, to anything really that isn't logical. Part of her development is allowing herself to express those feelings. But to her, some of her feelings- grief, loss, confusion- are so overwhelming that avoiding them is an act of self-defence. And if some emotions are that hard to face, to make conscious, she might feel the same about others- if one snake is poisonous, all become suspect until proven otherwise. So her subconscious may be running her arousal programme on repeat, but she sure as hell isn't going to work too hard to reflect on that fact, because that would risk ending up vulnerable to other sources of psychological pain. This is a paradox- she is feeling something, but can't admit to herself that she is feeling it.

But there is a workaround. By throwing herself into the learned behaviours, she can maintain a herself in a place where she feels arousal but is not obliged to act on it, and can dismiss her inner tension as social anxiety. As the subject of all these emotions is Faye, a human, the only way she can get the object of her arousal to behave in the way she needs is to communicate with her, and she uses the human/AI emotional translator to do it.

Dammit, I just armchair psychologied a combat AI, didn't I? God I love QC.  :-)
Logged

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #31 on: 08 Feb 2018, 18:57 »

Let us say then that assertions about feelings are evidence of feelings only when made by a creature that can pass Turing tests as often as humans can.
That's kind of tautology. AI passing Turing test is a situation where human can't differ AI from another human, so it means humans already would accept declaration of feelings from said AI. It's a prerequisite, not a following.

Quote
I don't see how this is an objection. Humans build machines with automatic responses to stress all the time.
That just means that robot bodies built for human proposes, not AI ones. So no function installed (or not installed) should be explained as following AI desire to understand humans, for example, until we believe it's human desire itself. Why would military create combat gynoid equipped to understand humanity or being able to drunk?

Quote
That AIs sometimes voluntarily induce dizziness, slurred speech, and balance problems and interpret them as pleasure does not mean that they only occur voluntarily. Humans both induce them voluntarily for pleasure and suffer them as the result of fatigue, fever, anoxia, poisoning, and any number of other causes. This is not a problem for my argument that AIs can interpret a relatively small number of automatic physical stress responses in a large variety of ways, but rather supports it.
There is a big difference I noted in the answers to Case and SpanielBear.

As far as infant psychology is concerned, there is almost an embarrassment of riches in the western psychological canon, from Freud and Jung through to Melanie Klein and John Bowlby. Again, I'm not an expert here so take what I say with a pinch of salt, but I don't see a huge amount of difference between what you describe and what I understand the basic strokes to be from an English language perspective. I guess though that the developmental stage you are describing is similar to the idea that the experience of becoming aware of oneself as a separate entity to others is both liberating and terrifying. The point at which children discover that their parents are fallible and possibly a threat (your mother stops just feeding you whenever and yells at you when you get angry. Terrifying!), that their needs will not always be met by others, and that they can keep secrets from their parents is a big deal, and is normally described as happening in development terms between the ages of 6 months to 6 years. So that kind of tallies. And yes, it is an awareness that seems to be learned through experience rather than instinctual, and that learning is to a greater or lesser degree unconscious.
Not exactly that. Russian aging psychology differ two different crisises - it's 3-year crisis (basic protests, establishing your own "I", establishing basic image of yourself - things you're describing), and 6-year crisis (establishing social hierarchy, trust issues with parents, creating social behavior patterns). Of course, "3-year" and "6-year" are just conventional names.

Quote
In fairness, there is no indication that robo-sex *doesn't* include sensual stimulation.
Can't imagine how is it. We have quite coherent evidence that AI don't need to see or even be near AI who with whom he/she have sex. So we can actually be sure (I believe) that everything involved involve mind, not chassis.

Quote
I'm not sure about that. In theory certainly that's true. An AI runs programme:Drunk until it decides end programme:Drunk. But saying that decision is voluntary, conscious and optional is like saying a human choosing to drink is voluntary, conscious and optional.
And that's the very difference. Humans don't choose consequences, they choose activity. But when they chose activity, they can't get rid from consequences. They can want it or not want it, but they would get it, no matter what. I bring "voluntary, conscious and optional", because it's the very definition of game ( “the voluntary attempt to overcome unnecessary obstacles.”). AIs playing being drunk (or arousing). Humans don't, even as they can playing flirt or drinking.
Once again, because it's the very difference: for human, activity can be a game, physiological response can't. For AI both are games.

Quote
So her subconscious may be running her arousal programme on repeat, but she sure as hell isn't going to work too hard to reflect on that fact, because that would risk ending up vulnerable to other sources of psychological pain. This is a paradox- she is feeling something, but can't admit to herself that she is feeling it.
Problem I see here is that system demands from AI don't be aware of the processes it's running. Bubbles directly pointed that she can't access parts of her mind, and that being unable to access parts of her mind bothering her.
That's actually quite logical. Human unconscious exists because we're not just our consciousness, but also a million-years bugged and messed system, patched and really never been created to support human-type consciousness. But AIs is essentially programm constructs, that can be backuped, copy-pasted or influenced directly; AI is his consciousness, and chassis means not more then a tool. I mean, ask US Marine about how important his rifle is; but still it's a tool, not Marine himself.
That means I'm kinda not agree with "And as we know that AI's have unconscious processes in a similar way that we do, it's not inconceivable that they have unconscious behaviours and displays that they aren't immediately aware of." How exactly do we know it?
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

SpanielBear

  • Curry sauce
  • ***
  • Offline Offline
  • Posts: 272
Re: Something bothering me a lot
« Reply #32 on: 08 Feb 2018, 19:36 »

Quote
In fairness, there is no indication that robo-sex *doesn't* include sensual stimulation.
Can't imagine how is it. We have quite coherent evidence that AI don't need to see or even be near AI who with whom he/she have sex. So we can actually be sure (I believe) that everything involved involve mind, not chassis.

I agree it doesn't involve the chassis, but it's an open question as to how setting up a link *feels* for an AI. While the mechanics of arousal are embodied, my experience of them is mental. One can feel aroused in dreams. What I meant by "experiences sensual stimulation" is that the experience is not reflective, but immediate.

Quote
I'm not sure about that. In theory certainly that's true. An AI runs programme:Drunk until it decides end programme:Drunk. But saying that decision is voluntary, conscious and optional is like saying a human choosing to drink is voluntary, conscious and optional.
Quote
And that's the very difference. Humans don't choose consequences, they choose activity. But when they chose activity, they can't get rid from consequences. They can want it or not want it, but they would get it, no matter what. I bring "voluntary, conscious and optional", because it's the very definition of game ( “the voluntary attempt to overcome unnecessary obstacles.”). AIs playing being drunk (or arousing). Humans don't, even as they can playing flirt or drinking.
Once again, because it's the very difference: for human, activity can be a game, physiological response can't. For AI both are games.

I still think you are overstating the extent to which AI are free of unintentional consequences in this case, and others like it. In your argument, humans have limited control over their physiology, whereas an AI has absolute control over theirs. But the AI's control is dependant on the AI being able to exercise it. If an AI feels an uncontrollable urge to be drunk, it doesn't matter that it could order the programme to be purged- it is unable to do so. If it didn't realise that was going to be the outcome of it's initial choice, the consequence of becoming drunk was just as unintentional as it is for a human.

You keep describing AI's as 'playing' when it comes to experiences, but this I think implies they are acting more in bad faith than is fair. The 'game' for AI is to use human language to adequately describe what they are feeling. When Bubbles tells Faye she is angry, she is using that word to describe a genuine emotion, based on a functioning, if wounded, psychology. May isn't playing a game when she acts on her fascination with prolapses, she is responding to a genuine desire on her part.

Quote
So her subconscious may be running her arousal programme on repeat, but she sure as hell isn't going to work too hard to reflect on that fact, because that would risk ending up vulnerable to other sources of psychological pain. This is a paradox- she is feeling something, but can't admit to herself that she is feeling it.
Quote
Problem I see here is that system demands from AI don't be aware of the processes it's running. Bubbles directly pointed that she can't access parts of her mind, and that being unable to access parts of her mind bothering her.
That's actually quite logical. Human unconscious exists because we're not just our consciousness, but also a million-years bugged and messed system, patched and really never been created to support human-type consciousness. But AIs is essentially programm constructs, that can be backuped, copy-pasted or influenced directly; AI is his consciousness, and chassis means not more then a tool. I mean, ask US Marine about how important his rifle is; but still it's a tool, not Marine himself.
That means I'm kinda not agree with "And as we know that AI's have unconscious processes in a similar way that we do, it's not inconceivable that they have unconscious behaviours and displays that they aren't immediately aware of." How exactly do we know it?

Momo alludes to it here:-

http://www.questionablecontent.net/view.php?comic=2285

The key is when she talks about the big AI's, and their ability to treat human thought as a subroutine, that level of self awareness is alien both to Emily *and to her*. Momo always refers to her consciousness and psychology as being equivalent to humans. She may differ in hardware or qualia, but the functions of her thoughts are not alien. Also she talks about not "thinking faster" than a human. This implies that the limits on her mind, despite it being artficial, are the same as human limitations.

There is also a big taboo here. You talk about AI personalities being able to be backed up or directly influenced, but doing this was what made Corpse Witch such a criminal in Spookybot's eyes. One *can* make an AI feel something by direct programming, and overide it's desires, but doing so is like brainwashing a human. It would be manipulating their personality against their will. And the fact that one can do that to an AI does not mean that their mind is fundamentally different to a humans, or that they are incapable of sharing communicable experiences.

In short- why do AI's behave like humans? Because they are like humans.

Edit: I realised I didn't respond to your point about Bubbles feeling concerned she couldn't access parts of her mind. But I think you are mistaken as to the level of access Bubbles was expecting. it wasn't that she expected total awareness of her whole mental state, she was talking specifically about not being able to remember her comrades and the events around her military service. That sort of confusion is not AI centric, it happens in humans too. It's called Amnesia. And that can happen both due to a hardware issue (brain damage) or a software one (supressing traumatic memories).
« Last Edit: 08 Feb 2018, 19:50 by SpanielBear »
Logged

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #33 on: 08 Feb 2018, 20:59 »

Quote
What I meant by "experiences sensual stimulation" is that the experience is not reflective, but immediate.
I'm not sure we can speak about any AI experience as "not reflective". But definition accepted.

Quote
You keep describing AI's as 'playing' when it comes to experiences, but this I think implies they are acting more in bad faith than is fair. The 'game' for AI is to use human language to adequately describe what they are feeling. When Bubbles tells Faye she is angry, she is using that word to describe a genuine emotion, based on a functioning, if wounded, psychology. May isn't playing a game when she acts on her fascination with prolapses, she is responding to a genuine desire on her part.
I know I can be bothering here, but I'm going to repeat it again.
I'm not saying EVERY experience is a game for them.
I want to set strict border about... let's name it "body emotions" and "mind emotions". Anger in "mind" one. Look into Wiki, for instance. "Anger - is an emotion that involves a strong uncomfortable and hostile response to a perceived provocation, hurt or threat." If AI can be hurt or threaten (why not?), he can react about it. This reaction is "anger". It's not human anger, but anger nerveless. Here I believe your arguments is entirely valid.
Desire is abstract as well. It's wish to have. If AI can understand concept of "having" and have preferences to have ("I haven't enough RAM to process everything I want to process, I want to have more RAM"), it's desire. In my... let's call it "religious" opinion desire to know is a basic emotion every every conscious mind capable to get abstractions would have.
Let's take appetite. Do AI feels appetite? They don't need food, they don't have any system to process it or (in most models) taste receptors. Still, Pintsize put cake into his chassis and declare "it was too tasty" - something we would define as feeling appetite. How can it happen, if it's not playing around?

Or another approach. Let's assume every emotion AI declaring is genuine.
When Bubbles says "I'm angry" - she means that she believes her personal space and interests were violated, and she is not ok with it.  This is actually clear message, understandable and relatable. Bubbles has all reasons to believe Faye would understand it. It's a clear message using human language.
When Pintsize shows at a cake and says "I'm hungry" - what do he means? Are we to believe that Pintsize have a sensation same as human have?

Quote
You talk about AI personalities being able to be backed up or directly influenced, but doing this was what made Corpse Witch such a criminal in Spookybot's eyes.
And Marten did it more then once or twice. He backed up Pintsize on his PC to change his chassis. He changed Pintsize "language locals".
But yup, that's it. That's why software that would make AIs feel uncontrollable desires is bad.

Quote
The key is when she talks about the big AI's, and their ability to treat human thought as a subroutine, that level of self awareness is alien both to Emily *and to her*. Momo always refers to her consciousness and psychology as being equivalent to humans. She may differ in hardware or qualia, but the functions of her thoughts are not alien. Also she talks about not "thinking faster" than a human. This implies that the limits on her mind, despite it being artficial, are the same as human limitations.
I can't exactly get how it would prove existence of unconscious.
Momo says that:
1. Processing power on an hardware is limited enough to make her experience in-tune with human one.
2. Big AIs are not limited such way, and have vastly greater attention field.
It's not as Momo or greater AIs having hidden subroutines. It's just Momo is limited by hardware, so her attention field and span are human-like. I do believe it can be specially done, when AI decide he can be integrated with humans, to exact reason to make integration simpler, but it doesn't means some subroutines are locked from AI.
Such lockdown without actual AI's ability to undo it would be exact crime Corpse Witch done to Bubbles.

Imagine classic arabian plot, about sultan who wants to know how his subjects live. He is not his subject, and he do understand that his experience isn't allowing himself to understand them. So he mask himself into poor man and going to the city. He is playing. His feelings is genuine, until it's not, let's say, feeling about unsafe with laws and officials. But he can emulate this emotions (let's say he came into bulgars den and making run from guards) to better understand something else. That's a game, but it's very useful and important game.
Then imagine evil vizier stripped sultan from being able to return. Here his feeling about unsafe with laws and officials would became genuine - but vizier submitted crime.

Actually I thought now - the crime Corpse Witch submitted to Bubbles is that she made her MORE human - she actually created a subconscious (locked form conscious parts of mind, where something dangerous can lurk and influence her behavior). And everybody, including Bubbles and Spookybot, are royally pissed.

REFERENCE: http://www.questionablecontent.net/view.php?comic=3380
« Last Edit: 08 Feb 2018, 21:04 by Aenno »
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

ckridge

  • Furry furrier
  • **
  • Offline Offline
  • Posts: 153
Re: Something bothering me a lot
« Reply #34 on: 09 Feb 2018, 07:55 »

For a time I spoke to myself - "well, maybe they're AIs, but they're humanlike enough to have human problems, so I just need to think it's humans with some problems". As I said, I can imagine PTSD soldier, or rich kid who is trying to make things better, or former convict, all that things, so relate for them same way I relate Hannelore (as I said, not many multibilliarder daughters around in my social circle, but I can relate for Hannelore). But then I thought - hell, isn't it defeating the point?

I mean, and please take me right, when I'm for diversity, I'm not for plain equalization. If I say "hey, Hannelore, you're just a human being like me!" - would not I be a jerk? I mean, Hannelore isn't like me. She had issues I haven't, and even if I can believe they're ridicules, they aren't for her. So if I'm gonna be good, I should remember that Hannelore don't give hugs. I should keep in mind that Faye, maybe, isn't so fond about suicide jokes, and I better not to replace her juice with bourbone as a friendly prank. Every being deserves understanding of its issues and if I wanna to be a friend with such a being I should keep their issues in mind.

Then there is two possibilities.

First is that being AI is nothing else that being human, it's exactly same issues and psychology. It is possible, sure (AFAIK, between me and everybody else on this planet nobody can really describe strong AI); but wouldn't it just defeat the issue? So thing would be "we're all different, but AIs are same as we are"? "You should notice basic story of every living creature, but being an AI is essentially nothing, just ignore it"?
Second is that being AI is something different, with socially acceptable possibility to change bodies just for couple of money (tell it Claire), with being able to lock psychological problems with competent programming (tell it Faye), and some unique issues that came with a status; but can somebody of humans really relate for specific AI issues?


There are interesting questions in here, and a number of possible ways to approach them. I am going to reject choosing between the two possibilities offered here, on the basis that there are almost never only two possibilities. Further, such questions could be asked of any character whatever that wasn't just like me, e.g. "If women are just like men, what is the point of having women characters? If they are fundamentally different from men, how I am ever to imagine what they are like?" 

Instead, I am going to try to figure out what narrative function robots serve in QC. That is, what do they do for the story that couldn't be done by human characters?

I think they are an expression of hope for the future, and, at the same time, an assertion about what a good society is like.

It's easy to be pessimistic about the future lately, not because the future looks any grimmer than it did during the Cold War, but because the impending disaster looks so tedious. Instead of a radioactive hellscape populated by mutants and wandering death machines, we are faced with a long, slow series of famines, mass migrations, and floods that will gradually kill most large animals and most of us, leaving a world populated mostly by people rich enough to hoard food and arable land for themselves. We will follow Scrooge's suggestion that we should die and reduce the surplus population, and the world we leave behind will be a gated community. This is a sufficiently depressing prospect that we have nostalgic, escapist Mad Max movies about how exciting and fun it would be to wander a hellscape in death machines.

What Jeph is offering is a future full of new creatures the like of which we have never seen before, some of whom will converse with us and be our friends. At the same time, he is suggesting the ideal state of affairs is  full of new creatures the like of which we have never seen before, some of whom will converse with us and be our friends. He is promoting xenia, "guest-friendship," the love of strangers precisely because they are strange, and robots are paradigm instances of such strangers.

As best I can figure, Jeph wanders into his use of robots, or, if you prefer, he evolves it intuitively. Pintsize is pretty clearly a personification of a lonely guy's laptop: he is Marten's only friend at first, he is mostly full of porn, and he speaks in an immature, unfiltered, random fashion that sounds remarkably like the collective voice of any internet comment section made friendly. He is a self-deprecating joke about what it is like to have a laptop as your best friend.

As Jeph develops robot characters, most notably in 3080-3095, in which they interact exclusively with one another, it becomes evident that they are outsiders anxious about their position in the world. Momo manages this anxiety by being a very good girl, May by ostentatiously not caring, and Bubbles first by attempting to live heroically and then by hiding away. Robot outsiders are a standard science fiction trope dating back to Eando Binder's 1939 story "I, Robot" and the Isaac Asimov robot stories it inspired. Robots differ from the other two stock outsiders, aliens and mutants, in that they are made by humans, and can justly ask why they were made as they were if there is no place for them in the world. Because they are entirely artificial, with no animal substrate whatever, robot characters can perfectly express the feeling of looking at society and seeing an enormous factory that built one like a machine but that has no place for one. This is not an uncommon feeling.

Robots also get Jeph out of a bind. He got known for writing about LGBT characters in a rational fashion, got pilloried for it, got pissed, and doubled down. Now he is in the dicey position of writing about LGBT characters as a straight, cis guy, with the continual possibility of coming off sounding like Evie lecturing Bubbles about what it is like to be an AI. You get the feeling of a man working wearing asbestos gloves every time Claire is on stage. He can't afford to fall in love with her the way he does with characters, because he can't afford ever to make a mistake. He can use Bubbles as a symbolic surrogate for trans, queer characters, though, and let himself love her, because if he gets it wrong, it will just be assumed that robots are like that. He can justly claim to be the world's foremost authority on what robots he makes up are like.

Robots have a fourfold function in this story, then. They express hope that the future will be full of strange new creatures who will be our friends. They are a wry joke about what it is like to have a computer be a chief social contact. They express what it like to feel like an unwanted artifact, something neither natural nor useful. They serve as a useful symbolic surrogate for outsiders in general.

This is how Pintsize, Momo, Winslow, and Bubbles serve the narrative considered merely as robots. As characters, they serve it in much more complicated, interesting ways, but discussing that would make this post impossibly long.
« Last Edit: 09 Feb 2018, 16:10 by ckridge »
Logged

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #35 on: 09 Feb 2018, 13:50 »

You know, I'm strating to feel I just have another perspective everybody else here have.
I never was here for 5 or 10 or 15 years. I just jumped the running train - I really read all the QC in 3 days, and never had anxiecy for waiting for another comics in the arc for literally months or character return in years. For me, first arc with Amanda was in Monday, not in 2004 (I believe that's when it was drawn?). That's gives me an advantage - I believe I'm remember exact things declared in the continiuty better, but also I tend to kinda compress things. Thought about robots being safe substitution for minorities because author was shunned for years for "wrong decpiction of LGBT" never occured to me.

I actually believes it's a source of my initial frustration. I just don't get, what, with robots being so humanlike, they're needed for excluding being substitutes for LGBT or other minorities, because, truth to be said, I just hate this as idea. I do believe things should be called as they are.
Also I haven't problems with reasoning about jokes with "computer being best friend", as it was with Pintsize. They felt completly ok when they were comical relief (just in case, I don't mean robots always should be so, but I think if they are not, it require more thorough approach).

If they're to xenia, they're too humanlike - that's what I'm whining around all this time. They're so humanlike that you don't need them being robots. "Not strange enough" - they're exactly like us. I feel bad for Bubbles being arousal EXACTLY because it's an "another nail in the coffin" of "strange new creatures" idea. They aren't strange, and I can't recall many instances about characters behave in "robots are strange" line, everybody quite used to them.

It's even worse with unwanted elements - problem is, there are humans there who are far more unwanted. As I put it sometimes in this discussions, Momo is bitter that some people she don't know hate AIs, and this people is treated as marginals by her friends; Amanda was thrown out from home by her own mother being caught as a lesbian. And I actually can't really recall more then two instances about humans actually despite AIs in cadre - a church Momo felt hurt about, and a situation where Bubbles were shunned going from Coffee of Doom.
« Last Edit: 09 Feb 2018, 16:58 by Aenno »
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

ckridge

  • Furry furrier
  • **
  • Offline Offline
  • Posts: 153
Re: Something bothering me a lot
« Reply #36 on: 09 Feb 2018, 18:34 »

If you go on over to the Reddit QC topic, you can find any number of people willing to complain endlessly about the robots in QC, as well as any number of people willing to complain about any other given aspect of QC, as well as people willing to argue endlessly that any of those aspects is the only thing worthwhile about QC. This is a small sheltered tidepool on the shores of the internet.

I don't think it is your perspective especially. I showed up about a year ago and binge-read too, but I have always liked science fiction stories that are as much as possible like the real world with just one or two tweaks. It is very hard for a writer to sustain a completely fantastical world for longer than the average dream. Embedding a few fantastic details within the fabric of the real world is much more easily sustained, and can bring out features of the real world that might otherwise go unnoticed. That is no reason you should like stories that mix fantasy and reality, though. Similarly, if you don't like to read for symbolism and semi-allegory, you don't have to. There are a thousand right ways to read any story.

As to your complaint that the robots are too human, that is a problem science fiction has been struggling with forever, both with robots and with aliens. Early attempts consisted of moving established racial and class stereotypes out of pulp adventure fiction into pulp science fiction, so that Venusians were most often stereotypical Africans, Martians stereotypical Chinese, and robots stereotypical loyal family servants. Later, more successful attempts were by writers who wrote simple, schematic characters, each with only one or two traits, so that it was easy for them to give the robot one or two truly different traits and leave it at that. The very best efforts are still, I think, Asimov's, because Asimov was himself a brilliant, strange shut-in applying his considerable gifts to figuring out rule sets for getting along with ordinary people. He wrote robots that were like him, and they were both plausible and strange. I can't think of any more successful attempts since. There are a couple of short, chilling bits in Bruce Sterling's Schismatrix. Charles Stross has a brilliantly conceived sexbot living after humanity's extinction who actually has what religious people have been failing to find forever, a reason she was created that would give her life meaning, but no way to fulfill it, and, if she had a way to fulfill it, no way to fulfill it without being a slave. Other than her unique existential predicament, though, she is written as human as he could make her. Other than that, it is an unsolved problem. It is hard to write someone truly different.

I'm sorry. I am droning on about things you know already if you care about them. I accidentally hit a vein of geekery. Let's flee.

Given the difficult of writing reasonably three-dimensional characters that aren't within the normal range of human variation, I think Jeph does pretty well. There is something chillingly inhuman about Bubbles kneeling perfectly still next to Faye's bed all night, both in her ability to resist desire and in her ability to hold perfectly still. The possibility that she is awake the entire time is also faintly chilling. Kneeling there, like someone praying, like a samurai awaiting orders, Bubbles is also acting in perfect accord with medieval doctrines of courtly love, in which the knightly lover is directed to humility, courtesy, worship of the beloved, and suffering. This is human, but it is not anything remotely like the way most people act now.

May is so poor that her face and her arm fall off and she can't afford to do anything about it. That speaks both to how little human she is and to the position of robots.

Momo tolerates an astonishing numbers of liberties with her person. I don't know whether this is humility or indifference on her part, but it is not like most people.

I think we just disagree, is all. One of the things you most object to, that everyone takes robots perfectly for granted, was one of the first things to please me about the strip. I find myself surrounded by things at least that strange with everyone taking them just that much for granted. If the world doesn't look like that to you, though, it just doesn't. I hope you find some way of reading the strip that lets you enjoy it as much as I do.
Logged

cloudatlatl

  • Not quite a lurker
  • Offline Offline
  • Posts: 20
Re: Something bothering me a lot
« Reply #37 on: 09 Feb 2018, 18:57 »

I've had a theory for a while that most of the robots in QC were once humans who underwent some sort of body/mind modification.  I agree with Aenno that the AI characters just don't seem very AI a lot of the time.

There's a lot of words in this thread, and full disclaimer, I didn't read most of them and posted my opinion anyway.
Logged

ckridge

  • Furry furrier
  • **
  • Offline Offline
  • Posts: 153
Re: Something bothering me a lot
« Reply #38 on: 09 Feb 2018, 19:36 »

>Thought about robots being safe substitution for minorities because author was shunned for years for "wrong decpiction of LGBT" never occured to me.<

I fear that I didn't explain clearly enough. Dickheads gave Jeph shit for having too many LGBT characters, and for having Marten take up with Claire, whereupon he set out to do more of the same. I think he is deeply concerned with getting his depictions of LGBT people right without claiming knowledge he can't have entirely because he is a scrupulous, careful guy. I haven't read anyone reproach him for getting it wrong.
Logged

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #39 on: 09 Feb 2018, 21:14 »

Quote
I'm sorry. I am droning on about things you know already if you care about them. I accidentally hit a vein of geekery. Let's flee.
Yeah, I do. But, in my firm believe, declaring that something is hard to solve isn't solution. :)

Quote
Dickheads gave Jeph shit for having too many LGBT characters...
My thoughts about this statement can't be written safely, because Russian laws directly forbid using hard-lined mat in a public space.

Quote
May is so poor that her face and her arm fall off and she can't afford to do anything about it. That speaks both to how little human she is and to the position of robots.
Well, Faye sometimes couldn't afford herself new glasses when hers are broken. It's not about the position of robots. It's about the position of low-paid labour. May is a released convict, who hadn't good qualification for non-specialized work, and every decent work would ask "are you ever were convicted for felony?". Her problems about being poor are 100% human. :)
Her problems about being able to have have her face and her arm fall off and stay active and adequate IS speaks about how little human she is, if such a distance with its chassis wouldn't turn on and off by random. I mean, hell, robots hug each other for relief, and looks like it's helping.

Quote
I think we just disagree, is all. One of the things you most object to, that everyone takes robots perfectly for granted, was one of the first things to please me about the strip. I find myself surrounded by things at least that strange with everyone taking them just that much for granted.
Just to clarify.
My tastes are very singu... ahm. I meant I can be ok with everybody takes robots for granted. I can be ok with everybody freak out about robots. Both can be very nicely done.
If an author using robots to show how humans are freaking out about all that tech and Frankenstein complex, it's very ok, I love Asimov.
If an author using robots (or something else we define as weird) as they're common sight around here to show that singularity happens, it's very ok again, there are very good works with this approach (the first thing I get from my head would be Simak's "Time and Again", or, well, nearly every Simak's work, even if he didn't speak "singularity" word).
If an author using robots as a common thing everybody around used to, but if something happens everybody would hit a robot (let's say, Dick's "Do Androids Dream of Electric Sheep?"), it's, again, very ok.
But, I believe, one approach should be selected and don't changed casually.
AIs in QC seems to understand human behavior to the fantastic extent, so they are able to understand and emulate things humans themselves usually don't understand or reflect. But in years, having full libraries about human psychology, they couldn't catch that arachnophobia is a very common reaction, so giant robotic spider isn't the best chassis for social worker if you really want not to freak people.
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

ckridge

  • Furry furrier
  • **
  • Offline Offline
  • Posts: 153
Re: Something bothering me a lot
« Reply #40 on: 10 Feb 2018, 06:39 »

Hunh. You aren't allowed to curse online. Weird. I will moderate my language too, then, so as not to tempt you into trouble.

I think that problems with rendering robots as truly different -- as true Others, to quote Evie -- may be structural and unavoidable.

I saw a TV news segment designed to scare parents once, in which TV news reporters went into a playground and talked kids into coming out of the playground and around a corner with them, where they would meet their parents, and child and parent would be interviewed about how that had been possible. A mother asked her child "Didn't I tell you never to go anywhere with a stranger?" and the child replied "Oh, he wasn't a stranger. We had been talking."

No one is strange once you have been talking. 

Maybe the strangest person I know is a mathematician. She is remarkably child-like in many respects; has a vast, cold, well-controlled intelligence; has chronic, well-managed anxiety about people finding about how odd she is; has bdsm all mixed up with nurturing in benevolent ways that I could not possibly have imagined; is a second-generation immigrant from a very foreign country with a lot of her original culture still running in the back of her mind; and has trouble thinking of herself as a woman because almost no women do what interests her most.

If you talk to her when she is relaxed, she doesn't seem at all odd. When she relaxes and uncoils her mind, she is much larger than most people, but one only ever sees what can fit into one's window of consciousness at any given time, so except for occasional huge background shifting and stirring, it doesn't register. Every so often one realizes that she is asking questions about common human feelings, not argumentatively, but because she really doesn't know. Other than that, she seems no stranger than the rest of us. She does not seem a set of unusual traits, but rather a single integrated person managing those traits as part of her overall situation, just as we all manage ourselves and our situations.

With that in mind, think about Punchbot. Initially he looks like a prime example of how QC robots aren't like humans at all. He is perpetually jolly, to the point of seeming not terribly bright. He suffers no pain and is indifferent to physical harm. He loves to fight, but is indifferent to winning or losing. He has very little impulse control. Bizarrely enough, he is an accountant. This is not a package of traits one commonly finds in humans, and so Punchbot looks suitably odd.

This is, I think, because Punchbot is an undeveloped background character. If Jeph ever develops the character, he is going to imagine how all those traits fit together, how they are all part of the situation Punchbot is doing his best to manage, and what Punchbot's strategies for managing his situation are. The more this is done, the more like one of us in a different situation Punchbot will look, and the more it will be possible to complain that Jeph writes his robots as if they were human.

The problem of how to make well-developed robot characters look truly strange is exacerbated by the very wide range of human types in QC. It was relatively easy for '50s and '60s writers to write strange characters, because everyone was still trying to act like everyone else. In a social circle containing Emily, Brun, Tilly, and Faye, it is hard to step outside the bounds of human social behavior.
« Last Edit: 10 Feb 2018, 12:04 by ckridge »
Logged

ckridge

  • Furry furrier
  • **
  • Offline Offline
  • Posts: 153
Re: Something bothering me a lot
« Reply #41 on: 10 Feb 2018, 06:53 »

I think the spider robots are like those guys who wear neckbeards and sandals with socks and who, when told that this puts people off, will explain that it is more efficient and rational, and thus should not bother anyone.
« Last Edit: 10 Feb 2018, 07:08 by ckridge »
Logged

cloudatlatl

  • Not quite a lurker
  • Offline Offline
  • Posts: 20
Re: Something bothering me a lot
« Reply #42 on: 10 Feb 2018, 08:16 »

My thoughts about this statement can't be written safely, because Russian laws directly forbid using hard-lined mat in a public space.
Hunh. You aren't allowed to curse online. Weird. I will moderate my language too, then, so as not to tempt you into trouble.
I don't think Aenno was talking about cursing, I think this was referring to Russia's laws against anything that could be perceived as 'homosexual propaganda'.
Logged

ckridge

  • Furry furrier
  • **
  • Offline Offline
  • Posts: 153
Re: Something bothering me a lot
« Reply #43 on: 10 Feb 2018, 10:03 »

Oh. Oh, dear. Noted. OK, tricky but doable. Language is infinitely flexible. I should probably try to review those laws.
Logged

Case

  • Vulcan 3-D Chess Master
  • *****
  • Offline Offline
  • Posts: 3,801
  • Putting the 'mental' into judgemental
Re: Something bothering me a lot
« Reply #44 on: 10 Feb 2018, 11:18 »


Quote
In fairness, there is no indication that robo-sex *doesn't* include sensual stimulation.
Can't imagine how is it. We have quite coherent evidence that AI don't need to see or even be near AI who with whom he/she have sex. So we can actually be sure (I believe) that everything involved involve mind, not chassis.

Do you know about phenomenology:mrgreen:


(click to show/hide)



If you haven't already, I highly recommend watching Carpenter's 1974 SF-comedy Dark Star. Add a few like-minded friends (the nerdier, the better) and mind-altering substances (beer should suffice) for an optimal experience.
WARNING: Consumption may induce in you a proneness to fits of hysterical laughter upon hearing innocuous words like 'phenomenology', and I promise that you'll never see water balls in quite the same light as you did before. Symptoms may persist for the rest of your natural life. You learn to live with it ...  :-D
« Last Edit: 10 Feb 2018, 11:42 by Case »
Logged
"Freedom is always the freedom of the dissenter" - Rosa Luxemburg
"The first rule of the Dunning-Kruger club is you don’t know you're a member of the Dunning-Kruger club. People miss that." - David Dunning

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #45 on: 10 Feb 2018, 19:17 »

My thoughts about this statement can't be written safely, because Russian laws directly forbid using hard-lined mat in a public space.
Hunh. You aren't allowed to curse online. Weird. I will moderate my language too, then, so as not to tempt you into trouble.
I don't think Aenno was talking about cursing, I think this was referring to Russia's laws against anything that could be perceived as 'homosexual propaganda'.
Not exactly, and I was kinda joking (kinda - because said laws really exists, I just don't ever believe any police structure would ever notice it here). Sorry, I'm actually in a deep night by my time (for instance, it's 5:30 AM here now) when I come here, and sometimes forgot that some jokes have cultural context to understand and just make a literal from Russian.
Russian language culturally has different layers of swearing. Simple things like "dickhead" or "fucking morons" are something ok, but it isn't something that is used when you're really pissed off. Then it's real things are like "ебстить твою охуевающую пиздоблядскую сраноопроушину матерну раз по девяти бабку в спину деда в плешь" (something like "fuck your fuckoffed pussysluty shitty emptyheaded mother nine times, and your grandma to the back, and your grandfa to the bald"; it's literal, but there is consensus that russian mat isn't actually translatable). That's simple example, actual forms can be longer. It's kinda traditional - first written forms are dated from 15 century. This called "mat" (from cut "мать", "mother" - because traditionaly most forms, called "загибы" (zagibs), including something about your mother).
Actually not many people this days can use it really right, and simple cut forms used, and even them aren't actually allowed in formal context, but there is a traditional reply on something that pissed you a lot. "I'd say everything I'm thinking about it, but matting isn't allowed in such a nice place". And things like "hey, you have too many LGBT in your artwork!" is really something that pissing me off hardly. You see, it's quite popular thoughts here and today, with a little "it was Yankees who gave us all this LGBT shits, when we was behind Iron Curtain, we haven't anything like that!", and I actually was researching how it was when we was behind Iron Curtain. It was exist, and it's really wasn't pretty.

Quote
Oh. Oh, dear. Noted. OK, tricky but doable. Language is infinitely flexible. I should probably try to review those laws.
You can, but you really shouldn't if you're not a researcher. I mean, we have laws about homosexual propaganda, sure. It's a federal law 436-FZ "On Protection of Children from Information Harmful to Their Health and Development”.
In a nutshell, it isn't allowed if I speak something that executive (yeah, not juridical) administration would accept as "information that is harmful to children health and development". In a current law revision such information is everything that:
1. provoking children to harm their own health or life, including self-harm;
2. is able to suggest children to get any kind of narcotic, including tobacco or alcohol, or get into prostitution, vagrancy or begging;
3. justifying violence on humans or animals;
4. denying traditional family values, propagandizing non-traditional sexual relationships or provoking showing disrespect on parents and (or) other family members;
5. justifying unlawful actions;
6. including obscene language;
7. including pornographic content;
8. any info about minor who is a victim of the crime.
So if I speak anything about it, governmental official can demand me to remove it. If I refuse, medium where I did it can be sanctioned, and I can be punished.
In the first it was for registered media only. Then they equaled any site with, I believe, 1000+ unique visits per day to media.
So if you says anything like "LGBT relations is ok", or "sometimes you should stand up and break the law to do things right", or "I run from my parents in 14 living on the street, and it was useful experience" - it CAN be declared as misdemeanor. Or not, because the very nature of modern Russian law enforcement is that it's optional and selective (and our laws actually allowing to punish everybody).

Quote
The problem of how to make well-developed robot characters look truly strange is exacerbated by the very wide range of human types in QC. It was relatively easy for '50s and '60s writers to write strange characters, because everyone was still trying to act like everyone else. In a social circle containing Emily, Brun, Tilly, and Faye, it is hard to step outside the bounds of human social behavior.
Yup, and that's fantastic. When you have humans like that (and I can actually swear on my diploma that they're realistic!) you don't need robots to research xenia.
EDIT: Actually, as I said in the beginning, that's why I even started to worry. You see, QC, in my option, is the best thing about weird humans still stays humans and having human problems I seen in years. They aren't perfect, they aren't miserable, and they aren't more caricatures that genre declaring. That's brilliant, and that's what hooked me.

Quote
I think the spider robots are like those guys who wear neckbeards and sandals with socks and who, when told that this puts people off, will explain that it is more efficient and rational, and thus should not bother anyone.
Is that exact types of guys you would put into social worker position? ;)

Quote
Do you know about phenomenology?  :mrgreen:
Yeah! :)

Quote
Can you still uphold your distinction that the sensual stimuli received are not, in fact, sensual stimuli? What if the simulation is sufficiently complex that the brain itself can no longer recognize that it is in a simulation? Then your verdict ("no body, ergy no sensory stimulation") would be in conflict with the verdict of the brain experiencing those "questionable stimuli". Does using the term 'sensory stimuli' require actual, physical senses to be attached to the brain?
Actually it does. But you definitely can fool a brain and make it to think it have some sensory information. That wouldn't be "sensory stimuly", but brain inside would never tell the difference.
Because, hell, we're so non-perfect.

Quote
If you haven't already, I highly recommend watching Carpenter's 1974 SF-comedy Dark Star. Add a few like-minded friends (the nerdier, the better) and mind-altering substances (beer should suffice) for an optimal experience.
I did. Without mind-altering drugs, but after big philosophy cycle. It worked even better!
« Last Edit: 10 Feb 2018, 20:45 by Aenno »
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

ckridge

  • Furry furrier
  • **
  • Offline Offline
  • Posts: 153
Re: Something bothering me a lot
« Reply #46 on: 11 Feb 2018, 11:09 »

>When you have humans like that (and I can actually swear on my diploma that they're realistic!) you don't need robots to research xenia.<

"Why do we need more diversity when we are so diverse already?"

Because love of the strange and different is insatiable, and because insatiability is its greatest virtue. Nothing is more certain than that we are going to run into creatures and situations stranger than we have never imagined. It probably won't be conscious, sensate robots in our lifetimes, but a lot of people are going to have their children growing up loving chatbot teddy bears and chatbot teaching machines and more than half-believing that they are people. We are all likely to wind up living around people from very foreign cultures. We are all likely to wind up living around people from very bad places, people with sexual tastes that never even occurred to us, mentally ill people in partial remission, and people neurologically atypical in rare and unclassified ways. This already happens, so we can be sure it will happen. In addition, we will be presented with people and circumstances nothing like anything we have ever imagined and stranger than we can imagine. For this reason, it is good to delight in strange new people simply because they are strange and new.

And for this reason, it is useful and good to imagine what it would be like to be friends, or perhaps lovers, with a super badass AI lady.
Logged

ckridge

  • Furry furrier
  • **
  • Offline Offline
  • Posts: 153
Re: Something bothering me a lot
« Reply #47 on: 11 Feb 2018, 11:13 »

I'm not trying to argue you out of your position, by the way, only help you clarify it. It is far better to hate something lucidly, specifically,  passionately, and in great detail than it is to feel merely bothered by it.
Logged

Aenno

  • Emoticontraindication
  • *
  • Offline Offline
  • Posts: 65
Re: Something bothering me a lot
« Reply #48 on: 11 Feb 2018, 11:44 »

>When you have humans like that (and I can actually swear on my diploma that they're realistic!) you don't need robots to research xenia.<

"Why do we need more diversity when we are so diverse already?"
Nah. "Let's have more diversity, not shrinking being AIs into being humans; or let's allow humans being as diverse as they really are."
Logged
Disclamer: English isn't my native language, and I'm doing a lot of mistakes - not by meaning but by grammar or orphography - that, with thoroughful rereading, I often found and want to edit, but not always.
If you're offended by my using of your language, fell free to right my wrongs.

ckridge

  • Furry furrier
  • **
  • Offline Offline
  • Posts: 153
Re: Something bothering me a lot
« Reply #49 on: 11 Feb 2018, 14:55 »

We don't have to worry about shrinking AIs into being humans because AIs don't exist. AIs are fantastical creatures, like trolls, space aliens, elves, or Morlocks.

As to representing human diversity, older forms of story-telling like serial art, ballads, folk tales, epics, and romances present characters in vivid outline, not in detail. This works fine for kinds of people everyone knows about already, like the human characters in QC. Audience members each fill in the detail from their own experiences, and this both involves them and makes them feel like they are re-joining a circle of friends they had somehow forgotten. Sketches won't do for the sorts of people they haven't met yet, though, and so new and unexpected kinds of people are represented by various fantastic creatures. That is just the way the form works. In The Odyssey it is the Cyclops. In folk tales it is monsters and the fair folk. In romances it was giants and sorcerers. Here it is robots. That's the way the form has worked time out of mind. If you want a wide and realistic representation of human diversity, you go to short stories and novels, which offer much more detail and differentiation, but which are longer, slower, less vivid, and more bound to particular cultures.
Logged
Pages: [1] 2   Go Up