Comic Discussion > QUESTIONABLE CONTENT
Robots and love
Carl-E:
OK, I posted in the WCDT before I saw this thread... it was a response to someone who thought Momo was being manipulative.
--- Quote ---Re: Robotic love.
The argument that love expressed after an extravagant gift doesn't hold, especially with humans. It may or may not be manipulative, but often such a gift is an expression of love from the gifter, and so will elucidate such a reaction from the giftee ("I don't have anything like this to give you in return to express my love, so I'll just have to tell you how I feel"). The fact is that, spoken or not, Momo's loved Mari since we first met them. She cares for and about her in the most fundamental of ways. And this extravagant gift has shown Momo that Marigold considers her as so much more than a housekeeping, advice giving robot - that Mari cares for Momo as well, something that may not have been evident in the past.
OK, all that being said - that's the human side of things. The assumption in this comic is that somehow, hman emotions are in these AI's, for better or worse. Momo "bonded" to Marigold, and now it's clear Mari has bonded back.
What happens when a lover enters the picture for Marigold? Especially a human one? Jealous Momo? We've seen some of that from Pintsize. Or is she one of those that cares enough about her human to let it go? This really complicates things!
We're not entering new territory, really - but I think we are seeing the beginning of a beautiful friendship.
--- End quote ---
Fact is, we're dealing with an AI for whom the singularity has hit. The robots have gained sentience, and seem to have also acquired human feelings in the process. We'll never know how (unless such a fictional occurance takes place in our world) a sentiance will respond to emotions, or even if they are "real". Certainly physical pain would need receptors (Marvin's "terrible pain in all the diodes down my left side" in Hitchhiker's was probably hypochondria), but given emotions, there wouldcertainly be emotional pain. How would AnthroPC's deal with the loss of their human after 70-80 years of companionship? Bradbury (I think) dealt with this in the Electric Grandmother story, but with AnPC's, the emotions seem to run deeper.
What do y'all think?
Is it cold in here?:
Neurons can fire, not fire, send impulses to other neurons, and change their sensitivity to input. All their activity is some combination of the above. Can machines like us, built from neural networks, love?
http://en.wikipedia.org/wiki/Vitalism#Foundations_of_chemistry. Chemists used to believe there was some magic principle unique to organic molecules that made them different from inorganic molecules, and that they could never be synthesized from non-living ingredients.
Random Al Yousir:
The way I see it, the big stumbling block is our idea of maths.
Take life. Building and maintenance of a living organism is achieved by execution of the genetic code. There's a math behind it, which we might discover, one day (assuming we get it managed to not kill ourselves on the way).
Once we have an understanding of the maths behind life, we might be able to discover the maths behind sentience (although there's no way I could know what I'm talking about, I assume that proficiency in the maths of life is a necessary requirement for the understanding of the maths of sentience, the same way that proficiency in the small multiplication table is a necessary requirement for the understanding of category theory).
But I won't hold my breath. From Euclid to Frege took, what? 3600 years?
Edit: According to Wikipedia, it's 2200 years. Huh, wish I could get hold of a proper debugger for my brain.
Skewbrow:
I would think that the AI breakthru that has happened in QCverse is related to machine learning. I admit to being clueless where the Realverse is in machine learning research, but it seems clear that we're behind. It is certainly a prerequisite to any kind of AI singularity taking place (I don't believe in an explosive speed AI singularity, because I think that even AI would need quite a bit of time to learn things).
But robotic love? Well, I believe love is a by-product of evolution (a very beautiful one at that), so am undecided about the possibility of robotic love. May be the AnthroPCs at least learned to emulate feelings well enough so that for Marigold won't be able to tell the difference? I mean, that is good enough for all the purposes of the present story line.
And as Random Al Yousir put the ball on the tee for me. I get your point, but category theory is quite often called "general nonsense" or "abstract nonsense" (most likely you knew this). It is mostly about abstracizing/generalizing for the sake of generaliziing itself. It give us some useful concepts, saves a bit of work in that the same theorems don't need to be proven with exact same ideas over and over again, but it doesn't really say much about any more natural area of mathematics.
Orbert:
--- Quote from: snubnose on 01 Sep 2011, 06:58 ---Thats why Assimovs three laws of robotics do not include any reference to feelings:
--- End quote ---
I've seen Dr. Asimov's name misspelled that way intentionally by folks who don't like him, but I'm assuming it was a typo here.
Doesn't matter. The Three Laws do not apply here, and there's no reason to think that they do. Dr. Asimov came up with the Three Laws to define how robots in his particular fictional universe behave. They are at the core of their programming and cannot be overridden. Still doesn't matter, because nowhere have we been given any reason to believe that AnthroPCs in the QCverse follow those same laws.
Also, I've noticed that nowhere have we discussed the difference between a robot and an android. The simplest definition I can think of is that robots have "simpler" programming, the ability to respond to external stimulus but not really "think for themselves". A Roomba is a robot. The machine that puts doors on cars at the factory is a robot. Androids have AI and are meant to mimic human behavior, to think for themselves, to infer, to guess, to go beyond their programming. Sci-fi writers over the years have played around with where exactly their programming and self-programming becomes sophisticated enough to mimic emotions, but they've wisely left the question of whether or not these emotions are "real" out of the equation.
Jeph seems to have dodged the question of whether Pintsize, Winslow, and Momo are robots or androids by dubbing them AnthroPCs. I would say that they are not robots, but androids. They are sentient, they think for themselves and mimic human behavior, including acting based on horniness (Pintsize anyway) or a sense of humor. Momo demonstrated desire (she wanted a new chassis) and that she cares for Marigold (she wants her to be happy). I don't think it's a huge jump, if it's a jump at all, to believe that they have feelings, or at the very least are capable of developing them.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version