Comic Discussion > QUESTIONABLE CONTENT

WCDT Strips 3886 - 3890 (3rd - 7th December 2018)

<< < (23/29) > >>

TheEvilDog:
I'd argue against turning off their emotions. It's unhealthy for people not to process complex emotions and it could be an even worse problem for an AI to do the same. Likewise, while AI might be physically superior to a Human, a cold emotionless rationale would likely stymie AI growth eventually.

Thrudd:

--- Quote from: BlueAmaranth on 05 Dec 2018, 09:58 ---Does anyone else think it's inappropriate of Roko to ask Elliot to mentor her in bread? I feel like she'd be making him a participant in her fetish without his consent. And if she actually takes a job at the bakery, then extend that to everyone else who works there and all the customers who come in. Like, there's a difference in my mind between just coming in to purchase bread vs "Tell me everything about this bread while I secretly get off on it."

--- End quote ---

I am firmly on the other side of the fence on this issue with a caveat. That being "Don't do that at work". Then again when it comes to "interpersonal happy fun time", unless it is part of the job description then "Don't do that at work" pretty much applies to everyone.

Can she be excited in more ways that one about bread? - yes

Can she learn under a master to make bread and share the objects of her love with others? - yes
Can she share her passion for bread with others? - As long as she does not cross the line in sharing then yes

Heck - you could substitute any activity or object for bread and the results would be exactly the same.
Sportsball, DramaTV, Games, Movies, Comics, The secret life of slugs ...... (click to show/hide)Remember kids
Don't do slugs

So a word to the wise moment here - Don't overshare and freak-out those not as excited about the subject as you are.
Warning - while you were typing a new reply has been posted. You may wish to review your post.

Gah - Okay, on the subject of AI and emotions a few points.

You can't turn them on or off, why expect any other intelligence to be able to?
Emotions are not just chemical based but also mind based as well and I don't see that great a difference on  the psychology.

I do agree that our esteemed creator may have drifted and lost track of some his characters unique characteristics of late.

(click to show/hide)I personally hope that he has learned some things over the years

You can't please everyone all the time
You can and will make mistakes
Things will go wrong
Some people are mean
Some people are immature and petulant
Some people are uninformed / ignorant
Some people are jerks
Some people are idiots - it is untreatable and must be handled with caution
Some People are evil
You are a people and can be any of the above, just don't be evil
Strategy, Tactics, Logistics - understand which is which
A plan that gets you where you need to go does not mean you can't take any side trips along the way
Humour and Drama are like salt and pepper - spices that enrich the flavour as long as you don't overdo it
Something that is too good to be true usually isn't - be that a product on TV, a social movement or your latest creation
Jumping in headfirst is fun and exciting but you will end up hurt by what is below the surface if you don't check the waters first

themacnut:
I don't think AI should be emotionless, far from it - having emotions helps them relate better to humans and vice-versa. I just think they should be in better control of their emotions, since they're generated differently than humans' are (no hormonal/chemical component, for one thing). Put it this way - imagine the damage a powerful AI like Station could do if it lost emotional control of itself. It would become a danger not only to the humans within it, but also those on the planet below.

Put it another way - even us humans have to learn some degree of emotional control to function in society. It follows that AI need to do the same, only they don't have the nearly two decades of time to learn that control like humans do during childhood and adolescence. So it follows that their emotional control would need to be at least partially "built-in" at the time of their installation in a body, maybe as some form of emotional control subroutines?

sitnspin:
AI in the QC verse are not "made", they are emergent entities, same as humans. They aren't coded by anyone.

Thrillho:
Jeph has also I think said before that he has deliberately not landed on an explanation why the AIs are so much like humans in their design, so regardless of why we may think AIs have them in this universe and whether we think it is right, it certainly seems to me like the AIs have emotions because humans do.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version