Comic Discussion > QUESTIONABLE CONTENT

WCDT: 2465-2469 (10-14 June, 2013) Weekly Comic Discussion Thread

<< < (18/67) > >>

Zebediah:
Well, I've often argued that a sentient AI would have profoundly different motivations than we do, simply because of the hardware differences. Our base motivations are the result of millions of years of evolution selecting for the traits that best kept us alive and best enabled us to reproduce in the environment we were in at the time. Our intelligence hasn't completely freed us from the instincts we evolved back on the African savannah or before.

AIs are the product of a vastly different evolutionary experience - they are the end result of decades of humans trying to build a better computational device. So an AI's driving motivation might well wind up being to process the most data, or to provide the most useful information (to us or to another AI). They wouldn't have a sex drive as such, because biological reproduction isn't an issue. Nor would they have any particular desire to dominate others, or to amass wealth. What would they want? More processing power and more data to process, hence more bandwidth. Beyond that it's anyone's guess what they might do.

Valdís:
But how does that change a single thing about how the causal reality for both of us works? Just because their reasons can have fundamentally different causes doesn't mean they don't work in the same way.

I don't see how a particular conscious animal or machine's psychology would have any possible impact on it what-so-ever. I mean, there are plenty of people and animals already with very different approaches to all kinds of things. It's not like I was speaking as though the statement I made is only valid if the beliefs were formulated within a "human 22 year old Nordic trans woman's brain with issues". :-P

pwhodges:

--- Quote from: Shjade on 10 Jun 2013, 12:44 ---Belief is a choice, a decision, a conscious frame of mind. It's not something built into you.
--- End quote ---

Ways of thinking can be trained into you at an early age, the memories of which do not remain conscious.  Some beliefs can be such a fundamental part of your early upbringing that you don't even have the awareness of them necessary to question them (unless you gain it anew as a result of later critical thinking); such beliefs can just as well be described as built in.

Loki:
Wouldn't an AI that we recognize as "human" be human-like by virtue of human bias? If a human evaluates if a robot is sapient (and thus if it is an AI), they cannot help but apply human standards to them, such as "does this subject care about others" etc. Others criteria, such as "does this subject have arsuajwry" would not apply because humans don't have a concept of "arsuajwry" (I just created the term through semi-random typing on my keyboard).

I fear I am not doing a good job explaining what I mean. Basically, an Artificial Intelligence designed by humans would be considered "human" because it is alike to a human intelligence. Thus, those we consider AIs would think like us to some degree. There may very well be subjects out there in QC-verse who are intelligent in their own way, but we wouldn't recognize them as such, because we are but human.

For a more succinct view on this, compare Randall Munroe:



Warning - while you were typing a human intelligence has weighed in on the matter. You may wish to review your existence.

Tobimaro:
I had to vote for Battle Spatula because of this (and I'm not a big fan of Ranma 1/2, but I thought of...) :

(click to show/hide) [/end spoiler]

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version