Comic Discussion > QUESTIONABLE CONTENT
WCDT Strips 3461-3465 (17-21 April 2017)
pwhodges:
My wife determined the crucial change in our relationship as being when she let me leave my toothbrush in her bathroom.
ElsaStegosaurus:
--- Quote from: OldGoat on 20 Apr 2017, 20:51 ---Hmmm....a budding Claire/Bubbles friendship. When Claire tells her that she's transitioned genders, will that make Bubbles realize, "I don't HAVE to occupy this combat 'droid chassis."?
--- End quote ---
btw: http://www.questionablecontent.net/view.php?comic=3337
JoeCovenant:
--- Quote from: Case on 20 Apr 2017, 09:02 ---
--- Quote from: JoeCovenant on 20 Apr 2017, 08:32 ---
--- Quote from: oddtail on 20 Apr 2017, 04:20 ---
--- Quote from: JoeCovenant on 20 Apr 2017, 02:33 ---Yeah but Bubbles memories literally ARE files, as mentioned above, and shown in the strip, cos Bubbles is a computer, not a person.
--- End quote ---
OK, this is nitpicky, I admit, but Bubbles is a computer, not a *human*. It's established very well in QCverse that AI are actually conscious, intelligent and have individuality. They also have rights as citizens.
So Bubbles is very much a person, unambiguously so (in the legal, moral, and mental sense). A non-human person, but still.
--- End quote ---
Oh I don't disagree with a single word of that!
Luckily it doesn't impinge on my argument! :)
"Person" or not, their memories are handled as a machine's.
--- End quote ---
Uhmmmh - I don't recall evidence for that. In fact, Bubbles explanation of the functioning of QC-verses AI's suggests that AI memory recording and storage is significantly more complex. Furthermore, I recall a discussion from Jaron Lanier's "You are not a Gadget" where he explains that the entire paradigm of files (and nested folders) that contemporary operating systems utilize was a design-choice that first became popular, and later became 'locked in', as he calls it - but it's in no way necessary to organize even our conventional Turing machine's OS' in that way - so why should it be necessary for something that is on an entirely different level of complexity & capabilities? (*)
For example: Recall those stories about a Savant who is taken on a helicopter trip over the roofs of Paris, and who later on paints stunningly detailed pictures of precisely that aerial view of Paris? No 'normal' human being can do that - not because our brains and senses cannot (they can), but because our memory recording filters for important information (But 'important' is a choice - a choice that can become ... a narrative)
We recall: "Paris from above", "Roofs", "Shingles", "Up in the air", "Loud" (Choppers are loud!), "Anxious" (Fear of falling) -> The things that our brain deemed important (also important for our survival) at the time.
He recalls the position of every fucking window he's seen (no kidding, they actually compared the drawings to photograpy taken during the trip) ... But this guy, for all his amazing abilities, can't tie his own shoelaces (like literally, not metaphorically) - It's no question which mode of memory encoding grants an evolutionary advantage.
(*) With 'lock in' Lanier means design choices that become central to the entire functioning of parts of human activity's infrastructure not necessarily because they are good choices, but because so many subsequent applications crucially rely on them for their operation. Examples include the diameter of tubes in London's Underground that allow only a certain class of tracks, or Dolby's MIDI-protocol - oftentimes, those choices were 'proof of concept'-designs that impose significant limitations later on. Lanier theorizes that this is compounded in computer science because of conventional program's 'brittleness' - a single misplaced digit, or character can make the entire operation impossible, or - if the coder has not implemented good error-handling - even shut down the entire system, including other user's access to services.
--- End quote ---
Why does 'Complexity' alter the fact that AI's are still machines?
And that machines (as we know them, as we have no evidence to the contrary) do not 'forget' data unless programmed (or made) to do so.
I completely understand and agree with arguing for the *personhood* of the AIs in this universe.
But that does not alter the fact that they are not biological entities, and as such cannot have the same arguments put forward to compare them against the way we 'keen, neat humans' work.
To use your example above, the only evidence we have to date (Emily in Bubbles mind) is that AI's are like those savants you discuss above. They have total recall, unless acted upon by an outside influence... That, of course, is leaving aside degradation of storage systems etc... "The centre cannot hold..." and all that.
But unless Bubbles is VERY old, surely the AI version of dementia hasn't kicked in?
Tova:
--- Quote from: JoeCovenant on 21 Apr 2017, 02:34 ---Why does 'Complexity' alter the fact that AI's are still machines?
And that machines (as we know them, as we have no evidence to the contrary) do not 'forget' data unless programmed (or made) to do so.
I completely understand and agree with arguing for the *personhood* of the AIs in this universe.
But that does not alter the fact that they are not biological entities, and as such cannot have the same arguments put forward to compare them against the way we 'keen, neat humans' work.
To use your example above, the only evidence we have to date (Emily in Bubbles mind) is that AI's are like those savants you discuss above. They have total recall, unless acted upon by an outside influence... That, of course, is leaving aside degradation of storage systems etc... "The centre cannot hold..." and all that.
But unless Bubbles is VERY old, surely the AI version of dementia hasn't kicked in?
--- End quote ---
I think that the key phrase in your post is "(machines) as we know them."
AIs in QC are not machines as we know them. What's more, they are an emergent species, and thus may well share some characteristics with biological species. But as we have no real evidence, as you say. We are left to speculate.
I am not here to categorically state that you are wrong. Merely to ask why you seem so confident in your opinion that AIs have perfect recall.
jheartney:
Pair of couples sharing a 2-bedroom apartment: hardly unusual for 20-somethings. The fact that Bubbles won't (for the most part) need the bathroom helps things a bit, and assuming the electricity account is in the name of the tenants, then mole-people landlord shouldn't care that much.
Eventually the logic of the situation will force some adjustments. Either Marten/Claire eventually vacate to their own place, or Faye/Bubbles does. Or all could move to a bigger place once they get tired of the cramped quarters. For the sake of the comic, the vacating couple could end up going to another apartment in the same building if it opens up (maybe Juicy will get a job in another state).
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version