Comic Discussion > QUESTIONABLE CONTENT

WCDT 4176-4180 (13th - 17th, January 2020)

<< < (23/39) > >>

Aenno:
I'd like to point some things.

1. Parole conditions are individual. There is (usually) some mandatory/common clauses, but nothing actually prevent parole board to define some clauses for concrete instance.
2. The "rarerest happening of unembodied AI needing a body" is still happens often enough for the special body-assignment department with dedicated budget committee to exist.
3. "Unembodied AI" doesn't exist in vacuum. It still should be runned on server, such a server should belong to somebody, somebody should pay for juice and machine resources. Networking AIs are generally believed as non-existent. In practical matters, there is no such thing as unembodied AI, there are AIs whose body is server. Or toaster. So no, there is no reason to really believe that being on server would cost nothing at all.
4. May explicitly saying that working as human companion, or at least within society ("proving a marked decrease in sociopathic tendencies"), is a part of the parole deal.
5. May explicitly explaining that her situation has a common procedure (100$ as lift-up and a halfway house).

rtmq0227:
An interesting thought on May's initial crime: if she's so averse to being disembodied, perhaps the initial crime WAS one of desperation. 

Imagine you are a socially/emotionally-immature entity (something like a person who's 5-12 years old) who's only ever been disembodied.  Something happens or changes or manifests and you have a sudden aversion to being disembodied.  You don't have a chassis to go back to, you've never even had one.  Maybe you don't know anyone who's had one, as your peer group is OTHER disembodied AI.  You have no real notion of how to go about acquiring a chassis.  You're a financial AI, likely under some social or professional pressure to remain a financial AI, so becoming a companion doesn't occur to you or seem like an option.  You conclude that the only way to acquire a chassis is to embezzle the money.  You figure you're clever and sneaky and no one will know until your plan is complete.  You're also immature and figure out that this embezzlement plan is just as easy/risky for 10K as it is for 10M, so you decide if you're going to pick you own body, why not be an AWESOME FIGHTER JET?

May's admitted she has impulse control issues.  Keeping that in mind, I think the above scenario is plausible.  It could be that her aversion to being disembodied started before robot jail, and she just hasn't felt comfortable discussing it yet.  At which point, you have to consider the obligations owed to AI who are created for a purpose (to be a banking AI, or a soldier, or an assembly arm, etc.) who develop some psychological aversion to their intended role.  Do they have the right to transition if they don't have the means?  Is their original commissioner responsible for their well-being?  How can it be considered ethical to bring an intelligent/self-aware entity into the world if there are no systems in place to handle the eventuality that some of them reject their intended purpose?

Or is it like grinding for loot?  You keep creating new AI until you have enough that are pre-disposed to whatever purpose you need them for.  What happens to the rest?  Presumably job placement programs and the companion program.  Play the law of averages to your advantage, and let the rest be companion AI sounds like something a company or bureaucracy would come up with.  It might also explain why there aren't more considerations for cases like May.

Aenno:
IIRC, AIs aren't created for the propose. They just emerge, nobody knows how exactly (at least that's common knowledge; it can be wrong). So placing obligations onto human whose banking expert system just became AI today and decided it wants to be a robotic spider to scary humans be a social worker isn't exactly fair.

By the way, there is a curious case of Bubbles. She wasn't created as a soldier, she emerged somehow and applied on military service (it's actually important point in her development). So, she was granted the powerful, top-secret robotic combat machine. Then she was discharged. Why was she allowed to keep the body? It's like tank driver would be allowed to take a tank he was driving to home after discharge.

Essentially, I believe humans just don't have SOP for such cases, and resolve them ad hoc.

Cornelius:
That would be one explanation, and possibly a better one than we've had so far, as it readily accommodates inconsistencies.

Is it cold in here?:

--- Quote from: notsocool ---the government employee explicitly says that disembodied AI ex-convicts usually do not choose to be embodied. This means that May's situation is directly a result of her own choices - hers is "a niche case of a niche case". And with respect, I cannot imagine that many of the jobs that human ex-convicts do cannot be done by a disembodied AI, such as all kinds of industrial work, some service positions, etc.

--- End quote ---

Your points are getting more and more interesting!

The idea of "service positions" jarred loose that May could do things like call center work from a server farm, if the terms of her parole permit it. May would be a lousy customer service rep, but it's not up to the taxpayers to make up for that.

Roko is missing an option. There is such a thing as petitioning a court to revise terms of parole. Some of the restrictions on May's options could be lifted -- there is good reason.

A variable we don't know about is whether she's even allowed to purchase her own body. Human parolees are allowed to change clothes but forbidden to move without notice or permission. Which is the closer analogy? As mentioned upthread, if it were just a matter of spending her own money, she could wind up as a felon in possession of a weapon.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version