Comic Discussion > QUESTIONABLE CONTENT
WCDT 4176-4180 (13th - 17th, January 2020)
notsocool:
--- Quote from: Cornelius on 17 Jan 2020, 05:32 ---An interesting question that, if I'm not mistaken, hasn't been dealt with in universe: is there such a thing as a natural death for AI? I.e. not through accident - as the Crushbot incident could have been, had Roko not had a reinforced core. If AI should be inextricably linked to their substrate, then that could wear out. On the other hand, we've seen Pintsize being backed up, and I seem to remember Momo being transferred by data cable?
--- End quote ---
This is a very interesting discussion of what constitutes an AI "self", but that's a philosophical question that belongs in a different forum. If an AI can be backed up, and if they can transfer their consciousness by a cable, it is clear that an AI is not hardware, but software - they're the information stored on the computer (or the computer inside an AI body). Of course, you'd still have to take care of your AI core, because if you smash it, your software is still gone.
JoeCovenant:
--- Quote from: notsocool on 17 Jan 2020, 03:58 ---
Sure! But do you think it should be new and be of good quality?
--- End quote ---
Doesn't have to be new.
And only needs to be decent quality.
We don't know how *new* May's is - but we DO know it is not of decent quality.
--- Quote from: notsocool on 17 Jan 2020, 03:58 ---This is not my point. I am all for social welfare. My point is, should we give a parolee medicare, shelters and free clothes if law-abiding people were not entitled to these things? I want parolees to be treated as well as people with no criminal record. But what you are suggesting is that we treat them better than people with no criminal record!
--- End quote ---
Personally, I think what he is suggesting is that people, who fall out of the "norms" of society, often need a helping hand to get back into them.
In a nutshell, treat them with basic human decency.
Is it cold in here?:
--- Quote from: Aenno ---it's a social problem she was able to make her crime in the first time.
--- End quote ---
That brings up an interesting question. Is QC AI code transparent enough that poor impulse control could show on a diagnostic readout? We know there's a maturity scale. Why was she trusted with $750 million?
@notsocool, I completely agree with you that it would be unfair to issue May a Momo-class chassis. I don't remember anyone advocating it. I liked your rent to own idea. Combine that with a basic second-hand model, analogous to a car with hand-cranked windows and 80,000 miles but which someone is still willing to put a warranty on. What would you think of that? To me it looks like a good compromise among compassion, practicality, and the interests of the taxpayers.
Here's another angle. If they'd refused to issue her a body at all, that would have been one thing. Isn't knowingly putting her into a defective one torture?
You made yet another sound point, that if May's parole conditions interfere with earning an honest living they can and should be changed. That is hard to do without a lawyer. Maybe Roko should consider the option of pounding the pavement to find a pro bono attorney to modify the conditions (not that we actually know what they are, since "digital work" was not defined).
Aenno:
--- Quote ---No. The "digital work" referred to is explicitly explained to be the renting out of processor power the way Pintsize does. May explains that she is not allowed to to this.
--- End quote ---
No, it's different clauses. Well, I believe they're included - first one is subset of second one.
There is 3828, where May explained she can't rent out her processor power: "if you commit massive bank fraud they don't let you plug your processors for cash anymore". That was an answer to the question "can you do it", and was in concrete context about "how Pintsize making money".
But also there is 4031, where, speaking with May parole officer, Roko speaks directly: "she can't do digital work because of the probation rules she's so diligently following".
--- Quote ---This is as clear as can be that there are AI offenders who are disembodied, and continue to be disembodied after release. May's parole conditions should be the same as theirs. This is the "ridiculous" part of my statement: if May is somehow being treated differently from other disembodied AI, that is ridiculous, and they should have her parole conditions revised.
--- End quote ---
Let US Department of Justice answer this: "The Commission always considers the individual's situation and may waive this or any other standard requirement if it sees fit to do so. On the other hand, special requirements may be added and must be met before release."
Actually, parole conditions for May case are a) very reasonable and neat, b) working. She is a naughty goblin, but she is quite likeable naughty goblin. She following the rules, she have a honest (even if shitty) job, she stays out of trouble, she control herself, she empathize, she accept and offer apologies. She can be an advertisement for parole system. It's not parole conditions she have a problem with - because, actually, even if she would be allowed to rent her processors or do any kind of digital off-site job, her hardware just isn't stable enough. Anything can break anytime. Including power systems supporting her AI core, by the way.
--- Quote ---There is no way that a regular server would cost more than the exact same computer PLUS arms, legs, and a face.
--- End quote ---
Why the hell should it be the *same* computer? By everything we saw from the comic, post-Singularity AI bodies are specialized systems built for containing AI and be operated by it up to basic level. Server system with such a limited functionality would be very impractical.
Still, it's not exactly my point. My point is that in any case it's obvious (May is a acting proof) that, by active regulations, government obliged to provide released AI some kind of hardware; it's very possible there are some kind of conditions, still May situation obviously fits them. If it wasn't the case, May request would be just simply denied. So it's actually common enough for regulation describing this to exist, because no way US government worker would do it on his own risk without any supporting regulation. He would be fired if he would.
And if this obligation exists (think about SNAP, or Section 8 - federal programs allowing basic living level for people who can't afford it; by the way, one of reasons of critique for Section 8 by conservators is "hey, this means that problems of low-incomes would be spreaded to suburbs!", and that's hillarious), there is, or at least should be standards. You can't give spoiled food as a part of SNAP, or give a housing that doesn't fit warranty of habitability under Section 8.
If such a standard doesn't exist, it should be introduced.
--- Quote ---May is not a prisoner, she is a parolee, and the government is not responsible for her.
--- End quote ---
In legal or moral sense?
Actually, government is responsible in both senses. Government, as parole system, is responsible for May behavior, obliged to impose restrictions on her and lock her out if she is a treat to a society. In moral sense, when you're taking power on somebody, you automatically is responsible for him, proportionally to taken power. At least, that's my truest conviction.
--- Quote ---Human parolees do not get those things.
--- End quote ---
Human parolees in US are able to get into SNAP, are able (in most states) to apply for Section 8 voucher (again, conservators hate it), they can apply for low-income help until they're not actually in prison. At least as I checked, they can do all of this is MA.
And parole system suppose to help. Let US DoJ speaks again: "Parole has a three-fold purpose: (1) through the assistance of the United States Probation Officer, a parolee may obtain help with problems concerning employment, residence, finances, or other personal problems which often trouble a person trying to adjust to life upon release from prison; (2) parole protects society because it helps former prisoners get established in the community and thus prevents many situations in which they might commit a new offense; and (3) parole prevents needless imprisonment of those who are not likely to commit further crime and who meet the criteria for parole."
You see, I can't rid from a thought that when you're saying "I want parolees to be treated as well as people with no criminal record. But what you are suggesting is that we treat them better than people with no criminal record!", you're actually saying "I want parolees to be treated as well as people with no criminal record, but only AFTER people with no criminal record".
--- Quote ---That brings up an interesting question. Is QC AI code transparent enough that poor impulse control could show on a diagnostic readout? We know there's a maturity scale. Why was she trusted with $750 million?
--- End quote ---
I believe she was trusted 750KK$ as a non-sapient banking expert system with teengirl avatar for staff amusement, and then AI emerged.
immortalfrieza:
--- Quote from: Cornelius on 17 Jan 2020, 05:32 ---An interesting question that, if I'm not mistaken, hasn't been dealt with in universe: is there such a thing as a natural death for AI? I.e. not through accident - as the Crushbot incident could have been, had Roko not had a reinforced core. If AI should be inextricably linked to their substrate, then that could wear out. On the other hand, we've seen Pintsize being backed up, and I seem to remember Momo being transferred by data cable?
--- End quote ---
Barring the corruption or a complete destruction of their code without backup, an AI can live as long as there is hardware capable of containing it, which was brought up pretty early in the comic to Pintsize who asked that exact question.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version