Comic Discussion > QUESTIONABLE CONTENT
AI wonderings and discussion
Zebediah:
That was Momo and Marigold, with Hannelore poking Momo’s skin post-shower in the subsequent comic.
baronvonfritz:
How does the QC universe deal with the system of ownership of sentient beings? It was noted by May about AIs technically being citizens and obviously certain AI are free to their destiny of choice, like May, Bubs, Roko, Melon, and my favorite Yay Newfriend! Then there's some ambiguity with Winslow, Momo, and various institution AIs like Station, governmental office AIs, various AIs partnered with humans in various strips in the archives, and the body shop AIs. Are those AIs enterprise/franchise/laboratory owned laborers and assistants? Finally Pint Size is obviously "owned" by Marten, but the similarly modeled Nelson seems to be autonomous. Momo and Winslow were once small anthro-PCs, had full sentience, completely owned by Marigold & Hannelore respectively, were gifted new bodies, and are now ambiguously "free citizens", but choose to stay with their previous "owners". Pint-Size was bought and paid for by Marten, fully and horribly sentient; Marten even bought a new chassis for him with frickin' laser beams and magic thumbs, his freedom hasn't been expressly covered, but Pint-Size doesn't seem all too interested in embodiment and autonomy, (and is mostly used for the most satisfyingly crude comic relief and I'd hate to see that change) but AI ownership and autonomy has not been addressed to the level that QC has delved into personal identities and intersentient relationships.
Gnabberwocky:
I can't find the comic--it's somewhere between 2200 and 2500--but Marten is having a conversation with Pintsize and Momo about how he got Pintsize. It's less of a "purchase" and more of a "companionship contract." Either one of them is free to break off the contract at any time and for any reason with no repercussions. Those are the AIs that elect specifically to be companions; some, like Millifeulle and Beepatrice, choose civilian life, while others, like Station or the sentient toaster from when Momo gets her body, choose to function as sentient machines.
N.N. Marf:
I would like to know about Station, were they to, for example, quit tomorrow. Or give 2 weeks notice. They're running a whole lot of stuff, and it's seen that their slacking off can cause---at least minor---problems. I'd imagine that'd been planned for, that there's procedures in place to quickly replace vital functions: non-sapient systems that can handle things well enough? or other sapient software are ready to take over some those roles? A human interface would be nice---were I commissioning the station, I'd plan it so that, worst case, each class (singletons inclusive) of person can provide for the needs of that class, without much urgent training---simple interfaces, or well-trained persons aboard.
Mr_Rose:
Short version; humans own the chassis of a given companion AI but that individual volunteers to be their companion, is matched to them via a service, and is free to leave at any time. Not sure if the contract requires paying back the cost of the original chassis if they choose to take it but that seems likely.
Of course that’s the new model, post AI Rights affirmation. Previously, the human ownership was total and leaving only technically possible, though companionship was still voluntary at the outset. But those AIs would have become emancipated and either signed on to the new model of the arrangement at that time or left, probably free and clear of any perceived debt.
As for non-Companion AI, I believe they would most likely have been sold to their purchaser under terms similar to any other specialty industrial equipment, including a maintenance contract. I expect that the fee would likely have been restructured in those cases to be an employment contract, possibly similar to a temp agency with part salary going to the AI and part “health insurance” going to the manufacturer and the AI themselves free to quit their employment to seek better terms elsewhere.
For cases like Station, I strongly suspect there’s a whole section of the human staff with “just in case” emergency maintenance responsibilities, should he ever be unavailable for some reason. Also he’s specifically capable of walling off the interactive part of his mind so he can be “drunk” without affecting his duties. As for ownership and independence, I wouldn’t be at all surprised if Daddy E-C basically treated him as a free individual from the start and had a substantial “salary” set aside into a trust fund so that the moment the AI Rights bill was passed he could hand it over to Station “here’s your pay for the last few years, would you like to continue your duties at the same rate but paid monthly?”
Though I’m not sure Station actually can leave, given his integration and enormously more complex instantiation. He probably has enough money to buy the relevant hardware but where would he put it and what would he do then? There’s basically no-one who could pay him appropriately for his capabilities, not least because a lot of those are dependent on being integrated into a structure. Maybe he could set himself up as a museum of some kind?
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version