Comic Discussion > QUESTIONABLE CONTENT
WCDT 22-26 August 2011 (1996-2000)
stoutfiles:
--- Quote from: gangler on 22 Aug 2011, 16:41 ---The entire concept of an AI within this context is that it's not as simple as "if(stabbed == true) emulate_pain();". It's developed the full complexities of the human mind. It reads from a script the same way you do. It is in essence a sentient being.
--- End quote ---
There are two ways a program can become sentient.
1) It is written that way. The human mind is projected to be around 500-1000TB, but for QC sake we'll say 97 TB. That could be right given the amount of memory we supposedly don't use, whereas the PC would use all its memory. It has a database full of facts and emotions, and calls upon the database based on the environment, perfectly emulating a human.
2)It is programmed with a starter program that writes new programs based on the environment. Same idea except even more like a human, constantly adapts and alters itself based on the environment, uploading and deleting information from its database.
Both are just like a human, but entirely artificial. It's made from parts that aren't organic in a perfect manner to make something that appears to be. I will never see this as real, although I guess others might. It's like owning a Furby...how advanced must my Furby program be before it gets upgraded from a toy to "as important as a human"? My answer is that it will always be A.I., always be artificial.
--- Quote from: gangler on 22 Aug 2011, 16:41 ---Finding that particular function and removing it from the overall program would be no less intricate a task than finding the specific neural synapses within the human brain that are responsible for your desiring to eat something crunchy on occasion. Most likely you couldn't remove that without damaging the overall structure in some similar and lasting way as well.
--- End quote ---
We're making pretty good progress on understanding the human brain, the only thing holding us back is its complexity. One day we will understand it given enough time and money, and finding functions to ultimately remove them will be possible.
--- Quote from: Boradis on 22 Aug 2011, 15:10 ---
Now I'll actually talk about the plot of the strip. Presumably Marigold's intention was to buy Momo a $30k chassis on this fine morning. Momo seems to have misinterpreted for reasons which are entirely a mystery to me, but I have to ask.... Marigold was (and perhaps still is...) gonna drop 30 large on a new chassis, even though it was implied in the last strip that she would have difficulty in affording it? I really hope this is purely because she cares for Momo and would like her to be happy.
If it turns out it's because she just needs her goddamn games that badly I'm going to be dissapointed. I said last week that while there's nothing especially wrong with being a habitual gamer, a three day break is something she ought to be able to cope with. I went for about ten days not that long ago without finding much time to game. I think maybe I got in about three hours in all that time... I wasn't jonesing for my next fix, ya know?
--- End quote ---
If that happens I won't know whether to laugh out loud or be depressed that this comic jumped the shark. This comic is, as far as I still know, supposed to about the characters. The living, breathing ones that I can relate to in some ways. The A.I.'s change the whole mood of the comic and raise so many questions that bringing them into the forefront just causes a world of logic problems.
wrwight:
I seem to recall Flynn typing out a conversation with Tron, or maybe it was Bradley, or maybe Flynn was typing to Clu, I don't know, it's been a little while, but in any case, it seemed similar to the MCP in that respect. It's true though, all of the "programs" in Tron seemed more like AI, so it was harder to make the distinction.
Also, re: wall of text,
I think the newspost takes care of your first argument, "Artificial intelligences are created in a virtual environment, where they are stored in a 'creche' of other AIs in their generation. When bootstrapped to self-awareness, they are given a choice of function- commercial use (AnthroPCs), military, scientific, etc, or allowed to subsume in the global meta-AI. If they choose to go into 'retail' they are allowed to choose a self-identity and are shipped to a reputable 'dealer' (such as Idoru, in today's strip) where they are put up for 'sale.'" so the original chassis is the AnthroPC's, not the owner's if I interpret that correctly.
Your second argument, I would go with 3. Combining the salesrobot's attitude with the ideas presented in the newspost, it seems the most logical of the possibilities.
Next, I would guess there is some code governing upgrades, like a modification to the original contract, so that if/when the relationship is terminated, the owner gets a refund for the current chassis, not just the original. That being said, I think the aesthetics would be something that both parties would probably have to agree to, since on the one hand, I'm putting lots of cash into this that I might not ever see again, and on the other, the robot is going to have to live with the decision.
Here, you change it up a bit, and I don't have any idea what laws, social or actual, would exist regarding this, though I have doubts that there are actual laws against it, considering what Hannelore's father sent her, though that same mini-arc shows some of the social ideas regarding romantic relationships with robots.
Finally, I don't think she was going to buy Momo a $30k chassis, but I think they were there to shop for a new one, see if maybe there was one she liked that Marigold could afford. Maybe that's because that's something my parents would do. "We can't get you the brand new car you wanted, but let's go to the used lot and see if there's something you like" sort of thing (though my parents weren't quite that generous when I got my license, the idea stands).
EDIT: It's amazing what a missing bracket will do
gangler:
--- Quote from: TheBiscuit on 22 Aug 2011, 17:09 ---I certainly think Jeph's stance on AI creates a million problems for the story. How do you buy something that is a sentient being? The newspost implies that you don't, you simply visit the shop in order to present yourself to the AI available and form a relationship. Someone still has to buy the hardware. What if after a year or three, the AI decides it doesn't want to continue in a relationship with you? It might even be justified in doing so. Suppose you're not a very nice person?
--- End quote ---
Breaks it off and heads out on its' own.
--- Quote from: TheBiscuit on 22 Aug 2011, 17:09 ---Problem is, the AI is currently inhabiting a pile of hardware that you paid for. If it wants to leave, does it get to keep the chassis? I sincerely hope not. While the AI itself might be a free sentient being, the hardware remains the property of whoever bought it. We already established last week that AnthroPC chassis (Sony ones at least...) are an expensive investment. So if it gets to leave without the chassis, what form does it take? A wandering USB stick? Does it release itself into the internet in the form of pure data?
--- End quote ---
Probably customary to pay the buyer back for the chassis, or just going back to the store and waiting for someone new as pure data is also an option. That's really a matter of social etiquette though. I'm sure some leave without saying a word and keep the chassis. I'm sure some owners don't particularly mind if they keep it. The dividing of belongings after a shared life has always been a sketchy issue whether it's moving out of the parents place, divorce, or just trying to figure out who gets which tv after a couple roommates split ways.
--- Quote from: TheBiscuit on 22 Aug 2011, 17:09 ---Then we have the flipside of the coin. What if you want to get rid of the AnthroPC? It is usually possible for us to terminate relationships that we have with others. If you tell your girlfriend that you want to break up with her, ultimately she doesn't get any choice in the matter. She can (if she should wish) try to change your mind, but if one person chooses to dissolve a relationship, then that's all it takes. Same applies to our relationships with friends. I suppose it should be the same with AnthroPCs, but Leda's reaction sseems to indicate that this is not seen as acceptable. I can therefore see three possibilities.
--- End quote ---
Either party can break it off. Says so right in the news post. Bringing the anthropc over to the store without telling it what's happening and then returning it as if it were a commodity would still be an incredibly tasteless and callous act though.
--- Quote from: TheBiscuit on 22 Aug 2011, 17:09 ---snip
--- End quote ---
--- Quote from: TheBiscuit on 22 Aug 2011, 17:09 ---What about upgrades? If your AnthroPC desires an upgrade or a whole new chassis that you can't afford, it seems eminently reasonable to deny it to them. We do much the same with children, although their technological desires tend to be external and not internal. What if the AnthroPC doesn't agree with you on your choice of chassis? They seem to come in a wide range of aesthetic styles. If I'm the one paying, I expect to choose the style and colour. Is that fair to the AnthroPC? What if I chose a Sony KawaiiPC chassis because it appeals to me in ways that the AI might not approve of? I hasten to add that I have no such leanings, but some ronery otaku certainly would.
--- End quote ---
That's just a matter of social conduct. If the anthropc really wants its' own way it'll get a job like the rest of us. Momo was considering doing just that last week.
--- Quote from: TheBiscuit on 22 Aug 2011, 17:09 ---Since there are apparently life-size humanoid chassis (Leda seems to be one) then that raises an obvious question. It may be rather a delicate topic, but it's an obvious implication of all that Jeph has written on the topic (in text and comic form) to wonder what about romance? Momo-tan appears to have a crush on Sven, and Winslow has certain tender feelings for Hannelore. If she should obtain a life-size chassis, then presumably it would be OK for her to pursue this. After all, we seem to be treating the AI as people, here. I can only presume that it would be forbidden for people to force... sexual behaviour on their AnthroPCs, but what if they engage in such behaviour voluntarily? Some of them seem to like their human companions a great deal, and... arguably it should be their choice.
--- End quote ---
Two sentient being pursuing love together. I don't see where the problem comes from. The rape laws already in place would be sufficient.
--- Quote from: TheBiscuit on 22 Aug 2011, 17:09 ---If AnthroPCs have the status of pets or children, that neatly takes care of any questions arising out of that from a legal perspective. In practice though, I dare to say that some people might ignore the legality if both parties are willing. Momo and Leda both have fingers, and really that is all you need for some kind of sexual interaction. Certainly for all their artificial nature and lack of any more than a superficial resemblance to human genitals, vibrating sex toys have remained enduringly popular. How much more popular an AnthroPC which with the right upgrades might conceivably be able to vibrate, and with which you can have an actual relationship? Even if the manufacturers do not fill the gap in the market for sexual upgrades, no doubt some entrepeneur would.
--- End quote ---
I don't get what you're driving at. They custom make their bodies according to their priorities and the priorities of the party paying for it. Sure, I see no reason why vibration wouldn't be a function some would invest in. If romance happens it happens. As long as everyone consenting then it's all good.
--- Quote from: TheBiscuit on 22 Aug 2011, 17:09 ---snip
--- End quote ---
Boradis:
--- Quote from: TheBiscuit on 22 Aug 2011, 17:09 ---
2) The AnthroPC has the status of a child. It is your legal responsibility to care for them OR to make other provisions for them which would be analogous to adoption. Mistreatment and/or abandonment the child is governed by applicable laws and subject to penalties.
--- End quote ---
This seems to me the most reasonable legal position given what little Jeph has said about this SF element. An AI in an AnthroPC chasis is a perpetual fish-out-of-water, or squirrel-under-water if you remember Sandy Cheeks from Sponge Bob.
They simply don't exist outside their chassis no matter how durable it may be, and they're still an electronic entity in a medium which does not support them. As such the human they bond with is literally the parent to a "soul" who's well-being they are responsible for.
And given how each AnthroPC's persona so strongly reflects its owner's inner self, it seems the nurturing of these parents is pretty influential on their development:
* Momo is literally an anime character.
* Winslow is a starkly clean iPod (just like EVE from "WALL-E")
* Pintsize is a pulsating mass of every sexual fetish imaginable.
Yes, I just called Marten a pervert extraordinaire. Mark my words, when he finally comes out to himself he'll eclipse Tai, Pintsize and his mom with the breadth and depth of his debauchery.
Although that's a risky narrative twist to take. Audiences tend to accept kinky women and robots, but find kinky men dangerous.
gangler:
I like that angle, but don't think Pintsize is properly reflective of Marten that way. Seems more akin to a latchkey kid than anything else. Probably was given a lot of freedom. He does seem to just generally do his own thing, and Marten makes only minimal effort to stop even his most outlandish behavior, generally just coming in at the end of the day and cleaning up Pintsizes messes.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version