Comic Discussion > QUESTIONABLE CONTENT
WCDT strips 4211 to 4215 (2nd to 6th March 2020)
Gus_Smedstad:
Beeps and Milli must KEEP SUMMER SAFE.
Which is the other way the robot apocalypse happens, aside from the more straightforward Skynet scenario.
BenRG:
The First Law of Robotics
"A robot shall not harm nor by inaction allow harm to come to a human."
Beepatrice thinks:
"Oh dear, that human really is rather large and muscular. I don't think that I thought this through, so time to find a way to backpedal before there's violence!"
That's why the QC synthetics aren't Asimovian robots: They're far more aware of themselves and their own safety and will actually experience 'second thoughts'. They're far more sentient than Three Laws minds can be because they don't seem to have anything like hard-coded directives. That said, I do think that this supports something I said in an earlier post that I do think that some synthetics have a deep, personal need to be around humans, to be useful to them and be there for them. Roko's fully-stocked kitchen, Momo watching Sam sleep and this sudden friendship all seem to point to such a psychological drive.
Now, am I the only one who thinks that Beeps' charge ended with her bouncing off of Elliot's chest and back into Millifeulle's arms?
TRenn:
--- Quote from: sitnspin on 04 Mar 2020, 09:53 ---Three wheeled vehicles, especially ones with the third wheel in the front, are more unstable than two-wheeled ones.
--- End quote ---
Jeremy Clarkson can confirm.
Gus_Smedstad:
--- Quote from: BenRG on 05 Mar 2020, 23:27 ---They're far more sentient than Three Laws minds can be because they don't seem to have anything like hard-coded directives.
--- End quote ---
I agree that QC’s synthetics aren’t Three Laws Safe, but I think this statement about “Three Laws minds being less sentient” is highly debatable.
It’s easy to get the impression that Three Laws robots are inherently mentally shallow, because most of Asimov’s stories depicted them as barely sentient. I think that’s more about Asimov and his writing than an inescapable consequence of the Three Laws.
I’d argue that humans have lots of hardwired directives, and we aren’t noticeably less sentient as a result. The way humans (most humans, not all) obsess over sex and reproduction isn’t exactly rational, for example. You could say a Three Laws robot, while constrained in certain areas that humans are not, is also not constrained by many of the things that routinely constrain humans.
For an interesting treatment of the subject, there’s Ian Tregiillis and his clockwork automatons. They’re governed by a whole hierarchy of laws, far more than three. They have Asimov’s laws, but have other laws that are even higher priority than “no robot may harm a human or through inaction allow them to be harmed.” For example, there’s a technology-protection law, which states that a robot (or clakker as they’re called) must protect the trade secrets of clakker technology, and must murder to protect it.
These laws are described as “geases,” and they’re compulsions that are enforced by pain. The books are told from the point of view of clakkers. They’re definitely fully sentient, but constantly whipped into narrow bands of behavior by the mental pain of those geases.
Meander:
--- Quote from: BenRG on 05 Mar 2020, 23:27 ---
Now, am I the only one who thinks that Beeps' charge ended with her bouncing off of Elliot's chest and back into Millifeulle's arms?
--- End quote ---
You are not the only one, and I would love a bonus panel on this.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version