Comic Discussion > ALICE GROVE
Unanswered Questions from the Alice-verse
BenRG:
It's worth noting that none of the 'controllers' we've had in the story so far, including the Praeses, have shown any interest in the humans under their power having any freedom of choice beyond the limits that they impose.
Then again, look around in the real world today. Once you've looked at the runaway social and human problems, tell me again why absolute freedom to choose whatever path you like is a good thing. The reason that we have laws is simply because we found out a very, very long time ago, that to cure a passing hunger pang, because they saw a shiny thing that momentarily caught their eye or because they fancy sating a passing sexual desire, humans are capable of making very, very dumb and self-destructive choices. Humans will deny and denigrate education, dehumanise each other and even act against heir own obvious best interests in order to sate a passing whim.
The creatures behind the Blink, and the Praeses too, informed by human history, have decided that the only way to solve this problem to to limit the scope of free choice. They have decided that freedom from war, from pollution and from the vicious complexities of a technological society are more important than freedom to take actions that history has proven that they cannot use safely anyway.
ReindeerFlotilla:
I didn't saw absolute freedom, so I'm not debating that strawman.
On the other hand, you've not addressed the fundamental moral wrong of forcing humans to live in poverty because "environmentalism" when there are artificial habitats they could live in, and seemingly limitless potential to make more.
Charles Stross's Eschaton is very similar to the blink, and it meddles. But the Eschaton has one aim that it has made humans aware of, and its meddling is geared to that aim. Self-preservation. Otherwise, it takes a fairly distant relationship with humanity. It demands only one thing. No time travel within its lightcone. Or else. (Notably, the Eschaton prefers to manipulate those who try such that they fail, rather than actually impose "Or else." It's implied that the human agents who work for it do so because it has convinced them that it very will "or else" if it must.)
From an absolute freedom strawman POV, one could say that the Eschaton is amoral. On the other hand, it seems both able and willing to end humanity, yet it chooses to allow them the maximum freedom it can tolerate to achieve its goals of not being erased from time.
I can't imagine something that could blink, per AG, could define the state of have and have not that it created as necessary to achieve a goal. If humanity is that problematic, and its morality so alien, it would be less resource intensive for it to simply destroy us all and replace us with something less... sentient (natural doesn't really enter into it, as the blink could have removed all the left over nano and eliminated the mutants given thousands of years to work with).
Again, I submit that creating a situation where people are born into poverty, when you have the power to eliminate it, is immoral, at best. Having the ability to have performed the blink, and continue to monitor and enforce the current status quo, and allowing no choice as to what kind of life a person might want to live is immoral. There'd have to be some complication that would prevent something so powerful from giving the Earthers their own low-tech-zone habs in space, and fully protecting the Earth environment. Give the spacetrees mars or venus, and let them practice terraforming (thousands of years, they've had). There's plenty of solar real estate for people who want to play Ned Ludd. To specifically strand a breeding population in your game preserve with no regard for their self-determination is wrong.
It stretches credibility that such an intelligence would care about Earth's environment but not care about humans to the extent that current situation would have to imply if the blinker was actively controlling what humans can do. Caring about the Earth's environment is a human perspective. Any alien, even one we created, would look at the Sol system and note that it is ~98% G2V star. The remaining 2% is Jupiter. This environment, with all its diverse stuff, is a planetary rounding error. Why protect it, beyond sentiment? Left alone, it will be a wholly different population in a few million years, and it will be a bit of impurity in a red giant's photosphere in a few billion. Left alone, humanity would have sorted itself with the war, and after 1000s of years, the Earth would have recovered. Just another extinction event among many.
There's no perfect morality, because moral is relative. But, if the war was so bad that the blink was needed, it would have been morally superior to blink humanity away and not blink them back, if you believe the implied thesis that humans are bad. Doing what the blink did simply changed the war from a hot war to a cold one, and Alice is of the opinion that the stakes are the same. All the blink really changed was 200+ generation of poverty on Earth and wealth in space.
I can imagine the blink being a kind FU or "So long and thanks for all the fish." A parting gift or curse from a human entity that outgrew us, but remained so human in those last moments that it couldn't let us obliterate ourselves. Lacking a prime directive, it didn't forsee or understand how its actions might create a different distopia and speard the war it was fed up with of thousands of years, rather than a few more days. There are moral failures there, too, but they are failures of planning. This idea of enforcing poverty rather than moving those who want more hitch off planet is an active moral failure. It's also just more complicated that makes sense.
I'm not saying that it's impossible that this is the AG plot. Just saying that if it is, a freaking big plothole. More convoluted than the Terminator timeline is now.
I'm aware that it is implied that this monitor preventing tech is implied to dumb, but that's the whole reason I pointed out that latex is natural, and that cooking is chemistry. A dumb system would miss ways to synthesize latex analogs it wasn't programmed to understand. Especially if it were discriminatory enough to know to kill a Earther latex plant but leave a Spacer super pump alone. That level of discrimination implies intelligence, and intelligence implies morality in chosing to inflict poverty on people, rather than relocating them out of the preserve.
Which, again raises the question of the good guys are supposed to be. ALice has proved she's not the good guy, yet (Assuming she's not bluffing). So far, the only hero in comic is Ardent. I'm fine with the main character not being the good guy, growing to later become the good guy. I'm fine with the world not having any good guys but one blue boy who is willing to whats right, no matter the cost, when the chips are really down. I'm just saying any intelligent system that is enforcing the status quo can't be the good guy if it has the power to do different.
(It is worth noting that Alice is trying to protect people and fight poverty... her windmill, her pump... So she's less likely to be a willing agent of the status quo as much as someone using all the power she can spare to try to do the best for most. Outside of these speculations, what makes her most not the good guy is that she sems to be seriously planning to kill Ardent. I still allow that she's bluffing, but the last comic does raise the potential that she's not).
BenRG:
I think that the problem we have here is a differing sense of what 'poverty' means.
If you want it to mean 'not having technological comforts, excessive leisure time which must somehow be filled and an excess of almost every consumable which is so easily available that over-consumption is a real social ill', then I agree - The people of Alice's world are poor.
However, if you want 'poverty' to mean 'no access to food, shelter, a reasonably harmonious social group and other necessities in sufficient amounts that human requirements are met with room to spare' then I would say that they are rich.
We have seen no sign of famine, families so large that it is a struggle to feed them and disease. Heck, in general, people seem to be happier and healthier than their equivalents in Questionable Content! I'm not saying that it is a paradise but it is a place where people, working to achieve a sustainable balance rather than to achieve wealth, have managed to find a happy medium where they do not want but are are also not so glutted with things that it causes issues with social cohesion and unbalanced class extremes based on access to excess. Is it a paradise? No but it is verging towards a Utopia and I'm sure that RF could tell us chapter and verse how dangerous those can be on social levels (stagnation is a big issue in such societies). It is possible that, in her pursuit of 'safe' small-scale technological advancements, Alice is trying to fight against such a thing.
Basically, Jeph is trying to give us a 'through the keyhole' into a society that is trying to live in harmony with nature and within certain limits in an attempt to create a sustainable and happy civilisation. Is it a fools' errand? Who knows? Answering that question may be one of the points of the narrative.
I'll remind everyone of this: This isn't necessarily something imposed against the inhabitants' wish, at least not for the first few generations. I believe that there is good reason to expect that those remained behind on Earth after the Blink were the radical low-tech faction. So, they would have wanted to create such a life and civilisation. This value-set would have been passed through the generations to the point where the current generation of inhabitants regard technology as a kind of slightly dodgy mystic art (hence Alice, who is actually an engineer by training from what I've seen, is perceived as a 'witch').
FWIW, I do hope that Jeph addresses how 'deviants' are handled. Maybe there is a pro-Praeses 'fifth column' living elsewhere, maybe? Maybe the AIs 'harvest' such outcast groups regularly and take them to wherever they went after the Blink?
pwhodges:
--- Quote from: ReindeerFlotilla on 20 Aug 2015, 00:45 ---On the other hand, you've not addressed the fundamental moral wrong of forcing humans to live in poverty because "environmentalism" when there are artificial habitats they could live in, and seemingly limitless potential to make more.
--- End quote ---
Well, there's the age-old idea that you don't give people the answer, but give them the background to be able to work it out for themselves in due course. Maybe Alice is bringing this society on slowly in the hope that her guidance will produce a better advanced society in the distant future than the one which the blink cut off.
ReindeerFlotilla:
--- Quote from: pwhodges on 20 Aug 2015, 01:43 ---
--- Quote from: ReindeerFlotilla on 20 Aug 2015, 00:45 ---On the other hand, you've not addressed the fundamental moral wrong of forcing humans to live in poverty because "environmentalism" when there are artificial habitats they could live in, and seemingly limitless potential to make more.
--- End quote ---
Well, there's the age-old idea that you don't give people the answer, but give them the background to be able to work it out for themselves in due course. Maybe Alice is bringing this society on slowly in the hope that her guidance will produce a better advanced society in the distant future than the one which the blink cut off.
--- End quote ---
I'm not criticising AG. I'm criticizing the theory that some force is specifically restricting access to technology on the surface, and that for is not the people who live there. That, whatever else, their way of life is externally imposed upon them.
With that in mind, this is not give a man a fish. It's pretty much the opposite of that saw. It's specifically NOT teaching a man to fish, or even allowing him to understand the principles of fishing. Nanotechnology would be outside the experience of anyone living more than few tens of miles from an Alice. When the superforce proposed above stopped some industrial process from working the people wouldn't have any idea why. If the superforce came down and told them "no, they would label it God, which really isn't much better than not understand why the proper application of heat and pressure lines up with the mathematical model when making steel but seems to fail for no known reason when applied to latex. Why you can distill liquor but can't distill hydrocarbon fuels.
There is nothing morally valuable about work for its own sake. That's primarily a myth enforced top down. Long ago, it did lead to a better life for all involved, but that's been steadily eroding in terms of distribution of wealth for some time. It seems to go cyclically, where work starts as collective cooperation for the common good (at least in large part) but erodes to a pure rat race that only enlivens and enriches the most ruthless few.
So, yes. IMO any system relies on fundamentally unnecessary manual labor to survive is poverty if there exists a system that could effectively remove that burden and allow people to pursue their highest potential.
Would people actually do that? Hell if I know. I would recommend reading Neil Stephenson's The Diamond Age for a sense of what nanotechnology not half as advanced as what Gavia has could actually do.
Earth is flush with water. Or big issue is that most of it is too salty (and most hat isn't is ice, and worse the heat, while melting the ice part, it's mixing it with the salty part.
A nano bot filter would fairly efforlessly remove the salt from the water. It would extract carbon from the air more effectively and efficiently than trees do. In fact, one of the nanopunk writers accurately noted that nanobot based carbon capture would be so efficient, we'd start running out of easily accessible carbon sources. One proposal was to just set the Appalachian chain coal deposits on fire, to keep atmospheric CO2 dropping too low.
A properly configured universale assembler system could build almost anything, as insanely low cost. It could remediate any environmental impact at insanely low cost.
The assembler system in the Diamond Age is very much like Eric Dexler's concept, which is built the way it is because such tech would be extremely sensitive to the environment. Indeed the central conflict of the story involved finding a way to build a less centralized type of assembler. While the goal in the book is system that would still be a macro-scale device, Gavia's nanotech is, apparently, free and nearly full self-replicating, able to operate in the least controlled of environments (A human body) and the eternal world (Gavia's nano is either UV resistant, or her internal stores rebuild fast enough that she needn't worry about sun-induced losses. She projects fire, energy blades and defensive screens. She's weaponized at least to the level of a light battle tank, and that's just a teenager. Her nano reassembled the teeth Alice removed, implying that is is a UA system). The kind of resource wars Alice is afraid of would be largely solved by nanotech. The issue would be (as the nanopunk writers note) a crunch on carbon, and any other elements that make good base substrates for building nanotech. (Which is mostly carbon, as far as we know, but we're just getting started).
I'm not saying that people who are fulfilled by manual labor are immoral. I'm saying that it is immoral to force a way of life on people who never asked to be born, much less born in the 19th century 2.0, to live that way when there are other options. I'm allowing for the whole Environmental Preserve Earth concept. But if that which blinked is still actively intervening, it like has the ability to identify those who would be happiest in both environments and move them to where they would be most needed. Both from a practical view of what nanotech could do, and from a view of the effort needed to enforce a low tech system (i.e. not only prevent know advances beyond a certain techlevel, but prevent the chemists from advancing in other ways) suggests the blinkers would have the power to intervene in more efficient, more humane ways. Thus, it seems more likely that the blinkers intervened and skipped town. Otherwise, they'd be malevolent force.
I would have some doubts about Alices fitness to guide humantiy to a better future based on her reactions to Gavia's threat and then Ardent's. In both cases she fails the WWJD test.
What Would Jean-luc Do? If you can't make a call that would make Captain Picard proud, you don't deserve to wear that ... Erm, you probably aren't fit to lead humanity to less mostrous future. Rule one of such a job. You can't be a monster. Again, however, she might be bluffing.
Somethings to consider. It's much more likely, interms of what's actually going on, that the trees are malevolent force. Even on a conservative time frame we're no more than 100 years from singularity in the real world. In AG, they are thousands past singularity and there hasn't been another they we are aware of. Likely this is because of the spacetrees. They are bioconstructed intelligences, and lack the easy of machine AI's ability to self improve. They could likely design a smarter tree, but they can't easily BECOME a smarter tree. (Personally, I think this limit would be overcome given 1000s of years, but since Alice seems to personally know these trees it seems likely that they are not able to self upgrade.) The space trees are almost certainly the BIOTECH side of the warring factions. What Blinked was likely the techtech side. An AI runaway. Intelligence explosion. Exponetially increasing cognition.
While the space people could be the tech side, and the blinker could be intervening to prevent them building a second exponetial intellect, ALice personally knowing the trees suggests to me that they predate the blink, just like she does.
Another important point about engineering such a system, I've mentioned this before, it would be both more effective and more humane to engineer the people such that they never want to build jet planes and such. I see no difference in doing that and engineering the environment to be hostile to attempts to build jet planes. But the form would be more human because everyone would happy with their lot in life. No one would be frustrated by a world that refuses to conform to the math, and most of all, anyone who got a look at the super pump would say, "Flashy, but it seems really too extravagant. Maybe we'd better build in some saftely locks to limit it's output."
In short, the easiest path wasn't taken, or Alice would have nothing to fear, and the spacetrees would be impotent.
Again, all I'm saying is that an interventionist blinker represents a pretty big plot hole. A non interventionist blinker doesn't. Something blinked and left town. When it's fair that the decendents many generations removed are stuck in a low tech world isn't relevant because there's no one around to change their lot in life. Those who want something might be being stifled by lack of resources, or just the tyranny of majorites or Alice's. There are issues with that, but they are of fundamentally different nature than an active power that could interven for the betterment of all and choose not to is the most convoluted manner possible.
Which is, incidentally, a good description of why I'm an atheist.
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version