Yeah, can't really mourn for Pate. Alice's actions (assuming she actually has killed him here) are more revenge than justice, but honestly, dude knew what Church could do, didn't bother to tell him to rein it in, and in fact seemed almost pleased when Ellie was killed. He could easily have said "Church, no violence. Let me lay out my case and we'll see if they're willing to cooperate." Instead, he made it a game, waited and let Church respond with maximum violence, likely as an example to force compliance. Dude's a monster.
(There's a difference between cheering Pate's fate and cheering Alice being the one doing it. He might have deserved what he gets... but her dishing it out still says something about her, and it's not necessarily something very nice.)
Anyway, I don't even fully disagree with what people are saying above, but there are a couple things that just bother me about the solution "whoever" came up with.
- There's a concept in American law (so obviously this isn't some universal moral absolute, but I think it's a good idea) that if some fundamental right has to be violated because the state has a compelling interest, then that violation should be done in the least restrictive way. So: we need to prevent a future war, what's the "least restrictive" way to do it? Is it ensuring that humankind never achieves the level of technology it did last time? Is it ensuring that no new beings like Alice can ever be created? Is there a better way? If a "cage" is required, there are many different kinds of cages
- I think we'd all agree that a nation preventing its people from ever being able to leave is violating those peoples' rights. The fact that no "escape valve" was built into this cage (like allowing people like Pate who chafe at the restrictions to maybe maybe go live in a Praeses world) is just bad design, because it will inevitably lead to malcontents who could, theoretically, threaten the status-quo.
- We all live in "cages" of a sort, but we also have the ability and right to question those cages, to try to change them when we realize that they are unfair or toxic. There doesn't seem to be any such recourse for people in Alice's world. Again: this will inevitably lead to malcontents.
- There's no guarantee that humankind would make the same mistakes again if they did achieve high technology. Sure, it's a risk, but there are also risks associated with the current solution: humankind stuck in stagnation is at risk for the next massive asteroid or superbug, and is inevitably doomed when the sun finally expands. Even the Praeses, at least those we've seen, aren't any farther out than low-Earth orbit. There's no indication we've ventured any farther than that. Enforced stagnation might well prevent war, but it also forces all of our eggs to remain in one basket, and that's terrible for the long-term survival chances of humanity. There's no guarantee that even the Praeses would be able to move enough of us off-world when the time comes for us to survive - and even if they did manage to move themselves and their inhabitants away, that still would mean abandoning all those remaining on the surface to a slow death.
Basically, I think "whoever" acted in a very understandable way in a terrible, horrible situation, and acted to prevent that horror from ever reoccurring, but in doing so, never bothered to consider that there might be other ways of accomplishing that goal that aren't so restrictive. It's basically the same thing as with the Reapers from Mass Effect:
"AI will inevitably turn on its creators - the solution is to KILL EVERYTHING WITH TECHNOLOGY so that never happens." No one stopped to question whether the initial assertion was even necessarily true, no less whether the solution was even the best one to accomplish the stated goal.