Thanks very much, Mark. In summary: A) Any beauty must come from the technology, not my writing; B) I agree the tooling needs a methodical description; C) I mean extinction of the network is barred, not the local nodes; D) I clarify "essence of humanity" and ask you to re-evaluate; E) I agree about the value of life, but think it implied in the supreme value of reason; F) I ask you to explain your concern about the principle of freedom; and G) I clarify my use of Habermas's principle D, and ask if you still see a problem.
A. I'm very pleased you see beauty in it; I wanted to share that, above all. I think it comes from the technology (theory, design) because I've no talent as a writer. It's actually that recognition that led me to working on recombinant text, and eventually the rest.
B. I agree the core inventions of the guideway internals (recombinant text, transitive voting, vote pipes) would benefit from adding a simple, methodical description. Above all, I felt I needed a description in terms of the theoretical requirement of maximizing freedom. Then space constraints took over and prevented me adding any other clarifying viewpoints. I think it's like Robert says, I just "cover too much ground", too much that's novel. It needs more structure in the delivery, as you suggest, and therefore a bigger delivery van.
C. I agree there's no assurance against haphazard, local violence. So a collision of colonists at a target star might, as you suggest, escalate beyond reason and destroy a stellar civilization (an event that we already fear today). But the larger network of civilizations would continue to exist, slowly expanding among the stars. This is the only assurance. (I should clarify this in the text.)
D. Humanity is the only rational being we know, objectively speaking. Maybe we'll never know another. Still, I write about the future of rational being as a whole, not humanity in particular (see p. 2 where I cite Kant). By "essence of humanity" (p. 1), I really mean "essence of rational being". With this clarification, do you still see a danger in speaking of essence?
E. You don't so much attack the supreme value of reason (P2) as defend the value of life, which leaves me room to agree. I think life is implied by reason. Reason cannot reproduce and maintain itself outside a social space, which in turn depends on a population of reproducing individuals (life). This is the necessary physical form of rational being. Further, I think rational being as such is bound to respect and honour its own cause within life at large, "To see a world in a grain of sand", as Blake says, "And a heaven in a wild flower". I feel we should be patient with ourselves therefore, and give reason the necessary time to work. *
F. Here I don't understand you, Mark. How does a principle of maximizing personal freedom (compatible with equal freedoms for all, p. 2) contribute to slavery, or any other unfreedom? I admit that a principle doesn't guarantee its practice, but neither does it undermine it.
G. Habermas's discourse principle (D, p. 3), when applied to laws, actually is a principle of democracy within a theory of democracy. So when you say it's "contradictory to the principle of democracy", I take it you mean contradicted by the practice of democracy. In other words, you are pointing to the existence of actual laws that are "invalid" when judged according to D. But this should come as no surprise. Few (least of all legislators) would suppose that all laws are de facto right laws. Some are struck down as illegal, some plain wrong by any standard, and most others flawed in ways that would be unacceptable if only we had the resources to fix them (which rarely we do). What D is telling us, at least in practical terms, is the direction each such law would have to take (again according to the theory) in order to move toward validity. Do you still see a problem here?
Mike
* Which reminds me of William Canning's beautifully structured, unnarrated film: Temples of Time (1973). Play at 480HQ for the best sound. Watch closely during the wild-flower sequence.