You make excellent points in your essay, slicing and dicing issues based on cultures and beliefs. You speak to both the complexity of the issues and the problem that the answers are unknown, or at least not universally agreed upon. I very much agree that
"Neither professorships, nor peer-reviewed papers, nor practical experience, nor official status, nor any other credentials are consistently reliable guides to who is right and who is wrong."
"Closure is almost certainly an illusion, or a sign that one has become a partisan, now to be discounted by the other side."
For example, you state that global warming if not quickly addressed, will cause such havoc that "it is questionable whether humanity will be able to adapt", thus, apparently committing yourself fully to one of the beliefs that may or may not be right.
Babel is here. You show it goes way deeper than language. And since there is no universally agreed-upon answer, many believe that we make the best of it with local autonomy, many experiments conducted in parallel, with Internet and other communications counted on to spread news of both local successes and of local failures.
You mentioned "globalized intellectual elites", as if this culture is somehow different from others, but the reality is that more or less closed communications, where political correctness forbids even discussion of taboo topics, and special interest based on the fact that most of such elites are based on tax dollars of "meer" citizens, suggests that these elites deserve no greater consideration than other special interests.
It's a complex situation. You discuss structural fora and crowdsourcing (Wikipedia, etc.) but note that "it is not making decisions for humanity." Yet you also note that scientists and academics often opine outside their expertise, disagree and often disrespect each other's views.
I live on a ranch one half hour from Stanford, and I associate with both "globalized intellectual elites" and with farmers and ranchers and local folk, and find plenty of "low information voters" in both groups.
Your discussion of AI as a possible solution is qualified by 1.) It probably won't live up to technology cultist's expectations, and 2.) It might become a target of distrust, fear and hatred. As I agree with Lorraine Ford, above, and don't believe such systems will become "sentient", I don't overly worry about the consequences of "if they did so". Eliza has not come very far in fifty years. I certainly agree with you that autonomous weapon systems are bad.
Your question about "who will control the technology" is a good one. Do I want Google or an IRS-wielding government to control it? The devil or the deep blue sea?
You end up, as far as I can see, agreeing with me and with Sabine Hossenfelder that, hopefully
"Ordinary citizens can use personal AI systems to find answers to the questions they ask..." [Emphasis on they.]
So, in the end, I think we see much the same problems and hope for much the same answer.
Thanks for outlining the problems and suggesting a hopeful solution.
Edwin Eugene Klingman