"Through the ingenuity and hard work of thousands of physicists, we have learned that all matter and energy in the universe is composed of interacting quantum fields, and we can in principle predict their behavior to great accuracy using the standard model of particle physics". Nothing more far from reality! Quantum field theory deals with interactions only approximately, isn't valid for full bound states, only superficially consider relativistic effects, is incompatible with quantum mechanics, is incompatible with thermodynamics,...
"Sure, there are a few phenomena that are outside the scope of current physics, such as what happens in the very early universe or near the singularity of a black hole, but, on the scales relevant to human life, we have a pretty complete understanding of all the relevant constituents of matter and fundamental laws". This isn't true. There are lots of issues in "the scales relevant to human life" that require an extension or reformulation of the foundations of physics. Several extensions of quantum mechanics and quantum field theory are at use in laboratories.
"In principle, we could use fundamental physics to predict with the greatest possible accuracy what will happen in any given situation, including those relevant to chemistry and biology, and even in those sciences that deal with the human mind, such as neuroscience, psychology, and sociology. I say "in principle" because those calculations would involve an impossibly detailed description of the initial conditions of the system being
studied, as well as infeasible computational power. It would be essentially impossible to identify and model a biological system directly in terms of its constituent quantum fields." This is deliciously ingenious; quantum field theory cannot even describe a relativistic bound state in a small molecule less still describe a whole biological system. The no reduction of biology to quantum field theory is not a question of computational power, but a consequence of quantum field theory being based in approximations: biological processes cannot be described as scattering processes in an infinite volume between minus infinite time and plus infinite time.
Effectively, established scientific theories aren't social constructs. The theories used to design a computer or a tire wouldn't work if they were "merely social constructs". The theories work because they adequately describe reality. Airplanes wouldn't fly as you correctly mention.
The controversy over the experimental status of relativity wasn't the reason why Einstein didn't win a Nobel Prize for relativity, but members of the Nobel committee knew that most of relativity had been pioneered by Poincaré and Lorentz; then the committee nominated Einstein for the photoelectric effect.
That physicists as Bohr or Heisenberg jumped to (invalid) conclusions about the nature of reality based on scant evidence was already noted by their contemporaneous: Planck, Einstein, Schrödinger,... That the Copenhagen view was accepted by the majority of physicists for decades and continue being very popular is an interesting case for historians and sociologists of science.
"For example, the theory that is identical to current physics, but also posits that there are green aliens hiding on the dark side of the moon that are completely undetectable because they do not interact in any way with ordinary matter, is compatible with current evidence, but we would not want to call it scientific". Because it isn't, scientific hypothesis must be testable.
I see no reason why we would not want to call "knowledge" to the name of the latest celebrities baby, or a paper is being cited by their own authors for no other reason than to increase their citation count. I wouldn't call scientific knowledge to the first, and I would evaluate the actual content of the paper instead of the ethic behind the actions of the authors, before classifying the paper as valid knowledge or not.
I don't think we can simply assume that knowledge network would be scale-free. Many networks earlier reported to be scale-free, failed to pass further statistical analysis or the analysis did cast doubts about such claims.
I'm not sure the mechanism for the generation of knowledge is correct. My complain is about the part describing how analogue "raw experience" nodes are replaced by some higher-abstraction node, reducing the number of links and complexity of the network. Whereas this mechanism is valid for (part of) descriptive knowledge it isn't valid for fundamental knowledge, where we ask "why", and often the answer to one "why" introduces a number of new "why" which need then to be answered. This repeated process generates an explosion of knowledge and the subsequent specialization into disciplines or branches.
I don't think the real world always imposes itself on the raw experience nodes. That experience could be incorrect and the reason for the development of sophisticated scientific methodologies to acquire 'experience' from observations and experiments.
"The existence of hubs ensures that the six degrees of separation property holds, so that it is possible to get from any two specialized disciplines to a common ground of knowledge in a relatively short number of steps." But not all the steps are the same; two disciplines can be separated by two steps, but those steps can be much more difficult than the five steps separating other pair of disciplines.
This is one of the reasons why I don't use a network approach to represent knowledge. I use a chemical approach. This chemical approach also shows that human knowledge has a hierarchical structure with fundamental knowledge in one extreme, descriptive knowledge in the other, and both synthetic and analytic routes linking the creation of new knowledge items or analysis of existent items.