Hi. Your big point (the amount of steering we can hope to do is more like swimming in a river than altering its course) is a good one which I pretty much take for granted, so you won't get any argument from me there. Instead, I'll nitpick about ems.
Much as I like the general idea ("Humans 3.0" in my essay) I have yet to see a convincing description of the technology. A couple of things jumped out at me while reading your essay:
"Ems would often spin off copies to do short term tasks and then end when those tasks are done."
How? A straight copy would be indistinguishable from the original. So you'd have two identical individuals, both thinking that the other one should just do its assignment and then have the good taste to commit suicide. If you propose to modify the copy to behave as expected, you lose the proposed advantage of ems over other approaches to AI, not needing to understand their inner workings in detail. Or is your proposal to create the copy in a torture chamber, suitably equipped to compel obedience and then kill upon task completion?
"Ems would split by speed into status castes, with faster ems being higher status."
Why would any em willingly run slower than technically possible? Would you willingly take a drug which slows you down? You mention gender imbalances as a possible cause, but if copying is so trivial, the gender in low supply could just duplicate itself.
Another speed-related problem: are your ems running on general purpose computers? The implied trivial ease of copying suggests as much. Have you tried to estimate the computational requirements for emulating even a single human brain in real time? Which technology are you suggesting will be used to run trillions of them, as you say, within a century?
When I inventoried computing technologies currently on the horizon, the only realistic candidate I found was neuromorphic hardware. It is far from obvious that it could be used for the kind of general purpose computing needed to run virtual worlds, and if it can't, ems will have a big speed problem.