Hello Cristinel
Your philosophical essay would benefit by considering utilitarianism. (The greatest good for the greatest number.) You would also be a good person to consider the limits of utilitarianism.
You see humanity as the measure of all things, perhaps even if replicated. If humans are good, aren 't more better? (Until we hit limits. ) Consider two poles of future projections: existential risk, and singularity. Actualization of existential risk (i.e. extinction of humanity) reduces utility to zero. Some singularities do not increase human numbers, but some versions increase them immensely. For example, Lewis [1] estimates that there is enough material in the asteroid belt to build habitats for 10,000,000,000,000,000 people. (10,000 x 1 trillion) - - probably an overestimate. If we assume that every human life has its share of good, that is a lot of utility. Meltzer et al [2] shows a possible method for construction of these habitats. Armstrong and Sandberg [3] show a possible method for settling, not only the asteroid belt or even the galaxy, but thousands of galaxies.
These all are forms of singularity and require exponential growth. Exponential growth sometimes hits limits. Nevertheless, even if we assign these a fairly low probability, they still have a humongous expected value (probability times value) specifically in terms of human lives. That suggests that they are worth at least some attempt to make them happen.
However, how much attempt? It seems wrong to put all of our eggs in the basket of settlement of the universe so that there is nothing left if that doesn 't work, even if the expected utility for that branch calculates as greater. What do you think? How should we configure our portfolio of investments in our future? This is the next step in my thoughts, so I could use help.
I deliberately stop with Armstrong in my essay, to avoid the issue of whether artificial humans should count in utilitarian calculations. Anderberg's essay in this contest [4] does not hesitate to go there, so his utilitarian calculations are potentially higher than mine. I like his cute formula, integrating utility to result in a smiley face.
[1] John Lewis, Mining the Sky: Untold Riches from the Asteroids, Comets, and Planets, Perseus Publishing, 1997, pg. 194.
[2] Philip Metzer et al, "Affordable, Rapid Bootstrapping of Space Industry and Solar System Civilization," Journal of Aerospace Engineering, April 2012.
[3] Stuart Armstrong & Anders Sandberg, "Eternity in six hours: Intergalactic spreading of intelligent life and sharpening the Fermi paradox," Acta Astronautica, Aug-Sept 2013.
[4] Tommy Anderberg, A Future Brighter than 100 Trillion Suns, FQXi essay contest.