Carlo,
Great essay. Entropy has been considered as the number of ways one can rearrange microstates in such a way that a macrostate is preserved. Information I_n = log(P_n), for the occurrence of a state with probability P_n, when summed over P_n is
S = -k sum_n P_nI_n.
For P_n = 1/N one gets the standard result that S = k log(Ω), with small errors. In this case the system with equiprobable microstates is at maximum entropy. The maximum entropy is hard to define, but a black hole is probably as close as we can get. It is entropy defined for a maximum number of degrees of freedom or qubits on an event horizon or holographic surface bounding a volume.
A system at maximum entropy can only be assessed to have such entropy if some measurement is performed, say of temperature with T = ∂E/∂S, which necessitates some probe be coupled to the system. This means there is another system with some Hamiltonian that couples in with the thermal system of interest. In this case there is a change in entropy where in the adiabatic limit usually means δS \lt\lt S. This would then imply that a relative entropy exists with a small change in entropy.
Cheers LC