We were talking about how the universe is or is not like a computer, and I had mentioned 't Hooft's model of a black hole from his paper Dimensional Reduction in Quantum Gravity. I have no idea if it's a correct model or not, but it does illustrate a good point about the higher orders of entropy.
Essentially, the black hole's event horizon is made up of N spin-like Boolean degrees of freedom (bits), where N is related to the number of distinct microscopic states by N = ln(number of distinct states)/ln(2). Barring some topology differences, this means that the black hole's event horizon is effectively the same thing as an N-bit integer, which also has 2^N distinct states. Of course, the probabilities of the distinct states in both the black hole model and the integer model are assumed to be all the same -- 1/(2^N) -- which is why the calculation of N is so simple (no summing required, the answer is already known).
To be clear: The measure N is the first order entropy (in binary units). In the black hole model it's the von Neumann entropy, in the integer model it's the Shannon entropy. They're effectively measuring the same thing -- the logarithm of the number of distinct equiprobable states.
This is not the end of the story though, because 't Hooft continues on in his paper to describe the beginnings of a cellular automaton rule that would govern the evolution from state to state. He also uses the word data a whole lot. Bonus.
What does his assumption of the requirement of a cellular automaton rule immediately tell me, without even looking into the technical details of it? It tells me that he assumes that the higher orders of entropy are likely NOT maximal. Look at it from the opposite point of view: If the evolution from one state to the next was entirely random at all orders, then all possible combinations of time-adjacent states (pairs, triples, etc) would have equal probability, and so the entropy would be maximal for all orders. This kind of randomness is the definitive anti-rule, and so you would not actually need a cellular automaton rule for it to occur -- for each bit you could simply ignore all of the neighbouring bits (again, neighbours depending on the topology) and just randomly flip the bit. That's the definitive anti-cellular automaton, in which case you entirely ignore the neighbours.
It seems to me that determinism in this model would be indicated by less-than maximal entropy for some or all of the higher orders of entropy. I don't think he states it quite like that, but it seems to be true.
Of course, entropy is information.