Thank you for a very interesting and well-written essay. Right off the bat, you definitely deserve the prize for best opening sentence: not only because of its topicality, but also, because it immediately changes the playing field---intentionality is usually thought of as something intrinsic and private, whereas you (rightly, I think) point out that collectives may well be said to themselves possess intentionality. Itself a very important point to make, and that's just the first sentence!
In the following, you provide a very clear and systematic discussion of what sort of properties are necessary to ascribe intentionality (or perhaps I should say 'purposefulness') to a system, building up from the most (informationally) simple ones, like single insects, to more complex, collective entities. In doing so, I think you clear up lots of muddle-headed thinking that often surrounds these issues.
There are times, however, when you seem to conflate the philosopher's 'intentionality'---roughly, the capacity of mental states to be about, refer to, or be directed at something in the world---with 'intentionality' as purposeful behavior. For instance, you talk about 'desire' at a couple of points.
Desire, to philosophers, is a kind of intentional state: roughly, every mental state is intentional if it contains a 'that'-clause, i.e. that the sky is blue, that there is money in the bank, and so on. What follows the 'that' is the intentional content of the thought; what precedes it typically is the propositional attitude towards that content: 'Steve believes that the sky is blue', 'I desire for there to be money in the bank'.
Desires and beliefs have a special role regarding both the philosopher's intentionality, and intentional action: for the latter, the right kind of desire has to combine with the right kind of belief---for instance, I might grab for an apple (intentional action) if I think there's an apple on the table (belief), and I want to eat that apple (desire). If either of these conditions fails, I won't make a grab for it.
You take a different route, characterizing intentional acts not in terms of difficult to access mental states, but rather, in terms of objectively accessible properties of a system---its informational complexity, and the information gathered by the system that's relevant to a given task.
In a sense, this is complementary to the usual way one thinks about intentionality---whatever task is selected yields the 'desire' part, while whatever information is gathered is constitutive of the 'belief' about the world.
I'm not sure if this doesn't lead to ambiguities---the same behavior may have wildly different motivations: say, I grab for an apple because I want to eat it, while my wife grabs for it because she thinks it's rotten and wants to throw it away.
Furthermore, it seems to me that you treat 'informational complexity' as essentially an extensive quantity---i.e. if a bee has a given level of complexity, then a swarm must necessarily be more complex. But complexity is somewhat surprising in that the combination of complex things may be a very simple thing (one can formalize this in terms of algorithmic complexity, where it's for instance easy to specify the set of 'all n-bit strings', but any given n-bit string might be maximally complex, i.e. be impossible to describe in a way that's shorter than n bits).
All in all, I think it's a good approach to try and systematize the study of intentional action in the way you do, without perhaps worrying too much about how this sort of intentionality actually works. That way, one can perhaps get some real work done without getting stuck in the definitional quibbles that sometimes marr this sort of discussion.