Tuesday, August 5, 2008

Relative Entropy

"Workin' on a mystery,
Goin' wherever it leads"

[ASIDE: This may come to naught, but I get something from forcing myself to try and explain what I'm thinking, especially if I am hung up on a particular idea. I'm also violating my unwritten 'one post a day' rule, but I'm trying to capture these thoughts while they are still semi-coherent. Bear with me...]

We'll pretend for the moment that I didn't just google the phrase 'relative entropy' and discover that it refers to something that I'm fairly sure I don't understand. :)

Today I had a moment of clarity (yes, 'clarity' is relative) while reading Teleportation:The Impossible Leap, by David Darling (2005). In the chapter called 'Dataverse', Darling gives a clear and coherent description of Shannon entropy, as well as a description of thermodynamic entropy, and he relates the two to each other. Somewhere in this description, I get sidetracked by the following line of thought...

Darling uses the analogy of a jar of black and white marbles. When all the white marbles are at the bottom of the jar and all the black marbles are at the top of the jar, the jar is in a state of low entropy. When the marbles appear to be randomly mixed, it has a larger value of entropy. So far I am board with this, until I start to wonder...

What happens if we use a jar of red and green marbles instead? And what happens if the person who is looking at the jar of marbles is color-blind? To that person, there is no discernible difference between the 'ordered' state of marbles (where all red are on the top and all green are on the bottom) and the 'chaotic' state of marbles (where red and green marbles are randomly mixed together). (If you want to nitpick about the red/green colorblind analogy, look here for additional examples of how the amount of information available changes depending on the tools one has for accessing it.) "Compared with low entropy states, high-entropy states contain very little information." At this point there is a demonstrable difference in the amount of information available, depending on the perceptual limitations of the person who is looking at the marbles.

"A highly chaotic state, which is far more likely to occur than an orderly one, corresponds to a large value of entropy." What defines 'chaos', as opposed to 'order'? There is 'order' if I am able to identify a pattern within the information. The ease with which I can identify the pattern(s) within any given set of information can vary, can change over time, and appears to be dependent on my existing knowledge structures. If I have a well-developed filter for processing a specific type of information, then that particular data set appears more 'ordered' to me than it does to someone who has no experience with that type of data.

Hmm... Time to cross-check our sources. ;) "Entropy is the number of different microstates that correspond to the same macrostate." I guess the egg illustration didn't make it into the online edition of this article, but the analogy was basically 'There are more ways for an egg to be broken than for it to be unbroken, therefore the state of 'broken-ness' has a higher entropy and the system will favor a transition from unbroken to broken over a transition from broken to unbroken.' I'm very tempted to point out that broken/unbroken is a completely arbitrary way to divide a set of perceptual experiences. It may be a highly logical way to classify eggs, because of the utility of the egg differs when it is broken as opposed to unbroken, but the classification is still a purely arbitrary cognitive tool. I have managed to bring this argument back to object classifications and boundaries, which are my pet hypotheses right now for how we will identify dynamics to explain classical physics in 5-dimensions. :) What I really need now is a good analogy to drive home this point... (thinking) (thinking)

Some things you just can't do without coffee.

So I'll end this post by transcribing some of the notes I jotted on a piece of paper. Perhaps in the morning, I'll see this argument in a different light. Or perhaps someday I'll be able to link the idea of relative entropy with the thermodynamics of neural information processing. This notion of relative entropy should be supported by corresponding differences in the energy expended in cognitive processing...

Transcribed from notes:
  • Can entropy be defined by object relationships, classifications or categories?
  • Do relationship(s) to existing knowledge determine whether a state is perceived to have 'low' or 'high' entropy?
  • One's ability to extract of identify information is based on existing knowledge structures and paradigm recognition patterns.
  • The same state can give different amounts of information depending on the predispositions or perspectives of the person accessing it. Does that change its entropy value?
  • Is entropy relative? Is entropy a measure of mis/match to existing knowledge? Can entropy be mitigated or altered by using different sets of pattern classifications?

No comments: