Powered By Blogger

Friday, April 07, 2006

New York Times Book Review: 'Programming the Universe,' by Seth Lloyd

From the review:
"Broadly speaking, entropy is the amount of disorder and information in a system. Take, for example, a fresh, unshuffled deck of cards. In that state it has low entropy and contains little information. Just two pieces of data (the hierarchy of suits and the relative ranks of the cards) tell you where to find every card in the deck without looking. Give it a good shuffle and look again. The deck has a lot of entropy and a lot of information. If you want to locate a particular card, you have to hunt through the entire deck. There is only one perfectly ordered state but about 1068 disordered ones, which is why you will never, ever accidentally shuffle the deck back into its original order."
This definition of entropy is one that we hear quite often. Entropy is the amount of disorder or randomness. What does that really mean? Frankly, I haven't a clue. What I do have is the belief that others do not have a clue either. They have equations and they speak with authority, but in the end, what does randomness mean?

The deck of cards for example: The only way one can say that a new deck of cards is ordered is if one knows what ordered cards are--- before you look the deck. If you think ordered cards are when the red and black suits alternate, then that arrangement would be an ordered deck. A new deck would not be ordered because the suits are grouped together. If, to continue, you counted in some other way than how we are taught to count, a different ordering of the cards would be, by definition, ordered.

In short, entropy as it is defined here, and this is how it is often defined, requires one to have outside information with which to measure "randomness." This information cannot come from the cards, it is external to the system. Thus, what one sees as order may not be order at all to someone else. Is entropy then dependent on the observer? Perhaps that's really what is happening, the observer computes the entropy but a different observer could compute a different entropy?

What I know for certain is that I don't know what random means, I don't know what ordered means and I certainly don't know what entropy means.

Perhaps you could help me out?

1 comment:

David said...

This is actually my point. What says that something is random? It is, as the Climbing says: that original, unsuffled deck. To know how to find a card is to know what the order is before you look. It is to know that a 3 follows a 2. That knowledge, however, is outside the deck itself. Thus, randomness is based on the deck and on outside knowledge beyond the deck.

Consequently, to recognize a sequence like cards, as random, is to know what an unshuffled deck is and what a shuffled deck is. That information is beyond the deck.

So, if you have a sequence of numbers say, how can one say if the sequence is randome if one does not have any outside information? That's my point and I think it's subtle and overlooked.