October 11, 2010

Entropy for Dummies

The entropy is the expected amount of information you get in a sample of a distribution. You can think of it as "how surprised are you, on average, on seeing a sample of that distribution".

An unbiased coin toss has one bit of entropy, as any way is equally likely, but a toss from a coin where both sides are heads has entropy 0, as you're never surprised when seeing its results.

The entropy can also be seen as a constant minus the information gain of a distribution over the uniform distribution (and this divergence, as you know, is the number of bits you "save" when you use a code based on the actual distribution instead of on the uniform distribution).

it's best to get used to it by seeing how it works in theorems and algorithms, and for this I recommend Davd Mackay's book, you can get the PDF from here.

Source.

3 comments:

  1. Hello, my name is Madison.
    I am taking a course online and it asked me to write a few paragraphs on Entropy... which i know nothing about, or understand. I'll paste the directions, and hopefully you can help me out!

    Take a look at the following uses of the term entropy by several historical figures: Anton Chekhov considered by some as being one of the greatest short story writers said, "Only entropy comes easy." Vaclav Havel, playwright and the first president of the Czech Republic said, "Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy." Rex Stout, an American crime writer said, "The minute those two little particles inside a woman's womb have joined together, billions of decisions have been made. A thing like that has to come from entropy." Choose one of the quotes above and discuss how the use of the word entropy is similar or different than your current understanding of the term and how it applies to the study of chemistry.

    If you can help that'd be great, thank you!

    ReplyDelete
  2. In effect, entropy talks about how much information you can get or you can expect from something. I tried to look through the definition of entropy in chemistry and as far as I understand from that it is mostly related to thermodynamic systems. For example if a system is in a very good balance that you can easily tell what's going on in there there is little information required to describe the system and hence it has low entropy. But if a system is more chaotic and you need lots and lots of information to describe it is with high entropy.

    Based on the second law of thermodynamics entropy always increases or remains constant in a system. It never gets to a lower entropy without extra work.

    I hope this somehow gives an idea of what entropy talks about in chemistry.

    Some references:

    http://www.shodor.org/unchem/advanced/thermo/#entropy
    http://chemistry.about.com/od/chemistryglossary/a/entropydef.htm
    http://en.wikipedia.org/wiki/Entropy

    ReplyDelete
  3. That's a big help, thanks!

    ReplyDelete