Entropy

Reading about mutual information between amino acids in protein-protein interaction pairs, and got distracted:

Although not particularly obvious from this equation, H(X) has a very concrete interpretation: Suppose x is chosen randomly from the distribution P_X(x), and someone who knows the distribution P_X(x) is asked to guess which x was chosen by asking only yes/no questions. If the guesser uses the optimal question-asking strategy, which is to divide the probability in half on each guess by asking questions like “is x greater than x_0?”, then the average number of yes/no questions it takes to guess x lies between H(X) and H(X)+1 (Cover and Thomas, 1991). This gives quantitative meaning to “uncertainty”: it is the number of yes/no questions it takes to guess a random variables, given knowledge of the underlying distribution and taking the optimal question-asking strategy.

Scholarpedia: Mutual information

Advertisements

One thought on “Entropy”

  1. For some reason, reading and re-reading this reminded me of this Farside comic from back in the day. The search terms to actually find this image on Google probably like between H(x) and H(x)+1 as well.

    here

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s