Problemstatement: Given a probability
distribution, what information do the variables contain?
Information
Information of a
random variable is defined as the p.log p sum of all
probabilities
It represents the uncertainty of
the variable.
Applet:
Mutual information
The mutual information
of two variables define how much information one
variable contains about the other. It is therefore defined as the
decrease of the uncertainty of one variable by knowing the other. In
probabilistic terms, the entropy decrease by conditioning on the
distribution.
Applet: