Interactive Tutorials


3. Information Theory 

1. Causal Models - 2. Causal Structure Learning - 4. Kernel Smoothing

   
 
  
Problem statement: Given a probability distribution, what information do the variables contain?


Information

Information of a random variable is defined as the p.log p sum of all probabilities
It represents the uncertainty of the variable.
Applet:



Mutual information

The mutual information of two variables define how much information one variable contains about the other. It is therefore defined as the decrease of the uncertainty of one variable by knowing the other. In probabilistic terms, the entropy decrease by conditioning on the distribution.
Applet:


last updated: January, 24th 2006 by Jan Lemeire
Parallel Computing Lab, VUB