Information Analysis

  

Problem statement: Given a probability distribution, what information do the variables contain?


Data Analysis

The theory of causality is based on the conditional independencies among the variables. We use the mutual information as a form-independent measure of dependency, in contrast with the correlation coefficient which measures the association of lineairly correlated variables. Since the definition of information is based on probability distributions, we have to estimate it from the observed data. Kernel Density Estimation offers a modern technique to do this.


 

 This code is being developed right now, please excuse us if some unconvenient 'features' appear. Feel free to annoy our young responsible developer with complaints.


Alternatively, run as an application: download jar-file and run module with java -Xbootclasspath/p:causalLearningWithKde.jar be.ac.vub.tw.statistics.DataAnalysisPanel (Under Windows: Start => Run => execute cmd to open command shell, go to download directory with cd <directory>).

      More information: Jan Lemeire (jan.lemeire@vub.ac.be)
   
  Interactive tutorial


last updated: February, 28th 2006 by Jan Lemeire
Parallel Computing Lab, VUB