Mutual information: a universal measure of statistical dependence

Kinney, J. B. (2004) Mutual information: a universal measure of statistical dependence. Biomedical Computation Review, 10 (2). p. 33.

URL: http://www.bcr.org/content/mutual-information-univ...

Abstract

A deluge of data is transforming science and industry. Many hope that this massive flux of information will reveal new vistas of insight and understanding, but extracting knowledge from Big Data requires appropriate statistical tools. Often, very little can be assumed about the types of patterns lurking in large data sets. In these cases it is important to use statistical methods that do not make strong assumptions about the relationships one hopes to identify and measure. In this tutorial we consider the specific problem of quantifying how strongly two variables depend on one another. Even for data sets containing thousands of different variables, assessing such pairwise relationships remains an important analysis task. Yet despite the simplicity of this problem and how frequently it is encountered in practice, the best way of actually answering it has not been settled.

Item Type: Paper
Subjects: bioinformatics > computational biology
bioinformatics > computational biology > statistical analysis
CSHL Authors:
Communities: CSHL labs > Kinney lab
Depositing User: Matt Covey
Date: 2004
Date Deposited: 30 Apr 2015 19:41
Last Modified: 30 Apr 2015 19:41
URI: http://repository.cshl.edu/id/eprint/31368

Actions (login required)

Administrator's edit/view item Administrator's edit/view item
CSHL HomeAbout CSHLResearchEducationNews & FeaturesCampus & Public EventsCareersGiving