| twoway.entropy {truecluster} | R Documentation |
These functions calculate bivariate informationtheoretic measures, like joint entropy, conditional entropy and mutual information.
twoway.entropy(p, grain=c("Totals","Margins","Cells"))
print.twoway.entropy(x, ...)
plot.twoway.entropy(x, ...)
p |
matrix of two-way probabilities (or frequencies) |
x |
on object of class twoway.entropy |
grain |
one of c("Totals","Margins","Cells") determining the granularity of the return value |
... |
further arguments to plot,print |
An object of class twoway.entropy with components
Totals |
always returned: a list with components H joint entropy, Ha row entropy, Hab row conditional entropy given columns, Hb column entropy, Hba column conditional entropy given rows, Im mutual information |
Margins |
returned unless grain="Total": pa row probabilities , ha row entropie, hab colwise conditional entropies, pb column probabilities, hb column entropy , hba rowwise conditional entropies |
Cells |
returned if grain="Cells": p joint probabilities, pab columnwise conditional probabilities, pba rowwise conditional probabilities, h joint entropies, hab columnwise conditional entropies, hba rowwise conditional entropies |
Jens Oehlschlägel
MacKay, David J.C. (2003). Information Theory, Inference, and Learning Algorithms (chapter 8). Cambridge University Press.
shannon.information, dist.entropy, Kullback.Leibler, log
# Exercise 8.6 from MacKay
p <- 1/matrix(c(8,16,16,4,16,8,16,Inf,32,32,16,Inf,32,32,16,Inf), 4)
twoway.entropy(p)
twoway.entropy(p, grain="Cells")
str(twoway.entropy(p, grain="Cells"))
plot(twoway.entropy(p), main="Plot from MacKay (2003), chapter 8.1")
# Highly correlated data
x <- rnorm(100)
y <- x + rnorm(100, sd=0.1)
a <- round(x)
b <- round(y)
p <- table(a, b)
plot(twoway.entropy(p), main=paste("correlation=", round(cor(x,y), 2)))