How do I represent conditional entropy between 3 (or more) variables in terms of joint entropy?

by Zac   Last Updated January 15, 2018 00:19 AM

For two random variables $X$ and $Y$, the conditional entropy $(Y|X)$ can be given in terms of the joint entropy $(X,Y)$ by:

$H(Y|X) = H(X) + H(X,Y) $

Following page 21 of Cover & Thomas, the Chain Rule gives theorem 2.5.1:

$H(X_1,X_2,...,X_n) = \sum_{i=0}^n H(X_i,X_{i-1},...,X_1) $

Unfortunately, this (and all the other material I have seen) shows the joint entropy in terms of a sum of conditional entropies.

I want to go the other way round, and represent a conditional entropy using only the joint/single entropies.

How do I disentangle the sum term to find e.g.: $ H(Y|X1,X2,X3)$ ?



Related Questions


Mutual information with a Dirac delta type pdf

Updated March 15, 2017 09:19 AM

Logistic regression: precision of p(Y)

Updated September 08, 2017 11:19 AM


what is the additivity of mutual information?

Updated January 07, 2018 10:19 AM

Limit of quantized entropy

Updated February 19, 2016 03:08 AM