# How do I represent conditional entropy between 3 (or more) variables in terms of joint entropy?

by Zac   Last Updated January 15, 2018 00:19 AM

For two random variables $X$ and $Y$, the conditional entropy $(Y|X)$ can be given in terms of the joint entropy $(X,Y)$ by:

$H(Y|X) = H(X) + H(X,Y)$

Following page 21 of Cover & Thomas, the Chain Rule gives theorem 2.5.1:

$H(X_1,X_2,...,X_n) = \sum_{i=0}^n H(X_i,X_{i-1},...,X_1)$

Unfortunately, this (and all the other material I have seen) shows the joint entropy in terms of a sum of conditional entropies.

I want to go the other way round, and represent a conditional entropy using only the joint/single entropies.

How do I disentangle the sum term to find e.g.: $H(Y|X1,X2,X3)$ ?

Tags :