by Zac
Last Updated January 15, 2018 00:19 AM

For two random variables $X$ and $Y$, the conditional entropy $(Y|X)$ can be given in terms of the joint entropy $(X,Y)$ by:

$H(Y|X) = H(X) + H(X,Y) $

Following page 21 of Cover & Thomas, the Chain Rule gives theorem 2.5.1:

$H(X_1,X_2,...,X_n) = \sum_{i=0}^n H(X_i,X_{i-1},...,X_1) $

Unfortunately, this (and all the other material I have seen) shows the joint entropy in terms of a sum of conditional entropies.

I want to go the other way round, and represent a conditional entropy using only the joint/single entropies.

How do I disentangle the sum term to find e.g.: $ H(Y|X1,X2,X3)$ ?

You can always group random variables together and treat them as single variables.

So if we group $X_1, X_2, X_3$ together, you can do: $$ H(Y,X_1,X_2,X_3)=H(X_1,X_2,X_3) + H(Y|X_1,X_2,X_3) $$

Therefore by rearranging you get: $$ H(Y|X_1,X_2,X_3)=H(Y,X_1,X_2,X_3) - H(X_1,X_2,X_3) $$

which is what you suggested.

I should also note that if you continue to group variables together and keep applying the chain rule for only 2 (groups of) variables you will eventually get the result of theorem 2.5.1

- ServerfaultXchanger
- SuperuserXchanger
- UbuntuXchanger
- WebappsXchanger
- WebmastersXchanger
- ProgrammersXchanger
- DbaXchanger
- DrupalXchanger
- WordpressXchanger
- MagentoXchanger
- JoomlaXchanger
- AndroidXchanger
- AppleXchanger
- GameXchanger
- GamingXchanger
- BlenderXchanger
- UxXchanger
- CookingXchanger
- PhotoXchanger
- StatsXchanger
- MathXchanger
- DiyXchanger
- GisXchanger
- TexXchanger
- MetaXchanger
- ElectronicsXchanger
- StackoverflowXchanger
- BitcoinXchanger
- EthereumXcanger