I have a question regarding definition of entropy by expected value of the random variable $\log \frac{1}{p(X)}$:
$H(X) = E \log \frac{1}{p(X)}$,
where $X$ is drawn accordingly to the probability mass function of $p(x)$.
The problem is I am still don't understand two thing:
1) How this formula was derived from the original formula for the entropy
$H(X) = - \sum_{x \in X} p(x) \log p(x)$.
2) Even without knowing how to derive the second formula, what it the meaning of $p(X)$? Can you show how to find an entropy for the fair dice with one toss by the second formula.
Appreciate your help!