6

I am computing then entropy of the uniform distribution on $[0,1]$:

$$ H(X) = \frac{1}{1-0} \int_0^1 (1 - 0) \log 1 \, dx = 0 $$

Does that mean at $X$ has zero entropy?

cactus314
  • 24,438

3 Answers3

10

A random number uniformly distributed on $[0,1]$ (actually, in any interval of positive measure) has infinite entropy - if we are speaking of the Shannon entropy $H(X)$, which corresponds to the average information content of each ocurrence of the variable. Indeed, in a real number on the interval $[0,1]$ I can code all the information of the wikipedia, math.stackexchange.com and more.

The differential entropy $h(X)$ is another thing. It's not a true entropy (it can be zero or negative), and among other things it depends on the scale (so, say, the -differential- entropy of the height of the humans gives different values if I measure them in centimeters or in inches - which is rather ridiculous).

So, yes, the differential entropy of your variable is zero. But (because the differential entropy is not the Shannon entropy) that means nothing special - in particular, it does not mean that the variable has no uncertainty -like a constant. Actually, the differential entropy of a constant variable (which would correspond to a Dirac delta density) is $-\infty$.

leonbloy
  • 63,430
0

I think you're right.

Shannon's definition of the entropy of a continuous random variable $X$ is $$ H(X)=-E[ln(f(x))] $$ where $f(x$) is its probability density function. The probability density function for a uniform distribution is $f(x)=1$ if $x \in [0,1]$ and $0$ otherwise. Of course $$ H(X) = -\int_0^1 ln(1) f(x) dx = 0 $$ in this case. It even makes sense in a physical system. If the temperature of a metal bar is equal everywhere, then there's no heating or cooling anywhere in that system.

Eric Fisher
  • 1,313
  • 1
    But Shannon's entropy is only defined for discrete r.v's. The above is a (not-always meaningful) continuous analogue, differential entropy, which loses many nice properties of Shannon entropy -- it can be negative, it depends on the scaling, etc. – Clement C. Dec 05 '17 at 22:47
  • 1
    Yes, this is exactly what my comment says. – Clement C. Dec 05 '17 at 22:59
  • Yes, thank you. You're comment is correct. Sorry for my temporary inattention. – Eric Fisher Dec 05 '17 at 23:00
-1

Using the comment of Clement C, divide the interval into n subintrvals. Then $H_n(X)=\sum_{k=1}^{n} \frac{-log_2(\frac{1}{n})}{n}=log_2(n)$ so that $H(X)=\infty$