Questions tagged [entropy]

This tag is for questions about mathematical entropy. If you have a question about thermodynamical entropy, visit Physics Stack Exchange or Chemistry Stack Exchange instead.

This tag is for questions about mathematical entropy, not to be confused with thermodynamical entropy which goes in Physics or Chemistry Stack Exchange.

1617 questions
25
votes
1 answer

Entropy of a uniform distribution

The entropy of a uniform distribution is $ ln(b-a)$. With $a=0$ and $b=1$ this reduces to zero. How come there is no uncertainty?
log2
  • 263
9
votes
4 answers

Understanding conditional entropy intuitively $H[Y|X=x]$ vs $H[Y|X]$

I was trying to understand conditional entropy better. The part that was confusing me exactly was the difference between $H[Y|X=x]$ vs $H[Y|X]$. $E[Y|X=x]$ makes some sense to me intuitively because its just the average unpredictability (i.e.…
7
votes
1 answer

Correct algorithm for Shannon entropy with R

Shannon entropy is defined by: $H(X) = -\sum_{i} {P(x_i) \log_b P(x_i)}$, where b could be $e$, 2 or 10 (bit, nat, dit, respectively). My interpretation of the formula is: $H(X)$ is equal to the negative sum of: probability of $x_i$ multiplied by…
Tommaso
  • 249
7
votes
1 answer

Definition of the Entropy

I have a question regarding definition of entropy by expected value of the random variable $\log \frac{1}{p(X)}$: $H(X) = E \log \frac{1}{p(X)}$, where $X$ is drawn accordingly to the probability mass function of $p(x)$. The problem is I am still…
com
  • 5,612
7
votes
1 answer

Maximum Entropy with bounded constraints

Assume we have the problem of estimating the probabilities $\{p_1,p_2,p_3\}$ subject to: $$0 \le p_1 \le .5$$ $$0.2 \le p_2 \le .6$$ $$0.3 \le p_3 \le .4$$ with only the natural constraint of $p_1+p_2+p_3=1$ Found two compelling arguments using…
sheppa28
  • 929
7
votes
0 answers

Approximation of Shannon entropy by trigonometric functions

Define Shannon entropy by $$I(p) = -p \log_2 p$$ Numerical experimentation shows that $\sin(\pi p)^{1-1/e}$ is a good approximation to $I(p) + I(1-p)$ on $[0,1],$ never differing by more than 3.3%. A little more experimentation shows that the…
rde
  • 71
4
votes
1 answer

Why $p$ factor in $p \log p$ in the entropy formula?

It is explained that we have log for additivity of information in the entropy formula. But, why is the $p$ factor? It is redundant, since we already have it in the $\log p$!
Val
  • 1
3
votes
1 answer

Entropy of the upper and lower bits of a square number

Consider a uniformly random number $x<2^n$. Let $H_{\star}(n)$ denote the (base-$2$ Shannon) entropy of the first $n$ bits of $x^2$, and let $H^{\star}(n)$ denote the entropy of the rest of the bits of $x^2$. In other words, writing $x^2 = q2^n + r$…
BHT
  • 2,215
3
votes
2 answers

Calculate entropy with n values

I trying to solve a quiz that asks the following. The variable $X$ can be the values $1,2,3,...,n$ with the probabilities $\frac{1}{2^1}, \frac{1}{2^2},\frac{1}{2^3},...,\frac{1}{2^n}$ How can I calculate the entropy of $X$? Don't I have to know all…
Favolas
  • 803
  • 1
  • 8
  • 15
3
votes
3 answers

Proving entropy inequalities

I have two functions whose functional forms are of Shannon entropy type. The functions are: $$f_1 = x \log x + (1-x)\log (1-x)+y \log y + (1-y)\log (1-y)$$ and $$f_2 = xy \log (xy) + (1-xy)\log (1-xy).$$ Here both $x,y$ are less than 1. Question: I…
3
votes
3 answers

Minimum number of bits required to store the order of a deck of cards

Assume I have a shuffled deck of cards (52 cards, all normal, no jokers) I'd like to record the order in my computer in such a way that the ordering requires the least bits (I'm not counting look up tables ect as part of the deal, just the ordering…
Joe
  • 422
2
votes
1 answer

Relation between entropy and compressibility of a file

Suppose I have an ordered list of bytes (the hexdump of some object file), and wish to calculate the information entropy of this file. My understanding is I can calculate this as $$ \sum_{n=0}^{n=255} -p_n \log_{256}(p_n) $$ where $p_n =…
Eric
  • 21
2
votes
0 answers

Does the conditional entropy H(X | (Y,Z)) equal H((X | Y) | Z)?

X,Y,Z are random variables of course. I have seen that this is considered equal sometimes but I need confirmation. I tried using H(A,B) = H(A|B) + H(B): H(X | (Y,Z)) = H(X,Y,Z) - H(Y,Z) and H(Y,Z) = H(Y|Z) + H(Z) so H(X | (Y,Z)) = H(X,Y,Z) - H(Y|Z)…
Phlipp
  • 31
2
votes
0 answers

Entropy of a measure-preserving transformation

Let $T:X \rightarrow X$ be a measure-preserving transformation of the probability space $(X,B,\mu)$. ($X$ is a topological space). If $K\subset X$ is a compact and invariant set, show that $h_{\mu}(f|K)=h_{\mu}(f)$. Here $h_{\mu}(.)$ denote entropy…
1
2 3 4 5 6