1

I am working on a problem of modelling a rubber molecule as a one-dimensional chain consisting of $N=N_{+}+N_{-}$ links, where $N_{+}$ points in the positive $x$-direction a distance $a$ and $N_{-}$ points in the negative $x$-direction a distance $a$.

It is trivial to prove that:

$$L = a(N_{+} - N_{-})$$

Where $L$ is the overall length of the rubber molecule. We can also see that the number of ways of arranging the links to achieve a length $L$ is given by:

$$\Omega(L)=\frac{N!}{N_{+}!N_{-}!}=\binom{N}{N_{+}}$$

However, I am asked to prove that the entropy (defined by $S = k_{B}\ln(\Omega(L))$) can be written approximately as:

$$S\approx N k_{B}\left[\ln(2)-\frac{L^{2}}{2N^{2} a^{2}}\right]\tag{1}$$

By Stirling's approximation we have:

$$k_{B}\left[-N\left(\frac{N_{+}}{N}\ln\left(\frac{N_{+}}{N}\right)-\left(1-\frac{N_{+}}{N}\right)\ln\left(1-\frac{N_{+}}{N}\right)\right)\right]$$

But I cannot see a way of relating this to the approximation for entropy given. I tried a Taylor expansion of the logarithms but the algebra quickly became messy and didn't provide anything which looked like $(1)$.

Thomas Russell
  • 10,425
  • 5
  • 38
  • 66
  • isn't that answered here? http://math.stackexchange.com/questions/235962/asymptotics-of-binomial-coefficients-and-the-entropy-function – leonbloy Nov 28 '14 at 14:03
  • @leonbloy That simply provides me with the formula that I have in my last line of working, I am unsure how I can relate this to $(1)$? – Thomas Russell Nov 28 '14 at 14:05
  • This is the normal approximation to the binomial distribution; maybe the details in Wikipedia's page on the de Moivre-Laplace theorem will meet your needs. –  Nov 28 '14 at 14:26
  • And, if memory serves, there's a simpler (but less rigorous) development in Keith Ball's book Strange Curves, Counting Rabbits, and other Mathematical Explorations. Ball is a gifted expositor, so this might be worth checking if the computation leaves you cold. –  Nov 28 '14 at 14:28

1 Answers1

2

$$ S = k_B \ln{N \choose N_+} \approx k_B N \,H\left(p \right)$$

where $H(p)$ is the binary entropy function (in nats) and $$p=\frac{N_+}{N}=\frac{1}{2}\left(1+\frac{L}{aN}\right)$$

Now, if we can do the (additional) assumption $\frac{L}{aN} \ll 1$, we can do a Taylor expansion of $H(p)$ around $p=1/2$, so that $$H(p)\approx \ln 2 - \frac{(1-2p)^2}{2 }= \ln 2 - \frac{L^2}{2 (aN)^2} $$

Alternatively, as pointed out in the comments, one could use the CLT approximation of the Binomial distribution to a gaussian, so $$2^{-N}{N \choose N_+} \approx \sqrt{\frac{2}{ \pi N}} \exp{\left(-\frac{2(N_+-N/2)^2}{N}\right)}$$

$$ \log {N \choose N_+} \approx N \left(\ln 2 - 2\left(\frac{N_+}{N}-\frac{1}{2}\right)^2\right ) = N \left( \ln 2 - \frac{L^2}{2 (aN)^2}\right)$$

leonbloy
  • 63,430