3

I'm reading about statistical definition of entropy, which says

$$S=-k_B\sum_ip_i\ln p_i,\tag1$$

where $k_B$ is Boltzmann's constant, and $p_i$ is probability of $i$th state to be occupied. But in classical mechanics there's always an uncountable set of states with given energy, e.g. all states between time $t_0$ and $t_1$ for given initial conditions.

Shouldn't $(1)$ be changed to an integral instead of discrete sum? Why is it never (as it seems) done?

Ruslan
  • 28,862
  • 3
    To anyone who would answer this, please do not stop at "we need a cell size in phase space". There are few if any places in physics where absolute values of entropy matter; it's almost always a ratio of entropies, in which case any such "cell size" (usually claimed to come from $\hbar$) disappears. – DanielSank Feb 03 '16 at 20:35

2 Answers2

2

Let us start with an example, called Langevin paramagnetism, where the magnetic moment is described classically, as a vector in three dimensions. Calling $\vec\mu$ this moment, $\vec B$ the magnetic induction and $\theta$ the angle between $\vec\mu$ and $\vec B$. The probability density of the angle $\theta$ is $\rho(\theta)=\frac{1}{\mathcal Z}\exp(\beta\mu B\cos\theta)$, where $\mathcal Z$ is the partition function and can be seen as a normalization constant. Indeed we have (using $x=\cos\theta$) $$\mathcal Z=\int_0^\pi\rho(\theta)\sin\theta\,\mathrm d\theta =\int_{-1}^1\exp(\beta\mu B x)\mathrm dx=2\frac{\sinh(\beta\mu B)}{\beta\mu B}.$$ The average energy is $$E=-\frac{\partial \ln\cal Z}{\partial \beta}=-\mu B\left(\coth(\beta\mu B)-\frac1{\beta\mu B}\right)$$ The entropy is $$\begin{split}S&=-k_{\text B}\int_0^\pi \rho(\theta)\ln\rho(\theta)\,\sin\theta\,\mathrm d\theta=-k_{\text B}\frac1{\cal Z} \int_{-1}^1\exp(\beta \mu B x)\left(\beta\mu Bx-\ln\mathcal Z\right)\mathrm dx\\ &=k_{\text B}\ln\mathcal Z-k_{\text B}\beta\mu B\left(\coth(\beta\mu B)-\frac{1}{\beta \mu B}\right)=k_{\text B}\ln {\cal Z}+\frac{E}{T}. \end{split}$$ If one defines (as usual) the free energy $F$ as $F=-k_{\text B}T\ln\mathcal Z$, then the last expression is $F=E-TS$. This demonstrates that the entropy formula is valid using the expression $S=-k_{\text B}\int\rho\ln\rho$.

In this model, no underlying discretization is needed, but when one uses the definition of $S$ as an integral, the argument in the logarithm may have a physical unit, which gives a hint that the entropy is defined up to a constant. Moreover, discretizing the values of $\theta$ into segments of width $\pi/N$, we can define $p_n=\rho(n\frac\pi N)\frac\pi N$. The statistical entropy becomes $$S=-k_{\text B}\sum_n p_n\ln p_n=-k_{\text B}\sum_n \rho\left(n\frac\pi N\right)\frac \pi N\left[\ln\rho\left(n\frac\pi N\right)+\ln\frac\pi N\right]$$ This is a Riemann sum plus a number that goes to infinity when $N\to\infty$. The preceding formula becomes $$S=-k_{\text B}\int\rho\ln\rho\;\;-k_{\text B}\lim_{N\to\infty}\ln\frac\pi N.$$ As usually only the variations of entropy are used, the infinite constant is irrelevant and one may forget about it and use the entropy defined by the integral.

Tom-Tom
  • 1,941
  • What is $\beta$ here? – Ruslan Feb 04 '16 at 08:56
  • 1
    $\beta=1/k_{\text B}T$. This is used in almost all textbooks and publications. – Tom-Tom Feb 04 '16 at 09:46
  • It might be obvious, but how do you get past the last equality sign in the equation after "The entropy is"? I can't seem to figure out why this holds: $$E=k_BT-\frac{\mu B}{k_B}\coth\left(\frac{\mu B}{k_BT}\right)$$ if energy was supposed to be $E=\mu B\cos\theta$ as seen in your definition of $\rho(\theta)$. – Ruslan Feb 05 '16 at 19:12
  • The energy of a position is $-\mu B \cos\theta$, the average energy is given by $E$ computed just above the formula. The last equality is obtained by remarking that the expression of the average energy $E$ appears in the expression of $S$ (and one uses $k_{\text B}\beta=\frac1T$. I have corrected the sign in the exponentials, this does not change anything in the following. Note the in your comment, the enrgy should be written as $k_{\text B}T-\mu B\coth\left(\beta\mu B\right)$. – Tom-Tom Feb 06 '16 at 10:31
  • Ah, got it, thanks. I seem to almost understand your answer, but a few things still remain unclear: 1. Why did you choose probability the way you did? Boltzmann distribution seems to have total energy in the exponent, while you only use potential energy. 2. In first expression for entropy, why do you include $\sin\theta$ weight as multiplier of first $\rho$ but not to one inside $\ln$? 3. Did you mean moment instead of momentum? – Ruslan Feb 06 '16 at 11:37
  • Yes I mean magnetic moment. 1. This is a very simple model for a magnetic moment. You can add a kinetic energy if you want, but that will not change anything, because it's an independent degree of freedom. So you will only get ${\cal Z}={\cal Z}{\text{magnetic}}{\cal Z}{\text{kinetic}}$. 2. The $\sin \theta$ is not part of the function $\rho$ but is required when integrating over $\theta$ because we are in three dimensions. The integration element is actually $\sin\theta\mathrm d\theta\mathrm d\varphi$ but nothing depends on $\varphi$ and we can integrate it out.
  • – Tom-Tom Feb 07 '16 at 21:08