4

We are all familiar with the Boltzmann-Gibbs-Shannon entropy formula:

$H_{\text{BGS}} = -\sum_{k}p_{k}\log{p_{k}}$

In information theory, this can be interpreted as the expectation value of the "surprise."

Using the maximum entropy principle, one can derive the micro, macro, and grand canonical ensembles from this expression. However, when it comes to Bose-Einstein and Fermi-Dirac statistics, these cannot be derived directly. Instead, we need to use the following entropy expressions:

$H_{\text{FD}} = -\sum_{k}(1-p_{k})\log{(1-p_{k})} - \sum_{k}p_{k}\log{p_{k}}$

$H_{\text{BE}} = \sum_{k}(p_{k}+1)\log{(p_{k}+1)} - \sum_{k}p_{k}\log{p_{k}}$

Interestingly, both of these expressions include an additional term. It caught my attention that we have a +1 and a -1, similar to the adjustments in the Bose-Einstein and Fermi-Dirac distributions. I would like to understand the significance and physical meaning behind these terms. Is there an intuitive way to interpret them?

  • There was a misprint in $H_{\text{FD}}$. – Gec May 21 '23 at 08:00
  • Something along these lines was asked before here on PSE several times, namely whether or not the FD or BE are really probabilistic distributions... – Tobias Fünke May 21 '23 at 08:16
  • @TobiasFünke I don't think this is a duplicate of those questions, as mine is more about the interpretation of the entropy from an information point of view. – IchVerlore May 22 '23 at 14:03
  • 1
    Well, but the as the answer correctly points out, the $p_k$ in the latter two cases are no probabilities, so this is not really the Shannon entropy (as a functional of probabilitity distributions). – Tobias Fünke May 22 '23 at 14:07
  • 1
    @TobiasFünke yes I wrote my comment before actually reading the answer. I now see how this could be seen as a duplicate (maybe more of a $\textit{soft}$ duplicate). – IchVerlore May 22 '23 at 14:25
  • @TobiasFünke I'm not sure I understand the duplication discussion. What is a duplicate in this case? My answer, question, or answer and question combined? And what should be done if something has been duplicated? I didn't do a search on SE. I wrote what was in my head about this question at that moment. I don't want to, but I'm ready to delete my answer if it repeats someone else's answer or discussion. – Gec May 22 '23 at 14:57
  • @Gec I don't have time right now to search, but there are for sure several questions regarding whether or not the $p_k$ (here) of the FD or BE distributions are probabilities. You've again answered this question in the context of this question, but the concept is the same. I just mentioned it because the question actually just boils down to this fact... In any case, you do not have to delete your answer. In the very worst case, the question is marked and closed as duplicate, but this does not affect your answer: Closed means no one else can write an answer to this question, for example. – Tobias Fünke May 22 '23 at 15:17
  • @TobiasFünke, thank you for the explanation. – Gec May 22 '23 at 15:38

1 Answers1

5

The similarity between $H_{\text{BGS}}$ and $H_{\text{FD}}$, $H_{\text{BE}}$ is formal. $H_{\text{BGS}}$ represents the entropy of a multiparticle system. In this case, $k$ enumerates the states of a multiparticle system and $p_k$ are the probabilities of these states. $H_{\text{FD}}$ and $H_{\text{BE}}$ express the entropy of quantum ideal gases. In this case, $k$ enumerates single-particle states, and $p_k$ are the average numbers of particles in these states, not probabilities. To emphasize the fact that $p_k$ are not probabilities in the later case, let's mention that in an ideal Bose-Einstein gas, $p_k$ can be greater than $1$.

Quantum ideal gases become classical in the limit of $p_k\ll 1$. In this limit, $H_{\text{FD}}$, $H_{\text{BE}}$ are formally reduced to $H_{\text{BGS}}$, while $p_k$ are still not probabilities, but the average number of particles. Thus, the additional terms in $H_{\text{FD}}$, $H_{\text{BE}}$ are related to the quantum statistics of indistinguishable particles.

Gec
  • 5,227