0

Jaynes wrote a paper called information theory and statistical mechanics (1957). I tried to read this but found it somewhat hard to follow, and I suspect it's because it's one of the earlier works, and that a modern text might be more refined since we've had time to work it out. On the other hand, what I've read from physics textbooks don't explicitly talk about information theoretic concepts like shannon entropy.

Is there a good modern introduction to the information theoretic view of entropy?

Note, this is not a duplicate. All of the answers there are also written by Jaynes, and at roughly the same time.

  • 1
    Hi user83014. Welcome to Phys.SE. Please don't repost a closed question in a new entry. Instead, you are supposed to edit the original question within the original entry. – Qmechanic Feb 21 '20 at 14:57
  • You could try to read the quite recent publication "Entropy? Honest!" by Tommaso Toffoli, a professor at Boston University. https://www.mdpi.com/1099-4300/18/7/247 I think it is very well written and considers a lot of different aspects but I am not sure if it is a good introduction... – 2b-t Feb 21 '20 at 15:26
  • Ahh, this is super confusing. The OP had posted a near-identical version of this question previously. I've flagged this one for the mods to delete - the other one should probably be reopened once that's done. – N. Virgo Feb 21 '20 at 16:33
  • And now even more confusing, because someone else edited this question's text into a somewhat related old question, https://physics.stackexchange.com/q/233180/ . So now we have three questions with basically the exact same text. – N. Virgo Feb 21 '20 at 16:35

0 Answers0