Understanding Shannon entropy: (1) variability within a distribution
23,940
Published 2020-07-27
* show that it represents the variability of the elements within a distribution, how different are they from each other (general characterization that works in all disciplines)
* show that this variability is measured in terms of the minimum number of questions needed to identify an element in the distribution (link to information theory)
* show that this is related to the logarithm of the number of permutations over large sequences (link to combinatorics)
* show that it is not in general coordinate independent (and that the KL divergence does not fix this)
* show that it is coordinate independent on physical state spaces - classical phase space and quantum Hilbert space (that is why those spaces are important in physics)
* show the link between the Shannon entropy to the Boltzmann, Gibbs and Von Neumann entropies (link to physics)
Most of these ideas are from our paper:
arxiv.org/abs/1912.02012
which is part of our bigger project Assumptions of Physics:
assumptionsofphysics.org/
All Comments (21)
-
DISCLAIMER: If you see ads, these are put in by YouTube. I do not get any money from them, YouTube does. I'd like to turn them off, but it seems it's out of my control!
-
Finally a great video on the topic. Thank you, I have been searching for this for long.
-
Great video. I am looking forward to watching the rest of the videos on information theory.
-
Incredibly clear and insightful explanation, thank you!
-
Wonderful explanation. I struggled for a day to fully comprehend how the logarithmic part of entropy formula work when the probabilities are not exactly equal to some powers of a base b. The third property in this explanation made my day. Thank you so much.
-
Fantastic explanation, thanks a lot 🙏🏽
-
What a clean explanation!
-
It's the first time I hear that there are different types/definitions of entropies... I always found entropy to be a challenging concept in itself; I think reading studying these different definitions might actually help me understand the concept better.. Thanks!
-
Thank you very much. Excellent
-
Great stuff.
-
wow.. this was an incredible video! Entropy is something that I am always finding out new things about.. harder to understand than quantum mechanics if I am being honest..
-
ohhh i hear about shannon entropy being described as "information", "uncertainty", "surprise", and yeah they sound vague . the sources i got these from are mostly popular science content creators, meant to attract and introduce people to the field... so valuable to have deeper, meatier contents like this also
-
Please continue to make content.
-
Thank you so much. It really brigthen my imagination a lot! But actually, how it is related in context of informational theory? Could you make a video or just simply explain using an example, please?
-
excelent explanation.
-
I'm trying to make examples: suppose we have a set O of three types of elements • × Δ , having 4 of those elements each. If we have other sets A,B,C with: A = { 8• , 2× , 2Δ } B = { 2• , 8× , 2Δ} C = { 2• , 2× , 8Δ } then all A, B, and C will have the same shannon entropy, and all of them will have shannon entropy larger than the original set O?
-
Please make a video on How to use Shannon entropy to detect land-use change.
-
5:12 zscore helps for grouping numbers as independent from their values in the set. But if the data is continuous, it doesnt work.
-
what I don't understand is how come we're using the full H(pi) when breaking down the shannon entropy into it subtypes. Why are we doing H(rk) = H(pi) + pa(H(qj)) instead of only using the portion of H(pi) which is not also included in H(qj)?
-
Where can I read up on entropy in the style of how you presented it?