Inspired by the classical information theory sections covered in a review paper by Ed Witten: https://arxiv.org/pdf/1805.11965.pdf
Toy_AvgMessage.py
creates an animation of the information gain from observing the average message (based on letter frequencies) wrt. to message length manimgl
text_entropy.py
calculates the information entropy of any text
conditional_entropy.py
calculates the conditional entropy, joint entropy, and mutual information for a probability matrix. An example inspired from Sec 2.2
is provided
relative_entropy.py
calculates the relative entropy for two simple, normal distributions with different variances