Skip to content

anishgoel1/witten_entropy

Repository files navigation

Inspired by the classical information theory sections covered in a review paper by Ed Witten: https://arxiv.org/pdf/1805.11965.pdf

Toy_AvgMessage.py creates an animation of the information gain from observing the average message (based on letter frequencies) wrt. to message length $N$ using manimgl

text_entropy.py calculates the information entropy of any text

conditional_entropy.py calculates the conditional entropy, joint entropy, and mutual information for a probability matrix. An example inspired from Sec 2.2 is provided

relative_entropy.py calculates the relative entropy for two simple, normal distributions with different variances

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages