Skip to content

Commit

Permalink
Update comments.
Browse files Browse the repository at this point in the history
  • Loading branch information
markrogoyski committed Dec 19, 2016
1 parent 0a3166a commit 4fe5220
Showing 1 changed file with 10 additions and 2 deletions.
12 changes: 10 additions & 2 deletions src/InformationTheory/Entropy.php
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,16 @@
/**
* Functions dealing with information entropy in the field of statistical field of information thoery.
*
* - Bhattacharyya distance
* - Kullback-Leibler divergence
* - Entropy:
* - Shannon entropy (bits)
* - Shannon entropy (nats)
* - Shannon entropy (harts)
* - Cross entropy
* - Distances and divergences
* - Bhattacharyya distance
* - Kullback-Leibler divergence
* - Hellinger distance
* - Jensen-Shannon divergence
*
* In information theory, entropy is the expected value (average) of the information contained in each message.
*
Expand Down

0 comments on commit 4fe5220

Please sign in to comment.