You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/index.md
+5-1
Original file line number
Diff line number
Diff line change
@@ -40,7 +40,11 @@ Thus, any of the implemented [probabilities estimators](@ref probabilities_estim
40
40
41
41
These names are common place, and so in Entropies.jl we provide convenience functions like [`entropy_wavelet`](@ref). However, it should be noted that these functions really aren't anything more than 2-lines-of-code wrappers that call [`entropy`](@ref) with the appropriate [`ProbabilitiesEstimator`](@ref).
42
42
43
-
There are only a few exceptions to this rule, which are quantities that are able to compute Shannon entropies via alternate means, without explicitly computing some probability distributions. These are `IndirectEntropy` instances, such as [`Kraskov`](@ref).
43
+
In addition to `ProbabilitiesEstimators`, we also provide [`EntropyEstimator`](@ref)s,
44
+
which compute entropies via alternate means, without explicitly computing some
45
+
probability distribution. For example, [`Kraskov`](@ref) estimator computes Shannon
46
+
entropy via a nearest neighbor algorithm, while the [`Zhu`](@ref) estimator computes
0 commit comments