-
Notifications
You must be signed in to change notification settings - Fork 14
Approximate entropy #72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report
@@ Coverage Diff @@
## main #72 +/- ##
==========================================
+ Coverage 79.65% 80.56% +0.91%
==========================================
Files 33 34 +1
Lines 747 772 +25
==========================================
+ Hits 595 622 +27
+ Misses 152 150 -2
📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
@Datseris As far as I can see, this should be ready to go now. The docs are here. I've:
|
Shouldn't there be a small |
This is the standard for Julia 1.3 an onwards.
@Datseris I've now added:
|
docs/src/complexity_measures.md
Outdated
## Convenience functions | ||
|
||
We provide a few convenience functions for widely used "entropy-like" complexity measures, such as "approximate entropy". Other arbitrary specialized convenience functions can easily be defined in a couple lines of code. | ||
|
||
```@docs | ||
approx_entropy | ||
``` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well now I am confused. I thought we agreed that we do this table thingy and therefore do not provide convenience functions anymore.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well now I am confused. I thought we agreed that we do this table thingy and therefore do not provide convenience functions anymore.
We agreed on having exactly two convenience functions for complexity measures too, for educational purposes, like we do for entropy. Those would be sample entropy and approximate entropy, because they are the most widely used ones. Citing yourself in #134:
Okay let's keep the sample_entropy and approxiamte_entropy as the only "convenience complexity functions".
Remaining complexity measures go in the table.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I made this comment before the existence of the table was on the table however (pun not intended). If we have a table, which satisfies the "search engine" argument, do we need these convenience methods? Or, we can have a docs section that discusses the library design principles. I have one in Agents.jl as well. Okay. Leave the "convenience method" here. Once we have the table, I'll do a PR that proposes my section.
What is this PR?
This PR is an implementation of the approximate entropy (ApEn) algorithm from Pincus (1991), a widely used measure of the regularity of time series (for a summary, see Delgado-Bonal & Marshak, 2019).
Note: the sample entropy (PR #71) is a proposed improvement of the approximate entropy.
Implementation
A simple tree search is used to locate within-radius-
r
neighbors. During testing, I found this approach to be more than an order of magnitude faster, both in terms of runtime and allocations, than using a naive loop-based approach.Tests
Tests are generic, as the original paper lack short, concise examples to test on. However, in the doc example, I reproduce the Henon map example in Pincus (1991) with reasonable accuracy, so I think the implementation should be correct.
References
Pincus, S. M. (1991). Approximate entropy as a measure of system complexity. Proceedings of the National Academy of Sciences, 88(6), 2297-2301.
Delgado-Bonal, A., & Marshak, A. (2019). Approximate entropy and sample entropy: A comprehensive tutorial. Entropy, 21(6), 541.