AntroPy: entropy and complexity of (EEG) time-series in Python

raphaelvallat, updated 🕥 2023-03-20 15:45:23

.. -- mode: rst --

|

.. figure:: https://github.com/raphaelvallat/antropy/blob/master/docs/pictures/logo.png?raw=true :align: center

AntroPy is a Python 3 package providing several time-efficient algorithms for computing the complexity of time-series. It can be used for example to extract features from EEG signals.

Documentation

• `Link to documentation <https://raphaelvallat.com/antropy/build/html/index.html>`_

Installation

AntroPy can be installed with pip

.. code-block:: shell

pip install antropy

or conda

.. code-block:: shell

conda config --add channels conda-forge conda config --set channel_priority strict conda install antropy

Dependencies

• `numpy <https://numpy.org/>`_
• `scipy <https://www.scipy.org/>`_
• `scikit-learn <https://scikit-learn.org/>`_
• `numba <http://numba.pydata.org/>`_
• `stochastic <https://github.com/crflynn/stochastic>`_

Functions

Entropy

.. code-block:: python

``````import numpy as np
import antropy as ant
np.random.seed(1234567)
x = np.random.normal(size=3000)
# Permutation entropy
print(ant.perm_entropy(x, normalize=True))
# Spectral entropy
print(ant.spectral_entropy(x, sf=100, method='welch', normalize=True))
# Singular value decomposition entropy
print(ant.svd_entropy(x, normalize=True))
# Approximate entropy
print(ant.app_entropy(x))
# Sample entropy
print(ant.sample_entropy(x))
# Hjorth mobility and complexity
print(ant.hjorth_params(x))
# Number of zero-crossings
print(ant.num_zerocross(x))
# Lempel-Ziv complexity
print(ant.lziv_complexity('01111000011001', normalize=True))
``````

.. parsed-literal::

``````0.9995371694290871
0.9940882825422431
0.9999110978316078
2.015221318528564
2.198595813245399
(1.4313385010057378, 1.215335712274099)
1531
1.3597696150205727
``````

Fractal dimension

.. code-block:: python

``````# Petrosian fractal dimension
print(ant.petrosian_fd(x))
# Katz fractal dimension
print(ant.katz_fd(x))
# Higuchi fractal dimension
print(ant.higuchi_fd(x))
# Detrended fluctuation analysis
print(ant.detrended_fluctuation(x))
``````

.. parsed-literal::

``````1.0310643385753608
5.954272156665926
2.005040632258251
0.47903505674073327
``````

Execution time ~~~~~~~~~~~~~~

Here are some benchmarks computed on a MacBook Pro (2020).

.. code-block:: python

``````import numpy as np
import antropy as ant
np.random.seed(1234567)
x = np.random.rand(1000)
# Entropy
%timeit ant.perm_entropy(x)
%timeit ant.spectral_entropy(x, sf=100)
%timeit ant.svd_entropy(x)
%timeit ant.app_entropy(x)  # Slow
%timeit ant.sample_entropy(x)  # Numba
# Fractal dimension
%timeit ant.petrosian_fd(x)
%timeit ant.katz_fd(x)
%timeit ant.higuchi_fd(x) # Numba
%timeit ant.detrended_fluctuation(x) # Numba
``````

.. parsed-literal::

``````106 µs ± 5.49 µs per loop (mean ± std. dev. of 7 runs, 10000 loops each)
138 µs ± 3.53 µs per loop (mean ± std. dev. of 7 runs, 10000 loops each)
40.7 µs ± 303 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each)
2.44 ms ± 134 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)
2.21 ms ± 35.4 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)
23.5 µs ± 695 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each)
40.1 µs ± 2.09 µs per loop (mean ± std. dev. of 7 runs, 10000 loops each)
13.7 µs ± 251 ns per loop (mean ± std. dev. of 7 runs, 100000 loops each)
315 µs ± 10.7 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
``````

Development

AntroPy was created and is maintained by `Raphael Vallat <https://raphaelvallat.com>`_. Contributions are more than welcome so feel free to contact me, open an issue or submit a pull request!

To see the code or report a bug, please visit the `GitHub repository <https://github.com/raphaelvallat/antropy>`_.

Note that this program is provided with NO WARRANTY OF ANY KIND. Always double check the results.

Acknowledgement

Several functions of AntroPy were adapted from:

• MNE-features: https://github.com/mne-tools/mne-features
• pyEntropy: https://github.com/nikdon/pyEntropy
• pyrem: https://github.com/gilestrolab/pyrem
• nolds: https://github.com/CSchoel/nolds

All the credit goes to the author of these excellent packages.

Issues

Errors in sample_entropy and higuchi_fd on a small vector

opened on 2023-02-04 09:23:31 by fingoldo

Hi, is this supposed to be?

``` import antropy as ant

print(ant.sample_entropy([-1, 2, 1, 3, 3])) ```

C:\ProgramData\Anaconda3\lib\site-packages\antropy\entropy.py in sample_entropy(x, order, metric) 663 x = np.asarray(x, dtype=np.float64) 664 if metric == "chebyshev" and x.size < 5000: --> 665 return _numba_sampen(x, order=order, r=(0.2 * x.std(ddof=0))) 666 else: 667 phi = _app_samp_entropy(x, order=order, metric=metric, approximate=False)

IndexError: getitem out of range

`print(ant.higuchi_fd(x))`

C:\ProgramData\Anaconda3\lib\site-packages\antropy\fractal.py in higuchi_fd(x, kmax) 297 x = np.asarray(x, dtype=np.float64) 298 kmax = int(kmax) --> 299 return _higuchi_fd(x, kmax) 300 301

ZeroDivisionError: division by zero

modify the entropy function be able to compute vectorizly

opened on 2021-12-27 23:08:21 by cheliu-computation

Hi, I have used your package to process the ECG signal and it achieve good results on classify different heart disease. Thanks a lot!

However, so far, these functions are only can deal with one-dimensional signal like array(~, 1). May I take a try to modify the code and make it can process the data like sklearn.preprocessing.scale(X, axis=xx)? So it will be more efficient to deal with big array, because we do not need to run the foor loop or something else.

My email is [email protected], welcome to discuss with me!

Allow users to pass signal in frequency domain in spectral entropy

opened on 2021-04-01 23:04:41 by raphaelvallat

Currently, antropy.spectral_entropy only allows `x` to be in time-domain. We should add `freqs=None` and `psd=None` as possible input if users want to calculate the spectral entropy of a pre-computed power spectrum. We should also add an example of how to calculate the spectral entropy from a multitaper power spectrum.

Releases

v0.1.5 2022-12-17 18:28:46

This is a minor release.

What's Changed

• Handle the limit of p = 0 in p log2 p by @jftsang in https://github.com/raphaelvallat/antropy/pull/3
• Correlation between entropy/FD metrics for data traces from Hodgin-Huxley model by @antelk in https://github.com/raphaelvallat/antropy/pull/5
• Fix docstrings and rerun by @antelk in https://github.com/raphaelvallat/antropy/pull/7
• Improve performance in `_xlog2x` by @jftsang in https://github.com/raphaelvallat/antropy/pull/8
• Prevent invalid operations in xlogx by @guiweber in https://github.com/raphaelvallat/antropy/pull/11
• Allow readonly arrays for higuchi_fd by @jvdd in https://github.com/raphaelvallat/antropy/pull/13
• modify the _embed function to fit the 2d input by @cheliu-computation in https://github.com/raphaelvallat/antropy/pull/15
• Fixed division by zero in linear regresion function (with test) by @Arritmic in https://github.com/raphaelvallat/antropy/pull/21
• Add conda install instructions by @raphaelvallat in https://github.com/raphaelvallat/antropy/pull/19

New Contributors

• @jftsang made their first contribution in https://github.com/raphaelvallat/antropy/pull/3
• @antelk made their first contribution in https://github.com/raphaelvallat/antropy/pull/5
• @guiweber made their first contribution in https://github.com/raphaelvallat/antropy/pull/11
• @jvdd made their first contribution in https://github.com/raphaelvallat/antropy/pull/13
• @cheliu-computation made their first contribution in https://github.com/raphaelvallat/antropy/pull/15
• @Arritmic made their first contribution in https://github.com/raphaelvallat/antropy/pull/21
• @raphaelvallat made their first contribution in https://github.com/raphaelvallat/antropy/pull/19

Full Changelog: https://github.com/raphaelvallat/antropy/compare/v0.1.4...v0.1.5

v0.1.4 2021-04-01 22:24:20

This new release includes a faster implementation of the LZ complexity algorithm (see https://github.com/raphaelvallat/antropy/pull/1).

Raphael Vallat

Sleep science, signal processing, machine-learning. Postdoctoral researcher, UC Berkeley. Advisor, Oura ring.

python signal-processing entropy fractal-dimension numba neuroscience eeg complexity signal feature-extraction machine-learning algorithm