Projects per year
Abstract / Description of output
Nonlinear dynamics play an important role in the analysis of signals. A popular, readily interpretable nonlinear measure is Permutation Entropy. It has recently been extended for the analysis of graph signals, thus providing a framework for non-linear analysis of data sampled on irregular domains. Here, we introduce a continuous version of Permutation Entropy, extend it to the graph domain, and develop a ordinal activation function akin to the one of neural networks. This is a step towards Ordinal Deep Learning, a potentially effective and very recently posited concept. We also formally extend ordinal contrasts to the graph domain. Continuous versions of ordinal contrasts of length 3 are also introduced and their advantage is shown in experiments. We also integrate specific contrasts for the analysis of images and show that it generalizes well to the graph domain allowing a representation of images, represented as graph signals, in a plane similar to the entropy-complexity one. Applications to synthetic data, including fractal patterns and popular non-linear maps, and real-life MRI data show the validity of these novel extensions and potential benefits over the state of the art. By extending very recent concepts related to permutation entropy to the graph domain, we expect to accelerate the development of more graph-based entropy methods to enable nonlinear analysis of a broader kind of data and establishing relationships with emerging ideas in data science.
Original language | English |
---|---|
Publisher | ArXiv |
DOIs | |
Publication status | Published - 10 Jul 2024 |
Fingerprint
Dive into the research topics of 'Graph permutation entropy: Extensions to the continuous case, a step towards ordinal deep learning, and more'. Together they form a unique fingerprint.Projects
- 1 Finished
-
Nonlinear analysis and modelling of multivariate signals on networks
1/11/20 → 31/10/23
Project: Research