Abstract / Description of output
The need to study how different levels of brain organization interact has led to an ever growing use of concurrent measures of neural signals with different recordings modalities [1, 2]. However, statistical methods that take full advantage of the great differences in multi-modal statistics and their intricate dependencies are currently lacking.
We developed a framework based on vine copulas [3, 4] with mixed margins to model multivariate data that are partly discrete such as neural spike counts and partly continuous such as local field potentials. The vine copula approach allowed us to derive efficient methods for likelihood calculation, inference and sampling with quadratic complexity in the number of modeled elements. We combined these methods by means of Monte-Carlo integration to obtain unbiased estimates of entropy and mutual information. To test our methods, we generated artificial data from parametric multivariate models. We also generated artificial spike counts and local field potentials from biologically realistic network models using the VERTEX simulator [5].
We applied our methods to these data and show that our new approach provides a model fit that is significantly better than that of corresponding independent models. Moreover, we demonstrate that mutual information estimates of fully continuous and mixed independent models can strongly differ from our proposed model which is faithful to the statistics of the margins and their dependencies.
Our framework presents the prospect of an improved analysis of neural data recorded simultaneously at different scales and from different modalities. Our models can also be used to construct Bayes-optimal decoders in brain-machine interfaces that benefit from concurrent recordings of various modalities.
We developed a framework based on vine copulas [3, 4] with mixed margins to model multivariate data that are partly discrete such as neural spike counts and partly continuous such as local field potentials. The vine copula approach allowed us to derive efficient methods for likelihood calculation, inference and sampling with quadratic complexity in the number of modeled elements. We combined these methods by means of Monte-Carlo integration to obtain unbiased estimates of entropy and mutual information. To test our methods, we generated artificial data from parametric multivariate models. We also generated artificial spike counts and local field potentials from biologically realistic network models using the VERTEX simulator [5].
We applied our methods to these data and show that our new approach provides a model fit that is significantly better than that of corresponding independent models. Moreover, we demonstrate that mutual information estimates of fully continuous and mixed independent models can strongly differ from our proposed model which is faithful to the statistics of the margins and their dependencies.
Our framework presents the prospect of an improved analysis of neural data recorded simultaneously at different scales and from different modalities. Our models can also be used to construct Bayes-optimal decoders in brain-machine interfaces that benefit from concurrent recordings of various modalities.
Original language | English |
---|---|
Number of pages | 1 |
DOIs | |
Publication status | Published - 23 Sept 2016 |
Event | Bernstein Conference 2016 - Duration: 21 Sept 2016 → 23 Sept 2016 |
Conference
Conference | Bernstein Conference 2016 |
---|---|
Period | 21/09/16 → 23/09/16 |