Optimal Encoding in Stochastic Latent-Variable Models

Michael E. Rule, Martino Sorbaro, Matthias H. Hennig

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

In this work we explore encoding strategies learned by statistical models of sensory coding in noisy spiking networks. Early stages of sensory communication in neural systems can be viewed as encoding channels in the information-theoretic sense. However, neural populations face constraints not commonly considered in communications theory. Using restricted Boltzmann machines as a model of sensory encoding, we find that networks with sufficient capacity learn to balance precision and noise-robustness in order to adaptively communicate stimuli with varying information content. Mirroring variability suppression observed in sensory systems, informative stimuli are encoded with high precision, at the cost of more variable responses to frequent, hence less informative stimuli. Curiously, we also find that statistical criticality in the neural population code emerges at model sizes where the input statistics are well captured. These phenomena have well-defined thermodynamic interpretations, and we discuss their connection to prevailing theories of coding and statistical criticality in neural populations.
Original languageEnglish
Article number714
Number of pages17
Issue number7
Publication statusPublished - 28 Jun 2020

Keywords / Materials (for Non-textual outputs)

  • information theory
  • encoding
  • neural networks
  • Sensory Systems


Dive into the research topics of 'Optimal Encoding in Stochastic Latent-Variable Models'. Together they form a unique fingerprint.

Cite this