If you made any changes in Pure these will be visible here soon.

Personal profile

Research Interests

Computational neuroscience and analysis of neurophysiological data.
Specifically, our work is currently in the following areas:
* Function, plasticity and homeostasis at the sub-cellular and
cellular levels
* Stability and function in networks of neurons
* Analysis of high-density multielectrode recordings


2001-2005PhD in Computational Neuroscience
 University of Stirling
 Thesis Title: The Role of Non-Linearities in Visual Perception studied with a Computational Model of the Vertebrate Retina
1994-2000Ruhr Universitaet Bochum, Germany
 German Diplom in Physics


Dr Hennig employs statistical, computational and mathematical modelling to investigate stability and processing in neuronal circuits. With a background in physics, Dr. Hennig moved into neuroscience during his PhD, investigating models of the vertebrate retina. Supported by a MRC Career Development Award, his group has recently produced algorithms for primary data analysis for extracellular recordings, and has uncovered principles of circuit-remodelling in chronic recordings from large neuronal networks. Current projects include population coding in the retina, data-driven investigation of cortical circuits in rodent models of neurodevelopmental disorder models, and development of efficient inference methods for large scale neuronal circuit analysis.

Education/Academic qualification

Computational Neuroscience, Doctor of Philosophy (PhD), The Role of Non-Linearities in Visual Perception studied with a Computational Model of the Vertebrate Retina, University of Stirling


  • RC0321 Neuroscience. Biological psychiatry. Neuropsychiatry
  • QA75 Electronic computers. Computer science


Dive into the research topics where Matthias Hennig is active. These topic labels come from the works of this person. Together they form a unique fingerprint.
  • 1 Similar Profiles


Recent external collaboration on country/territory level. Dive into details by clicking on the dots or