Novel analytical and datasharing tools for rich neuronal activity datasets obtained with a 4,096 electrodes array

  • Hennig, Matthias (Principal Investigator)
  • Eglen, Stephen (Sponsor)
  • Sernagor, Evelyne (Sponsor)

Project Details

Description

In order to decipher information processing in neural networks, it is imperative to understand neural connectivity between individual neurons as well as global network communication. To achieve this goal, neural activity must be studied in very large cell populations using high spatiotemporal resolution recording tools, resulting in the generation of rich datasets that necessitate the development of powerful computational analytical tools. The aims of this proposal were:

1. To acquire rich datasets of neural activity in an intact neural network, the retina. Both spontaneous activity and responses to light have been recorded from the retinal ganglion cell (RGC) layer in neonatal mouse retinas using the novel Active Pixel Sensor (APS) multielectrode array (MEA) consisting of 4,096 electrodes with near cellular resolution (21 um x 21 um electrodes , 42 um centre-to-centre separation), where all channels (integrated on an active area of 2.7 mm x 2.7 mm) can be acquired from synchronously at high enough temporal resolution (7.8 kHz) to detect single spike signals reliably.

2. To refine existing computational tools that have been developed to analyse activity from standard commercial 60 channels MEAs and optimize them for APS MEA acquired data. These include standard methods of spike train analysis such as spike statistics and correlation analysis.

3. Most importantly, to develop new analytical tools adapted to the large datasets acquired from multi-channel devices integrating several thousands of channels. These tools are based on statistical modelling, looking at signal variability, enabling us to obtain information on the structure and dynamics of the neural network. They are applicable to a range of preparations, including in vivo recordings.

4. To enable sharing of the novel retinal data acquired with the APS MEA system, of the newly developed codes and existing tools, they will be deployed on the existing CARMEN (Code Analysis, Repository and Modelling for e-Neuroscience, a data sharing facility developed in the UK with EPSRC funding and currently funded by the BBSRC) portal.

Layman's description

Understanding brain function is one of the biggest scientific challenges of the 21st century. The human brain contains about 100 billion neurons, mutually connected in a vast network. One approach to understanding brain function is by recording its electrical activity under different conditions, or whilst the network is developing during early life. In recent years, this has been achieved using multi-electrode arrays (MEAs), which simultaneously record activity from a small population of neurons using 60-100 electrodes spaced over 1 to 4mm^2.
Our grant aimed to evaluate the feasibility of using a much larger MEA, the Active Pixel Sensor (APS) array, developed by Luca Berdondini's group (Italian Institute of Technology, Genova). It has 4096 electrodes, densely covering about 7mm^2 of tissue. We proposed to record activity from the intact retina, the light-sensing neural tissue at the back of the eye. The retina was an ideal choice for this project, given previous experimental experience with this system, and our current understanding of the development and functioning of the retina.
There were two main challenges before starting the work:
1. Would the APS system reliably detect neuronal activity in the intact retina? We would be the first to independently assess its performance.
2. The APS system has nearly 70 times as many electrodes as the current commercial arrays. Would our analysis algorithms cope with these much larger data sets, and how would we interpret these new recordings?

Key findings

1. The APS equipment was successfully installed at Newcastle. We have found the APS array to be an excellent system for recording neuronal activity from developing retina at a near cellular resolution. The system has shown us us how spontaneous patterns (i.e. when the retina is not stimulated) of neuronal firing change whilst the mouse normally develops. The spatiotemporal patterns of activity are much richer than was previously observed with the previous generation of commercial MEAs.
We focused on the mouse because of its ready availability, and the wide range of mice with genetic mutations. Recordings were therefore made from both normal (wild-type) mice and the CRX mouse (which is a model for retinal dystrophy).
In addition, the APS system was also able to reliably record retinal activity in response to light stimulation. This was not an original aim of the grant, as it was unclear how well the system would work. However, the pilot data collected during this project have been highly encouraging.
2. We have generated new computational tools to handle the large datasets generated by the APS system. Our current methods for analysing activity patterns have been updated to handle these new datasets. Furthermore, new computational methods have been developed to investigate the richer dynamic activity patterns that to date only the APS system has been able to record. These methods include detection of action potentials from the array, classification of individual firing patterns, and inferring for inferring how neurons might connect to each other.
The scientific findings from this study have been presented at international scientific conferences, and we are currently preparing several manuscripts for publication. In line with BBSRC data sharing policy, we will publicly share our datasets through the CARMEN website (http://portal.carmen.org.uk). Some analysis algorithms are already available through CARMEN, and further tools will be added in the future.
StatusFinished
Effective start/end date15/10/1014/03/12

Funding

  • BBSRC: £10,583.00

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.