Optimal Learning Rules for Discrete Synapses

A. B. Barrett, M. C. W. van Rossum

Research output: Contribution to journalArticlepeer-review

Abstract

There is evidence that biological synapses have a limited number of discrete weight states. Memory storage with such synapses behaves quite differently from synapses with unbounded, continuous weights, as old memories are automatically overwritten by new memories. Consequently, there has been substantial discussion about how this affects learning and storage capacity. In this paper, we calculate the storage capacity of discrete, bounded synapses in terms of Shannon information. We use this to optimize the learning rules and investigate how the maximum information capacity depends on the number of synapses, the number of synaptic states, and the coding sparseness. Below a certain critical number of synapses per neuron (comparable to numbers found in biology), we find that storage is similar to unbounded, continuous synapses. Hence, discrete synapses do not necessarily have lower storage capacity.
Original languageEnglish
Article numbere1000230
Pages (from-to)1-7
Number of pages7
JournalPLoS Computational Biology
Volume4
Issue number11
DOIs
Publication statusPublished - Nov 2008

Keywords / Materials (for Non-textual outputs)

  • synapses
  • learning
  • Memory
  • Neurons
  • Coding mechanisms
  • cellular neuroscience
  • neuronal dendrites
  • Synaptic plasticity

Fingerprint

Dive into the research topics of 'Optimal Learning Rules for Discrete Synapses'. Together they form a unique fingerprint.

Cite this