Abstract
Brain-inspired, inherently parallel computation has been proven to excel at tasks where the intrinsically serial Von Neumann architecture struggles. This has led to vast efforts aimed towards developing bio-inspired electronics, most notably in the guise of artificial neural networks (ANNs). However, ANNs are simply one possible substrate upon which computation can be carried out; their configuration determining what sort of computational function is being performed. In this work we show how Bayesian inference, a fundamental computational function, can be carried out using arrays of memristive devices, demonstrating computation directly using probability distributions as inputs and outputs. Our approach bypasses the need to map the Bayesian computation on an ANN (or any other) substrate since computation is carried out by simply providing the input distributions and letting Ohm’s law converge the voltages within the system to the correct answer. We show the fundamental circuit blocks used to enable this style of computation, examine how memristor non-idealities affect the quality of computation and exemplify a ‘Bayesian learning machine’ performing a simple task with no need for any digital arithmetic-logic operations.
Original language | Undefined/Unknown |
---|---|
Publication status | Published - 2017 |
Event | NIPS 2017: 31st Conference on Neural Information Processing Systems - Long Beach, California, United States Duration: 4 Dec 2017 → 9 Dec 2017 https://nips.cc/ https://nips.cc/Conferences/2017 |
Conference
Conference | NIPS 2017 |
---|---|
Abbreviated title | NIPS 2017 |
Country/Territory | United States |
City | California |
Period | 4/12/17 → 9/12/17 |
Internet address |