Hardware-Level Bayesian Inference

Alex Serb, Edoardo Manino, Ioannis Messaris, Long Tran-Thanh, Themis Prodromakis

Research output: Contribution to conferencePaperpeer-review

Abstract

Brain-inspired, inherently parallel computation has been proven to excel at tasks where the intrinsically serial Von Neumann architecture struggles. This has led to vast efforts aimed towards developing bio-inspired electronics, most notably in the guise of artificial neural networks (ANNs). However, ANNs are simply one possible substrate upon which computation can be carried out; their configuration determining what sort of computational function is being performed. In this work we show how Bayesian inference, a fundamental computational function, can be carried out using arrays of memristive devices, demonstrating computation directly using probability distributions as inputs and outputs. Our approach bypasses the need to map the Bayesian computation on an ANN (or any other) substrate since computation is carried out by simply providing the input distributions and letting Ohm’s law converge the voltages within the system to the correct answer. We show the fundamental circuit blocks used to enable this style of computation, examine how memristor non-idealities affect the quality of computation and exemplify a ‘Bayesian learning machine’ performing a simple task with no need for any digital arithmetic-logic operations.
Original languageUndefined/Unknown
Publication statusPublished - 2017
EventNIPS 2017: 31st Conference on Neural Information Processing Systems - Long Beach, California, United States
Duration: 4 Dec 20179 Dec 2017
https://nips.cc/
https://nips.cc/Conferences/2017

Conference

ConferenceNIPS 2017
Abbreviated titleNIPS 2017
Country/TerritoryUnited States
CityCalifornia
Period4/12/179/12/17
Internet address

Cite this