Signal Perceptron: On the Identifiability of Boolean Function Spaces and Beyond

Miguel-Angel M. Lucero, Rafael-Michael Karampatsis, Enrique B. Gallardo, Vaishak Belle

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

In a seminal book, Minsky and Papert define the perceptron as a limited implementation of what they called "parallel machines." They showed that some binary Boolean functions including XOR are not definable in a single layer perceptron due to its limited capacity to learn only linearly separable functions. In this work, we propose a new more powerful implementation of such parallel machines. This new mathematical tool is defined using analytic sinusoids-instead of linear combinations-to form an analytic signal representation of the function that we want to learn. We show that this re-formulated parallel mechanism can learn, with a single layer, any non-linear k-ary Boolean function. Finally, to provide an example of its practical applications, we show that it outperforms the single hidden layer multilayer perceptron in both Boolean function learning and image classification tasks, while also being faster and requiring fewer parameters.

Original languageEnglish
Article number770254
Number of pages20
JournalFrontiers in Artificial Intelligence
Volume5
DOIs
Publication statusPublished - 2 Jun 2022

Keywords / Materials (for Non-textual outputs)

  • signal perceptron
  • perceptron
  • learning function spaces
  • Parallel machines
  • neural networks

Fingerprint

Dive into the research topics of 'Signal Perceptron: On the Identifiability of Boolean Function Spaces and Beyond'. Together they form a unique fingerprint.

Cite this