Signal Perceptron: On the Identifiability of Boolean Function Spaces and Beyond

Miguel-Angel M. Lucero, Rafael-Michael Karampatsis, Enrique B. Gallardo, Vaishak Belle

Research output: Contribution to journalArticlepeer-review


In a seminal book, Minsky and Papert define the perceptron as a limited implementation of
what they called “parallel machines”.
They showed that some binary Boolean functions including XOR are not definable in a single layer perceptron due to its limited capability to learn only linearly separable functions.
In this work, we propose a new more powerful implementation of such parallel machines.
This new mathematical machinery is defined by using analytic sinusoids instead of linear combinations, to form an analytic signal representation of the function that we want to learn.
We show that this re-formulated parallel mechanism not only allows us to learn the binary XOR function, but more generally any k-ary Boolean function as well, in just a single layer.
Original languageEnglish
JournalFrontiers in Artificial Intelligence
Publication statusAccepted/In press - 28 Apr 2022


  • signal perceptron
  • perceptron
  • learning function spaces
  • Parallel machines
  • neural networks


Dive into the research topics of 'Signal Perceptron: On the Identifiability of Boolean Function Spaces and Beyond'. Together they form a unique fingerprint.

Cite this