Learning Continuous Semantic Representations of Symbolic Expressions

Miltiadis Allamanis, Pankajan Chanthirasegaran, Pushmeet Kohli, Charles Sutton

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Combining abstract, symbolic reasoning with continuous neural reasoning is a grand challenge of representation learning. As a step in this direction, we propose a new architecture, called neural equivalence networks, for the problem of learning continuous semantic representations of algebraic and logical expressions. These networks are trained to represent semantic equivalence, even of expressions that are syntactically very different. The challenge is that semantic representations must be computed in a syntax-directed manner, because semantics is compositional, but at the same time, small changes in syntax can lead to very large changes in semantics, which can be difficult for continuous neural architectures. We perform an exhaustive evaluation on the task of checking equivalence on a highly diverse class of symbolic algebraic and boolean expression types, showing that our model significantly outperforms existing architectures.
Original languageEnglish
Title of host publicationThe 34th International Conference on Machine Learning (ICML 2017)
Place of PublicationSydney, Australia
Number of pages9
Publication statusPublished - 11 Aug 2017
Event34th International Conference on Machine Learning (ICML), 2017 - Sydney, Australia
Duration: 6 Aug 201711 Aug 2017

Publication series

NameProceedings of Machine Learning Research
ISSN (Electronic)2640-3498


Conference34th International Conference on Machine Learning (ICML), 2017


Dive into the research topics of 'Learning Continuous Semantic Representations of Symbolic Expressions'. Together they form a unique fingerprint.

Cite this