Ionela Mocanu, Vaishak Belle, Brendan Juba

Research output: Contribution to conferencePaperpeer-review


To deploy knowledge-based systems in the real world, the challenge of knowledge acquisition must be addressed. Knowledge engineering by hand is a daunting task, so machine learning has been widely proposed as an alternative. However, machine learning has difficulty acquiring rules that feature the kind of exceptions that are prevalent in real-world knowledge. Moreover, it is conjectured to be impossible to
reliably learn representations featuring a desirable level of expressiveness. Works by Khardon and Roth and by Juba proposed solutions to such problems by learning to reason directly, bypassing the intractable step of producing an explicit representation of the learned knowledge. These works focused on Boolean, propositional logics. In this work, we consider such implicit learning to reason for arithmetic theories, including logics considered with satisfiability modulo theory (SMT) solvers. We show that for standard fragments of linear arithmetic, we can learn to reason efficiently. These results are consequences of a more general finding: we show that there is an efficient reduction from the learning to reason problem for a logic to any sound and complete solver for that logic.
Original languageEnglish
Number of pages11
Publication statusPublished - 13 Dec 2019
EventKnowledge Representation & Reasoning Meets Machine Learning: Workshop at NeurIPS'19 - Vancouver, Canada
Duration: 13 Dec 201913 Dec 2019


WorkshopKnowledge Representation & Reasoning Meets Machine Learning
Abbreviated titleKR2ML
Internet address


Dive into the research topics of 'PAC + SMT'. Together they form a unique fingerprint.

Cite this