Edinburgh Research Explorer

HOUDINI: Lifelong Learning as Program Synthesis

Research output: Chapter in Book/Report/Conference proceedingConference contribution

  • Lazar Valkov
  • Dipak Chaudhari
  • Akash Srivastava
  • Charles Sutton
  • Swarat Chaudhuri

Related Edinburgh Organisations

Open Access permissions

Open

Original languageEnglish
Title of host publicationThirty-second Conference on Neural Information Processing Systems (NIPS 2018)
Place of PublicationMontreal, Canada
Pages1-12
Number of pages12
Publication statusPublished - 2018
EventThirty-second Conference on Neural Information Processing Systems - Montreal, Canada
Duration: 3 Dec 20188 Dec 2018
https://nips.cc/

Conference

ConferenceThirty-second Conference on Neural Information Processing Systems
Abbreviated titleNIPS 2018
CountryCanada
CityMontreal
Period3/12/188/12/18
Internet address

Abstract

We present a neurosymbolic framework for the lifelong learning of algorithmic tasks that mix perception and procedural reasoning. Reusing high-level concepts across domains and learning complex procedures are key challenges in lifelong learning. We show that a program synthesis approach that combines gradient descent with combinatorial search over programs can be a more effective response to these challenges than purely neural methods. Our framework, called HOUDINI, represents neural networks as strongly typed, differentiable functional programs that use symbolic higher-order combinators to compose a library of neural functions. Our learning algorithm consists of: (1) a symbolic program synthesizer that performs a type-directed search over parameterized programs, and decides on the library functions to reuse, and the architectures to combine them, while learning a sequence of tasks; and (2) a neural module that trains these programs using stochastic gradient descent. We evaluate HOUDINI on three benchmarks that combine perception with the algorithmic tasks of counting, summing, and shortest-path computation. Our experiments show that HOUDINI transfers high-level concepts more effectively than traditional transfer learning and progressive neural networks, and that the typed representation of networks significantly accelerates the search.

Event

Thirty-second Conference on Neural Information Processing Systems

3/12/188/12/18

Montreal, Canada

Event: Conference

ID: 75123290