Bootstrapping compositional generalization with cache-and-reuse

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

People effectively reuse previously learned concepts to construct more complex concepts, and sometimes this can lead to systematically different beliefs when the same evidence is processed in different orders. We model these phenomena with a novel Bayesian concept learning framework that incorporates adaptor grammars to enable a dynamic concept library that is enriched over time, allowing for caching and later reusing elements of earlier insights in a principled way. Our model accounts for unique curriculum-order and conceptual garden-pathing effects in compositional causal generalization that alternative models fail to capture: While people can successfully acquire a complex causal concept when they have an opportunity to cache a key sub-concept, simply reversing the presentation order of the same learning examples induces dramatic failures, and leads people to complex and ad hoc concepts. This work provides an explanation for why information selection alone is not enough to teach complex concepts, and offers a computational account of how past experiences shape future conceptual discoveries.
Original languageEnglish
Title of host publicationProceedings of the Computational Cognitive Neuroscience Society Meeting 2023
Publication statusAccepted/In press - 31 May 2023
EventConference on Cognitive Computational Neuroscience 2023 - Oxford, United Kingdom
Duration: 24 Aug 202327 Aug 2023


ConferenceConference on Cognitive Computational Neuroscience 2023
Abbreviated titleCCN 2023
Country/TerritoryUnited Kingdom
Internet address

Keywords / Materials (for Non-textual outputs)

  • concept learning
  • compositional generalization
  • Bayesian-symbolic models
  • adaptor grammars
  • order effects
  • garden-pathing


Dive into the research topics of 'Bootstrapping compositional generalization with cache-and-reuse'. Together they form a unique fingerprint.

Cite this