Compression: A lossless mechanism for learning complex structured relational representations

Ekaterina Shurkova, Leonidas A.A. Doumas

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

People learn by both decomposing and combining concepts; most accounts of combination are either compositional or conjunctive. We augment the DORA model of representation learning to build new predicate representation by combining (or compressing) existing predicate representations (e.g., building a predicate a_b by combining predicates a and b). The resulting model learns structured relational representations from experience and then combines these relational concepts to form more complex, compressed concepts. We show that the resulting model provides an account of a category learning experiment in which categories are defined as novel combinations of relational concepts.
Original languageEnglish
Title of host publicationProceedings of the 43rd Annual Meeting of the Cognitive Science Society
Subtitle of host publicationComparative Cognition: Animal Minds, CogSci 2021
EditorsTecumseh Fitch, Claus Lamm, Helmut Leder, Kristin Teßmar-Raible
PublisherThe Cognitive Science Society
Pages293-299
Number of pages7
Volume43
Publication statusPublished - 29 Jul 2021
Event43rd Annual Meeting of the Cognitive Science Society: Comparative Cognition: Animal Minds, CogSci 2021 - Virtual, Online, Austria
Duration: 26 Jul 202129 Jul 2021

Publication series

NameProceedings of the Annual Meeting of the Cognitive Science Society
PublisherThe Cognitive Science Society
ISSN (Electronic)1069-7977

Conference

Conference43rd Annual Meeting of the Cognitive Science Society: Comparative Cognition: Animal Minds, CogSci 2021
Country/TerritoryAustria
CityVirtual, Online
Period26/07/2129/07/21

Keywords / Materials (for Non-textual outputs)

  • chunking
  • comparison
  • compression
  • computational modeling
  • mapping
  • relational categorisation
  • symbolic-connectionist model

Fingerprint

Dive into the research topics of 'Compression: A lossless mechanism for learning complex structured relational representations'. Together they form a unique fingerprint.

Cite this