Stimulus-independent neural coding of event semantics: Evidence from cross-sentence fMRI decoding

Aliff Asyraff, Rafael Lemarchand, Andres Tamm, Paul Hoffman

Research output: Contribution to journalArticlepeer-review

Abstract

Multivariate neuroimaging studies indicate that the brain represents word and object concepts in a format that readily generalises across stimuli. Here we investigated whether this was true for neural representations of simple events described using sentences. Participants viewed sentences describing four events in different ways. Multivariate classifiers were trained to discriminate the four events using a subset of sentences, allowing us to test generalisation to novel sentences. We found that neural patterns in a left-lateralised network of frontal, temporal and parietal regions discriminated events in a way that generalised successfully over changes in the syntactic and lexical properties of the sentences used to describe them. In contrast, decoding in visual areas was sentence-specific and failed to generalise to novel sentences. In the reverse analysis, we tested for decoding of syntactic and lexical structure, independent of the event being described. Regions displaying this coding were limited and largely fell outside the canonical semantic network. Our results indicate that a distributed neural network represents the meaning of event sentences in a way that is robust to changes in their structure and form. They suggest that the semantic system disregards the surface properties of stimuli in order to represent their underlying conceptual significance.
Original languageEnglish
Article number118073
JournalNeuroImage
Volume236
Early online date18 Apr 2021
DOIs
Publication statusPublished - 1 Aug 2021

Keywords

  • conceptual knowledge
  • MVPA
  • semantic cognition
  • sentences

Fingerprint

Dive into the research topics of 'Stimulus-independent neural coding of event semantics: Evidence from cross-sentence fMRI decoding'. Together they form a unique fingerprint.

Cite this