Orthogonality of Syntax and Semantics within Distributional Spaces

Mark Steedman, Jeffrey Mitchell

Research output: Chapter in Book/Report/Conference proceedingConference contribution


A recent distributional approach to word-analogy problems (Mikolov et al., 2013b) exploits interesting regularities in the structure of the space of representations. Investigating further, we find that performance on this task can also be related to orthogonality within the space. Explicitly designing such structure into a neural network model results in representations that decompose into orthogonal semantic and syntactic subspaces. We demonstrate that using word-order and morphological structure within English Wikipedia text to enable this decomposition can produce substantial improvements on semantic-similarity, pos-induction and word-analogy tasks.
Original languageEnglish
Title of host publicationProceedings of the 53rd Annual Meeting of the Association for Computational Linguistics (ACL)
PublisherAssociation for Computational Linguistics
Number of pages10
ISBN (Print)978-1-941643-72-3
Publication statusPublished - 2015


Dive into the research topics of 'Orthogonality of Syntax and Semantics within Distributional Spaces'. Together they form a unique fingerprint.

Cite this