Language Models Based on Semantic Composition

Jeff Mitchell, Mirella Lapata

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper we propose a novel statistical language model to capture long-range semantic dependencies. Specifically, we apply the concept of semantic composition to the problem of constructing predictive history representations for upcoming words. We also examine the influence of the underlying semantic space on the composition task by comparing spatial semantic representations against topic-based ones. The composition models yield reductions in perplexity when combined with a standard n-gram language model over the n-gram model alone. We also obtain perplexity reductions when integrating our models with a structured language model
Original languageEnglish
Title of host publicationProceedings of the 2009 Conference on Empirical Methods in Natural Language Processing
PublisherAssociation for Computational Linguistics
Pages430-439
Number of pages10
Publication statusPublished - 2009

Fingerprint

Dive into the research topics of 'Language Models Based on Semantic Composition'. Together they form a unique fingerprint.

Cite this