Incorporating Intra-Query Term Dependencies in an Aspect Query Language Model

Dawei Song, Yanjie Shi, Peng Zhang, Qiang Huang, Udo Kruschwitz, Yuexian Hou, Bo Wang

Research output: Contribution to journalArticlepeer-review


Query language modeling based on relevance feedback has been widely applied to improve the effectiveness of information retrieval. However, intra-query term dependencies (i.e., the dependencies between different query terms and term combinations) have not yet been sufficiently addressed in the existing approaches. This article aims to investigate this issue within a comprehensive framework, namely the Aspect Query Language Model (AM). We propose to extend the AM with a hidden Markov model (HMM) structure to incorporate the intra-query term dependencies and learn the structure of a novel aspect HMM (AHMM) for query language modeling. In the proposed AHMM, the combinations of query terms are viewed as latent variables representing query aspects. They further form an ergodic HMM, where the dependencies between latent variables (nodes) are modeled as the transitional probabilities. The segmented chunks from the feedback documents are considered as observables of the HMM. Then the AHMM structure is optimized by the HMM, which can estimate the prior of the latent variables and the probability distribution of the observed chunks. Our extensive experiments on three large-scale text retrieval conference (TREC) collections have shown that our method not only significantly outperforms a number of strong baselines in terms of both effectiveness and robustness but also achieves better results than the AM and another state-of-the-art approach, namely the latent concept expansion model.
Original languageEnglish
Number of pages22
JournalComputational Intelligence
Publication statusPublished - 1 Oct 2014


Dive into the research topics of 'Incorporating Intra-Query Term Dependencies in an Aspect Query Language Model'. Together they form a unique fingerprint.

Cite this