Learning Structural Kernels for Natural Language Processing

Daniel Beck, Trevor Cohn, Christian Hardmeier, Lucia Specia

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

Structural kernels are a flexible learning paradigm that has been widely used in Natural Language Processing. However, the problem of model selection in kernel-based methods is usually overlooked. Previous approaches mostly rely on setting default values for kernel hyperparameters or using grid search, which is slow and coarse-grained. In contrast, Bayesian methods allow efficient model selection by maximizing the evidence on the training data through gradient-based methods. In this paper we show how to perform this in the context of structural kernels by using Gaussian Processes. Experimental results on tree kernels show that this procedure results in better prediction performance compared to hyperparameter optimization via grid search. The framework proposed in this paper can be adapted to other structures besides trees, e.g., strings and graphs, thereby extending the utility of kernel-based methods.
Original languageEnglish
Pages (from-to)461-473
Number of pages13
JournalTransactions of the Association for Computational Linguistics
Volume3
DOIs
Publication statusPublished - 2015

Fingerprint

Dive into the research topics of 'Learning Structural Kernels for Natural Language Processing'. Together they form a unique fingerprint.

Cite this