Learning Latent Trees with Stochastic Perturbations and Differentiable Dynamic Programming

Caio Corro, Ivan Titov

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We treat projective dependency trees as latent variables in our probabilistic model and induce them in such a way as to be beneficial for a downstream task, without relying on any direct tree supervision. Our approach relies on Gumbel perturbations and differentiable dynamic programming. Unlike previous approaches to latent tree learning, we stochastically sample global structures and our parser is fully differentiable.

We illustrate its effectiveness on sentiment analysis and natural language inference tasks. We also study its properties on a synthetic structure induction task. Ablation studies emphasize the importance of both stochasticity and constraining latent structures to be projective trees.

Original languageEnglish
Title of host publicationProceedings of the 57th Annual Meeting of the Association for Computational Linguistics (long papers)
EditorsAnna Korhonen, David Traum, Lluís Màrquez
Place of PublicationFlorence, Italy
PublisherACL Anthology
Pages5508–5521
Number of pages14
Volume1
ISBN (Print)978-1-950737-48-2
Publication statusE-pub ahead of print - 2 Aug 2019
Event57th Annual Meeting of the Association for Computational Linguistics - Fortezza da Basso, Florence, Italy
Duration: 28 Jul 20192 Aug 2019
Conference number: 57
http://www.acl2019.org/EN/index.xhtml

Conference

Conference57th Annual Meeting of the Association for Computational Linguistics
Abbreviated titleACL 2019
Country/TerritoryItaly
CityFlorence
Period28/07/192/08/19
Internet address

Fingerprint

Dive into the research topics of 'Learning Latent Trees with Stochastic Perturbations and Differentiable Dynamic Programming'. Together they form a unique fingerprint.

Cite this