Projects per year
We treat projective dependency trees as latent variables in our probabilistic model and induce them in such a way as to be beneficial for a downstream task, without relying on any direct tree supervision. Our approach relies on Gumbel perturbations and differentiable dynamic programming. Unlike previous approaches to latent tree learning, we stochastically sample global structures and our parser is fully differentiable.
We illustrate its effectiveness on sentiment analysis and natural language inference tasks. We also study its properties on a synthetic structure induction task. Ablation studies emphasize the importance of both stochasticity and constraining latent structures to be projective trees.
|Title of host publication||Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (long papers)|
|Editors||Anna Korhonen, David Traum, Lluís Màrquez|
|Place of Publication||Florence, Italy|
|Number of pages||14|
|Publication status||E-pub ahead of print - 2 Aug 2019|
|Event||57th Annual Meeting of the Association for Computational Linguistics - Fortezza da Basso, Florence, Italy|
Duration: 28 Jul 2019 → 2 Aug 2019
Conference number: 57
|Conference||57th Annual Meeting of the Association for Computational Linguistics|
|Abbreviated title||ACL 2019|
|Period||28/07/19 → 2/08/19|
FingerprintDive into the research topics of 'Learning Latent Trees with Stochastic Perturbations and Differentiable Dynamic Programming'. Together they form a unique fingerprint.
- 1 Active
1/05/17 → 30/04/22