Projects per year
Abstract
Long Short-Term Memory (LSTM) networks,a type of recurrent neural network with a more complex computational unit, have been successfully applied to a variety of sequence modeling tasks. In this paper we develop Tree Long Short-Term Memory (TREELSTM), a neural network model based on LSTM, which is designed to predict a tree rather than a linear sequence. TREELSTM defines the probability of a sentence by estimating the generation probability of its dependency tree. At each time step, a node is generated based on the representation of the generated subtree. We further enhance the modeling power of TREELSTM by explicitly representing the correlations between left and right dependents. Application of our model to the MSR sentence completion challenge achieves results beyond the current state of the art. We also report results on dependency parsing reranking achieving competitive performance.
Original language | English |
---|---|
Title of host publication | The 15th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies |
Publisher | Association for Computational Linguistics |
Pages | 310-320 |
Number of pages | 11 |
ISBN (Print) | 978-1-941643-91-4 |
Publication status | Published - Jun 2016 |
Event | 15th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies - San Diego, United States Duration: 12 Jun 2016 → 17 Jun 2016 http://naacl.org/naacl-hlt-2016/ http://naacl.org/naacl-hlt-2016/ |
Conference
Conference | 15th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies |
---|---|
Abbreviated title | NAACL HLT 2016 |
Country | United States |
City | San Diego |
Period | 12/06/16 → 17/06/16 |
Internet address |
Fingerprint Dive into the research topics of 'Top-down Tree Long Short-Term Memory Networks'. Together they form a unique fingerprint.
Projects
- 1 Finished