Abstract / Description of output
We introduce an LSTM-based method for dynamically integrating several word prediction experts to obtain a conditional language model which can be good simultaneously at several subtasks. We illustrate this general approach with an application to dialogue where we integrate a neural chat model, good at conversational aspects, with a neural question-answering model, good at retrieving precise information from a knowledge-base, and show how the integration combines the strengths of the independent components. We hope that this focused contribution will attract attention on the benefits of using such mixtures of experts in NLP and dialogue systems specifically.
Original language | English |
---|---|
Title of host publication | Proceedings of the 1st Workshop on Representation Learning for NLP |
Place of Publication | Berlin, Germany |
Publisher | ACL Anthology |
Pages | 94-99 |
Number of pages | 6 |
DOIs | |
Publication status | Published - 11 Aug 2016 |
Event | 1st Workshop on Representation Learning for NLP - Berlin, Germany Duration: 11 Aug 2016 → 11 Aug 2016 https://sites.google.com/site/repl4nlp2016/ |
Conference
Conference | 1st Workshop on Representation Learning for NLP |
---|---|
Abbreviated title | RepL4NLP 2016 |
Country/Territory | Germany |
City | Berlin |
Period | 11/08/16 → 11/08/16 |
Internet address |