LSTM-based Mixture-of-Experts for Knowledge-Aware Dialogues

Phong Le, Marc Dymetman, Jean-Michel Renders

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

We introduce an LSTM-based method for dynamically integrating several word prediction experts to obtain a conditional language model which can be good simultaneously at several subtasks. We illustrate this general approach with an application to dialogue where we integrate a neural chat model, good at conversational aspects, with a neural question-answering model, good at retrieving precise information from a knowledge-base, and show how the integration combines the strengths of the independent components. We hope that this focused contribution will attract attention on the benefits of using such mixtures of experts in NLP and dialogue systems specifically.
Original languageEnglish
Title of host publicationProceedings of the 1st Workshop on Representation Learning for NLP
Place of PublicationBerlin, Germany
PublisherACL Anthology
Pages94-99
Number of pages6
DOIs
Publication statusPublished - 11 Aug 2016
Event1st Workshop on Representation Learning for NLP - Berlin, Germany
Duration: 11 Aug 201611 Aug 2016
https://sites.google.com/site/repl4nlp2016/

Conference

Conference1st Workshop on Representation Learning for NLP
Abbreviated titleRepL4NLP 2016
Country/TerritoryGermany
CityBerlin
Period11/08/1611/08/16
Internet address

Fingerprint

Dive into the research topics of 'LSTM-based Mixture-of-Experts for Knowledge-Aware Dialogues'. Together they form a unique fingerprint.

Cite this