Abstract
We propose a model for Chinese poem
generation based on recurrent neural net-
works which we argue is ideally suited to
capturing poetic content and form. Our
generator
jointly
performs content selection (“what to say”) and surface realization
(“how to say”) by learning representations
of individual characters, and their combinations into one or more lines as well
as how these mutually reinforce and constrain each other. Poem lines are generated incrementally by taking into account
the entire history of what has been generated so far rather than the limited horizon imposed by the previous line or lexical
n-grams. Experimental results show that
our model outperforms competitive Chinese poetry generation systems using both
automatic and manual evaluation methods.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP) |
Place of Publication | Doha, Qatar |
Publisher | Association for Computational Linguistics |
Pages | 670-680 |
Number of pages | 11 |
Publication status | Published - 1 Oct 2014 |