Learning to Generate Product Reviews from Attributes

Li Dong, Shaohan Huang, Furu Wei, Maria Lapata, Ming Zhou, Ke Xu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Automatically generating product reviews is a meaningful, yet not well-studied task in sentiment analysis. Traditional natural language generation methods rely extensively on hand-crafted rules and predefined templates. This paper presents an attention-enhanced attribute-to-sequence model to generate product reviews for given attribute information, such as user, product, and rating. The attribute encoder learns to represent input attributes as vectors. Then, the sequence decoder generates reviews by conditioning its output on these vectors. We also introduce an attention mechanism to jointly generate reviews and align words with input attributes. The proposed model is trained end-to-end to maximize the likelihood of target product reviews given the attributes. We build a publicly available dataset for the review generation task by leveraging the Amazon book reviews and their metadata. Experiments on the dataset show that
our approach outperforms baseline methods and the attention mechanism significantly improves the performance of our model.
Original languageEnglish
Title of host publicationProceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers
PublisherAssociation for Computational Linguistics
Pages623-632
Number of pages10
ISBN (Print)978-1-945626-34-0
Publication statusPublished - 7 Apr 2017
Event15th EACL 2017 Software Demonstrations - Valencia, Spain
Duration: 3 Apr 20177 Apr 2017
http://eacl2017.org/
http://eacl2017.org/index.php

Conference

Conference15th EACL 2017 Software Demonstrations
Abbreviated titleEACL 2017
Country/TerritorySpain
CityValencia
Period3/04/177/04/17
Internet address

Fingerprint

Dive into the research topics of 'Learning to Generate Product Reviews from Attributes'. Together they form a unique fingerprint.

Cite this