Language to logical form with neural attention

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Semantic parsing aims at mapping natural language to machine interpretable meaning representations. Traditional approaches rely on high-quality lexicons, manually-built templates, and linguistic features which are either domainor representation-specific. In this paper we present a general method based on an attention-enhanced encoder-decoder model. We encode input utterances into vector representations, and generate their logical forms by conditioning the output sequences or trees on the encoding vectors. Experimental results on four datasets show that our approach performs competitively without using hand-engineered features and is easy to adapt across domains and meaning representations.

Original languageEnglish
Title of host publication54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Long Papers
PublisherAssociation for Computational Linguistics (ACL)
Pages33-43
Number of pages11
Volume1
ISBN (Electronic)978-1-945626-00-5
DOIs
Publication statusPublished - 12 Aug 2016
Event54th Annual Meeting of the Association for Computational Linguistics - Berlin, Germany
Duration: 7 Aug 201612 Aug 2016
https://mirror.aclweb.org/acl2016/

Conference

Conference54th Annual Meeting of the Association for Computational Linguistics
Abbreviated titleACL 2016
Country/TerritoryGermany
CityBerlin
Period7/08/1612/08/16
Internet address

Fingerprint

Dive into the research topics of 'Language to logical form with neural attention'. Together they form a unique fingerprint.

Cite this