Neural sentence generation from formal semantics

Kana Manome, Masashi Yoshikawa, Hitomi Yanaka, Pascual Martínez-Gómez, Koji Mineshima, Daisuke Bekki

研究成果: Conference contribution

1 被引用数 (Scopus)

抄録

Sequence-to-sequence models have shown strong performance in a wide range of NLP tasks, yet their applications to sentence generation from logical representations are underdeveloped. In this paper, we present a sequence-to-sequence model for generating sentences from logical meaning representations based on event semantics. We use a semantic parsing system based on Combinatory Categorial Grammar (CCG) to obtain data annotated with logical formulas. We augment our sequence-to-sequence model with masking for predicates to constrain output sentences. We also propose a novel evaluation method for generation using Recognizing Textual Entailment (RTE). Combining parsing and generation, we test whether or not the output sentence entails the original text and vice versa. Experiments showed that our model outperformed a baseline with respect to both BLEU scores and accuracies in RTE.

本文言語English
ホスト出版物のタイトルINLG 2018 - 11th International Natural Language Generation Conference, Proceedings of the Conference
出版社Association for Computational Linguistics (ACL)
ページ408-414
ページ数7
ISBN(電子版)9781948087865
出版ステータスPublished - 2018
外部発表はい
イベント11th International Natural Language Generation Conference, INLG 2018 - Tilburg, Netherlands
継続期間: 2018 11 52018 11 8

出版物シリーズ

名前INLG 2018 - 11th International Natural Language Generation Conference, Proceedings of the Conference

Conference

Conference11th International Natural Language Generation Conference, INLG 2018
国/地域Netherlands
CityTilburg
Period18/11/518/11/8

ASJC Scopus subject areas

  • ソフトウェア

フィンガープリント

「Neural sentence generation from formal semantics」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル