DSNNLG 2019 : 1st Workshop on Discourse Structure in Neural NLG
Call For Papers
DSNNLG @ INLG2019
1st Workshop on Discourse Structure in Neural NLG
November 1, Tokyo, Japan
This workshop will be held as part of the 12th International Conference on Natural Language Generation (INLG2019), which is supported by the ACL Special Interest Group on Natural Language Generation.
CALL FOR PAPERS
Neural methods for natural language generation (NNLG) arrived with much fanfare a few years ago and became the dominant method employed in the E2E NLG Challenge (Dušek et al., 2018). While neural methods promise flexible, end-to-end trainable models, recent studies have revealed their inability to produce satisfactory output for longer or more complex texts (Wiseman et al., 2017) as well as how the black-box nature of these models makes them difficult to control, in contrast to traditional NLG architectures that make use of explicit representations of discourse structure and/or sentence planning operations (Reed et al., 2018). As such, several papers have recently appeared (Puduppully et al., 2019; Moryossef et al., 2019; Balakrishnan et al., 2019, Oraby et al., 2019; Yi et al., 2019) that explore how to incorporate intermediate structures into NNLG or otherwise improve coherence and cohesion.
This workshop aims to encourage further research on enhancing quality in NNLG in terms of discourse coherence and cohesion along with ways to make NNLG models easier to control. DSNNLG 2019 will be a one-day workshop, co-located with INLG 2019 in Tokyo, Japan. We plan on organizing a mix of sessions, including talks by invited speakers, presentations and posters for submitted works, and a panel session.
We invite papers of a theoretical and experimental nature on discourse structure and controllability of neural models for natural language generation (NNLG). Relevant topics include (but are not limited to):
* Limits of current end-to-end NNLG
- Limits w.r.t. sentence planning and discourse structure
- Limits w.r.t. control and interpretability
* Methods for improving discourse coherence and cohesion in NNLG
- Handling discourse connectives
- Lexical choice, aggregation and generating referring expressions
- Avoiding unnecessary repetition or enhancing contextually appropriate diversity
* Representations and model structure in NNLG
- Use of more explicit guidance or structure in the input
- Use of intermediate structure
- End-to-end vs. modular systems
* Evaluation of discourse coherence and cohesion in NNLG
Submission deadline: 4 September 2019
Notification of acceptance: 1 October 2019
Camera-ready papers due: 15 October 2019
Workshop date: 1 November 2019
We welcome three categories of regular papers:
1. Long papers are most appropriate for presenting substantial research results and must not exceed eight (8) pages of content (with any number of additional pages for references).
2. Short papers are more appropriate for presenting an ongoing research effort and must not exceed four (4) pages of content (with any number of additional pages for references).
3. Demo papers should be no more than two (2) pages in length, including references, and should describe implemented systems. Authors of demo papers should be willing to present a demo of their system during the workshop.
Submissions should follow the new ACL Author Guidelines (https://www.aclweb.org/adminwiki/index.php?title=ACL_Author_Guidelines) and policies for submission, review and citation, and be anonymized for double blind reviewing. Dual submission to other venues (conferences/workshops/journals) is not permitted to avoid overtaxing limited reviewer resources, though cross-submissions are allowed (see below).
ACL 2019 offers both LaTeX style files (http://www.acl2019.org/medias/340-acl2019-latex.zip) and Microsoft Word templates (http://www.acl2019.org/medias/341-acl2019-word.zip). Papers should be submitted electronically through the START conference management system:
Regular papers will be included in the workshop proceedings and appear in the ACL Anthology.
We also welcome two categories of non-archival submissions:
1. Extended abstracts of no more than 2 pages are appropriate for preliminary but interesting unpublished ideas that would benefit from additional exposure and discussion.
2. Cross-submissions are appropriate for papers on relevant topics which have appeared in or been submitted to alternative venues (such as other NLP or ML conferences). Authors should indicate the alternative venue in the Acknowledgments section.
Selection of non-archival submissions will be determined solely by the organizing committee based on their relevance to the workshop aims. Review of non-archival submissions will not be anonymous. Selected submissions will be presented as posters at the workshop and will not appear in the proceedings.
Anusha Balakrishnan, Facebook AI
Vera Demberg, Saarland University
Chandra Khatri, Uber AI
Abhinav Rastogi, Google AI
Donia Scott, Scott Rush Associates
Marilyn Walker, UC Santa Cruz
Michael White, Ohio State University / Facebook AI
Paul Crook, Facebook AI
Alessandra Cervone, University of Trento
Claire Gardent, French National Centre for Scientific Research (CNRS)
Behnam Hedayatnia, Amazon Alexa AI
Dave Howcroft, Saarland University
Emiel Krahmer, Tillburg University
Shereen Oraby, University of California - Santa Cruz
Cecile Paris, CSIRO
Owen Rambow, Elemental Cognition
Alexander Rush, Harvard University
Frank Schilder, Thomson Reuters
Rajen Subba, Facebook AI
David Winer, University of Utah
Sam Wiseman, Toyota Technological Institute at Chicago
Amir Zeldes, Georgetown University
Yi-Chia Wang, Uber AI
Anusha Balakrishnan, Jinfeng Rao, Kartikeya Upasani, Michael White and Rajen Subba. 2019. Constrained Decoding for Neural NLG from Compositional Representations in Task-Oriented Dialogue. To appear in Proc. ACL-19.
Ondřej Dušek, Jekaterina Novikova and Verena Rieser. 2018. Findings of the E2E NLG Challenge. In Proc. of INLG-18.
Amit Moryossef, Yoav Goldberg and Ido Dagan. 2019. Step-by-Step: Separating Planning from Realization in Neural Data-to-Text Generation. To appear in Proc. NAACL-19.
Ratish Puduppully, Li Dong, Mirella Lapata. 2019. Data-to-Text Generation with Content Selection and Planning. In Proc. AAAI-19.
Shereen Oraby and Vrindavan Harrison and Marilyn Walker. 2019. Curate and Generate: A Corpus and Method for Joint Control of Semantics and Style in Neural NLG. To appear in Proc. ACL-19.
Lena Reed, Shereen Oraby and Marilyn Walker. 2018. Can Neural Generators for Dialogue Learn Sentence Planning and Discourse Structuring? In Proc. of INLG-18.
Sam Wiseman, Stuart Shieber and Alexander Rush. 2017. Challenges in Data-to-Document Generation. In Proc. of EMNLP-17.
Sanghyun Yi, Rahul Goel, Chandra Khatri, Alessandra Cervone, Tagyoung Chung, Behnam Hedayatnia, Anu Venkatesh, Raefer Gabriel and Dilek Hakkani-Tur. 2019. Towards Coherent and Engaging Spoken Dialog Response Generation Using Automatic Conversation Evaluators.