CASK 2021 : Conversational Agents with Structured Knowledge @ AKBC 2021
Call For Papers
Conversational agents are increasingly becoming part of our day-to-day life on the phone, in the car and in the living room, performing tasks on our behalf which often require interfacing with various forms of structured knowledge. The goal of this workshop is to shed light and encourage interdisciplinary research on some of the open problems situated on the interface of dialog systems and structured knowledge (e.g., knowledge bases, tables etc.), which is critical to the development of digital assistants like Google Home, Alexa and Siri. Our hope is to help bridge the gap between academic and industry research in this area. Open questions include, but are not limited to:
How to enable more natural conversations about structured objects and their properties?
How to model the structured context of a conversation?
Can we develop dialog systems that can inter-operate across arbitrary knowledge base schemas?
How to reason about, model, and make updates to an evolving world / knowledge base in a conversational setting?
How to characterize and respond to ambiguous references to entities?
How to address acoustic ambiguity in references to KB items in spoken queries?
How to segment user speech into meaningful utterances?
How do Conversational Agents interact with a Situated KB (e.g., user devices)?
How to use knowledge encoded in Pre-trained LMs in Conversational Agents?
Make sure to also check out the closely-related AKBC workshop on personal knowledge graphs here.
Workshop registration is included in AKBC 2021 registration, along with access to the main conference.
Call for Extended Abstracts
We invite the submission of extended abstracts to CASK, describing (i) new unpublished or (ii) relevant recently published research in the field. The workshop will not consist of archival proceedings, and the accepted abstracts will be presented as oral (lightning talks) during the workshop and listed on the website.
To submit an abstract to CASK, please send an email to firstname.lastname@example.org with the subject line “[CASK ABSTRACT]: here-goes-your-title”. The email should include the following:
Extended abstracts in PDF format (2 pages max, not including references), as an attachment. Please make sure to include the complete author list, along with their affiliation(s).
We encourage the authors to cite all the relevant and related work(s) in the abstract submission. This won’t count towards the 2 page limit.
Name and email of the author who would be presenting the work as a lightning talk. We will reach out with the details of the lightning talk post acceptance.
(Optional) Full paper, if the authors have a preprint ready.
Submitted abstracts will undergo a lightweight review process to make sure they are relevant to the workshop, but no official reviews will be sent to authors. All accepted abstracts will be made available prior to the workshop. We also would make the lightning talks accessible post workshop proceedings.
If you require further assistance in participating in the workshop, please let us know and we’ll be in touch to discuss how we can best address your needs.
Abstract submission deadline: Sep 10
Notification of acceptance: Sep 17
Workshop: Oct 7
All deadlines are 11.59 pm UTC -12h (“anywhere on Earth”).
Dilek-hakkani Tur, Amazon
Milica Gašić, Heinrich-Heine-Universität Düsseldorf
Vivian Violet Chen, National Taiwan University
CASK workshop will be held virtually on October 7th with the following with the following schedule (all times in Pacific Time, UTC-7).
8:00-8:10 - Opening remarks
8:10-8:45 - Lightning talks
8:45-9:30 - Invited talk: Milica Gašić, Heinrich-Heine-Universität Düsseldorf
9:30-10:00 - Break
10:00-10:30 - Lightning talks
10:30-11:15 - Invited talk: Vivian Violet Chen, National Taiwan University
11:15-11:30 - Break
11:30-12:15 - Invited talk: Dilek-hakkani Tur, Amazon
12:15-1:30 - Panel discussion
Aditya Gupta, Google
Emily Pitler, Google
Mihir Kale, Google
Rahul Goel, Google
Shachi Paul, Google
Shyam Upadhyay, Google
Waleed Ammar, Google