posted by user: bapl || 1292 views || tracked by 2 users: [display]

Adapt-NLP 2021 : The Second Workshop on Domain Adaptation for NLP

FacebookTwitterLinkedInGoogle

Link: https://adapt-nlp.github.io/Adapt-NLP-2021/
 
When Apr 19, 2021 - Apr 20, 2021
Where virtual
Submission Deadline Jan 18, 2021
Notification Due Feb 18, 2018
 

Call For Papers

Call For Papers

Overview

The growth in computational power and the rise of Deep Neural Networks
(DNNs) have revolutionized the field of Natural Language Processing
(NLP). The ability to collect massive datasets with the capacity to
train big models on powerful GPUs, has yielded NLP-based technology
that was beyond imagination only a few years ago.

Unfortunately, this technology is still limited to a handful of
resource rich languages and domains. This is because most NLP
algorithms rely on the fundamental assumption that the training and
the test sets are drawn from the same underlying distribution. When
the train and test distributions do not match, a phenomenon known as
domain shift, such models are likely to encounter performance drops.

Despite the growing availability of heterogeneous data, many NLP
domains still lack the amounts of labeled data required to feed
data-hungry neural models, and in some domains and languages even
unlabeled data is scarce. As a result, the problem of domain
adaptation, training an algorithm on annotated data from one or more
source domains, and applying it to other target domains, is a
fundamental challenge that has to be solved in order to make NLP
technology available for most world languages and textual domains.

Domain Adaptation (DA) is hence the focus of this workshop.
Particularly, the topics of the workshop include, but are not
restricted to:

- Novel DA algorithms addressing existing and new assumptions (e.g.
assuming or not assuming unlabeled data from the source and target
domains, making certain assumptions on the differences between the
source and target domain distributions, etc.).
- Introducing and exploring novel or under-explored DA setups, aiming
towards realistic and applicable ones (e.g., one-to-many DA,
many-to-many DA, DA when the target domain is unknown when training on
the source domain, and source-free DA where just a source model is
available but there is no access to source data).
- Extending DA research to new domains and tasks through both novel
datasets and algorithmic approaches.
- Proposing novel zero-shot and few-shot algorithms and discussing their
relevance for DA..
- Exploring the similarities and differences between algorithmic
approaches to DA, cross-lingual, and cross-task learning.
- A conceptual discussion of the definitions of fundamental concepts
such as domain, transfer as well as zero-shot and few-shot learning.
- Novel approaches to evaluation of DA methods under different
assumptions on data availability (e.g. evaluation without access to
target domain labeled data and even with small amounts of target
domain unlabeled data).
- Thorough empirical comparisons of existing DA methods on existing and
novel tasks, datasets, and setups.

Related Resources

CoNLL 2025   29th Conference on Computational Natural Language Learning
NLP4KGC 2025   4th NLP4KGC: Natural Language Processing for Knowledge Graph Construction
LDK 2025   Fifth Conference on Language, Data and Knowledge
TSD 2025   Twenty-eighth International Conference on Text, Speech and Dialogue
RANLP 2025   Recent Advances in Natural Language Processing
IEEE-Ei/Scopus-ITCC 2025   2025 5th International Conference on Information Technology and Cloud Computing (ITCC 2025)-EI Compendex
Slav-NLP 2025   The 10th Workshop on NLP for Slaviclanguages
NSLP 2025   2nd International Workshop on Natural Scientific Language Processing and Research Knowledge Graphs
CODI CRAC 2025   Joint Workshop on Computational Approaches to Discourse (CODI) and Computational Models of Reference, Anaphora and Coreference (CRAC)
ACL-SRW 2025   ACL 2025 Student Research Workshop