posted by user: bapl || 1414 views || tracked by 2 users: [display]

Adapt-NLP 2021 : The Second Workshop on Domain Adaptation for NLP

FacebookTwitterLinkedInGoogle

Link: https://adapt-nlp.github.io/Adapt-NLP-2021/
 
When Apr 19, 2021 - Apr 20, 2021
Where virtual
Submission Deadline Jan 18, 2021
Notification Due Feb 18, 2018
 

Call For Papers

Call For Papers

Overview

The growth in computational power and the rise of Deep Neural Networks
(DNNs) have revolutionized the field of Natural Language Processing
(NLP). The ability to collect massive datasets with the capacity to
train big models on powerful GPUs, has yielded NLP-based technology
that was beyond imagination only a few years ago.

Unfortunately, this technology is still limited to a handful of
resource rich languages and domains. This is because most NLP
algorithms rely on the fundamental assumption that the training and
the test sets are drawn from the same underlying distribution. When
the train and test distributions do not match, a phenomenon known as
domain shift, such models are likely to encounter performance drops.

Despite the growing availability of heterogeneous data, many NLP
domains still lack the amounts of labeled data required to feed
data-hungry neural models, and in some domains and languages even
unlabeled data is scarce. As a result, the problem of domain
adaptation, training an algorithm on annotated data from one or more
source domains, and applying it to other target domains, is a
fundamental challenge that has to be solved in order to make NLP
technology available for most world languages and textual domains.

Domain Adaptation (DA) is hence the focus of this workshop.
Particularly, the topics of the workshop include, but are not
restricted to:

- Novel DA algorithms addressing existing and new assumptions (e.g.
assuming or not assuming unlabeled data from the source and target
domains, making certain assumptions on the differences between the
source and target domain distributions, etc.).
- Introducing and exploring novel or under-explored DA setups, aiming
towards realistic and applicable ones (e.g., one-to-many DA,
many-to-many DA, DA when the target domain is unknown when training on
the source domain, and source-free DA where just a source model is
available but there is no access to source data).
- Extending DA research to new domains and tasks through both novel
datasets and algorithmic approaches.
- Proposing novel zero-shot and few-shot algorithms and discussing their
relevance for DA..
- Exploring the similarities and differences between algorithmic
approaches to DA, cross-lingual, and cross-task learning.
- A conceptual discussion of the definitions of fundamental concepts
such as domain, transfer as well as zero-shot and few-shot learning.
- Novel approaches to evaluation of DA methods under different
assumptions on data availability (e.g. evaluation without access to
target domain labeled data and even with small amounts of target
domain unlabeled data).
- Thorough empirical comparisons of existing DA methods on existing and
novel tasks, datasets, and setups.

Related Resources

NLP4KGC 2025   4th NLP4KGC: Natural Language Processing for Knowledge Graph Construction
DRIJ 2025   Dental Research: An International Journal
Ei/Scopus-ITCC 2026   2026 6th International Conference on Information Technology and Cloud Computing (ITCC 2026)
WMDQS 2025   1st Workshop on Multilingual Data Quality Signals at COLM 2025
RTME 2026   11th International Conference on Recent Trends in Mechanical Engineering
CSIJ 2025   Circuits and Systems: An International Journal (CSIJ)
AEROIJ 2025   Aerospace Engineering: An International Journal
CSEIJ 2025   Call for Conference Proceedings - Computer Science & Engineering: An International Journal
EAMT 2026   European Association for Machine Translation Conferences/Workshops
NTIJ 2025   Nanoscience and Technology: An International Journal