![]() |
| |||||||||||||||
NML@ICDM 2025 : Neverending Machine Learning workshop during ICDM 2025 | |||||||||||||||
Link: https://sites.google.com/view/nml-icdm2025/home | |||||||||||||||
| |||||||||||||||
Call For Papers | |||||||||||||||
Overview:
The landscape of machine learning is evolving beyond the traditional paradigm where models are trained and tested on stationary datasets. Real-world applications increasingly demand adaptability to changing data distributions, continuous learning of new tasks, and the ability to handle evolving environments. The Neverending Machine Learning (NML) workshop aims to explore and advance techniques that enable lifelong learning, adaptive modeling, and robustness in dynamic data scenarios. Objective: The workshop aims to bring together researchers and practitioners interested in advancing the capabilities of machine learning systems beyond traditional static datasets. Participants will explore cutting-edge research, share insights, and discuss challenges and opportunities in developing truly adaptive and evolving machine learning solutions. Topics of Interest: 1. Continual and Lifelong Machine Learning: Techniques and algorithms that enable continual learning and adaptation over time, preserving knowledge while accommodating new data and tasks. 2. Learning from High-speed Data Streams: Methods for learning from continuously arriving data, where traditional batch learning and fully supervised approaches are impractical. 3. Machine Unlearning: Techniques to remove specific knowledge or biases from a model's learned representations, allowing for both the removal of outdated knowledge and adapting to the evolving nature of data (such as changing privacy / ethical considerations). 4. Test-time Adaptation: Approaches that enable models to adapt their behavior during inference based on the specific characteristics of the input data or the environment, such as the presence of concept drift. 5. Adaptive TinyML: Techniques and methodologies for implementing adaptive machine learning models on resource-constrained devices, enabling continuous learning and adaptation in edge computing scenarios. 6. Continual Multi-task Learning: Methods that allow models to learn multiple tasks simultaneously, leveraging shared knowledge and enhancing generalization. 7. Continual Transfer Learning: Approaches for transferring knowledge from one domain or task to another, improving learning efficiency and performance in new environments. 8. Continual Open-world Learning: Strategies to recognize and handle unknown classes or concepts during training and inference, ensuring models can operate effectively in open-world scenarios. 9. Out-of-Distribution Detection: Techniques to identify data samples that do not belong to the training distribution are crucial for maintaining model reliability and safety. 10. Few-shot Learning: Algorithms can learn new concepts from a few labeled examples, mimicking human-like rapid learning abilities, especially in continual and lifelong learning scenarios. Format: The workshop will feature keynote presentations by leading experts, contributed paper presentations, and interactive panel discussions. Participants will be able to engage in hands-on sessions and collaborative activities to foster innovation and networking among attendees. Target Audience: Researchers, practitioners, and students working in machine learning, artificial intelligence, data science, and related fields are encouraged to participate. Participants with expertise or interest in lifelong learning, adaptive systems, and dynamic data environments will find the workshop particularly relevant. Submission Guidelines: Authors are invited to submit original research contributions or position papers addressing one or more of the workshop's topics. English-language research contributions that have not been concurrently submitted or published elsewhere. Submissions should adhere to the ICDM formatting and submission guidelines, i.e., they must adhere to the IEEE 2-column format. For the regular paper track, submissions should not exceed 8 pages of content, plus an additional 2 pages for references. For the short paper track, submissions should be limited to a maximum of 4 pages of content, plus 1 extra page for references. Important Dates: Paper submission deadline: September 10, 2025 Notification of acceptance: October 7, 2025 Camera-ready deadline and copyright form: October 11, 2025 Workshop day: December 12, 2025 **All times are at 11:59PM AoE** unless otherwise stated In alignment with the ICDM 2025 reviewing scheme, all submissions will undergo triple-blind reviews by the Program Committee, evaluating technical quality, relevance to the conference scope, originality, significance, and clarity. All accepted papers will be presented as posters, with a select few chosen for oral presentations. The best paper award will be conferred. Accepted papers will be published in the IEEE ICDM 2025 Workshop proceedings (published by IEEE and EI-indexed). The link to the submission system will be available soon. Organizers: Bartosz Krawczyk, Center for Imaging Science, Rochester Institute of Technology, USA Yi He, School of Computing, Data Science, and Physics, William & Mary, USA Xin Jin, School of Software, Yunnan University, China Michal Wozniak, Department of Systems and Computer Networks, Wroclaw University of Science and Technology, Poland Contact Information: For inquiries regarding the workshop, please contact bartosz.krawczyk@rit.edu and michal.wozniak@pwr.edu.pl |
|