![]() |
| |||||||||||
MMM-SS-XR-MM 2026 : CfP MMM 2026 Special Session “Extended Reality and Multimedia Modelling (XR-MM)” | |||||||||||
Link: https://mmm2026.cz/calls.html#special_sessions_xrmm | |||||||||||
| |||||||||||
Call For Papers | |||||||||||
CfP Special Session “Extended Reality and Multimedia Modelling (XR-MM)”
The Extended Reality and Multimedia Modelling (XR-MM) special session at the Multimedia Modelling 2026 conference invites researchers, industry experts, and enthusiasts to explore the latest advancements in extended reality (XR) and multimedia technologies. This session will focus on the development and integration of XR solutions with multimedia analysis, retrieval, and processing methods, emphasizing seamless and interactive experiences that transform the way we live, work, and interact with our surroundings. Topics of Interests The XR-MM 2026 special session will address the following key topics: • Technologies for next-generation XR applications across all domains: Exploring cutting-edge solutions in virtual reality (VR), augmented reality (AR), and mixed reality (MR) that push the boundaries of XR experiences. o Human factors in design of XR interfaces and interactions. • Real-time 3D Modeling and Rendering: Investigating innovative techniques for creating realistic and dynamic 3D models and environments, enabling high-quality visuals and interactions in XR applications. o 3D modeling and asset management for immersive XR experiences o Synthetic Dataset Generation and Benchmarking for XR Calibration and Reconstruction o Learning-Based Calibration and Pose Estimation for Multi-View XR Systems with Low Overlap and Sparse Data • AI for XR Content Creation: Utilizing artificial intelligence and machine learning for content analysis, understanding and retrieval to facilitate XR content generation. o Generative AI and foundation models for XR content creation and synthetic data generation o Multimedia analysis and AI-based approaches for media mining and adaptation in XR experiences • AI-Driven Multimedia and XR Integration: Utilizing artificial intelligence and machine learning to enhance recognition and manipulation in XR environments, leading to more intuitive and engaging experiences. o Active object detection and real-time scene understanding from first- and third-person perspectives o Processing of egocentric multimedia datasets and streams for immersive XR environments • Multisensory & multimodal Interfaces and Wearable Technologies: Investigating the latest advancements in haptic feedback, gesture recognition, and sensory input/output devices that facilitate natural and immersive interactions with XR and multimedia content. • Adaptive and Interactive Content Delivery: Developing methods for optimizing and personalizing multimedia content based on user preferences, context, and device capabilities, ensuring a seamless XR experience. • Security, privacy aspects, and mitigations for XR multimedia content. Organizers • Claudio Vairo, CNR-ISTI, IT • Imad H. Elhajj, AUB, LB • Leonel Toledo, i2CAT, ES • Dimitrios Zarpalas, CERTH, EL • Georg Thallinger, JOANNEUM RESEARCH, AT |
|