posted by user: demarsico || 1016 views || tracked by 1 users: [display]

MHMC 2016 : Multimodal Interaction in Industrial Human-Machine Communication

FacebookTwitterLinkedInGoogle

Link: http://etfa2016.org/etfa-2016/workshops
 
When Sep 6, 2016 - Sep 6, 2016
Where Berlin
Submission Deadline May 20, 2016
Notification Due Jul 10, 2016
Final Version Due Jul 30, 2016
Categories    human-computer interaction   industrial engineering   multimodal interaction
 

Call For Papers

MHMC 2016 – International Workshop on Multimodal Interaction in Industrial Human-Machine Communication



In connection with the 21st IEEE International Conference on Emerging Technologies and Factory Automation



September 6, 2016, Berlin



http://etfa2016.org/images/track-cfp/MHMC_CfP.pdf



Aims and Objectives

Nowadays, industrial environments are full of sophisticated computer-controlled machines. In addition, recent developments in pervasive and ubiquitous computing provide a further support for advanced activity control. Even if the exploitation of these technologies is very often committed to specialized workers, who have been purposely trained to use complex equipments, easy and effective interaction is a key factor that can bring many benefits – from faster task completion to error prevention, cognitive load reduction and higher employee satisfaction.

Multimodal interaction means using “non-conventional” input and/or output tools and modalities to communicate with a device. The main purpose of multimodal interfaces is to combine both multiple input modes — usually more “natural” than traditional input devices, such as touch, speech, hand gestures, head/body movements and eye gaze — and solutions in which different output modalities are used in a coordinated manner — such as visual displays (e.g. virtual and augmented reality), auditory cues (e.g. conversational agents) and haptic systems (e.g. force feedback controllers). Besides handling input fusion, multimodal interfaces can also handle output fission, in an essentially dynamic progress. Sophisticated multimodal interfaces can integrate complementary modalities to get the most out of the strengths of each mode, and overcome weaknesses. In addition, they can support handling different environmental situations as well as different user (sensory/motor) abilities.

Although multimodal interaction is becoming more and more common in our everyday life, industrial applications are still rather few, in spite of their potential advantages. For example, a camera could allow a machine to be controlled through hand gesture commands, or the user might be monitored in order to detect potential dangerous behaviors. On the other side, an augmented or virtual reality system could be employed to provide an equipment operator with sophisticated visual cues, where auditory/olfactory displays might be used as an additional alerting mechanism in risky environments. Besides being used in real working situations to increase the amount and quality of available information, augmented/virtual reality interaction can be also exploited to implement an effective and safe training plan.



This workshop aims at gathering works presenting different forms of multimodal interaction in industrial processes, equipment and settings with a twofold purpose:

· Taking stock of the current state of multimodal systems for industrial applications.

· Being a showcase to demonstrate the potential of multimodal communication to those who have never considered its application in industrial settings.



Both proposals of novel applications and papers describing user studies are welcome.



Summary of topics
Topics of interest include, but are not limited to, the following:



1. Multimodal Input

· Vision-based input

· Speech input

· Tangible interfaces

· Motion tracking sensors

2. Multimodal Output

· Virtual Reality

· Augmented Reality

· Auditory displays

· Haptic (or tactile) interfaces

· Olfactory displays

3. Combination of “traditional” input and output modalities and multimodal solutions

Any form of integration of conventional input and output modalities (e.g. keyboard, mouse, buttons, standard LCD monitors, audiovisual content, etc.) with multimodal communication.



Important dates

Deadline for submission of workshop papers: May 20

Notification of acceptance of workshop papers: July 10

Deadline for submission of final workshop papers: July 30



Submission of papers

The working language of the workshop is English. Papers are limited to 8 double column pages in a font no smaller than 10-points.

Manuscripts can be submitted here: http://etfa2016.org/2015-08-23-13-16-59/submit-papers.



Workshop Co-Chairs:

· Maria De Marsico, “Sapienza” University of Rome, Italy
Email: demarsico@di.uniroma1.it

· Giancarlo Iannizzotto, University of Messina, Italy
Email: giancarlo.iannizzotto@unime.it

· Marco Porta, University of Pavia, Italy
Email: marco.porta@unipv.it


More information can be found on the workshop page (http://etfa2016.org/etfa-2016/workshops) and on the conference website (http://etfa2016.org/).

Related Resources

ICML 2017   34rd International Conference on Machine Learning
ICIEA 2017   2017 4th International Conference on Industrial Engineering and Applications (ICIEA 2017)- EI &Scopus
CHI 2017   ACM CHI Conference on Human Factors in Computing Systems
ICII 2017   2017 3rd International Conference on Information Management and Industrial Engineering (ICII 2017)
IJCAI 2017   International Joint Conference on Artificial Intelligence
ICMIMT 2017   2017 The 8th International Conference on Mechanical, Industrial, and Manufacturing Technologies (MIMT 2017)--Ei Compendex, Scopus
IUI 2017   22nd ACM International Conference on Intelligent User Interfaces
ICITM 2017   2017 the 6th International Conference on Industrial Technology and Management (ICITM 2017)-IEEE, Ei Compendex
IEEE-CEMAG-SI-VRAR 2017   IEEE CE Magazine - Special Issue on Advanced Interaction and Virtual-Augmented Reality
MA3HMI 2016   International Workshop on Multimodal Analyses enabling Artificial Agents in Human-Machine Interaction