SI-TNNLS-SMLMATA 2018 : TNNLS Special Issue on Structured Multi-output Learning: Modelling, Algorithm, Theory and Applications
Call For Papers
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS Special Issue on
Structured Multi-output Learning: Modelling, Algorithm, Theory and Applications
Structured multi-output learning aims to predict multiple structured outputs for an input, where the output involve structured objects such as sequences, strings, trees, lattices, or graphs, and the output values are characterized by diverse data types, such as binary, nominal, ordinal and real-valued variables. Such problems arise in a variety of real-world applications, ranging from document classification, computer emulation, sensor network analysis, concept-based information retrieval, human action/causal induction, to video analysis, image annotation/retrieval, gene function prediction and brain science. Due to its popularity in applications, structured multi-output learning have also been widely explored in machine learning community, such as multi-label/multiclass classification, multi-target regression, multi-concept retrieval, hierarchical classification with class taxonomies, label sequence learning, sequence alignment learning, and supervised grammar learning, and so on.
The theoretical properties of existing structured multi-output learning approaches are still not well understood. Moreover, the emerging trends of ultrahigh input and output dimensionality, and the complexly structured objects, lead to formidable challenges for structured multi-output learning. Therefore, it is imperative to propose practical mechanisms and efficient optimization algorithms for large-scale applications. Deep neural networks have been the promising learning systems in various applications. It remains non-trivial for practitioners to design novel deep neural networks and related learning systems that are appropriate for more comprehensive structured multi-output learning domains. Interested topics in this special issue include, but are not limited to:
• Novel deep neural networks and related learning systems for structured multi-output learning tasks.
• Novel modellings for structured multi-output learning problems.
• Statistical theory analysis for multiple structured data.
• Large-scale optimization algorithms for learning multiple structured data.
• Sparse representation learning for large-scale multiple structured data.
• Active learning for structured multi-output learning problems.
• Online learning for structured multi-output learning problems.
• Metric learning for structured multi-output learning problems.
• Structured multi-output learning with noisy data.
• Structured multi-output learning with imbalanced data.
• New applications: 1) Image annotation/retrieval; 2) Video analysis; 3) Document retrieval, classification, graphical model; 3) Natural language processing; 4) Bioinformatics; 5) Information retrieval; 6) Graph embedding, network embedding; 7) Other novel applications.
• 1st, November, 2018: Submission deadline.
• 25th, February, 2019: Reviewer’s comments to authors.
• 20th, April, 2019: Submission deadline of revisions.
• 25th, June, 2019: Final decisions to authors.
• August, 2019: Publication date.
• Weiwei Liu, University of New South Wales, Australia.
• Xiaobo Shen, Nanyang Technological University, Singapore.
• Yew-Soon Ong, Nanyang Technological University, Singapore.
• Ivor W. Tsang, University of Technology Sydney, Australia.
• Chen Gong, Nanjing University of Science and Technology, China.
• Vladimir Pavlovic, Rutgers University, USA.
• Read the Information for Authors at http://cis.ieee.org/tnnls.
• Submit your manuscript at the TNNLS webpage (http://mc.manuscriptcentral.com/tnnls) and follow the submission procedure. Please, clearly indicate on the first page of the manuscript and in the cover letter that the manuscript is submitted to this special issue. Send an email to the leading editor Dr. Weiwei Liu (firstname.lastname@example.org) with subject “TNNLS special issue submission” to notify about your submission.
• Early submissions are welcome. We will start the review process as soon as we receive your contributions.