posted by user: claudiogalicchio || 2925 views || tracked by 1 users: [display]

Reservoir Computing @ IJCNN 2020 : Challenges in Reservoir Computing - Special Session of IJCNN 2020

FacebookTwitterLinkedInGoogle

Link: https://sites.google.com/view/reservoir-computing-ijcnn2020/home
 
When Jul 19, 2020 - Jul 24, 2020
Where Glasgow, UK
Submission Deadline Jan 30, 2020
Notification Due Mar 15, 2020
Final Version Due Apr 15, 2020
Categories    neural networks   machine learning   echo state network   reservoir computing
 

Call For Papers

** Organizers **
Claudio Gallicchio (University of Pisa), Lukas Gonon (University of St. Gallen), Josef Teichmann (ETH Zurich), Juan-Pablo Ortega (University of St. Gallen, Switzerland and CNRS, France)

** Aim and Scope **
Reservoir Computing (RC) defines a class of recurrent neural systems where the dynamical memory component is left untrained after initialization. Only a simple - typically linear - readout layer is adapted on a set of training examples, thereby allowing the use of simple learning strategies. The overall approach has intriguing features that attracted researchers during the last decade. First, it gives a refreshing perspective on the use of dynamical systems in machine learning for time-series data. Moreover, the resulting ease of implementation and fast training compared to fully trained architectures made it greatly appealing for experimental usage, mostly in academia. Yet, at the current stage of neural networks/deep learning research development, RC-based methods do present several downsides that prevent extensive (e.g., industrial) applications to problems of Artificial Intelligence size with human-level performance. One such fundamental downside is that in real-world applications, the training efficiency of RC risks to vanish completely, colliding with the complexity involved by possibly gigantic reservoir spaces, and cost-intensive hyper-parameter search, often required to get state-of-the-art results. The difficulty in effectively dealing with huge input-output spaces is a related known RC issue that complicates matters further. Overcoming complexities of this kind represents a major challenge in RC research nowadays. On a different side, methodological, architectural and theoretical studies on RC have the potentiality to both develop a deeper understanding of the operation of (fading memory) dynamical neural systems, and to foster the progress of their training algorithms. Besides, novel ways to control the organization of neural dynamics, such is the case of conceptors, can originate from RC and transfer to more general ML setups. A further research-attractive dimension of RC systems is that they are inherently amenable to be implemented in neuromorphic hardware. In this regard, photonic reservoirs are certainly among the most exciting possibilities emerged in the last few years, promising both ultra-fast processing and very low energy consumption. However, designing full optical RC networks for real-world applications currently needs to pursue primary goals, such as implementing non-linear reservoirs with optical readout training.

This session intends to give a new impetus to RC research within the international neural networks community. We then invite to submit papers on both theoretical and application sides of RC. In particular, this session calls for novel, potentially groundbreaking, contributions that specifically address open challenges in the RC field.

A list of relevant topics for this session includes, without being limited to, the following:

- Reservoir Computing for Artificial Intelligence problems (e.g., vision, natural language processing, etc.)
- Reservoir Computing methods for fully trained Recurrent Neural Networks (including hybrid approaches)
- Neuromorphic Reservoir Computing
- Novel Reservoir Computing architectures, models and training algorithms
- Theory of dynamical systems in neural networks, including stability of input-driven temporal embeddings
- Statistical Learning Theory of Reservoir Computing networks
- Ensemble learning and Reservoir Computing
- Advancements in Reservoir Computing models, e.g. Echo State Networks and Liquid State Machines
- Conceptors
- Deep Reservoir Computing
- Reservoir dimensionality reduction, efficient reservoir hyper-parameter search and learning
- New applications of Reservoir Computing

** Papers Submission **
Papers submission for this Special Session follows the same process as for the regular sessions of WCCI 2020. When submitting your paper choose "Challenges in Reservoir Computing" as (main) research topic (among the Special Sessions topics).

For further information and news in this regard, please refer the WCCI 2020 website: https://wcci2020.org/submissions/

** Important Dates **
15 Jan 2020 Paper Submission Deadline
15 Mar 2020 Paper Acceptance Notification Date
15 April 2020 Final Paper Submission and Early Registration Deadline
19-24 July 2020 IEEE WCCI 2020, Glasgow, Scotland, UK

Related Resources

IJCNN 2023   International Joint Conference on Neural Networks
ITCFEC 2023   Special Issue for Submission: Internet of Things and Cloud-Fog-Edge Computing
MLDM 2023   18th International Conference on Machine Learning and Data Mining
SEKE 2023   The 35th International Conference on Software Engineering and Knowledge Engineering
FAIML 2023   2023 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML 2023)
PODC 2023   Principles of Distributed Computing
CVPR 2023   The IEEE/CVF Conference on Computer Vision and Pattern Recognition
MobiCom 2023   The 29th Annual International Conference On Mobile Computing And Networking (Winter Deadline)
ICICO 2023   6th International Conference on Intelligent Computing & Optimization 2023
CSML 2023   International Conference on Computer Science and Machine Learning