posted by user: claudiogalicchio || 3921 views || tracked by 3 users: [display]

Random-Weights Neural Networks @ IWANN 2019 : Special Session on Random-Weights Neural Networks at IWANN 2019

FacebookTwitterLinkedInGoogle

Link: http://iwann.uma.es/?page_id=290#SS08
 
When Jun 12, 2019 - Jun 14, 2019
Where Gran Canaria, Spain
Submission Deadline Feb 1, 2019
Notification Due Mar 18, 2019
Final Version Due Mar 26, 2019
Categories    neural networks   machine learning
 

Call For Papers

Random-weights Neural Networks identify a class of artificial neural models that employ a form of randomization in both their architectural and training design. Typically, connections to the hidden layer(s) are left untrained after initialization, and only the output weights need to be adjusted through learning (typically, by means of non-iterative methods). Extreme efficiency of training algorithms, along with the ease of implementation, made the randomized approach to Neural Networks design an incredibly widespread and popular methodology among both researchers and practitioners. Besides, from a theoretical perspective, randomization enables an effective study of the inherent properties for various kinds of Neural Networks architectures, even in the absence of (or prior to) training of internal weights connections. In literature, the approach has been instantiated in several forms, both in the case of feed-forward models (e.g., Random Vector Functional Link, Extreme Learning Machine, No-prop and Stochastic Configuration Networks), and for recurrent architectures (e.g., Echo State Networks, Liquid State Machines). Moreover, the rise of the Deep Learning era in Machine Learning research has recently given a further impulse to the study of hierarchically organized neural architectures with multiple random-weights components. In this concern, the potentialities of combining the advantages of deep architectures and the efficiency of randomized Neural Networks approaches remain still largely unexplored.

This session calls for contributions in the area of random weights Neural Networks from all perspectives, from seminal works on breakthrough ideas to applications of consolidated learning methodologies. Topics of interest for this session include, but are not limited to, the following:

- Neural Networks with random weights
- Randomized algorithms for Neural Networks
- Non-iterative methods for learning
- Random Vector Functional Link, Extreme Learning Machines, No-prop, and Stochastic Configuration Networks
- Reservoir Computing, Echo State Networks, and Liquid State Machines
- Deep Neural Networks with Random Weights (e.g. Deep Extreme Learning Machines and Deep Echo State Networks)
- Theoretical analysis on advantages and downsides of randomized Neural Networks
- Comparisons with fully trained Neural Networks
- Real-world Applications

Related Resources

CVIPPR 2026   2026 4th Asia Conference on Computer Vision, Image Processing and Pattern Recognition (CVIPPR 2026)
IEEE-ICECCS 2026   2025 IEEE International Conference on Electronics, Communications and Computer Science (ICECCS 2026)
AMLDS 2026   IEEE--2026 2nd International Conference on Advanced Machine Learning and Data Science
Ei/Scopus-CMLDS 2026   2026 3rd International Conference on Computing, Machine Learning and Data Science (CMLDS 2026)
CNCIT 2026   2026 5th International Conference on Networks, Communications and Information Technology
CFP-CIPCV-EI/SCOPUS 2026   The 2026 4th International Conference on Intelligent Perception and Computer Vision
NeTCoM 2026   18th International Conference on Networks & Communications
ICDM 2026   The 26th IEEE International Conference on Data Mining
Ei/Scopus-CNIOT 2026   2026 7th IEEE International Conference on Computing, Networks and Internet of Things (CNIOT 2026)
IEEE-Ei/Scopus-ICISC 2026   2025 6th International Conference on Intelligent System and Computing (ICISC 2026)