posted by user: AAAI22 || 419 views || tracked by 3 users: [display]

AAAI22-W 2022 : AAAI-22 Workshop: Learning Network Architecture During Training

FacebookTwitterLinkedInGoogle

Link: https://www.cmu.edu/epp/patents/events/aaai22/
 
When Feb 28, 2022 - Feb 28, 2022
Where online
Submission Deadline Nov 26, 2021
Notification Due Dec 3, 2021
Final Version Due Dec 31, 2021
Categories    machine learning   neural networks   neural architecture search
 

Call For Papers

A fundamental problem in the use of artificial neural networks is that the first step is to guess the network architecture. Fine tuning a neural network is very time consuming and far from optimal. Hyperparameters such as the number of layers, the number of nodes in each layer, the pattern of connectivity, and the presence and placement of elements such as memory cells, recurrent connections, and convolutional elements are all manually selected. If it turns out that the architecture is not appropriate for the task, the user must repeatedly adjust the architecture and retrain the network until an acceptable architecture has been obtained.

There is now a great deal of interest in finding better alternatives to this scheme. Options include pruning a trained network or training many networks automatically. In this workshop we focus on a contrasting approach: to learn the architecture during training. This topic encompasses forms of Neural Architecture Search (NAS) in which the performance properties of each architecture, after some training, are used to guide the selection of the next architecture to be tried. This topic also encompasses techniques that augment or alter a network as the network is trained. An example of the latter is the Cascade Correlation algorithm, as well as others that incrementally build or modify a neural network during training, as needed for the problem at hand.


Our goal is to build a stronger community of researchers exploring these methods, and to find synergies among these related approaches and alternatives. Eliminating the need to guess the right topology in advance of training is a prominent benefit of learning network architecture during training. Additional advantages are possible, including decreased computational resources to solve a problem, reduced time for the trained network to make predictions, reduced requirements for training set size, and avoiding “catastrophic forgetting.” We would especially like to highlight approaches that are qualitatively different from current popular, but computationally intensive, NAS methods.

As deep learning problems become increasingly complex, network sizes must increase and other architectural decisions become critical to success. The deep learning community must often confront serious time and hardware constraints from suboptimal architectural decisions. The growing popularity of NAS methods demonstrates the community’s hunger for better ways of choosing or evolving network architectures that are well-matched to the problem at hand.

Topics include methods for learning network architecture during training, including Incrementally building neural networks during training, new performance benchmarks for the above. Novel approaches, preliminary results, and works in progress are encouraged.


Please see the workshop web page at https://www.cmu.edu/epp/patents/events/aaai22/

Related Resources

IEEE COINS 2022   IEEE COINS 2022: Hybrid (3 days on-site | 2 days virtual)
Federated Learning in IOT Cybersecurity 2021   PeerJ Computer Science - Federated Learning for Cybersecurity in Internet of Things
AAAI 2021   35th AAAI Conference on Artificial Intelligence
ICML 2022   39th International Conference on Machine Learning
FAIML 2022   2022 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML 2022)
AAAI 2022   The Thirty-Sixth AAAI Conference on Artificial Intelligence
MLDM 2022   18th International Conference on Machine Learning and Data Mining
IJCNN 2023   International Joint Conference on Neural Networks
IEEE COINS 2022   Internet of Things IoT | Artificial Intelligence | Machine Learning | Big Data | Blockchain | Edge & Cloud Computing | Security | Embedded Systems |
CFDSP 2022   2022 International Conference on Frontiers of Digital Signal Processing (CFDSP 2022)