posted by user: AAAI22 || 2082 views || tracked by 3 users: [display]

AAAI22-W 2022 : AAAI-22 Workshop: Learning Network Architecture During Training

FacebookTwitterLinkedInGoogle

Link: https://www.cmu.edu/epp/patents/events/aaai22/
 
When Feb 28, 2022 - Feb 28, 2022
Where online
Submission Deadline Nov 26, 2021
Notification Due Dec 3, 2021
Final Version Due Dec 31, 2021
Categories    machine learning   neural networks   neural architecture search
 

Call For Papers

A fundamental problem in the use of artificial neural networks is that the first step is to guess the network architecture. Fine tuning a neural network is very time consuming and far from optimal. Hyperparameters such as the number of layers, the number of nodes in each layer, the pattern of connectivity, and the presence and placement of elements such as memory cells, recurrent connections, and convolutional elements are all manually selected. If it turns out that the architecture is not appropriate for the task, the user must repeatedly adjust the architecture and retrain the network until an acceptable architecture has been obtained.

There is now a great deal of interest in finding better alternatives to this scheme. Options include pruning a trained network or training many networks automatically. In this workshop we focus on a contrasting approach: to learn the architecture during training. This topic encompasses forms of Neural Architecture Search (NAS) in which the performance properties of each architecture, after some training, are used to guide the selection of the next architecture to be tried. This topic also encompasses techniques that augment or alter a network as the network is trained. An example of the latter is the Cascade Correlation algorithm, as well as others that incrementally build or modify a neural network during training, as needed for the problem at hand.


Our goal is to build a stronger community of researchers exploring these methods, and to find synergies among these related approaches and alternatives. Eliminating the need to guess the right topology in advance of training is a prominent benefit of learning network architecture during training. Additional advantages are possible, including decreased computational resources to solve a problem, reduced time for the trained network to make predictions, reduced requirements for training set size, and avoiding “catastrophic forgetting.” We would especially like to highlight approaches that are qualitatively different from current popular, but computationally intensive, NAS methods.

As deep learning problems become increasingly complex, network sizes must increase and other architectural decisions become critical to success. The deep learning community must often confront serious time and hardware constraints from suboptimal architectural decisions. The growing popularity of NAS methods demonstrates the community’s hunger for better ways of choosing or evolving network architectures that are well-matched to the problem at hand.

Topics include methods for learning network architecture during training, including Incrementally building neural networks during training, new performance benchmarks for the above. Novel approaches, preliminary results, and works in progress are encouraged.


Please see the workshop web page at https://www.cmu.edu/epp/patents/events/aaai22/

Related Resources

IEEE-Ei/Scopus-ITCC 2025   2025 5th International Conference on Information Technology and Cloud Computing (ITCC 2025)-EI Compendex
AAAI 2024   The 38th Annual AAAI Conference on Artificial Intelligence
AAAI 2025   The 39th Annual AAAI Conference on Artificial Intelligence
SPIE-Ei/Scopus-DMNLP 2025   2025 2nd International Conference on Data Mining and Natural Language Processing (DMNLP 2025)-EI Compendex&Scopus
21st AIAI 2025   21st (AIAI) Artificial Intelligence Applications and Innovations
From Data to Decision: Empowering Ecosys 2025   The International Society for Ecological Modelling Global Conference:
IEEE-Ei/Scopus-CNIOT 2025   2025 IEEE 6th International Conference on Computing, Networks and Internet of Things (CNIOT 2025) -EI Compendex
ICSTTE 2025   2025 3rd International Conference on SmartRail, Traffic and Transportation Engineering (ICSTTE 2025)
ACIJ 2024   Advanced Computing: An International Journal
Good-Data@AAAI 2025   AAAI 2025 Workshop on Preparing Good Data for Generative AI: Challenges and Approaches (Good-Data)