LR4SD 2019 : CFP Special Session on Learning Representations for Structured Data (LR4SD) @ IJCNN 2019
Call For Papers
** Apologies for cross-posting** CFP [Deadline extended]:
Special Session on "Learning Representations for Structured Data"
2019 International Joint Conference on Neural Networks (IJCNN)
July 14-19 2019, Budapest, Hungary
Paper submission: EXTENDED to 15 January 2019
Notification of acceptance: 28 February 2019
Aims and Scope:
Structured data, e.g. sequences, trees and graphs, are a natural representation for compound information made of atomic information pieces (i.e. the nodes and their labels) and their intertwined relationships, represented by the edges (and their labels). Sequences are simple structures representing linear dependencies such as in genomics and proteomics, or with time series data. Trees, on the other hand, allow to model hierarchical contexts and relationships, such as with natural language sentences, crystallographic structures, images. Graphs are the most general and complex form of structured data allowing to represent networks of interacting elements, e.g. in social graphs or metabolomics, as well as data where topological variations influence the feature of interest, e.g. molecular compounds. Being able to process data in these rich structured forms provides a fundamental advantage when it comes to identifying data patterns suitable for predictive and/or explorative analyses. This has motivated a recent increasing interest of the machine learning community into the development of learning models for structured information.
On the other hand, recent improvements in the predictive performances shown by machine learning methods is due to their ability, in contrast to traditional approaches, to learn a “good” representation for the task under consideration. Deep Learning techniques are nowadays widespread, since they allow to perform such representation learning in an end-to-end fashion. Nonetheless, representations learning is becoming of great importance in other areas, such in kernel-based and probabilistic models. It has also been shown that, when the data available for the task at hand is limited, it is still beneficial to resort to representations learned in an unsupervised fashion, or on different, but related, tasks.
This session focuses on learning representation for structured data such as sequences, trees, graphs, and relational data.
Topics that are of interest to this session include, but are not limited to:
- Probabilistic models for structured data
- Structured output generation (probabilistic models, variational autoencoders, adversarial training, …)
- Deep learning and representation learning for structures
- Learning with network data
- Recurrent, recursive and contextual models
- Reservoir computing and randomized neural networks for structures
- Kernels for structured data
- Relational deep learning
- Learning implicit representations
- Applications of adaptive structured data processing: e.g. Natural Language Processing, machine vision (e.g. point clouds as graphs), materials science, chemoinformatics, computational biology, social networks.
- For paper guidelines please visit https://www.ijcnn.org/paper-submission-guidelines
- For submissions please select the single topic "S11. Learning Representations for Structured Data" from the "S. SPECIAL SESSIONS" category as the main research topic on https://ieee-cis.org/conferences/ijcnn2019/upload.php
- Davide Bacciu, University of Pisa
- Thomas Gärtner, University of Nottingham
- Nicolò Navarin, University of Padova
- Alessandro Sperduti, University of Padova
For any enquire, please write to: bacciu [at] di.unipi.it or nnavarin [at] math.unipd.it