posted by organizer: stefan_t_kramer || 393 views || tracked by 1 users: [display]

DeCoDeML 2019 : First Workshop on Deep Continuous-Discrete Machine Learning @ ECML PKDD 2019

FacebookTwitterLinkedInGoogle

Link: https://sites.google.com/view/decodeml-workshop-2019/
 
When Sep 16, 2019 - Sep 16, 2019
Where Würzburg, Germany
Submission Deadline Jun 21, 2019
Notification Due Jul 19, 2019
Final Version Due Jul 26, 2019
 

Call For Papers

CALL FOR PAPERS

First Workshop on Deep Continuous-Discrete Machine Learning (DeCoDeML) @ ECML PKDD 2019

September 16, 2019, Würzburg, Germany

Co-located with ECML PKDD 2019, http://ecmlpkdd2019.org

Workshop site: https://sites.google.com/view/decodeml-workshop-2019/


Since the beginnings of machine learning – and indeed already hinted at in Alan Turing’s groundbreaking 1950 paper ”Computing machinery and intelligence” – two opposing approaches have been pursued: On the one hand, approaches that relate learning to knowledge and mostly use ”discrete” formalisms of formal logic. On the other hand, approaches which, mostly motivated by biological models, investigate learning in artificial neural networks and predominantly use ”continuous” methods from numerical optimization and statistics. The recent successes of deep learning can be attributed to the latter, the ”continuous” approach, and are currently opening up new opportunities for computers to ”perceive” the world and to act, with farreaching consequences for industry, science and society. The massive success in recognizing ”continuous” patterns is the catalyst for a new enthusiasm for artificial intelligence methods. However, today’s artificial neural networks are hardly suitable for learning and understanding ”discrete” logical structures, and this is one of the major hurdles to further progress.

Accordingly, one of the biggest open problems is to clarify the connection between these two learning approaches (logical-discrete, neural-continuous). In particular, the role and benefits of prior knowledge need to be reassessed and clarified. The role of formal logic in ensuring sound reasoning must be related to perception through deep networks. Further, the question of how to use prior knowledge to make the results of deep learning more stable, and to explain and justify them, is to be discussed. The extraction of symbolic knowledge from networks is a topic that needs to be reexamined against the background of the successes of deep learning. Finally, it is an open question if and how the principles responsible for the success of deep learning methods can be transferred to symbolic learning.


Workshop format:

This is a half-day workshop. We are aiming for a real workshop with a lot of interaction, and find a workshop is the right format because the topic is cutting-edge with much on-going work. Note that the workshop focuses on basic research questions (continuous/discrete and learning/knowledge in the era of deep learning), not consequences thereof or the like. The workshop will consist of:

* Oral presentations of the accepted papers. Depending on the number, they may range from 10 to 20 minutes each.

* A panel: Open problems in deep continuous-discrete machine learning and how they can be addressed. How can the scientific community organize itself to contribute?

* We will try to add a longer invited talk by a relevant and well recognized expert.


Paper submission:

Authors should submit a PDF version in Springer LNCS style using the workshop EasyChair site: https://easychair.org/my/conference?conf=decodeml2019

We request extended abstracts on work in progress, already finished work, published work, position statements, etc. between two and three pages long in Springer LNCS style. Author names and affiliations should be included (no blind reviewing). Submission will take place via EasyChair.

All submissions will be reviewed by at least three PC members. Accepted papers will be published on the workshop webpage. We are planning a topic in the section "Machine Learning and Artificial Intelligence" of the journal "Frontiers in Big Data". Further possibilities and future events will be discussed at the workshop.


Important Dates:

* Submission deadline: Friday, June 21, 2019 *** DEADLINE EXTENDED ***

* Acceptance notification: Friday, July 19, 2019

* Camera-ready deadline: Monday, July 26, 2019

* Workshop: Monday, September 16, 2019


PC members:

* Henrik Boström, KTH Royal Institute of Technology, Sweden

* Ines Dutra, Universidade do Porto, Portugal

* Eibe Frank, University of Waikato, New Zealand

* Johannes Fürnkranz, TU Darmstadt, Germany

* Iryna Gurevych, TU Darmstadt, Germany

* Visvanathan Ramesh, Goethe University Frankfurt am Main, Germany

* Bertil Schmidt, Johannes Gutenberg University Mainz, Germany

* Ivan Titov, University of Edinburgh, UK

* Jochen Triesch, Goethe University Frankfurt am Main, Germany

* Ivan Vulic, University of Cambridge, UK

* Michael Wand, Johannes Gutenberg University Mainz, Germany

* Gerson Zaverucha, Federal University of Rio de Janeiro, Brazil


Organizers and contact:

* Kristian Kersting (Technical University Darmstadt), kersting@cs.tu-darmstadt.de

* Stefan Kramer (Johannes Gutenberg University Mainz), kramer@informatik.uni-mainz.de

Related Resources

MNLP 2020   4th IEEE Conference on Machine Learning and Natural Language Processing
AICA 2020   O'Reilly AI Conference San Jose
ICDMML 2020   【EI SCOPUS】2020 International Conference on Data Mining and Machine Learning
MAAIDL 2020   Springer Book 'Malware Analysis using Artificial Intelligence and Deep Learning'
ISBDAI 2020   【Ei Compendex Scopus】2020 International Symposium on Big Data and Artificial Intelligence
IEEE AIML4COINS 2020   IEEE AIML4COINS2020 | Artificial Intelligence | Machine Learning | Deep Learning | Machine Vision | Big Data Analytics | Video Analytics | Speech Recognition | NLP
16th AIAI 2020   16th Artificial Intelligence Applications and Innovations
IEEE CiSt 2020   6th IEEE Congress on Information Science and Technology
SPECOM 2020   22nd International Conference on Speech and Computer
CMES_RADLMSA 2020   CMES_Recent Advances on Deep Learning for Medical Signal Analysis (IF: 0.796)