posted by organizer: krausea || 620 views || tracked by 8 users: [display]

NIPS DISCML 2017 : NIPS 2017 Workshop on Discrete Structures in Machine Learning (DISCML)

FacebookTwitterLinkedInGoogle

Link: http://www.discml.cc
 
When Dec 8, 2017 - Dec 8, 2017
Where Long Beach
Submission Deadline Oct 30, 2017
Categories    machine learning   discrete optimization
 

Call For Papers

============================================================

Call for Papers
DISCML -- 7th Workshop on Discrete Structures in Machine Learning at NIPS 2017 (Long Beach)

Dec 8, 2017
www.discml.cc

============================================================


Discrete optimization problems and combinatorial structures are ubiquitous in machine learning. They arise for discrete labels with complex dependencies, structured estimators, learning with graphs, partitions, permutations, or when selecting informative subsets of data or features.

What are efficient algorithms for handling such problems? Can we robustly solve them in the presence of noise? What about streaming or distributed settings? Which models are computationally tractable and rich enough for applications? What theoretical worst-case bounds can we show? What explains good performance in practice?

Such questions are the theme of the DISCML workshop. It aims to bring together theorists and practitioners to explore new applications, models and algorithms, and mathematical properties and concepts that can help learning with complex interactions and discrete structures.

We invite high-quality submissions that present recent results related to discrete and combinatorial problems in machine learning, and submissions that discuss open problems or controversial questions and observations, e.g., missing theory to explain why algorithms work well in certain instances but not in general, or illuminating worst case examples. We also welcome the description of well-tested software and benchmarks.

Areas of interest include, but are not restricted to:
* discrete optimization in context of deep learning
* bridging discrete and continuous optimization methods
* graph algorithms
* continuous relaxations
* learning and inference in discrete probabilistic models
* algorithms for large data (streaming, sketching, distributed)
* online learning
* new applications


Submissions:

Please send submissions in NIPS 2017 format (length max. 6 pages, non-anonymous) to submit@discml.cc

Submission deadline: October 30, 2017.


Organizers:
Jeff A. Bilmes (University of Washington, Seattle),
Stefanie Jegelka (MIT),
Amin Karbasi (Yale University),
Andreas Krause (ETH Zurich, Switzerland),
Yaron Singer (Harvard University)

Related Resources

NIPS 2017   The Thirty-first Annual Conference on Neural Information Processing Systems
ICPR 2018   24th International Conference on Pattern Recognition
ECCV 2018   European Conference on Computer Vision
ETHE Blearning 2017   Blended learning in higher education: research findings
NIPS 2018   The Thirty-second Annual Conference on Neural Information Processing Systems
MLDM 2018   14th International Conference on Machine Learning and Data Mining MLDM 2018
NAACL HLT 2018   The 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
COLT 2018   Computational Learning Theory
ICANN 2018   27th International Conference on Artificial Neural Networks
NIPS IEVDL 2017   NIPS 2017 Workshop: Interpreting, Explaining and Visualizing Deep Learning - Now what ?