posted by system || 4315 views || tracked by 5 users: [display]

NIPS SPARSE 2011 : NIPS Workshop on Sparse Representation and Low-rank Approximation

FacebookTwitterLinkedInGoogle

Link: http://www.cs.berkeley.edu/~ameet/sparse-low-rank-nips11/
 
When Dec 16, 2011 - Dec 17, 2011
Where Granada, Spain
Submission Deadline Oct 14, 2011
Categories    machine learning
 

Call For Papers

Sparse representation and low-rank approximation are fundamental tools in fields as diverse as computer vision, computational biology, signal processing, natural language processing, and machine learning. Recent advances in sparse and low-rank modeling have led to increasingly concise descriptions of high dimensional data, together with algorithms of provable performance and bounded complexity. Our workshop aims to survey recent work on sparsity and low-rank approximation and to provide a forum for open discussion of the key questions concerning these dimensionality reduction techniques. The workshop will be divided into two segments, a "sparsity segment" emphasizing sparse dictionary learning and a "low-rank segment" emphasizing scalability and large data.

The sparsity segment will be dedicated to learning sparse latent representations and dictionaries: decomposing a signal or a vector of observations as sparse linear combinations of basis vectors, atoms or covariates is ubiquitous in machine learning and signal processing. Algorithms and theoretical analyses for obtaining these decompositions are now numerous. Learning the atoms or basis vectors directly from data has proven useful in several domains and is often seen from different view points: (a) as a matrix factorization problem with potentially some constraints such as pointwise nonnegativity, (b) as a latent variable model which can be treated in a probabilistic and potentially Bayesian way, leading in particular to topic models, and (c) as dictionary learning with often a goal of signal representation or restoration. The goal of this part of the workshop is to confront these various points of view and foster exchanges of ideas among the signal processing, statistics, machine learning and applied mathematics communities.

The low-rank segment will explore the impact of low-rank methods for large-scale machine learning. Large datasets often take the form of matrices representing either a set of real-valued features for each datapoint or pairwise similarities between datapoints. Hence, modern learning problems face the daunting task of storing and operating on matrices with millions to billions of entries. An attractive solution to this problem involves working with low-rank approximations of the original matrix. Low-rank approximation is at the core of widely used algorithms such as Principal Component Analysis and Latent Semantic Indexing, and low-rank matrices appear in a variety of applications including lossy data compression, collaborative filtering, image processing, text analysis, matrix completion, robust matrix factorization and metric learning. In this segment we aim to study new algorithms, recent theoretical advances and large-scale empirical results, and more broadly we hope to identify additional interesting scenarios for use of low-rank approximations for learning tasks.

Related Resources

AAAI 2020   The Thirty-Fourth AAAI Conference on Artificial Intelligence
ICDM 2020   20th IEEE International Conference on Data Mining
Special Session in SMC 2020   Special Session on Sparse Data Analysis, Representation Learning and Multi-Agent System within the IEEE SMC 2020
IEEE COINS 2020   Internet of Things IoT | Artificial Intelligence | Machine Learning | Big Data | Blockchain | Edge & Cloud Computing | Security | Embedded Systems | Circuit and Systems | WSN | 5G
CanadianAI 2020   33rd Canadian Conference on Artificial Intelligence
IEEE-CVIV 2020   2020 2nd International Conference on Advances in Computer Vision, Image and Virtualization (CVIV 2020)
LOPAL 2020   Second International Conference on Learning and Optimization Algorithms: Theory and Applications
MNLP 2020   4th IEEE Conference on Machine Learning and Natural Language Processing
DBKDA 2020   The Twelfth International Conference on Advances in Databases, Knowledge, and Data Applications
ACM--ICMLC--Ei and Scopus 2020   ACM--2020 12th International Conference on Machine Learning and Computing (ICMLC 2020)--SCOPUS, Ei Compendex