More constraints, smaller coresets: Constrained matrix approximation of sparse big data

Dan Feldman, Tamir Tassa

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We suggest a generic data reduction technique with provable guarantees for computing the low rank approximation of a matrix under some ℓz error, and constrained factorizations, such as the Non-negative Matrix Factorization (NMF). Our main algorithm reduces a given n × d matrix into a small, ε-dependent, weighted subset C of its rows (known as a coreset), whose size is independent of both n and d. We then prove that applying existing algorithms on the resulting coreset can be turned into (1 + ε)-approximations for the original (large) input matrix. In particular, we provide the first linear time approximation scheme (LTAS) for the rank-one NMF. The coreset C can be computed in parallel and using only one pass over a possibly unbounded stream of row vectors. In this sense we improve the result in [4] (Best paper of STOC 2013). Moreover, since C is a subset of these rows, its construction time, as well as its sparsity (number of non-zeroes entries) and the sparsity of the resulting low rank approximation depend on the maximum sparsity of an input row, and not on the actual dimension d. In this sense, we improve the result of Libery [21] (Best paper of KDD 2013) and answer affirmably, and in a more general setting, his open question of computing such a coreset. We implemented our coreset and demonstrate it by turning Matlab's NMF off-line function that gets a matrix in the memory of a single machine, into a streaming algorithm that runs in parallel on 64 machines on Amazon's cloud and returns sparse NMF factorization. Source code is provided for reproducing the experiments and integration with existing and future algorithms.

Original languageEnglish
Title of host publicationKDD 2015 - Proceedings of the 21st ACM SIGKDD Conference on Knowledge Discovery and Data Mining
PublisherAssociation for Computing Machinery
Pages249-258
Number of pages10
ISBN (Electronic)9781450336642
DOIs
StatePublished - 10 Aug 2015
Event21st ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2015 - Sydney, Australia
Duration: 10 Aug 201513 Aug 2015

Publication series

NameProceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
Volume2015-August

Conference

Conference21st ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2015
Country/TerritoryAustralia
CitySydney
Period10/08/1513/08/15

Bibliographical note

Publisher Copyright:
© 2015 ACM.

ASJC Scopus subject areas

  • Software
  • Information Systems

Fingerprint

Dive into the research topics of 'More constraints, smaller coresets: Constrained matrix approximation of sparse big data'. Together they form a unique fingerprint.

Cite this