Abstract
Least-mean squares (LMS) solvers such as Linear / Ridge / Lasso-Regression, SVD and Elastic-Net not only solve fundamental machine learning problems, but are also the building blocks in a variety of other methods, such as decision trees and matrix factorizations. We suggest an algorithm that gets a finite set of n d-dimensional real vectors and returns a weighted subset of d + 1 vectors whose sum is exactly the same. The proof in Caratheodory's Theorem (1907) computes such a subset in O(n2d2) time and thus not used in practice. Our algorithm computes this subset in O(nd) time, using O(log n) calls to Caratheodory's construction on small but "smart" subsets. This is based on a novel paradigm of fusion between different data summarization techniques, known as sketches and coresets. As an example application, we show how it can be used to boost the performance of existing LMS solvers, such as those in scikit-learn library, up to x100. Generalization for streaming and distributed (big) data is trivial. Extensive experimental results and complete open source code are also provided.
Original language | English |
---|---|
Journal | Advances in Neural Information Processing Systems |
Volume | 32 |
State | Published - 2019 |
Event | 33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019 - Vancouver, Canada Duration: 8 Dec 2019 → 14 Dec 2019 |
Bibliographical note
Funding Information:We thank Rafi Dalla-Torre and Benjamin Lastmann from Samsung Research Israel for the fruitful debates and their useful review of our code.
Publisher Copyright:
© 2019 Neural information processing systems foundation. All rights reserved.
ASJC Scopus subject areas
- Computer Networks and Communications
- Information Systems
- Signal Processing