Abstract
A coreset for a set of points is a small subset of weighted points that approximately preserves important properties of the original set. Specifically, if P is a set of points, Q is a set of queries, and f : P ×Q → R is a cost function, then a set S ⊆ P with weights w : P → [0,∞) is an ϵ-coreset for some parameter ϵ>0 if Ps∈Sw(s)f(s,q) is a (1+ϵ) multiplicative approximation to Pp∈Pf(p,q) for all q∈Q. Coresets are used to solve fundamental problems in machine learning under various big data models of computation. Many of the suggested coresets in the recent decade used, or could have used a general framework for constructing coresets whose size depends quadratically on the total sensitivity t. In this paper we improve this bound from O(t2) to O(tlogt). Thus our results imply more space efficient solutions to a number of problems, including projective clustering, k-line clustering, and subspace approximation. The main technical result is a generic reduction to the sample complexity of learning a class of functions with bounded VC dimension. We show that obtaining an (ν,α)-sample for this class of functions with appropriate parameters ν and α suffices to achieve space efficient ϵ-coresets. Our result implies more efficient coreset constructions for a number of interesting problems in machine learning; we show applications to k-median/k-means, k-line clustering, j-subspace approximation, and the integer (j,k)-projective clustering problem.
Original language | English |
---|---|
Pages (from-to) | 948-963 |
Number of pages | 16 |
Journal | Proceedings of Machine Learning Research |
Volume | 157 |
State | Published - 2021 |
Event | 13th Asian Conference on Machine Learning, ACML 2021 - Virtual, Online Duration: 17 Nov 2021 → 19 Nov 2021 |
Bibliographical note
Publisher Copyright:© 2021 V. Braverman, D. Feldman, H. Lang, A. Statman & S. Zhou.
Keywords
- Dimensionality reduction
- coresets
- sensitivity sampling
ASJC Scopus subject areas
- Artificial Intelligence
- Software
- Control and Systems Engineering
- Statistics and Probability