TY - GEN
T1 - Efficient Coreset Constructions via Sensitivity Sampling
AU - Braverman, Vladimir
AU - Feldman, Dan
AU - Lang, Harry
AU - Statman, Adiel
AU - Zhou, Samson
PY - 2021/5/1
Y1 - 2021/5/1
N2 - A coreset for a set of points is a small subset of weighted points that approximately preserves important properties of the original set. Specifically, if $P$ is a set of points, $Q$ is a set of queries, and $f:Ptimes QtoR$ is a cost function, then a set $S with weights $w:P0,$ is an $-coreset for some parameter $gt;0$ if $sin Sw(s)f(s,q)$ is a $(1+$ multiplicative approximation to $pin Pf(p,q)$ for all $q. Coresets are used to solve fundamental problems in machine learning under various big data models of computation. Many of the suggested coresets in the recent decade used, or could have used a general framework for constructing coresets whose size depends quadratically on the total sensitivity $t$. In this paper we improve this bound from $O(t^2)$ to $O(t$. Thus our results imply more space efficient solutions to a number of problems, including projective clustering, $k$-line clustering, and subspace approximation. The main technical result is a generic reduction to the sample complexity of learning a class of functions with bounded VC dimension. We show that obtaining an $($-sample for this class of functions with appropriate parameters $ and $ suffices to achieve space efficient $-coresets. Our result implies more efficient coreset constructions for a number of interesting problems in machine learning; we show applications to $k$-median/$k$-means, $k$-line clustering, $j$-subspace approximation, and the integer $(j,k)$-projective clustering problem.
AB - A coreset for a set of points is a small subset of weighted points that approximately preserves important properties of the original set. Specifically, if $P$ is a set of points, $Q$ is a set of queries, and $f:Ptimes QtoR$ is a cost function, then a set $S with weights $w:P0,$ is an $-coreset for some parameter $gt;0$ if $sin Sw(s)f(s,q)$ is a $(1+$ multiplicative approximation to $pin Pf(p,q)$ for all $q. Coresets are used to solve fundamental problems in machine learning under various big data models of computation. Many of the suggested coresets in the recent decade used, or could have used a general framework for constructing coresets whose size depends quadratically on the total sensitivity $t$. In this paper we improve this bound from $O(t^2)$ to $O(t$. Thus our results imply more space efficient solutions to a number of problems, including projective clustering, $k$-line clustering, and subspace approximation. The main technical result is a generic reduction to the sample complexity of learning a class of functions with bounded VC dimension. We show that obtaining an $($-sample for this class of functions with appropriate parameters $ and $ suffices to achieve space efficient $-coresets. Our result implies more efficient coreset constructions for a number of interesting problems in machine learning; we show applications to $k$-median/$k$-means, $k$-line clustering, $j$-subspace approximation, and the integer $(j,k)$-projective clustering problem.
M3 - פרסום בספר כנס
VL - 157
T3 - Proceedings of Machine Learning Research
SP - 948
EP - 963
BT - Proceedings of The 13th Asian Conference on Machine Learning
PB - PMLR
ER -