Training Gaussian mixture models at scale via coresets

Mario Lucic, Matthew Faulkner, Andreas Krause, Dan Feldman

Research output: Contribution to journalArticlepeer-review


How can we train a statistical mixture model on a massive data set? In this work we show how to construct coresets for mixtures of Gaussians. A coreset is a weighted subset of the data, which guarantees that models fitting the coreset also provide a good fit for the original data set. We show that, perhaps surprisingly, Gaussian mixtures admit coresets of size polynomial in dimension and the number of mixture components, while being independent of the data set size. Hence, one can harness computationally intensive algorithms to compute a good approximation on a significantly smaller data set. More importantly, such coresets can be efficiently constructed both in distributed and streaming settings and do not impose restrictions on the data generating process. Our results rely on a novel reduction of statistical estimation to problems in computational geometry and new combinatorial complexity results for mixtures of Gaussians. Empirical evaluation on several real-world data sets suggests that our coreset-based approach enables significant reduction in training-time with negligible approximation error.

Original languageEnglish
Pages (from-to)1-25
Number of pages25
JournalJournal of Machine Learning Research
StatePublished - 1 May 2018

Bibliographical note

Funding Information:
We thank Olivier Bachem for invaluable discussions, suggestions and comments. This research was partially supported by ONR grant N00014-09-1-1044, NSF grants CNS-0932392, IIS-0953413, DARPA MSEE grant FA8650-11-1-7156, and the Zurich Information Security Center.

Publisher Copyright:
© 2018 Mario Lucic, Matthew Faulkner, Andreas Krause, Dan Feldman.


  • Coresets
  • Distributed computation
  • Gaussian mixture models
  • Streaming

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence


Dive into the research topics of 'Training Gaussian mixture models at scale via coresets'. Together they form a unique fingerprint.
  • Scalable training of mixture models via coresets

    Feldman, D., Faulkner, M. & Krause, A., 2011, Advances in Neural Information Processing Systems 24: 25th Annual Conference on Neural Information Processing Systems 2011, NIPS 2011. p. 2142-2150

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Cite this