Data-dependent coresets for compressing neural networks with applications to generalization bounds

Cenk Baykal, Lucas Liebenwein, Igor Gilitschenski, Dan Feldman, Daniela Rus

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We present an efficient coresets-based neural network compression algorithm that sparsifies the parameters of a trained fully-connected neural network in a manner that provably approximates the network's output. Our approach is based on an importance sampling scheme that judiciously defines a sampling distribution over the neural network parameters, and as a result, retains parameters of high importance while discarding redundant ones. We leverage a novel, empirical notion of sensitivity and extend traditional coreset constructions to the application of compressing parameters. Our theoretical analysis establishes guarantees on the size and accuracy of the resulting compressed network and gives rise to generalization bounds that may provide new insights into the generalization properties of neural networks. We demonstrate the practical effectiveness of our algorithm on a variety of neural network configurations and real-world data sets.

Original languageEnglish
Title of host publicationInternational Conference on Learning Representations (ICLR) 2019
StatePublished - 2019
Event7th International Conference on Learning Representations, ICLR 2019 - New Orleans, United States
Duration: 6 May 20199 May 2019

Conference

Conference7th International Conference on Learning Representations, ICLR 2019
Country/TerritoryUnited States
CityNew Orleans
Period6/05/199/05/19

Bibliographical note

Funding Information:
This research was supported in part by the National Science Foundation award IIS-1723943. We thank Brandon Araki and Kiran Vodrahalli for valuable discussions and helpful suggestions. We would also like to thank Kasper Green Larsen, Alexander Mathiasen, and Allan Gronlund for pointing out an error in an earlier formulation of Lemma 6.

Publisher Copyright:
© 7th International Conference on Learning Representations, ICLR 2019. All Rights Reserved.

ASJC Scopus subject areas

  • Education
  • Computer Science Applications
  • Linguistics and Language
  • Language and Linguistics

Fingerprint

Dive into the research topics of 'Data-dependent coresets for compressing neural networks with applications to generalization bounds'. Together they form a unique fingerprint.

Cite this