Data-Independent Structured Pruning of Neural Networks via Coresets

Ben Mussay, Dan Feldman, Samson Zhou, Vladimir Braverman, Margarita Osadchy

Research output: Contribution to journalArticlepeer-review

Abstract

Model compression is crucial for the deployment of neural networks on devices with limited computational and memory resources. Many different methods show comparable accuracy of the compressed model and similar compression rates. However, the majority of the compression methods are based on heuristics and offer no worst case guarantees on the tradeoff between the compression rate and the approximation error for an arbitrarily new sample. We propose the first efficient structured pruning algorithm with a provable tradeoff between its compression rate and the approximation error for any future test sample. Our method is based on the coreset framework, and it approximates the output of a layer of neurons/filters by a coreset of neurons/filters in the previous layer and discards the rest. We apply this framework in a layer-by-layer fashion from the bottom to the top. Unlike previous works, our coreset is data-independent, meaning that it provably guarantees the accuracy of the function for any input x ∈ Rd, including an adversarial one.

Original languageEnglish
Number of pages16
JournalIEEE Transactions on Neural Networks and Learning Systems
DOIs
StatePublished - 2021

Bibliographical note

Publisher Copyright:
IEEE

Keywords

  • Approximation algorithms
  • Approximation error
  • Biological neural networks
  • Computer architecture
  • Coreset
  • Data models
  • Neurons
  • Training
  • model compression
  • network pruning
  • structured pruning.

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Data-Independent Structured Pruning of Neural Networks via Coresets'. Together they form a unique fingerprint.

Cite this