Average-Case Information Complexity of Learning

Ido Nachum, Amir Yehudayoff

Research output: Contribution to journalConference articlepeer-review

Abstract

How many bits of information are revealed by a learning algorithm for a concept class of VC-dimension d? Previous works have shown that even for d = 1 the amount of information may be unbounded (tend to ∞ with the universe size). Can it be that all concepts in the class require leaking a large amount of information? We show that typically concepts do not require leakage. There exists a proper learning algorithm that reveals O(d) bits of information for most concepts in the class. This result is a special case of a more general phenomenon we explore. If there is a low information learner when the algorithm knows the underlying distribution on inputs, then there is a learner that reveals little information on an average concept without knowing the distribution on inputs.

Original languageEnglish
Pages (from-to)633-646
Number of pages14
JournalProceedings of Machine Learning Research
Volume98
StatePublished - 2019
Externally publishedYes
Event30th International Conference on Algorithmic Learning Theory, ALT 2019 - Chicago, United States
Duration: 22 Mar 201924 Mar 2019

Bibliographical note

Publisher Copyright:
© 2019 Proceedings of Machine Learning Research. All rights reserved.

Keywords

  • Learning Theory, Information Theory

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Average-Case Information Complexity of Learning'. Together they form a unique fingerprint.

Cite this