Finite Littlestone Dimension Implies Finite Information Complexity

Aditya Pradeep, Ido Nachum, Michael Gastpar

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We prove that every online learnable class of functions of Littlestone dimension d admits a learning algorithm with finite information complexity. Towards this end, we use the notion of a globally stable algorithm. Generally, the information complexity of such a globally stable algorithm is large yet finite, roughly exponential in d. We also show there is room for improvement; for a canonical online learnable class, indicator functions of affine subspaces of dimension d, the information complexity can be upper bounded logarithmically in d.

Original languageEnglish
Title of host publication2022 IEEE International Symposium on Information Theory, ISIT 2022
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3055-3060
Number of pages6
ISBN (Electronic)9781665421591
DOIs
StatePublished - 2022
Externally publishedYes
Event2022 IEEE International Symposium on Information Theory, ISIT 2022 - Espoo, Finland
Duration: 26 Jun 20221 Jul 2022

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
Volume2022-June
ISSN (Print)2157-8095

Conference

Conference2022 IEEE International Symposium on Information Theory, ISIT 2022
Country/TerritoryFinland
CityEspoo
Period26/06/221/07/22

Bibliographical note

Publisher Copyright:
© 2022 IEEE.

Keywords

  • Littlestone dimension
  • Mutual information
  • PAC learning

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Finite Littlestone Dimension Implies Finite Information Complexity'. Together they form a unique fingerprint.

Cite this