Abstract
We prove that every online learnable class of functions of Littlestone dimension d admits a learning algorithm with finite information complexity. Towards this end, we use the notion of a globally stable algorithm. Generally, the information complexity of such a globally stable algorithm is large yet finite, roughly exponential in d. We also show there is room for improvement; for a canonical online learnable class, indicator functions of affine subspaces of dimension d, the information complexity can be upper bounded logarithmically in d.
Original language | English |
---|---|
Title of host publication | 2022 IEEE International Symposium on Information Theory, ISIT 2022 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 3055-3060 |
Number of pages | 6 |
ISBN (Electronic) | 9781665421591 |
DOIs | |
State | Published - 2022 |
Externally published | Yes |
Event | 2022 IEEE International Symposium on Information Theory, ISIT 2022 - Espoo, Finland Duration: 26 Jun 2022 → 1 Jul 2022 |
Publication series
Name | IEEE International Symposium on Information Theory - Proceedings |
---|---|
Volume | 2022-June |
ISSN (Print) | 2157-8095 |
Conference
Conference | 2022 IEEE International Symposium on Information Theory, ISIT 2022 |
---|---|
Country/Territory | Finland |
City | Espoo |
Period | 26/06/22 → 1/07/22 |
Bibliographical note
Publisher Copyright:© 2022 IEEE.
Keywords
- Littlestone dimension
- Mutual information
- PAC learning
ASJC Scopus subject areas
- Theoretical Computer Science
- Information Systems
- Modeling and Simulation
- Applied Mathematics