Abstract
We study the problems of learning and testing junta distributions on {-1, 1}n with respect to the uniform distribution, where a distribution p is a k-junta if its probability mass function p(x) depends on a subset of at most k variables. The main contribution is an algorithm for finding relevant coordinates in a k-junta distribution with subcube conditioning Bhattacharyya and Chakraborty (2018); Canonne et al. (2019). We give two applications: • An algorithm for learning k-junta distributions with Õ(k/ε2) log n + O(2k/ε2) subcube conditioning queries, and • An algorithm for testing k-junta distributions with Õ((k + √n)/ε2) subcube conditioning queries. All our algorithms are optimal up to poly-logarithmic factors. Our results show that subcube conditioning, as a natural model for accessing high-dimensional distributions, enables significant savings in learning and testing junta distributions compared to the standard sampling model. This addresses an open question posed by Aliakbarpour et al. (2016).
| Original language | English |
|---|---|
| Pages (from-to) | 1060-1113 |
| Number of pages | 54 |
| Journal | Proceedings of Machine Learning Research |
| Volume | 134 |
| State | Published - 2021 |
| Externally published | Yes |
| Event | 34th Conference on Learning Theory, COLT 2021 - Boulder, United States Duration: 15 Aug 2021 → 19 Aug 2021 |
Bibliographical note
Publisher Copyright:© 2021 X. Chen, R. Jayaram, A. Levi & E. Waingarten.
ASJC Scopus subject areas
- Artificial Intelligence
- Software
- Control and Systems Engineering
- Statistics and Probability
Fingerprint
Dive into the research topics of 'Learning and Testing Junta Distributions with Subcube Conditioning'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver