Abstract
In this paper, we propose the first continuous optimization algorithms that achieve a constant factor approximation guarantee for the problem of monotone continuous submodular maximization subject to a linear constraint. We first prove that a simple variant of the vanilla coordinate ascent, called COORDINATE-ASCENT+, achieves a (2ee--11 - ε)-approximation guarantee while performing O(n/ε) iterations, where the computational complexity of each iteration is roughly O(n/√ε + n log n) (here, n denotes the dimension of the optimization problem). We then propose COORDINATE-ASCENT++, that achieves the tight (1 - 1/e - ε)-approximation guarantee while performing the same number of iterations, but at a higher computational complexity of roughly O(n3/ε2.5 + n3 log n/ε2) per iteration. However, the computation of each round of COORDINATE-ASCENT++ can be easily parallelized so that the computational cost per machine scales as O(n/√ε + n log n).
Original language | English |
---|---|
Journal | Advances in Neural Information Processing Systems |
Volume | 2020-December |
State | Published - Dec 2020 |
Event | 34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online Duration: 6 Dec 2020 → 12 Dec 2020 |
Bibliographical note
Funding Information:The work of Moran Feldman was supported in part by ISF grants no. 1357/16 and 459/20. Amin Karbasi is partially supported by NSF (IIS-1845032), ONR (N00014-19-1-2406), AFOSR (FA9550-18-1-0160), and TATA Sons Private Limited.
Publisher Copyright:
© 2020 Neural information processing systems foundation. All rights reserved.
ASJC Scopus subject areas
- Computer Networks and Communications
- Information Systems
- Signal Processing