Abstract
We consider convex-concave saddle point problems with a separable structure and non-strongly convex functions. We propose an efficient stochastic block coordinate descent method using adaptive primal-dual updates, which enables flexible parallel optimization for large-scale problems. Our method shares the efficiency and flexibility of block coordinate descent methods with the simplicity of primal-dual methods and utilizing the structure of the separable convex-concave saddle point problem. It is capable of solving a wide range of machine learning applications, including robust principal component analysis, Lasso, and feature selection by group Lasso, etc. Theoretically and empirically, we demonstrate significantly better performance than state-of-the-art methods in all these applications.
Original language | English |
---|---|
Title of host publication | Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence |
Publisher | AAAI Press |
Pages | 2429-2534 |
Number of pages | 106 |
Publication status | Published - Feb 2016 |
Event | Thirtieth AAAI Conference on Artificial Intelligence - Phoenix, United States Duration: 12 Feb 2016 → 17 Feb 2016 https://www.aaai.org/Conferences/AAAI/aaai16.php |
Publication series
Name | AAAI'16 |
---|---|
Publisher | AAAI Press |
Conference
Conference | Thirtieth AAAI Conference on Artificial Intelligence |
---|---|
Abbreviated title | AAAI-16 |
Country/Territory | United States |
City | Phoenix |
Period | 12/02/16 → 17/02/16 |
Internet address |