Stochastic Parallel Block Coordinate Descent for Large-scale Saddle Point Problems

Zhanxing Zhu, Amos J. Storkey

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We consider convex-concave saddle point problems with a separable structure and non-strongly convex functions. We propose an efficient stochastic block coordinate descent method using adaptive primal-dual updates, which enables flexible parallel optimization for large-scale problems. Our method shares the efficiency and flexibility of block coordinate descent methods with the simplicity of primal-dual methods and utilizing the structure of the separable convex-concave saddle point problem. It is capable of solving a wide range of machine learning applications, including robust principal component analysis, Lasso, and feature selection by group Lasso, etc. Theoretically and empirically, we demonstrate significantly better performance than state-of-the-art methods in all these applications.
Original languageEnglish
Title of host publicationProceedings of the Thirtieth AAAI Conference on Artificial Intelligence
PublisherAAAI Press
Pages2429-2534
Number of pages106
Publication statusPublished - Feb 2016
EventThirtieth AAAI Conference on Artificial Intelligence - Phoenix, United States
Duration: 12 Feb 201617 Feb 2016
https://www.aaai.org/Conferences/AAAI/aaai16.php

Publication series

NameAAAI'16
PublisherAAAI Press

Conference

ConferenceThirtieth AAAI Conference on Artificial Intelligence
Abbreviated titleAAAI-16
Country/TerritoryUnited States
CityPhoenix
Period12/02/1617/02/16
Internet address

Fingerprint

Dive into the research topics of 'Stochastic Parallel Block Coordinate Descent for Large-scale Saddle Point Problems'. Together they form a unique fingerprint.

Cite this