Mini-Batch Primal and Dual Methods for SVMs

Martin Takac, Avleen Bijral, Nathan Srebro, Peter Richtarik

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We address the issue of using mini-batches in stochastic optimization of SVMs. We show that the same quantity, the spectral norm of the data, controls the parallelization speedup obtained for both primal stochastic subgradient descent (SGD) and stochastic dual coordinate ascent (SCDA) methods and use it to derive novel variants of mini-batched SDCA. Our guarantees for both methods are expressed in terms of the original nonsmooth primal problem based on the hinge-loss.
Original languageEnglish
Title of host publicationJMLR Workshop and Conference Proceedings
Subtitle of host publicationProceedings of the 30th International Conference on Machine Learning
Pages1022-1030
Volume28
Edition3
Publication statusPublished - 2013

Fingerprint

Dive into the research topics of 'Mini-Batch Primal and Dual Methods for SVMs'. Together they form a unique fingerprint.

Cite this