AIDE: Fast and Communication Efficient Distributed Optimization

Sashank J. Reddi, Jakub Konečný, Peter Richtárik, Barnabás Póczós, Alex Smola

Research output: Working paper


In this paper, we present two new communication-efficient methods for distributed minimization of an average of functions. The first algorithm is an inexact variant of the DANE algorithm that allows any local algorithm to return an approximate solution to a local subproblem. We show that such a strategy does not affect the theoretical guarantees of DANE significantly. In fact, our approach can be viewed as a robustification strategy since the method is substantially better behaved than DANE on data partition arising in practice. It is well known that DANE algorithm does not match the communication complexity lower bounds. To bridge this gap, we propose an accelerated variant of the first method, called AIDE, that not only matches the communication lower bounds but can also be implemented using a purely first-order oracle. Our empirical results show that AIDE is superior to other communication efficient algorithms in settings that naturally arise in machine learning applications.
Original languageEnglish
Publication statusPublished - 24 Aug 2016


  • math.OC
  • cs.LG
  • stat.ML


Dive into the research topics of 'AIDE: Fast and Communication Efficient Distributed Optimization'. Together they form a unique fingerprint.

Cite this