TY - JOUR
T1 - Distributed Algorithms for Composite Optimization
T2 - Unified Framework and Convergence Analysis
AU - Xu, Jinming
AU - Tian, Ye
AU - Sun, Ying
AU - Scutari, Gesualdo
N1 - Funding Information:
Manuscript received March 4, 2020; revised November 15, 2020, March 7, 2021, and May 6, 2021; accepted May 19, 2021. Date of publication June 7, 2021; date of current version June 29, 2021. The associate editor coordinating the review of this manuscript and approving it for publication was Dr. Antti Tölli. The work of Jinming Xu was supported in part by NSF of China under Grants 62003302, 62088101, 61922058, U1909207, and the Fundamental Research Funds for the Central Universities (2020QNA5014). The work of Ye Tian, Ying Sun, and Gesualdo Scutari was supported by the USA NSF Grant CIF 1719205, the ARO Grant W911NF1810238, and the ONR Grant. This work was presented in part at IEEE CAMSAP 2019 [1], and IEEE no. 1317827 CDC 2020 [2]. (Corresponding author: Gesualdo Scutari.) Jinming Xu is with the College of Control Science and Engineering, Zhejiang University, Hangzhou 310027, China (e-mail: jimmyxu@zju.edu.cn).
Publisher Copyright:
© 1991-2012 IEEE.
PY - 2021
Y1 - 2021
N2 - We study distributed composite optimization over networks: agents minimize a sum of smooth (strongly) convex functions-the agents' sum-utility-plus a nonsmooth (extended-valued) convex one. We propose a general unified algorithmic framework for such a class of problems and provide a convergence analysis leveraging the theory of operator splitting. Distinguishing features of our scheme are: (i) When each of the agent's functions is strongly convex, the algorithm converges at a linear rate, whose dependence on the agents' functions and network topology is decoupled; (ii) When the objective function is convex (but not strongly convex), similar decoupling as in (i) is established for the coefficient of the proved sublinear rate. This also reveals the role of function heterogeneity on the convergence rate. (iii) The algorithm can adjust the ratio between the number of communications and computations to achieve a rate (in terms of computations) independent on the network connectivity; and (iv) A by-product of our analysis is a tuning recommendation for several existing (non-accelerated) distributed algorithms yielding provably faster (worst-case) convergence rate for the class of problems under consideration.
AB - We study distributed composite optimization over networks: agents minimize a sum of smooth (strongly) convex functions-the agents' sum-utility-plus a nonsmooth (extended-valued) convex one. We propose a general unified algorithmic framework for such a class of problems and provide a convergence analysis leveraging the theory of operator splitting. Distinguishing features of our scheme are: (i) When each of the agent's functions is strongly convex, the algorithm converges at a linear rate, whose dependence on the agents' functions and network topology is decoupled; (ii) When the objective function is convex (but not strongly convex), similar decoupling as in (i) is established for the coefficient of the proved sublinear rate. This also reveals the role of function heterogeneity on the convergence rate. (iii) The algorithm can adjust the ratio between the number of communications and computations to achieve a rate (in terms of computations) independent on the network connectivity; and (iv) A by-product of our analysis is a tuning recommendation for several existing (non-accelerated) distributed algorithms yielding provably faster (worst-case) convergence rate for the class of problems under consideration.
UR - http://www.scopus.com/inward/record.url?scp=85110726737&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85110726737&partnerID=8YFLogxK
U2 - 10.1109/TSP.2021.3086579
DO - 10.1109/TSP.2021.3086579
M3 - Article
AN - SCOPUS:85110726737
SN - 1053-587X
VL - 69
SP - 3555
EP - 3570
JO - IRE Transactions on Audio
JF - IRE Transactions on Audio
M1 - 9447939
ER -