TY - JOUR

T1 - SI-ADMM

T2 - A Stochastic Inexact ADMM Framework for Stochastic Convex Programs

AU - Xie, Yue

AU - Shanbhag, Uday V.

N1 - Funding Information:
Manuscript received March 8, 2019; accepted July 6, 2019. Date of publication November 13, 2019; date of current version May 28, 2020. This work was supported in part by the NSF Awards CMMI-1538605 and CMMI-1246887 (CAREER). This paper was presented in part at the Winter Simulation Conference [1]. Recommended by Associate Editor S. S. Saab. (Corresponding author: Uday V. Shanbhag.) The authors are with the Industrial and Manufacturing Engineering, Pennsylvania State University, University Park, PA 16802 USA (e-mail: xieyue1990@gmail.com; udaybag@psu.edu).
Publisher Copyright:
© 1963-2012 IEEE.

PY - 2020/6

Y1 - 2020/6

N2 - We consider the structured stochastic convex program requiring the minimization of \mathbb {E}_\xi [\tilde{f}(x,\xi)]+\mathbb {E}_\xi [\tilde{g}(y,\xi)] subject to the constraint Ax + By = b. Motivated by the need for decentralized schemes, we propose a stochastic inexact alternating direction method of multiplier (SI-ADMM) framework where subproblems are solved inexactly via stochastic approximation schemes. we propose a stochastic inexact alternating direction method of multiplier (SI-ADMM) framework where subproblems are solved inexactly via stochastic approximation schemes. Based on this framework, we prove the following: 1) under suitable assumptions on the associated batch-size of samples utilized at each iteration, the SI-ADMM scheme produces a sequence that converges to the unique solution almost surely; 2) if the number of gradient steps (or equivalently, the number of sampled gradients) utilized for solving the subproblems in each iteration increases at a geometric rate, the mean-squared error diminishes to zero at a prescribed geometric rate; and 3) the overall iteration complexity in terms of gradient steps (or equivalently samples) is found to be consistent with the canonical level of \mathcal {O}(1/\epsilon). Preliminary applications on LASSO and distributed regression suggest that the scheme performs well compared to its competitors.

AB - We consider the structured stochastic convex program requiring the minimization of \mathbb {E}_\xi [\tilde{f}(x,\xi)]+\mathbb {E}_\xi [\tilde{g}(y,\xi)] subject to the constraint Ax + By = b. Motivated by the need for decentralized schemes, we propose a stochastic inexact alternating direction method of multiplier (SI-ADMM) framework where subproblems are solved inexactly via stochastic approximation schemes. we propose a stochastic inexact alternating direction method of multiplier (SI-ADMM) framework where subproblems are solved inexactly via stochastic approximation schemes. Based on this framework, we prove the following: 1) under suitable assumptions on the associated batch-size of samples utilized at each iteration, the SI-ADMM scheme produces a sequence that converges to the unique solution almost surely; 2) if the number of gradient steps (or equivalently, the number of sampled gradients) utilized for solving the subproblems in each iteration increases at a geometric rate, the mean-squared error diminishes to zero at a prescribed geometric rate; and 3) the overall iteration complexity in terms of gradient steps (or equivalently samples) is found to be consistent with the canonical level of \mathcal {O}(1/\epsilon). Preliminary applications on LASSO and distributed regression suggest that the scheme performs well compared to its competitors.

UR - http://www.scopus.com/inward/record.url?scp=85086049259&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85086049259&partnerID=8YFLogxK

U2 - 10.1109/TAC.2019.2953209

DO - 10.1109/TAC.2019.2953209

M3 - Article

AN - SCOPUS:85086049259

SN - 0018-9286

VL - 65

SP - 2355

EP - 2370

JO - IRE Transactions on Automatic Control

JF - IRE Transactions on Automatic Control

IS - 6

M1 - 8897605

ER -