Compositional Stochastic Average Gradient for Machine Learning and Related Applications

Tsung Yu Hsieh, Yasser EL-Manzalawy, Yiwei Sun, Vasant Honavar

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

Many machine learning, and statistical inference problems require minimization of a composition of expected value functions (CEVF). Of particular interest is the finite-sum versions of such compositional optimization problems (FS-CEVF). Compositional stochastic variance reduced gradient (C-SVRG) methods that combine stochastic compositional gradient descent (SCGD) and stochastic variance reduced gradient descent (SVRG) methods are the state-of-the-art methods for FS-CEVF problems. We introduce compositional stochastic average gradient descent (C-SAG) a novel extension of the stochastic average gradient method (SAG) to minimize composition of finite-sum functions. C-SAG, like SAG, estimates gradient by incorporating memory of previous gradient information. We present theoretical analyses of C-SAG which show that C-SAG, like C-SVRG, achieves a linear convergence rate for strongly convex objective function; However, C-CAG achieves lower oracle query complexity per iteration than C-SVRG. Finally, we present results of experiments showing that C-SAG converges substantially faster than full gradient (FG), as well as C-SVRG.

Original languageEnglish (US)
Title of host publicationIntelligent Data Engineering and Automated Learning – IDEAL 2018 - 19th International Conference, Proceedings
EditorsHujun Yin, Paulo Novais, David Camacho, Antonio J. Tallón-Ballesteros
PublisherSpringer Verlag
Pages740-752
Number of pages13
ISBN (Print)9783030034924
DOIs
StatePublished - Jan 1 2018
Event19th International Conference on Intelligent Data Engineering and Automated Learning, IDEAL 2018 - Madrid, Spain
Duration: Nov 21 2018Nov 23 2018

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11314 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

Other19th International Conference on Intelligent Data Engineering and Automated Learning, IDEAL 2018
CountrySpain
CityMadrid
Period11/21/1811/23/18

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Compositional Stochastic Average Gradient for Machine Learning and Related Applications'. Together they form a unique fingerprint.

  • Cite this

    Hsieh, T. Y., EL-Manzalawy, Y., Sun, Y., & Honavar, V. (2018). Compositional Stochastic Average Gradient for Machine Learning and Related Applications. In H. Yin, P. Novais, D. Camacho, & A. J. Tallón-Ballesteros (Eds.), Intelligent Data Engineering and Automated Learning – IDEAL 2018 - 19th International Conference, Proceedings (pp. 740-752). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11314 LNCS). Springer Verlag. https://doi.org/10.1007/978-3-030-03493-1_77