Alternating direction methods for latent variable gaussian graphical model selection

Shiqian Ma, Lingzhou Xue, Hui Zou

Research output: Contribution to journalLetter

43 Citations (Scopus)

Abstract

Chandrasekaran, Parrilo, and Willsky (2012) proposed a convex optimization problem for graphicalmodel selection in the presence of unobserved variables. This convex optimization problem aims to estimate an inverse covariance matrix that can be decomposed into a sparse matrix minus a low-rank matrix from sample data. Solving this convex optimization problem is very challenging, especially for large problems. In this letter, we propose two alternating direction methods for solving this problem. The first method is to apply the classic alternating direction method of multipliers to solve the problem as a consensus problem. The second method is a proximal gradient-based alternating-direction method of multipliers. Our methods take advantage of the special structure of the problem and thus can solve large problems very efficiently. A global convergence result is established for the proposed methods. Numerical results on both synthetic data and gene expression data show that our methods usually solve problems with 1 million variables in 1 to 2 minutes and are usually 5 to 35 times faster than a state-of-the-artNewton-CG proximal point algorithm.

Original languageEnglish (US)
Pages (from-to)2172-2198
Number of pages27
JournalNeural computation
Volume25
Issue number8
DOIs
StatePublished - Aug 7 2013

Fingerprint

Synthetic Genes
Direction compound
Model Selection
Consensus
Gene Expression
Proximal
Problem Solving

All Science Journal Classification (ASJC) codes

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience

Cite this

@article{a7bd9c6820cd43eab8b5052591502e12,
title = "Alternating direction methods for latent variable gaussian graphical model selection",
abstract = "Chandrasekaran, Parrilo, and Willsky (2012) proposed a convex optimization problem for graphicalmodel selection in the presence of unobserved variables. This convex optimization problem aims to estimate an inverse covariance matrix that can be decomposed into a sparse matrix minus a low-rank matrix from sample data. Solving this convex optimization problem is very challenging, especially for large problems. In this letter, we propose two alternating direction methods for solving this problem. The first method is to apply the classic alternating direction method of multipliers to solve the problem as a consensus problem. The second method is a proximal gradient-based alternating-direction method of multipliers. Our methods take advantage of the special structure of the problem and thus can solve large problems very efficiently. A global convergence result is established for the proposed methods. Numerical results on both synthetic data and gene expression data show that our methods usually solve problems with 1 million variables in 1 to 2 minutes and are usually 5 to 35 times faster than a state-of-the-artNewton-CG proximal point algorithm.",
author = "Shiqian Ma and Lingzhou Xue and Hui Zou",
year = "2013",
month = "8",
day = "7",
doi = "10.1162/NECO_a_00379",
language = "English (US)",
volume = "25",
pages = "2172--2198",
journal = "Neural Computation",
issn = "0899-7667",
publisher = "MIT Press Journals",
number = "8",

}

Alternating direction methods for latent variable gaussian graphical model selection. / Ma, Shiqian; Xue, Lingzhou; Zou, Hui.

In: Neural computation, Vol. 25, No. 8, 07.08.2013, p. 2172-2198.

Research output: Contribution to journalLetter

TY - JOUR

T1 - Alternating direction methods for latent variable gaussian graphical model selection

AU - Ma, Shiqian

AU - Xue, Lingzhou

AU - Zou, Hui

PY - 2013/8/7

Y1 - 2013/8/7

N2 - Chandrasekaran, Parrilo, and Willsky (2012) proposed a convex optimization problem for graphicalmodel selection in the presence of unobserved variables. This convex optimization problem aims to estimate an inverse covariance matrix that can be decomposed into a sparse matrix minus a low-rank matrix from sample data. Solving this convex optimization problem is very challenging, especially for large problems. In this letter, we propose two alternating direction methods for solving this problem. The first method is to apply the classic alternating direction method of multipliers to solve the problem as a consensus problem. The second method is a proximal gradient-based alternating-direction method of multipliers. Our methods take advantage of the special structure of the problem and thus can solve large problems very efficiently. A global convergence result is established for the proposed methods. Numerical results on both synthetic data and gene expression data show that our methods usually solve problems with 1 million variables in 1 to 2 minutes and are usually 5 to 35 times faster than a state-of-the-artNewton-CG proximal point algorithm.

AB - Chandrasekaran, Parrilo, and Willsky (2012) proposed a convex optimization problem for graphicalmodel selection in the presence of unobserved variables. This convex optimization problem aims to estimate an inverse covariance matrix that can be decomposed into a sparse matrix minus a low-rank matrix from sample data. Solving this convex optimization problem is very challenging, especially for large problems. In this letter, we propose two alternating direction methods for solving this problem. The first method is to apply the classic alternating direction method of multipliers to solve the problem as a consensus problem. The second method is a proximal gradient-based alternating-direction method of multipliers. Our methods take advantage of the special structure of the problem and thus can solve large problems very efficiently. A global convergence result is established for the proposed methods. Numerical results on both synthetic data and gene expression data show that our methods usually solve problems with 1 million variables in 1 to 2 minutes and are usually 5 to 35 times faster than a state-of-the-artNewton-CG proximal point algorithm.

UR - http://www.scopus.com/inward/record.url?scp=84880972942&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84880972942&partnerID=8YFLogxK

U2 - 10.1162/NECO_a_00379

DO - 10.1162/NECO_a_00379

M3 - Letter

C2 - 23607561

AN - SCOPUS:84880972942

VL - 25

SP - 2172

EP - 2198

JO - Neural Computation

JF - Neural Computation

SN - 0899-7667

IS - 8

ER -