### Abstract

Chandrasekaran, Parrilo, and Willsky (2012) proposed a convex optimization problem for graphicalmodel selection in the presence of unobserved variables. This convex optimization problem aims to estimate an inverse covariance matrix that can be decomposed into a sparse matrix minus a low-rank matrix from sample data. Solving this convex optimization problem is very challenging, especially for large problems. In this letter, we propose two alternating direction methods for solving this problem. The first method is to apply the classic alternating direction method of multipliers to solve the problem as a consensus problem. The second method is a proximal gradient-based alternating-direction method of multipliers. Our methods take advantage of the special structure of the problem and thus can solve large problems very efficiently. A global convergence result is established for the proposed methods. Numerical results on both synthetic data and gene expression data show that our methods usually solve problems with 1 million variables in 1 to 2 minutes and are usually 5 to 35 times faster than a state-of-the-artNewton-CG proximal point algorithm.

Original language | English (US) |
---|---|

Pages (from-to) | 2172-2198 |

Number of pages | 27 |

Journal | Neural computation |

Volume | 25 |

Issue number | 8 |

DOIs | |

State | Published - Aug 7 2013 |

### Fingerprint

### All Science Journal Classification (ASJC) codes

- Arts and Humanities (miscellaneous)
- Cognitive Neuroscience

### Cite this

*Neural computation*,

*25*(8), 2172-2198. https://doi.org/10.1162/NECO_a_00379

}

*Neural computation*, vol. 25, no. 8, pp. 2172-2198. https://doi.org/10.1162/NECO_a_00379

**Alternating direction methods for latent variable gaussian graphical model selection.** / Ma, Shiqian; Xue, Lingzhou; Zou, Hui.

Research output: Contribution to journal › Letter

TY - JOUR

T1 - Alternating direction methods for latent variable gaussian graphical model selection

AU - Ma, Shiqian

AU - Xue, Lingzhou

AU - Zou, Hui

PY - 2013/8/7

Y1 - 2013/8/7

N2 - Chandrasekaran, Parrilo, and Willsky (2012) proposed a convex optimization problem for graphicalmodel selection in the presence of unobserved variables. This convex optimization problem aims to estimate an inverse covariance matrix that can be decomposed into a sparse matrix minus a low-rank matrix from sample data. Solving this convex optimization problem is very challenging, especially for large problems. In this letter, we propose two alternating direction methods for solving this problem. The first method is to apply the classic alternating direction method of multipliers to solve the problem as a consensus problem. The second method is a proximal gradient-based alternating-direction method of multipliers. Our methods take advantage of the special structure of the problem and thus can solve large problems very efficiently. A global convergence result is established for the proposed methods. Numerical results on both synthetic data and gene expression data show that our methods usually solve problems with 1 million variables in 1 to 2 minutes and are usually 5 to 35 times faster than a state-of-the-artNewton-CG proximal point algorithm.

AB - Chandrasekaran, Parrilo, and Willsky (2012) proposed a convex optimization problem for graphicalmodel selection in the presence of unobserved variables. This convex optimization problem aims to estimate an inverse covariance matrix that can be decomposed into a sparse matrix minus a low-rank matrix from sample data. Solving this convex optimization problem is very challenging, especially for large problems. In this letter, we propose two alternating direction methods for solving this problem. The first method is to apply the classic alternating direction method of multipliers to solve the problem as a consensus problem. The second method is a proximal gradient-based alternating-direction method of multipliers. Our methods take advantage of the special structure of the problem and thus can solve large problems very efficiently. A global convergence result is established for the proposed methods. Numerical results on both synthetic data and gene expression data show that our methods usually solve problems with 1 million variables in 1 to 2 minutes and are usually 5 to 35 times faster than a state-of-the-artNewton-CG proximal point algorithm.

UR - http://www.scopus.com/inward/record.url?scp=84880972942&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84880972942&partnerID=8YFLogxK

U2 - 10.1162/NECO_a_00379

DO - 10.1162/NECO_a_00379

M3 - Letter

C2 - 23607561

AN - SCOPUS:84880972942

VL - 25

SP - 2172

EP - 2198

JO - Neural Computation

JF - Neural Computation

SN - 0899-7667

IS - 8

ER -