A first-order smoothed penalty method for compressed sensing

N. S. Aybat, G. Iyengar

Research output: Contribution to journalArticle

13 Citations (Scopus)

Abstract

We propose a first-order smoothed penalty algorithm (SPA) to solve the sparse recovery problem min{||x||1: Ax = b}. SPA is efficient as long as the matrix-vector product Ax and AT y can be computed efficiently; in particular, A need not have orthogonal rows. SPA converges to the target signal by solving a sequence of penalized optimization subproblems, and each subproblem is solved using Nesterov's optimal algorithm for simple sets [Yu. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, Kluwer Academic Publishers, Norwell, MA, 2004] and [Yu. Nesterov, Math. Program., 103 (2005), pp. 127-152]. We show that the SPA iterates xk are ∈-feasible; i.e. || Axk - b ||2 ≤ e and e-optimal; i.e.

Original languageEnglish (US)
Pages (from-to)287-313
Number of pages27
JournalSIAM Journal on Optimization
Volume21
Issue number1
DOIs
StatePublished - May 30 2011

Fingerprint

Compressed sensing
Compressed Sensing
Penalty Method
Penalty
First-order
Cross product
Matrix Product
Convex Optimization
Optimal Algorithm
Iterate
Convex optimization
Recovery
Converge
Target
Optimization

All Science Journal Classification (ASJC) codes

  • Software
  • Theoretical Computer Science

Cite this

@article{f74a7299a89a4c32ab0ff9f22f484d59,
title = "A first-order smoothed penalty method for compressed sensing",
abstract = "We propose a first-order smoothed penalty algorithm (SPA) to solve the sparse recovery problem min{||x||1: Ax = b}. SPA is efficient as long as the matrix-vector product Ax and AT y can be computed efficiently; in particular, A need not have orthogonal rows. SPA converges to the target signal by solving a sequence of penalized optimization subproblems, and each subproblem is solved using Nesterov's optimal algorithm for simple sets [Yu. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, Kluwer Academic Publishers, Norwell, MA, 2004] and [Yu. Nesterov, Math. Program., 103 (2005), pp. 127-152]. We show that the SPA iterates xk are ∈-feasible; i.e. || Axk - b ||2 ≤ e and e-optimal; i.e.",
author = "Aybat, {N. S.} and G. Iyengar",
year = "2011",
month = "5",
day = "30",
doi = "10.1137/090762294",
language = "English (US)",
volume = "21",
pages = "287--313",
journal = "SIAM Journal on Optimization",
issn = "1052-6234",
publisher = "Society for Industrial and Applied Mathematics Publications",
number = "1",

}

A first-order smoothed penalty method for compressed sensing. / Aybat, N. S.; Iyengar, G.

In: SIAM Journal on Optimization, Vol. 21, No. 1, 30.05.2011, p. 287-313.

Research output: Contribution to journalArticle

TY - JOUR

T1 - A first-order smoothed penalty method for compressed sensing

AU - Aybat, N. S.

AU - Iyengar, G.

PY - 2011/5/30

Y1 - 2011/5/30

N2 - We propose a first-order smoothed penalty algorithm (SPA) to solve the sparse recovery problem min{||x||1: Ax = b}. SPA is efficient as long as the matrix-vector product Ax and AT y can be computed efficiently; in particular, A need not have orthogonal rows. SPA converges to the target signal by solving a sequence of penalized optimization subproblems, and each subproblem is solved using Nesterov's optimal algorithm for simple sets [Yu. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, Kluwer Academic Publishers, Norwell, MA, 2004] and [Yu. Nesterov, Math. Program., 103 (2005), pp. 127-152]. We show that the SPA iterates xk are ∈-feasible; i.e. || Axk - b ||2 ≤ e and e-optimal; i.e.

AB - We propose a first-order smoothed penalty algorithm (SPA) to solve the sparse recovery problem min{||x||1: Ax = b}. SPA is efficient as long as the matrix-vector product Ax and AT y can be computed efficiently; in particular, A need not have orthogonal rows. SPA converges to the target signal by solving a sequence of penalized optimization subproblems, and each subproblem is solved using Nesterov's optimal algorithm for simple sets [Yu. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, Kluwer Academic Publishers, Norwell, MA, 2004] and [Yu. Nesterov, Math. Program., 103 (2005), pp. 127-152]. We show that the SPA iterates xk are ∈-feasible; i.e. || Axk - b ||2 ≤ e and e-optimal; i.e.

UR - http://www.scopus.com/inward/record.url?scp=79957449870&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=79957449870&partnerID=8YFLogxK

U2 - 10.1137/090762294

DO - 10.1137/090762294

M3 - Article

AN - SCOPUS:79957449870

VL - 21

SP - 287

EP - 313

JO - SIAM Journal on Optimization

JF - SIAM Journal on Optimization

SN - 1052-6234

IS - 1

ER -