### Abstract

We propose a first-order smoothed penalty algorithm (SPA) to solve the sparse recovery problem min{||x||_{1}: Ax = b}. SPA is efficient as long as the matrix-vector product Ax and A^{T} y can be computed efficiently; in particular, A need not have orthogonal rows. SPA converges to the target signal by solving a sequence of penalized optimization subproblems, and each subproblem is solved using Nesterov's optimal algorithm for simple sets [Yu. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, Kluwer Academic Publishers, Norwell, MA, 2004] and [Yu. Nesterov, Math. Program., 103 (2005), pp. 127-152]. We show that the SPA iterates xk are ∈-feasible; i.e. || Ax_{k} - b ||_{2} ≤ e and e-optimal; i.e.

Original language | English (US) |
---|---|

Pages (from-to) | 287-313 |

Number of pages | 27 |

Journal | SIAM Journal on Optimization |

Volume | 21 |

Issue number | 1 |

DOIs | |

State | Published - May 30 2011 |

### Fingerprint

### All Science Journal Classification (ASJC) codes

- Software
- Theoretical Computer Science

### Cite this

*SIAM Journal on Optimization*,

*21*(1), 287-313. https://doi.org/10.1137/090762294

}

*SIAM Journal on Optimization*, vol. 21, no. 1, pp. 287-313. https://doi.org/10.1137/090762294

**A first-order smoothed penalty method for compressed sensing.** / Aybat, N. S.; Iyengar, G.

Research output: Contribution to journal › Article

TY - JOUR

T1 - A first-order smoothed penalty method for compressed sensing

AU - Aybat, N. S.

AU - Iyengar, G.

PY - 2011/5/30

Y1 - 2011/5/30

N2 - We propose a first-order smoothed penalty algorithm (SPA) to solve the sparse recovery problem min{||x||1: Ax = b}. SPA is efficient as long as the matrix-vector product Ax and AT y can be computed efficiently; in particular, A need not have orthogonal rows. SPA converges to the target signal by solving a sequence of penalized optimization subproblems, and each subproblem is solved using Nesterov's optimal algorithm for simple sets [Yu. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, Kluwer Academic Publishers, Norwell, MA, 2004] and [Yu. Nesterov, Math. Program., 103 (2005), pp. 127-152]. We show that the SPA iterates xk are ∈-feasible; i.e. || Axk - b ||2 ≤ e and e-optimal; i.e.

AB - We propose a first-order smoothed penalty algorithm (SPA) to solve the sparse recovery problem min{||x||1: Ax = b}. SPA is efficient as long as the matrix-vector product Ax and AT y can be computed efficiently; in particular, A need not have orthogonal rows. SPA converges to the target signal by solving a sequence of penalized optimization subproblems, and each subproblem is solved using Nesterov's optimal algorithm for simple sets [Yu. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, Kluwer Academic Publishers, Norwell, MA, 2004] and [Yu. Nesterov, Math. Program., 103 (2005), pp. 127-152]. We show that the SPA iterates xk are ∈-feasible; i.e. || Axk - b ||2 ≤ e and e-optimal; i.e.

UR - http://www.scopus.com/inward/record.url?scp=79957449870&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=79957449870&partnerID=8YFLogxK

U2 - 10.1137/090762294

DO - 10.1137/090762294

M3 - Article

AN - SCOPUS:79957449870

VL - 21

SP - 287

EP - 313

JO - SIAM Journal on Optimization

JF - SIAM Journal on Optimization

SN - 1052-6234

IS - 1

ER -