## Abstract

We propose a first-order smoothed penalty algorithm (SPA) to solve the sparse recovery problem min{||x||_{1}: Ax = b}. SPA is efficient as long as the matrix-vector product Ax and A^{T} y can be computed efficiently; in particular, A need not have orthogonal rows. SPA converges to the target signal by solving a sequence of penalized optimization subproblems, and each subproblem is solved using Nesterov's optimal algorithm for simple sets [Yu. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, Kluwer Academic Publishers, Norwell, MA, 2004] and [Yu. Nesterov, Math. Program., 103 (2005), pp. 127-152]. We show that the SPA iterates xk are ∈-feasible; i.e. || Ax_{k} - b ||_{2} ≤ e and e-optimal; i.e.

Original language | English (US) |
---|---|

Pages (from-to) | 287-313 |

Number of pages | 27 |

Journal | SIAM Journal on Optimization |

Volume | 21 |

Issue number | 1 |

DOIs | |

State | Published - May 30 2011 |

## All Science Journal Classification (ASJC) codes

- Software
- Theoretical Computer Science