More accurate bidiagonal reduction for computing the singular value decomposition

Research output: Contribution to journalArticle

19 Citations (Scopus)

Abstract

Bidiagonal reduction is the preliminary stage for the fastest stable algorithms for computing the singular value decomposition (SVD) now available. However, the best-known error bounds on bidiagonal reduction methods on any matrix are of the form A + δA = UBVT, ∥δA∥2 ≤ ε M f(m,n)∥A∥2 where B is bidiagonal, U and V are orthogonal, εM is machine precision, and f(m,n) is a modestly growing function of the dimensions of A. A preprocessing technique analyzed by Higham [Linear Algebra Appl., 309 (2000), pp. 153-174] uses orthogonal factorization with column pivoting to obtain the factorization A = Q (CT 0) PT, where Q is orthogonal, C is lower triangular, and P is permutation matrix. Bidiagonal reduction is applied to the resulting matrix C. To do that reduction, a new Givens-based bidiagonalization algorithm is proposed that produces a bidiagonal matrix B that satisfies C + δC = U(B + δB)VT where δB is bounded componentwise and δC satisfies a columnwise bound (based upon the growth of the lower right corner of C) with U and V orthogonal to nearly working precision. Once we have that reduction, there is a good menu of algorithms that obtain the singular values of the bidiagonal matrix B to relative accuracy, thus obtaining an SVD of C that can be much more accurate than that obtained from standard bidiagonal reduction procedures. The additional operations required over the standard bidiagonal reduction algorithm of Golub and Kahan [J. Soc. Indust. Appl. Math. Ser. B Numer. Anal., 2 (1965), pp. 205-224] are those for using Givens rotations instead of Householder transformations to compute the matrix V, and 2n3/3 flops to compute column norms.

Original languageEnglish (US)
Pages (from-to)761-798
Number of pages38
JournalSIAM Journal on Matrix Analysis and Applications
Volume23
Issue number3
DOIs
StatePublished - Jan 1 2002

Fingerprint

Singular value decomposition
Computing
Orthogonal Factorization
Householder Transformation
Givens Rotations
Permutation Matrix
Pivoting
Singular Values
Reduction Method
Linear algebra
Error Bounds
Preprocessing
Triangular
Factorization
Norm

All Science Journal Classification (ASJC) codes

  • Analysis

Cite this

@article{a97f032e486c446a9f921e6ae744a2c2,
title = "More accurate bidiagonal reduction for computing the singular value decomposition",
abstract = "Bidiagonal reduction is the preliminary stage for the fastest stable algorithms for computing the singular value decomposition (SVD) now available. However, the best-known error bounds on bidiagonal reduction methods on any matrix are of the form A + δA = UBVT, ∥δA∥2 ≤ ε M f(m,n)∥A∥2 where B is bidiagonal, U and V are orthogonal, εM is machine precision, and f(m,n) is a modestly growing function of the dimensions of A. A preprocessing technique analyzed by Higham [Linear Algebra Appl., 309 (2000), pp. 153-174] uses orthogonal factorization with column pivoting to obtain the factorization A = Q (CT 0) PT, where Q is orthogonal, C is lower triangular, and P is permutation matrix. Bidiagonal reduction is applied to the resulting matrix C. To do that reduction, a new Givens-based bidiagonalization algorithm is proposed that produces a bidiagonal matrix B that satisfies C + δC = U(B + δB)VT where δB is bounded componentwise and δC satisfies a columnwise bound (based upon the growth of the lower right corner of C) with U and V orthogonal to nearly working precision. Once we have that reduction, there is a good menu of algorithms that obtain the singular values of the bidiagonal matrix B to relative accuracy, thus obtaining an SVD of C that can be much more accurate than that obtained from standard bidiagonal reduction procedures. The additional operations required over the standard bidiagonal reduction algorithm of Golub and Kahan [J. Soc. Indust. Appl. Math. Ser. B Numer. Anal., 2 (1965), pp. 205-224] are those for using Givens rotations instead of Householder transformations to compute the matrix V, and 2n3/3 flops to compute column norms.",
author = "Barlow, {Jesse L.}",
year = "2002",
month = "1",
day = "1",
doi = "10.1137/S0895479898343541",
language = "English (US)",
volume = "23",
pages = "761--798",
journal = "SIAM Journal on Matrix Analysis and Applications",
issn = "0895-4798",
publisher = "Society for Industrial and Applied Mathematics Publications",
number = "3",

}

More accurate bidiagonal reduction for computing the singular value decomposition. / Barlow, Jesse L.

In: SIAM Journal on Matrix Analysis and Applications, Vol. 23, No. 3, 01.01.2002, p. 761-798.

Research output: Contribution to journalArticle

TY - JOUR

T1 - More accurate bidiagonal reduction for computing the singular value decomposition

AU - Barlow, Jesse L.

PY - 2002/1/1

Y1 - 2002/1/1

N2 - Bidiagonal reduction is the preliminary stage for the fastest stable algorithms for computing the singular value decomposition (SVD) now available. However, the best-known error bounds on bidiagonal reduction methods on any matrix are of the form A + δA = UBVT, ∥δA∥2 ≤ ε M f(m,n)∥A∥2 where B is bidiagonal, U and V are orthogonal, εM is machine precision, and f(m,n) is a modestly growing function of the dimensions of A. A preprocessing technique analyzed by Higham [Linear Algebra Appl., 309 (2000), pp. 153-174] uses orthogonal factorization with column pivoting to obtain the factorization A = Q (CT 0) PT, where Q is orthogonal, C is lower triangular, and P is permutation matrix. Bidiagonal reduction is applied to the resulting matrix C. To do that reduction, a new Givens-based bidiagonalization algorithm is proposed that produces a bidiagonal matrix B that satisfies C + δC = U(B + δB)VT where δB is bounded componentwise and δC satisfies a columnwise bound (based upon the growth of the lower right corner of C) with U and V orthogonal to nearly working precision. Once we have that reduction, there is a good menu of algorithms that obtain the singular values of the bidiagonal matrix B to relative accuracy, thus obtaining an SVD of C that can be much more accurate than that obtained from standard bidiagonal reduction procedures. The additional operations required over the standard bidiagonal reduction algorithm of Golub and Kahan [J. Soc. Indust. Appl. Math. Ser. B Numer. Anal., 2 (1965), pp. 205-224] are those for using Givens rotations instead of Householder transformations to compute the matrix V, and 2n3/3 flops to compute column norms.

AB - Bidiagonal reduction is the preliminary stage for the fastest stable algorithms for computing the singular value decomposition (SVD) now available. However, the best-known error bounds on bidiagonal reduction methods on any matrix are of the form A + δA = UBVT, ∥δA∥2 ≤ ε M f(m,n)∥A∥2 where B is bidiagonal, U and V are orthogonal, εM is machine precision, and f(m,n) is a modestly growing function of the dimensions of A. A preprocessing technique analyzed by Higham [Linear Algebra Appl., 309 (2000), pp. 153-174] uses orthogonal factorization with column pivoting to obtain the factorization A = Q (CT 0) PT, where Q is orthogonal, C is lower triangular, and P is permutation matrix. Bidiagonal reduction is applied to the resulting matrix C. To do that reduction, a new Givens-based bidiagonalization algorithm is proposed that produces a bidiagonal matrix B that satisfies C + δC = U(B + δB)VT where δB is bounded componentwise and δC satisfies a columnwise bound (based upon the growth of the lower right corner of C) with U and V orthogonal to nearly working precision. Once we have that reduction, there is a good menu of algorithms that obtain the singular values of the bidiagonal matrix B to relative accuracy, thus obtaining an SVD of C that can be much more accurate than that obtained from standard bidiagonal reduction procedures. The additional operations required over the standard bidiagonal reduction algorithm of Golub and Kahan [J. Soc. Indust. Appl. Math. Ser. B Numer. Anal., 2 (1965), pp. 205-224] are those for using Givens rotations instead of Householder transformations to compute the matrix V, and 2n3/3 flops to compute column norms.

UR - http://www.scopus.com/inward/record.url?scp=0036018639&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0036018639&partnerID=8YFLogxK

U2 - 10.1137/S0895479898343541

DO - 10.1137/S0895479898343541

M3 - Article

AN - SCOPUS:0036018639

VL - 23

SP - 761

EP - 798

JO - SIAM Journal on Matrix Analysis and Applications

JF - SIAM Journal on Matrix Analysis and Applications

SN - 0895-4798

IS - 3

ER -