Improved bounds for the nyström method with application to kernel classification

Rong Jin, Tianbao Yang, Mehrdad Mahdavi, Yu Feng Li, Zhi Hua Zhou

Research output: Contribution to journalArticle

21 Scopus citations

Abstract

We develop two approaches for analyzing the approximation error bound for the Nyström method that approximates a positive semidefinite (PSD) matrix by sampling a small set of columns, one based on a concentration inequality for integral operators, and one based on random matrix theory. We show that the approximation error, measured in the spectral norm, can be improved from O(N/√ m) to O(N/m1-1) in the case of large eigengap, where N is the total number of data points, m is the number of sampled data points, and \rho \in (0, 1/2) is a positive constant that characterizes the eigengap. When the eigenvalues of the kernel matrix follow a p-power law, our analysis based on random matrix theory further improves the bound to O(N/mp-1) under an incoherence assumption. We present a kernel classification approach based on the Nyström method and derive its generalization performance using the improved bound. We show that when the eigenvalues of the kernel matrix follow a p-power law, we can reduce the number of support vectors to N2p/(p2-1) which is sublinear in N when p > 1+√2, without seriously sacrificing its generalization performance.

Original languageEnglish (US)
Article number6547995
Pages (from-to)6939-6949
Number of pages11
JournalIEEE Transactions on Information Theory
Volume59
Issue number10
DOIs
StatePublished - Sep 25 2013

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint Dive into the research topics of 'Improved bounds for the nyström method with application to kernel classification'. Together they form a unique fingerprint.

  • Cite this