Convergence guarantees for kernel-based quadrature rules in misspecified settings

Motonobu Kanagawa, Bharath K. Sriperumbudur, Kenji Fukumizu

Research output: Contribution to journalConference article

16 Citations (Scopus)

Abstract

Kernel-based quadrature rules are becoming important in machine learning and statistics, as they achieve super-√n convergence rates in numerical integration, and thus provide alternatives to Monte Carlo integration in challenging settings where integrands are expensive to evaluate or where integrands are high dimensional. These rules are based on the assumption that the integrand has a certain degree of smoothness, which is expressed as that the integrand belongs to a certain reproducing kernel Hilbert space (RKHS). However, this assumption can be violated in practice (e.g., when the integrand is a black box function), and no general theory has been established for the convergence of kernel quadratures in such misspecified settings. Our contribution is in proving that kernel quadratures can be consistent even when the integrand does not belong to the assumed RKHS, i.e., when the integrand is less smooth than assumed. Specifically, we derive convergence rates that depend on the (unknown) lesser smoothness of the integrand, where the degree of smoothness is expressed via powers of RKHSs or via Sobolev spaces.

Original languageEnglish (US)
Pages (from-to)3296-3304
Number of pages9
JournalAdvances in Neural Information Processing Systems
StatePublished - Jan 1 2016
Event30th Annual Conference on Neural Information Processing Systems, NIPS 2016 - Barcelona, Spain
Duration: Dec 5 2016Dec 10 2016

Fingerprint

Hilbert spaces
Sobolev spaces
Learning systems
Statistics

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

@article{18942be707b649799fa442e253e7c12e,
title = "Convergence guarantees for kernel-based quadrature rules in misspecified settings",
abstract = "Kernel-based quadrature rules are becoming important in machine learning and statistics, as they achieve super-√n convergence rates in numerical integration, and thus provide alternatives to Monte Carlo integration in challenging settings where integrands are expensive to evaluate or where integrands are high dimensional. These rules are based on the assumption that the integrand has a certain degree of smoothness, which is expressed as that the integrand belongs to a certain reproducing kernel Hilbert space (RKHS). However, this assumption can be violated in practice (e.g., when the integrand is a black box function), and no general theory has been established for the convergence of kernel quadratures in such misspecified settings. Our contribution is in proving that kernel quadratures can be consistent even when the integrand does not belong to the assumed RKHS, i.e., when the integrand is less smooth than assumed. Specifically, we derive convergence rates that depend on the (unknown) lesser smoothness of the integrand, where the degree of smoothness is expressed via powers of RKHSs or via Sobolev spaces.",
author = "Motonobu Kanagawa and Sriperumbudur, {Bharath K.} and Kenji Fukumizu",
year = "2016",
month = "1",
day = "1",
language = "English (US)",
pages = "3296--3304",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",

}

Convergence guarantees for kernel-based quadrature rules in misspecified settings. / Kanagawa, Motonobu; Sriperumbudur, Bharath K.; Fukumizu, Kenji.

In: Advances in Neural Information Processing Systems, 01.01.2016, p. 3296-3304.

Research output: Contribution to journalConference article

TY - JOUR

T1 - Convergence guarantees for kernel-based quadrature rules in misspecified settings

AU - Kanagawa, Motonobu

AU - Sriperumbudur, Bharath K.

AU - Fukumizu, Kenji

PY - 2016/1/1

Y1 - 2016/1/1

N2 - Kernel-based quadrature rules are becoming important in machine learning and statistics, as they achieve super-√n convergence rates in numerical integration, and thus provide alternatives to Monte Carlo integration in challenging settings where integrands are expensive to evaluate or where integrands are high dimensional. These rules are based on the assumption that the integrand has a certain degree of smoothness, which is expressed as that the integrand belongs to a certain reproducing kernel Hilbert space (RKHS). However, this assumption can be violated in practice (e.g., when the integrand is a black box function), and no general theory has been established for the convergence of kernel quadratures in such misspecified settings. Our contribution is in proving that kernel quadratures can be consistent even when the integrand does not belong to the assumed RKHS, i.e., when the integrand is less smooth than assumed. Specifically, we derive convergence rates that depend on the (unknown) lesser smoothness of the integrand, where the degree of smoothness is expressed via powers of RKHSs or via Sobolev spaces.

AB - Kernel-based quadrature rules are becoming important in machine learning and statistics, as they achieve super-√n convergence rates in numerical integration, and thus provide alternatives to Monte Carlo integration in challenging settings where integrands are expensive to evaluate or where integrands are high dimensional. These rules are based on the assumption that the integrand has a certain degree of smoothness, which is expressed as that the integrand belongs to a certain reproducing kernel Hilbert space (RKHS). However, this assumption can be violated in practice (e.g., when the integrand is a black box function), and no general theory has been established for the convergence of kernel quadratures in such misspecified settings. Our contribution is in proving that kernel quadratures can be consistent even when the integrand does not belong to the assumed RKHS, i.e., when the integrand is less smooth than assumed. Specifically, we derive convergence rates that depend on the (unknown) lesser smoothness of the integrand, where the degree of smoothness is expressed via powers of RKHSs or via Sobolev spaces.

UR - http://www.scopus.com/inward/record.url?scp=85019177231&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85019177231&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:85019177231

SP - 3296

EP - 3304

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -