### Abstract

Kernel-based quadrature rules are becoming important in machine learning and statistics, as they achieve super-√n convergence rates in numerical integration, and thus provide alternatives to Monte Carlo integration in challenging settings where integrands are expensive to evaluate or where integrands are high dimensional. These rules are based on the assumption that the integrand has a certain degree of smoothness, which is expressed as that the integrand belongs to a certain reproducing kernel Hilbert space (RKHS). However, this assumption can be violated in practice (e.g., when the integrand is a black box function), and no general theory has been established for the convergence of kernel quadratures in such misspecified settings. Our contribution is in proving that kernel quadratures can be consistent even when the integrand does not belong to the assumed RKHS, i.e., when the integrand is less smooth than assumed. Specifically, we derive convergence rates that depend on the (unknown) lesser smoothness of the integrand, where the degree of smoothness is expressed via powers of RKHSs or via Sobolev spaces.

Original language | English (US) |
---|---|

Pages (from-to) | 3296-3304 |

Number of pages | 9 |

Journal | Advances in Neural Information Processing Systems |

State | Published - Jan 1 2016 |

Event | 30th Annual Conference on Neural Information Processing Systems, NIPS 2016 - Barcelona, Spain Duration: Dec 5 2016 → Dec 10 2016 |

### Fingerprint

### All Science Journal Classification (ASJC) codes

- Computer Networks and Communications
- Information Systems
- Signal Processing

### Cite this

*Advances in Neural Information Processing Systems*, 3296-3304.

}

*Advances in Neural Information Processing Systems*, pp. 3296-3304.

**Convergence guarantees for kernel-based quadrature rules in misspecified settings.** / Kanagawa, Motonobu; Sriperumbudur, Bharath K.; Fukumizu, Kenji.

Research output: Contribution to journal › Conference article

TY - JOUR

T1 - Convergence guarantees for kernel-based quadrature rules in misspecified settings

AU - Kanagawa, Motonobu

AU - Sriperumbudur, Bharath K.

AU - Fukumizu, Kenji

PY - 2016/1/1

Y1 - 2016/1/1

N2 - Kernel-based quadrature rules are becoming important in machine learning and statistics, as they achieve super-√n convergence rates in numerical integration, and thus provide alternatives to Monte Carlo integration in challenging settings where integrands are expensive to evaluate or where integrands are high dimensional. These rules are based on the assumption that the integrand has a certain degree of smoothness, which is expressed as that the integrand belongs to a certain reproducing kernel Hilbert space (RKHS). However, this assumption can be violated in practice (e.g., when the integrand is a black box function), and no general theory has been established for the convergence of kernel quadratures in such misspecified settings. Our contribution is in proving that kernel quadratures can be consistent even when the integrand does not belong to the assumed RKHS, i.e., when the integrand is less smooth than assumed. Specifically, we derive convergence rates that depend on the (unknown) lesser smoothness of the integrand, where the degree of smoothness is expressed via powers of RKHSs or via Sobolev spaces.

AB - Kernel-based quadrature rules are becoming important in machine learning and statistics, as they achieve super-√n convergence rates in numerical integration, and thus provide alternatives to Monte Carlo integration in challenging settings where integrands are expensive to evaluate or where integrands are high dimensional. These rules are based on the assumption that the integrand has a certain degree of smoothness, which is expressed as that the integrand belongs to a certain reproducing kernel Hilbert space (RKHS). However, this assumption can be violated in practice (e.g., when the integrand is a black box function), and no general theory has been established for the convergence of kernel quadratures in such misspecified settings. Our contribution is in proving that kernel quadratures can be consistent even when the integrand does not belong to the assumed RKHS, i.e., when the integrand is less smooth than assumed. Specifically, we derive convergence rates that depend on the (unknown) lesser smoothness of the integrand, where the degree of smoothness is expressed via powers of RKHSs or via Sobolev spaces.

UR - http://www.scopus.com/inward/record.url?scp=85019177231&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85019177231&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:85019177231

SP - 3296

EP - 3304

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -