Minimax estimation of maximum mean discrepancy with radial kernels

Ilya Tolstikhin, Bharath Kumar Sriperumbudur, Bernhard Schölkopf

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

Maximum Mean Discrepancy (MMD) is a distance on the space of probability measures which has found numerous applications in machine learning and nonpara-metric testing. This distance is based on the notion of embedding probabilities in a reproducing kernel Hilbert space. In this paper, we present the first known lower bounds for the estimation of MMD based on finite samples. Our lower bounds hold for any radial universal kernel on Rd and match the existing upper bounds up to constants that depend only on the properties of the kernel. Using these lower bounds, we establish the minimax rate optimality of the empirical estimator and its U-statistic variant, which are usually employed in applications.

Original languageEnglish (US)
Pages (from-to)1938-1946
Number of pages9
JournalAdvances in Neural Information Processing Systems
StatePublished - 2016

Fingerprint

Hilbert spaces
Learning systems
Statistics
Testing

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

@article{67b5939068414ce28baee9b48dce70ee,
title = "Minimax estimation of maximum mean discrepancy with radial kernels",
abstract = "Maximum Mean Discrepancy (MMD) is a distance on the space of probability measures which has found numerous applications in machine learning and nonpara-metric testing. This distance is based on the notion of embedding probabilities in a reproducing kernel Hilbert space. In this paper, we present the first known lower bounds for the estimation of MMD based on finite samples. Our lower bounds hold for any radial universal kernel on Rd and match the existing upper bounds up to constants that depend only on the properties of the kernel. Using these lower bounds, we establish the minimax rate optimality of the empirical estimator and its U-statistic variant, which are usually employed in applications.",
author = "Ilya Tolstikhin and Sriperumbudur, {Bharath Kumar} and Bernhard Sch{\"o}lkopf",
year = "2016",
language = "English (US)",
pages = "1938--1946",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",

}

Minimax estimation of maximum mean discrepancy with radial kernels. / Tolstikhin, Ilya; Sriperumbudur, Bharath Kumar; Schölkopf, Bernhard.

In: Advances in Neural Information Processing Systems, 2016, p. 1938-1946.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Minimax estimation of maximum mean discrepancy with radial kernels

AU - Tolstikhin, Ilya

AU - Sriperumbudur, Bharath Kumar

AU - Schölkopf, Bernhard

PY - 2016

Y1 - 2016

N2 - Maximum Mean Discrepancy (MMD) is a distance on the space of probability measures which has found numerous applications in machine learning and nonpara-metric testing. This distance is based on the notion of embedding probabilities in a reproducing kernel Hilbert space. In this paper, we present the first known lower bounds for the estimation of MMD based on finite samples. Our lower bounds hold for any radial universal kernel on Rd and match the existing upper bounds up to constants that depend only on the properties of the kernel. Using these lower bounds, we establish the minimax rate optimality of the empirical estimator and its U-statistic variant, which are usually employed in applications.

AB - Maximum Mean Discrepancy (MMD) is a distance on the space of probability measures which has found numerous applications in machine learning and nonpara-metric testing. This distance is based on the notion of embedding probabilities in a reproducing kernel Hilbert space. In this paper, we present the first known lower bounds for the estimation of MMD based on finite samples. Our lower bounds hold for any radial universal kernel on Rd and match the existing upper bounds up to constants that depend only on the properties of the kernel. Using these lower bounds, we establish the minimax rate optimality of the empirical estimator and its U-statistic variant, which are usually employed in applications.

UR - http://www.scopus.com/inward/record.url?scp=85019172607&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85019172607&partnerID=8YFLogxK

M3 - Article

SP - 1938

EP - 1946

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -