Nonparametric Kullback-Liebler Divergence Estimation Using M-Spacing

Linyun He, Eunhye Song

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Entropy of a random variable with unknown distribution function can be estimated nonparametrically by spacing methods when independent and identically distributed (i.i.d.) observations of the random variable are available. We extend the classical entropy estimator based on sample spacing to define an m-spacing estimator for the Kullback-Liebler (KL) divergence between two i.i.d. observations with unknown distribution functions, which can be applied to measure discrepancy between real-world system output and simulation output as well as between two simulators' outputs. We show that the proposed estimator converges almost surely to the true KL divergence as the numbers of outputs collected from both systems increase under mild conditions and discuss the required choices for m and the simulation output sample size as functions of the real-world sample size. Additionally, we show Central Limit Theorems for the proposed estimator with appropriate scaling.

Original languageEnglish (US)
Title of host publication2021 Winter Simulation Conference, WSC 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781665433112
DOIs
StatePublished - 2021
Event2021 Winter Simulation Conference, WSC 2021 - Phoenix, United States
Duration: Dec 12 2021Dec 15 2021

Publication series

NameProceedings - Winter Simulation Conference
Volume2021-December
ISSN (Print)0891-7736

Conference

Conference2021 Winter Simulation Conference, WSC 2021
Country/TerritoryUnited States
CityPhoenix
Period12/12/2112/15/21

All Science Journal Classification (ASJC) codes

  • Software
  • Modeling and Simulation
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Nonparametric Kullback-Liebler Divergence Estimation Using M-Spacing'. Together they form a unique fingerprint.

Cite this