A simulated annealing based inexact oracle for wasserstein loss minimization

Jianbo Ye, James Wang, Jia Li

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Learning under a Wasserstein loss, a.k.a. Wasserstein loss minimization (WLM), is an emerging research topic for gaining insights from a large set of structured objects. Despite being conceptually simple, WLM problems are computationally challenging because they involve minimizing over functions of quantities (i.e. Wasserstein distances) that themselves require numerical algorithms to compute. In this paper, we introduce a stochastic approach based on simulated annealing for solving WLMs. Particularly, we have developed a Gibbs sampler to approximate effectively and efficiently the partial gradients of a sequence of Wasserstein losses. Our new approach has the advantages of numerical stability and readiness for warm starts. These characteristics are valuable for WLM problems that often require multiple levels of iterations in which the oracle for computing the value and gradient of a loss function is embedded. We applied the method to optimal transport with Coulomb cost and the Wasserstein non-negative matrix factorization problem, and made comparisons with the existing method of entropy regularization.

Original languageEnglish (US)
Title of host publication34th International Conference on Machine Learning, ICML 2017
PublisherInternational Machine Learning Society (IMLS)
Pages6005-6017
Number of pages13
ISBN (Electronic)9781510855144
StatePublished - Jan 1 2017
Event34th International Conference on Machine Learning, ICML 2017 - Sydney, Australia
Duration: Aug 6 2017Aug 11 2017

Publication series

Name34th International Conference on Machine Learning, ICML 2017
Volume8

Other

Other34th International Conference on Machine Learning, ICML 2017
CountryAustralia
CitySydney
Period8/6/178/11/17

Fingerprint

Simulated annealing
Convergence of numerical methods
Factorization
Entropy
Costs

All Science Journal Classification (ASJC) codes

  • Computational Theory and Mathematics
  • Human-Computer Interaction
  • Software

Cite this

Ye, J., Wang, J., & Li, J. (2017). A simulated annealing based inexact oracle for wasserstein loss minimization. In 34th International Conference on Machine Learning, ICML 2017 (pp. 6005-6017). (34th International Conference on Machine Learning, ICML 2017; Vol. 8). International Machine Learning Society (IMLS).
Ye, Jianbo ; Wang, James ; Li, Jia. / A simulated annealing based inexact oracle for wasserstein loss minimization. 34th International Conference on Machine Learning, ICML 2017. International Machine Learning Society (IMLS), 2017. pp. 6005-6017 (34th International Conference on Machine Learning, ICML 2017).
@inproceedings{2a60e653e2d64230a0e93c8812ed729f,
title = "A simulated annealing based inexact oracle for wasserstein loss minimization",
abstract = "Learning under a Wasserstein loss, a.k.a. Wasserstein loss minimization (WLM), is an emerging research topic for gaining insights from a large set of structured objects. Despite being conceptually simple, WLM problems are computationally challenging because they involve minimizing over functions of quantities (i.e. Wasserstein distances) that themselves require numerical algorithms to compute. In this paper, we introduce a stochastic approach based on simulated annealing for solving WLMs. Particularly, we have developed a Gibbs sampler to approximate effectively and efficiently the partial gradients of a sequence of Wasserstein losses. Our new approach has the advantages of numerical stability and readiness for warm starts. These characteristics are valuable for WLM problems that often require multiple levels of iterations in which the oracle for computing the value and gradient of a loss function is embedded. We applied the method to optimal transport with Coulomb cost and the Wasserstein non-negative matrix factorization problem, and made comparisons with the existing method of entropy regularization.",
author = "Jianbo Ye and James Wang and Jia Li",
year = "2017",
month = "1",
day = "1",
language = "English (US)",
series = "34th International Conference on Machine Learning, ICML 2017",
publisher = "International Machine Learning Society (IMLS)",
pages = "6005--6017",
booktitle = "34th International Conference on Machine Learning, ICML 2017",

}

Ye, J, Wang, J & Li, J 2017, A simulated annealing based inexact oracle for wasserstein loss minimization. in 34th International Conference on Machine Learning, ICML 2017. 34th International Conference on Machine Learning, ICML 2017, vol. 8, International Machine Learning Society (IMLS), pp. 6005-6017, 34th International Conference on Machine Learning, ICML 2017, Sydney, Australia, 8/6/17.

A simulated annealing based inexact oracle for wasserstein loss minimization. / Ye, Jianbo; Wang, James; Li, Jia.

34th International Conference on Machine Learning, ICML 2017. International Machine Learning Society (IMLS), 2017. p. 6005-6017 (34th International Conference on Machine Learning, ICML 2017; Vol. 8).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - A simulated annealing based inexact oracle for wasserstein loss minimization

AU - Ye, Jianbo

AU - Wang, James

AU - Li, Jia

PY - 2017/1/1

Y1 - 2017/1/1

N2 - Learning under a Wasserstein loss, a.k.a. Wasserstein loss minimization (WLM), is an emerging research topic for gaining insights from a large set of structured objects. Despite being conceptually simple, WLM problems are computationally challenging because they involve minimizing over functions of quantities (i.e. Wasserstein distances) that themselves require numerical algorithms to compute. In this paper, we introduce a stochastic approach based on simulated annealing for solving WLMs. Particularly, we have developed a Gibbs sampler to approximate effectively and efficiently the partial gradients of a sequence of Wasserstein losses. Our new approach has the advantages of numerical stability and readiness for warm starts. These characteristics are valuable for WLM problems that often require multiple levels of iterations in which the oracle for computing the value and gradient of a loss function is embedded. We applied the method to optimal transport with Coulomb cost and the Wasserstein non-negative matrix factorization problem, and made comparisons with the existing method of entropy regularization.

AB - Learning under a Wasserstein loss, a.k.a. Wasserstein loss minimization (WLM), is an emerging research topic for gaining insights from a large set of structured objects. Despite being conceptually simple, WLM problems are computationally challenging because they involve minimizing over functions of quantities (i.e. Wasserstein distances) that themselves require numerical algorithms to compute. In this paper, we introduce a stochastic approach based on simulated annealing for solving WLMs. Particularly, we have developed a Gibbs sampler to approximate effectively and efficiently the partial gradients of a sequence of Wasserstein losses. Our new approach has the advantages of numerical stability and readiness for warm starts. These characteristics are valuable for WLM problems that often require multiple levels of iterations in which the oracle for computing the value and gradient of a loss function is embedded. We applied the method to optimal transport with Coulomb cost and the Wasserstein non-negative matrix factorization problem, and made comparisons with the existing method of entropy regularization.

UR - http://www.scopus.com/inward/record.url?scp=85048536367&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85048536367&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:85048536367

T3 - 34th International Conference on Machine Learning, ICML 2017

SP - 6005

EP - 6017

BT - 34th International Conference on Machine Learning, ICML 2017

PB - International Machine Learning Society (IMLS)

ER -

Ye J, Wang J, Li J. A simulated annealing based inexact oracle for wasserstein loss minimization. In 34th International Conference on Machine Learning, ICML 2017. International Machine Learning Society (IMLS). 2017. p. 6005-6017. (34th International Conference on Machine Learning, ICML 2017).