Metric learning from relative comparisons by minimizing squared residual

Eric Yi Liu, Zhishan Guo, Xiang Zhang, Vladimir Jojic, Wei Wang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

24 Citations (Scopus)

Abstract

Recent studies [1]-[5] have suggested using constraints in the form of relative distance comparisons to represent domain knowledge: d(a, b) < d(c, d) where d(·) is the distance function and a, b, c, d are data objects. Such constraints are readily available in many problems where pairwise constraints are not natural to obtain. In this paper we consider the problem of learning a Mahalanobis distance metric from supervision in the form of relative distance comparisons. We propose a simple, yet effective, algorithm that minimizes a convex objective function corresponding to the sum of squared residuals of constraints. We also extend our model and algorithm to promote sparsity in the learned metric matrix. Experimental results suggest that our method consistently outperforms existing methods in terms of clustering accuracy. Furthermore, the sparsity extension leads to more stable estimation when the dimension is high and only a small amount of supervision is given.

Original languageEnglish (US)
Title of host publicationProceedings - 12th IEEE International Conference on Data Mining, ICDM 2012
Pages978-983
Number of pages6
DOIs
StatePublished - Dec 1 2012
Event12th IEEE International Conference on Data Mining, ICDM 2012 - Brussels, Belgium
Duration: Dec 10 2012Dec 13 2012

Other

Other12th IEEE International Conference on Data Mining, ICDM 2012
CountryBelgium
CityBrussels
Period12/10/1212/13/12

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Cite this

Liu, E. Y., Guo, Z., Zhang, X., Jojic, V., & Wang, W. (2012). Metric learning from relative comparisons by minimizing squared residual. In Proceedings - 12th IEEE International Conference on Data Mining, ICDM 2012 (pp. 978-983). [6413822] https://doi.org/10.1109/ICDM.2012.38
Liu, Eric Yi ; Guo, Zhishan ; Zhang, Xiang ; Jojic, Vladimir ; Wang, Wei. / Metric learning from relative comparisons by minimizing squared residual. Proceedings - 12th IEEE International Conference on Data Mining, ICDM 2012. 2012. pp. 978-983
@inproceedings{0453a451d4344c2ca20029926183ae9c,
title = "Metric learning from relative comparisons by minimizing squared residual",
abstract = "Recent studies [1]-[5] have suggested using constraints in the form of relative distance comparisons to represent domain knowledge: d(a, b) < d(c, d) where d(·) is the distance function and a, b, c, d are data objects. Such constraints are readily available in many problems where pairwise constraints are not natural to obtain. In this paper we consider the problem of learning a Mahalanobis distance metric from supervision in the form of relative distance comparisons. We propose a simple, yet effective, algorithm that minimizes a convex objective function corresponding to the sum of squared residuals of constraints. We also extend our model and algorithm to promote sparsity in the learned metric matrix. Experimental results suggest that our method consistently outperforms existing methods in terms of clustering accuracy. Furthermore, the sparsity extension leads to more stable estimation when the dimension is high and only a small amount of supervision is given.",
author = "Liu, {Eric Yi} and Zhishan Guo and Xiang Zhang and Vladimir Jojic and Wei Wang",
year = "2012",
month = "12",
day = "1",
doi = "10.1109/ICDM.2012.38",
language = "English (US)",
isbn = "9780769549057",
pages = "978--983",
booktitle = "Proceedings - 12th IEEE International Conference on Data Mining, ICDM 2012",

}

Liu, EY, Guo, Z, Zhang, X, Jojic, V & Wang, W 2012, Metric learning from relative comparisons by minimizing squared residual. in Proceedings - 12th IEEE International Conference on Data Mining, ICDM 2012., 6413822, pp. 978-983, 12th IEEE International Conference on Data Mining, ICDM 2012, Brussels, Belgium, 12/10/12. https://doi.org/10.1109/ICDM.2012.38

Metric learning from relative comparisons by minimizing squared residual. / Liu, Eric Yi; Guo, Zhishan; Zhang, Xiang; Jojic, Vladimir; Wang, Wei.

Proceedings - 12th IEEE International Conference on Data Mining, ICDM 2012. 2012. p. 978-983 6413822.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Metric learning from relative comparisons by minimizing squared residual

AU - Liu, Eric Yi

AU - Guo, Zhishan

AU - Zhang, Xiang

AU - Jojic, Vladimir

AU - Wang, Wei

PY - 2012/12/1

Y1 - 2012/12/1

N2 - Recent studies [1]-[5] have suggested using constraints in the form of relative distance comparisons to represent domain knowledge: d(a, b) < d(c, d) where d(·) is the distance function and a, b, c, d are data objects. Such constraints are readily available in many problems where pairwise constraints are not natural to obtain. In this paper we consider the problem of learning a Mahalanobis distance metric from supervision in the form of relative distance comparisons. We propose a simple, yet effective, algorithm that minimizes a convex objective function corresponding to the sum of squared residuals of constraints. We also extend our model and algorithm to promote sparsity in the learned metric matrix. Experimental results suggest that our method consistently outperforms existing methods in terms of clustering accuracy. Furthermore, the sparsity extension leads to more stable estimation when the dimension is high and only a small amount of supervision is given.

AB - Recent studies [1]-[5] have suggested using constraints in the form of relative distance comparisons to represent domain knowledge: d(a, b) < d(c, d) where d(·) is the distance function and a, b, c, d are data objects. Such constraints are readily available in many problems where pairwise constraints are not natural to obtain. In this paper we consider the problem of learning a Mahalanobis distance metric from supervision in the form of relative distance comparisons. We propose a simple, yet effective, algorithm that minimizes a convex objective function corresponding to the sum of squared residuals of constraints. We also extend our model and algorithm to promote sparsity in the learned metric matrix. Experimental results suggest that our method consistently outperforms existing methods in terms of clustering accuracy. Furthermore, the sparsity extension leads to more stable estimation when the dimension is high and only a small amount of supervision is given.

UR - http://www.scopus.com/inward/record.url?scp=84874034396&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84874034396&partnerID=8YFLogxK

U2 - 10.1109/ICDM.2012.38

DO - 10.1109/ICDM.2012.38

M3 - Conference contribution

AN - SCOPUS:84874034396

SN - 9780769549057

SP - 978

EP - 983

BT - Proceedings - 12th IEEE International Conference on Data Mining, ICDM 2012

ER -

Liu EY, Guo Z, Zhang X, Jojic V, Wang W. Metric learning from relative comparisons by minimizing squared residual. In Proceedings - 12th IEEE International Conference on Data Mining, ICDM 2012. 2012. p. 978-983. 6413822 https://doi.org/10.1109/ICDM.2012.38