### Abstract

In this paper, we consider a smoothing kernel based classification rule and propose an algorithm for optimizing the performance of the rule by learning the bandwidth of the smoothing kernel along with a data-dependent distance metric. The data-dependent distance metric is obtained by learning a function that embeds an arbitrary metric space into a Euclidean space while minimizing an upper bound on the resubstitution estimate of the error probability of the kernel classification rule. By restricting this embedding function to a reproducing kernel Hubert space, we reduce the problem to solving a semidefinite program and show the resulting kernel classification rule to be a variation of the k-nearest neighbor rule. We compare the performance of the kernel rule (using the learned data-dependent distance metric) to state-of-the-art distance metric learning algorithms (designed for k-nearest neighbor classification) on some benchmark datasets. The results show that the proposed rule has either better or as good classification accuracy as the other metric learning algorithms.

Original language | English (US) |
---|---|

Title of host publication | Proceedings of the 25th International Conference on Machine Learning |

Pages | 1008-1015 |

Number of pages | 8 |

State | Published - Nov 26 2008 |

Event | 25th International Conference on Machine Learning - Helsinki, Finland Duration: Jul 5 2008 → Jul 9 2008 |

### Publication series

Name | Proceedings of the 25th International Conference on Machine Learning |
---|

### Other

Other | 25th International Conference on Machine Learning |
---|---|

Country | Finland |

City | Helsinki |

Period | 7/5/08 → 7/9/08 |

### Fingerprint

### All Science Journal Classification (ASJC) codes

- Artificial Intelligence
- Human-Computer Interaction
- Software

### Cite this

*Proceedings of the 25th International Conference on Machine Learning*(pp. 1008-1015). (Proceedings of the 25th International Conference on Machine Learning).

}

*Proceedings of the 25th International Conference on Machine Learning.*Proceedings of the 25th International Conference on Machine Learning, pp. 1008-1015, 25th International Conference on Machine Learning, Helsinki, Finland, 7/5/08.

**Metric embedding for kernel classification rules.** / Sriperumbudur, Bharath K.; Lang, Omer A.; Lanckriet, Gert R.G.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

TY - GEN

T1 - Metric embedding for kernel classification rules

AU - Sriperumbudur, Bharath K.

AU - Lang, Omer A.

AU - Lanckriet, Gert R.G.

PY - 2008/11/26

Y1 - 2008/11/26

N2 - In this paper, we consider a smoothing kernel based classification rule and propose an algorithm for optimizing the performance of the rule by learning the bandwidth of the smoothing kernel along with a data-dependent distance metric. The data-dependent distance metric is obtained by learning a function that embeds an arbitrary metric space into a Euclidean space while minimizing an upper bound on the resubstitution estimate of the error probability of the kernel classification rule. By restricting this embedding function to a reproducing kernel Hubert space, we reduce the problem to solving a semidefinite program and show the resulting kernel classification rule to be a variation of the k-nearest neighbor rule. We compare the performance of the kernel rule (using the learned data-dependent distance metric) to state-of-the-art distance metric learning algorithms (designed for k-nearest neighbor classification) on some benchmark datasets. The results show that the proposed rule has either better or as good classification accuracy as the other metric learning algorithms.

AB - In this paper, we consider a smoothing kernel based classification rule and propose an algorithm for optimizing the performance of the rule by learning the bandwidth of the smoothing kernel along with a data-dependent distance metric. The data-dependent distance metric is obtained by learning a function that embeds an arbitrary metric space into a Euclidean space while minimizing an upper bound on the resubstitution estimate of the error probability of the kernel classification rule. By restricting this embedding function to a reproducing kernel Hubert space, we reduce the problem to solving a semidefinite program and show the resulting kernel classification rule to be a variation of the k-nearest neighbor rule. We compare the performance of the kernel rule (using the learned data-dependent distance metric) to state-of-the-art distance metric learning algorithms (designed for k-nearest neighbor classification) on some benchmark datasets. The results show that the proposed rule has either better or as good classification accuracy as the other metric learning algorithms.

UR - http://www.scopus.com/inward/record.url?scp=56449095464&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=56449095464&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:56449095464

SN - 9781605582054

T3 - Proceedings of the 25th International Conference on Machine Learning

SP - 1008

EP - 1015

BT - Proceedings of the 25th International Conference on Machine Learning

ER -