Constrained empirical risk minimization framework for distance metric learning

Distance metric learning (DML) has received increasing attention in recent years. In this paper, we propose a constrained empirical risk minimization framework for DML. This framework enriches the state-of-the-art studies on both theoretic and algorithmic aspects. Theoretically, we comprehensively a...

Full description

Bibliographic Details
Main Authors: Bian, W, Tao, D
Format: Article in Journal/Newspaper
Language:unknown
Published: 2012
Subjects:
DML
Online Access:http://hdl.handle.net/10453/22849
Description
Summary:Distance metric learning (DML) has received increasing attention in recent years. In this paper, we propose a constrained empirical risk minimization framework for DML. This framework enriches the state-of-the-art studies on both theoretic and algorithmic aspects. Theoretically, we comprehensively analyze the generalization by bounding the sample and the approximation errors with respect to the best model. Algorithmically, we carefully derive an optimal gradient descent by using Nesterov's method, and provide two example algorithms that utilize the logarithmic loss and the smoothed hinge loss, respectively. We evaluate the new framework on data classification and image retrieval experiments. Results show that the new framework has competitive performance compared with the representative DML algorithms, including Xing's method, large margin nearest neighbor classifier, neighborhood component analysis, and regularized metric learning. © 2012 IEEE.