Efficient Distance Metric Learning by Adaptive Sampling and Mini-Batch Stochastic Gradient Descent (SGD)

Distance metric learning (DML) is an important task that has found applications in many domains. The high computational cost of DML arises from the large number of variables to be determined and the constraint that a distance metric has to be a positive semi-definite (PSD) matrix. Although stochasti...

Full description

Bibliographic Details
Main Authors: Qian, Qi, Jin, Rong, Yi, Jinfeng, Zhang, Lijun, Zhu, Shenghuo
Format: Report
Language:unknown
Published: arXiv 2013
Subjects:
DML
Online Access:https://dx.doi.org/10.48550/arxiv.1304.1192
https://arxiv.org/abs/1304.1192
id ftdatacite:10.48550/arxiv.1304.1192
record_format openpolar
spelling ftdatacite:10.48550/arxiv.1304.1192 2023-05-15T16:01:10+02:00 Efficient Distance Metric Learning by Adaptive Sampling and Mini-Batch Stochastic Gradient Descent (SGD) Qian, Qi Jin, Rong Yi, Jinfeng Zhang, Lijun Zhu, Shenghuo 2013 https://dx.doi.org/10.48550/arxiv.1304.1192 https://arxiv.org/abs/1304.1192 unknown arXiv arXiv.org perpetual, non-exclusive license http://arxiv.org/licenses/nonexclusive-distrib/1.0/ Machine Learning cs.LG FOS Computer and information sciences Preprint Article article CreativeWork 2013 ftdatacite https://doi.org/10.48550/arxiv.1304.1192 2022-04-01T13:23:02Z Distance metric learning (DML) is an important task that has found applications in many domains. The high computational cost of DML arises from the large number of variables to be determined and the constraint that a distance metric has to be a positive semi-definite (PSD) matrix. Although stochastic gradient descent (SGD) has been successfully applied to improve the efficiency of DML, it can still be computationally expensive because in order to ensure that the solution is a PSD matrix, it has to, at every iteration, project the updated distance metric onto the PSD cone, an expensive operation. We address this challenge by developing two strategies within SGD, i.e. mini-batch and adaptive sampling, to effectively reduce the number of updates (i.e., projections onto the PSD cone) in SGD. We also develop hybrid approaches that combine the strength of adaptive sampling with that of mini-batch online learning techniques to further improve the computational efficiency of SGD for DML. We prove the theoretical guarantees for both adaptive sampling and mini-batch based approaches for DML. We also conduct an extensive empirical study to verify the effectiveness of the proposed algorithms for DML. Report DML DataCite Metadata Store (German National Library of Science and Technology)
institution Open Polar
collection DataCite Metadata Store (German National Library of Science and Technology)
op_collection_id ftdatacite
language unknown
topic Machine Learning cs.LG
FOS Computer and information sciences
spellingShingle Machine Learning cs.LG
FOS Computer and information sciences
Qian, Qi
Jin, Rong
Yi, Jinfeng
Zhang, Lijun
Zhu, Shenghuo
Efficient Distance Metric Learning by Adaptive Sampling and Mini-Batch Stochastic Gradient Descent (SGD)
topic_facet Machine Learning cs.LG
FOS Computer and information sciences
description Distance metric learning (DML) is an important task that has found applications in many domains. The high computational cost of DML arises from the large number of variables to be determined and the constraint that a distance metric has to be a positive semi-definite (PSD) matrix. Although stochastic gradient descent (SGD) has been successfully applied to improve the efficiency of DML, it can still be computationally expensive because in order to ensure that the solution is a PSD matrix, it has to, at every iteration, project the updated distance metric onto the PSD cone, an expensive operation. We address this challenge by developing two strategies within SGD, i.e. mini-batch and adaptive sampling, to effectively reduce the number of updates (i.e., projections onto the PSD cone) in SGD. We also develop hybrid approaches that combine the strength of adaptive sampling with that of mini-batch online learning techniques to further improve the computational efficiency of SGD for DML. We prove the theoretical guarantees for both adaptive sampling and mini-batch based approaches for DML. We also conduct an extensive empirical study to verify the effectiveness of the proposed algorithms for DML.
format Report
author Qian, Qi
Jin, Rong
Yi, Jinfeng
Zhang, Lijun
Zhu, Shenghuo
author_facet Qian, Qi
Jin, Rong
Yi, Jinfeng
Zhang, Lijun
Zhu, Shenghuo
author_sort Qian, Qi
title Efficient Distance Metric Learning by Adaptive Sampling and Mini-Batch Stochastic Gradient Descent (SGD)
title_short Efficient Distance Metric Learning by Adaptive Sampling and Mini-Batch Stochastic Gradient Descent (SGD)
title_full Efficient Distance Metric Learning by Adaptive Sampling and Mini-Batch Stochastic Gradient Descent (SGD)
title_fullStr Efficient Distance Metric Learning by Adaptive Sampling and Mini-Batch Stochastic Gradient Descent (SGD)
title_full_unstemmed Efficient Distance Metric Learning by Adaptive Sampling and Mini-Batch Stochastic Gradient Descent (SGD)
title_sort efficient distance metric learning by adaptive sampling and mini-batch stochastic gradient descent (sgd)
publisher arXiv
publishDate 2013
url https://dx.doi.org/10.48550/arxiv.1304.1192
https://arxiv.org/abs/1304.1192
genre DML
genre_facet DML
op_rights arXiv.org perpetual, non-exclusive license
http://arxiv.org/licenses/nonexclusive-distrib/1.0/
op_doi https://doi.org/10.48550/arxiv.1304.1192
_version_ 1766397141799927808