Efficient Stochastic Optimization for Low-Rank Distance Metric Learning

Although distance metric learning has been successfully applied to many real-world applications, learning a distance metric from large-scale and high-dimensional data remains a challenging problem. Due to the PSD constraint, the computational complexity of previous algorithms per iteration is at lea...

Full description

Bibliographic Details
Published in:Proceedings of the AAAI Conference on Artificial Intelligence
Main Authors: Zhang, Jie, Zhang, Lijun
Format: Article in Journal/Newspaper
Language:English
Published: Association for the Advancement of Artificial Intelligence 2017
Subjects:
DML
Online Access:https://ojs.aaai.org/index.php/AAAI/article/view/10649
https://doi.org/10.1609/aaai.v31i1.10649
id ftjaaai:oai:ojs.aaai.org:article/10649
record_format openpolar
spelling ftjaaai:oai:ojs.aaai.org:article/10649 2023-05-15T16:01:35+02:00 Efficient Stochastic Optimization for Low-Rank Distance Metric Learning Zhang, Jie Zhang, Lijun 2017-02-12 application/pdf https://ojs.aaai.org/index.php/AAAI/article/view/10649 https://doi.org/10.1609/aaai.v31i1.10649 eng eng Association for the Advancement of Artificial Intelligence https://ojs.aaai.org/index.php/AAAI/article/view/10649/10508 https://ojs.aaai.org/index.php/AAAI/article/view/10649 doi:10.1609/aaai.v31i1.10649 Proceedings of the AAAI Conference on Artificial Intelligence; Vol. 31 No. 1 (2017): Thirty-First AAAI Conference on Artificial Intelligence 2374-3468 2159-5399 Stochastic Optimization Low-Rank DML Incremental SVD info:eu-repo/semantics/article info:eu-repo/semantics/publishedVersion 2017 ftjaaai https://doi.org/10.1609/aaai.v31i1.10649 2022-07-02T23:42:59Z Although distance metric learning has been successfully applied to many real-world applications, learning a distance metric from large-scale and high-dimensional data remains a challenging problem. Due to the PSD constraint, the computational complexity of previous algorithms per iteration is at least O(d2) where d is the dimensionality of the data.In this paper, we develop an efficient stochastic algorithm for a class of distance metric learning problems with nuclear norm regularization, referred to as low-rank DML. By utilizing the low-rank structure of the intermediate solutions and stochastic gradients, the complexity of our algorithm has a linear dependence on the dimensionality d. The key idea is to maintain all the iterates in factorized representations and construct stochastic gradients that are low-rank. In this way, the projection onto the PSD cone can be implemented efficiently by incremental SVD. Experimental results on several data sets validate the effectiveness and efficiency of our method. Article in Journal/Newspaper DML AAAI Publications (Association for the Advancement of Artificial Intelligence) Proceedings of the AAAI Conference on Artificial Intelligence 31 1
institution Open Polar
collection AAAI Publications (Association for the Advancement of Artificial Intelligence)
op_collection_id ftjaaai
language English
topic Stochastic Optimization
Low-Rank DML
Incremental SVD
spellingShingle Stochastic Optimization
Low-Rank DML
Incremental SVD
Zhang, Jie
Zhang, Lijun
Efficient Stochastic Optimization for Low-Rank Distance Metric Learning
topic_facet Stochastic Optimization
Low-Rank DML
Incremental SVD
description Although distance metric learning has been successfully applied to many real-world applications, learning a distance metric from large-scale and high-dimensional data remains a challenging problem. Due to the PSD constraint, the computational complexity of previous algorithms per iteration is at least O(d2) where d is the dimensionality of the data.In this paper, we develop an efficient stochastic algorithm for a class of distance metric learning problems with nuclear norm regularization, referred to as low-rank DML. By utilizing the low-rank structure of the intermediate solutions and stochastic gradients, the complexity of our algorithm has a linear dependence on the dimensionality d. The key idea is to maintain all the iterates in factorized representations and construct stochastic gradients that are low-rank. In this way, the projection onto the PSD cone can be implemented efficiently by incremental SVD. Experimental results on several data sets validate the effectiveness and efficiency of our method.
format Article in Journal/Newspaper
author Zhang, Jie
Zhang, Lijun
author_facet Zhang, Jie
Zhang, Lijun
author_sort Zhang, Jie
title Efficient Stochastic Optimization for Low-Rank Distance Metric Learning
title_short Efficient Stochastic Optimization for Low-Rank Distance Metric Learning
title_full Efficient Stochastic Optimization for Low-Rank Distance Metric Learning
title_fullStr Efficient Stochastic Optimization for Low-Rank Distance Metric Learning
title_full_unstemmed Efficient Stochastic Optimization for Low-Rank Distance Metric Learning
title_sort efficient stochastic optimization for low-rank distance metric learning
publisher Association for the Advancement of Artificial Intelligence
publishDate 2017
url https://ojs.aaai.org/index.php/AAAI/article/view/10649
https://doi.org/10.1609/aaai.v31i1.10649
genre DML
genre_facet DML
op_source Proceedings of the AAAI Conference on Artificial Intelligence; Vol. 31 No. 1 (2017): Thirty-First AAAI Conference on Artificial Intelligence
2374-3468
2159-5399
op_relation https://ojs.aaai.org/index.php/AAAI/article/view/10649/10508
https://ojs.aaai.org/index.php/AAAI/article/view/10649
doi:10.1609/aaai.v31i1.10649
op_doi https://doi.org/10.1609/aaai.v31i1.10649
container_title Proceedings of the AAAI Conference on Artificial Intelligence
container_volume 31
container_issue 1
_version_ 1766397379003547648