Denoising Multi-Similarity Formulation: A Self-Paced Curriculum-Driven Approach for Robust Metric Learning

Deep Metric Learning (DML) is a group of techniques that aim to measure the similarity between objects through the neural network. Although the number of DML methods has rapidly increased in recent years, most previous studies cannot effectively handle noisy data, which commonly exists in practical...

Full description

Bibliographic Details
Published in:Proceedings of the AAAI Conference on Artificial Intelligence
Main Authors: Zhang, Chenkang, Luo, Lei, Gu, Bin
Format: Article in Journal/Newspaper
Language:English
Published: Association for the Advancement of Artificial Intelligence 2023
Subjects:
DML
Online Access:https://ojs.aaai.org/index.php/AAAI/article/view/26324
https://doi.org/10.1609/aaai.v37i9.26324
id ftjaaai:oai:ojs.aaai.org:article/26324
record_format openpolar
spelling ftjaaai:oai:ojs.aaai.org:article/26324 2023-07-23T04:19:01+02:00 Denoising Multi-Similarity Formulation: A Self-Paced Curriculum-Driven Approach for Robust Metric Learning Zhang, Chenkang Luo, Lei Gu, Bin 2023-06-26 application/pdf https://ojs.aaai.org/index.php/AAAI/article/view/26324 https://doi.org/10.1609/aaai.v37i9.26324 eng eng Association for the Advancement of Artificial Intelligence https://ojs.aaai.org/index.php/AAAI/article/view/26324/26096 https://ojs.aaai.org/index.php/AAAI/article/view/26324 doi:10.1609/aaai.v37i9.26324 Copyright (c) 2023 Association for the Advancement of Artificial Intelligence Proceedings of the AAAI Conference on Artificial Intelligence; Vol. 37 No. 9: AAAI-23 Technical Tracks 9; 11183-11191 2374-3468 2159-5399 ML: Representation Learning ML: Deep Neural Network Algorithms info:eu-repo/semantics/article info:eu-repo/semantics/publishedVersion 2023 ftjaaai https://doi.org/10.1609/aaai.v37i9.26324 2023-07-01T22:51:31Z Deep Metric Learning (DML) is a group of techniques that aim to measure the similarity between objects through the neural network. Although the number of DML methods has rapidly increased in recent years, most previous studies cannot effectively handle noisy data, which commonly exists in practical applications and often leads to serious performance deterioration. To overcome this limitation, in this paper, we build a connection between noisy samples and hard samples in the framework of self-paced learning, and propose a Balanced Self-Paced Metric Learning (BSPML) algorithm with a denoising multi-similarity formulation, where noisy samples are treated as extremely hard samples and adaptively excluded from the model training by sample weighting. Especially, due to the pairwise relationship and a new balance regularization term, the sub-problem w.r.t. sample weights is a nonconvex quadratic function. To efficiently solve this nonconvex quadratic problem, we propose a doubly stochastic projection coordinate gradient algorithm. Importantly, we theoretically prove the convergence not only for the doubly stochastic projection coordinate gradient algorithm, but also for our BSPML algorithm. Experimental results on several standard data sets demonstrate that our BSPML algorithm has better generalization ability and robustness than the state-of-the-art robust DML approaches. Article in Journal/Newspaper DML AAAI Publications (Association for the Advancement of Artificial Intelligence) Proceedings of the AAAI Conference on Artificial Intelligence 37 9 11183 11191
institution Open Polar
collection AAAI Publications (Association for the Advancement of Artificial Intelligence)
op_collection_id ftjaaai
language English
topic ML: Representation Learning
ML: Deep Neural Network Algorithms
spellingShingle ML: Representation Learning
ML: Deep Neural Network Algorithms
Zhang, Chenkang
Luo, Lei
Gu, Bin
Denoising Multi-Similarity Formulation: A Self-Paced Curriculum-Driven Approach for Robust Metric Learning
topic_facet ML: Representation Learning
ML: Deep Neural Network Algorithms
description Deep Metric Learning (DML) is a group of techniques that aim to measure the similarity between objects through the neural network. Although the number of DML methods has rapidly increased in recent years, most previous studies cannot effectively handle noisy data, which commonly exists in practical applications and often leads to serious performance deterioration. To overcome this limitation, in this paper, we build a connection between noisy samples and hard samples in the framework of self-paced learning, and propose a Balanced Self-Paced Metric Learning (BSPML) algorithm with a denoising multi-similarity formulation, where noisy samples are treated as extremely hard samples and adaptively excluded from the model training by sample weighting. Especially, due to the pairwise relationship and a new balance regularization term, the sub-problem w.r.t. sample weights is a nonconvex quadratic function. To efficiently solve this nonconvex quadratic problem, we propose a doubly stochastic projection coordinate gradient algorithm. Importantly, we theoretically prove the convergence not only for the doubly stochastic projection coordinate gradient algorithm, but also for our BSPML algorithm. Experimental results on several standard data sets demonstrate that our BSPML algorithm has better generalization ability and robustness than the state-of-the-art robust DML approaches.
format Article in Journal/Newspaper
author Zhang, Chenkang
Luo, Lei
Gu, Bin
author_facet Zhang, Chenkang
Luo, Lei
Gu, Bin
author_sort Zhang, Chenkang
title Denoising Multi-Similarity Formulation: A Self-Paced Curriculum-Driven Approach for Robust Metric Learning
title_short Denoising Multi-Similarity Formulation: A Self-Paced Curriculum-Driven Approach for Robust Metric Learning
title_full Denoising Multi-Similarity Formulation: A Self-Paced Curriculum-Driven Approach for Robust Metric Learning
title_fullStr Denoising Multi-Similarity Formulation: A Self-Paced Curriculum-Driven Approach for Robust Metric Learning
title_full_unstemmed Denoising Multi-Similarity Formulation: A Self-Paced Curriculum-Driven Approach for Robust Metric Learning
title_sort denoising multi-similarity formulation: a self-paced curriculum-driven approach for robust metric learning
publisher Association for the Advancement of Artificial Intelligence
publishDate 2023
url https://ojs.aaai.org/index.php/AAAI/article/view/26324
https://doi.org/10.1609/aaai.v37i9.26324
genre DML
genre_facet DML
op_source Proceedings of the AAAI Conference on Artificial Intelligence; Vol. 37 No. 9: AAAI-23 Technical Tracks 9; 11183-11191
2374-3468
2159-5399
op_relation https://ojs.aaai.org/index.php/AAAI/article/view/26324/26096
https://ojs.aaai.org/index.php/AAAI/article/view/26324
doi:10.1609/aaai.v37i9.26324
op_rights Copyright (c) 2023 Association for the Advancement of Artificial Intelligence
op_doi https://doi.org/10.1609/aaai.v37i9.26324
container_title Proceedings of the AAAI Conference on Artificial Intelligence
container_volume 37
container_issue 9
container_start_page 11183
op_container_end_page 11191
_version_ 1772181764431675392