Multi Proxy Anchor Family Loss for Several Types of Gradients ...
The deep metric learning (DML) objective is to learn a neural network that maps into an embedding space where similar data are near and dissimilar data are far. However, conventional proxy-based losses for DML have two problems: gradient problem and application of the real-world dataset with multipl...
Main Authors: | , , |
---|---|
Format: | Text |
Language: | unknown |
Published: |
arXiv
2021
|
Subjects: | |
Online Access: | https://dx.doi.org/10.48550/arxiv.2110.03997 https://arxiv.org/abs/2110.03997 |
id |
ftdatacite:10.48550/arxiv.2110.03997 |
---|---|
record_format |
openpolar |
spelling |
ftdatacite:10.48550/arxiv.2110.03997 2023-07-23T04:19:01+02:00 Multi Proxy Anchor Family Loss for Several Types of Gradients ... Saeki, Shozo Kawahara, Minoru Aman, Hirohisa 2021 https://dx.doi.org/10.48550/arxiv.2110.03997 https://arxiv.org/abs/2110.03997 unknown arXiv https://dx.doi.org/10.1016/j.cviu.2023.103654 arXiv.org perpetual, non-exclusive license http://arxiv.org/licenses/nonexclusive-distrib/1.0/ Computer Vision and Pattern Recognition cs.CV FOS Computer and information sciences Text article-journal ScholarlyArticle Article 2021 ftdatacite https://doi.org/10.48550/arxiv.2110.0399710.1016/j.cviu.2023.103654 2023-07-03T18:42:13Z The deep metric learning (DML) objective is to learn a neural network that maps into an embedding space where similar data are near and dissimilar data are far. However, conventional proxy-based losses for DML have two problems: gradient problem and application of the real-world dataset with multiple local centers. Additionally, the performance metrics of DML also have some issues with stability and flexibility. This paper proposes three multi-proxies anchor (MPA) family losses and a normalized discounted cumulative gain (nDCG@k) metric. This paper makes three contributions. (1) MPA-family losses can learn using a real-world dataset with multi-local centers. (2) MPA-family losses improve the training capacity of a neural network owing to solving the gradient problem. (3) MPA-family losses have data-wise or class-wise characteristics with respect to gradient generation. Finally, we demonstrate the effectiveness of MPA-family losses, and MPA-family losses achieves higher accuracy on two datasets for ... Text DML DataCite Metadata Store (German National Library of Science and Technology) |
institution |
Open Polar |
collection |
DataCite Metadata Store (German National Library of Science and Technology) |
op_collection_id |
ftdatacite |
language |
unknown |
topic |
Computer Vision and Pattern Recognition cs.CV FOS Computer and information sciences |
spellingShingle |
Computer Vision and Pattern Recognition cs.CV FOS Computer and information sciences Saeki, Shozo Kawahara, Minoru Aman, Hirohisa Multi Proxy Anchor Family Loss for Several Types of Gradients ... |
topic_facet |
Computer Vision and Pattern Recognition cs.CV FOS Computer and information sciences |
description |
The deep metric learning (DML) objective is to learn a neural network that maps into an embedding space where similar data are near and dissimilar data are far. However, conventional proxy-based losses for DML have two problems: gradient problem and application of the real-world dataset with multiple local centers. Additionally, the performance metrics of DML also have some issues with stability and flexibility. This paper proposes three multi-proxies anchor (MPA) family losses and a normalized discounted cumulative gain (nDCG@k) metric. This paper makes three contributions. (1) MPA-family losses can learn using a real-world dataset with multi-local centers. (2) MPA-family losses improve the training capacity of a neural network owing to solving the gradient problem. (3) MPA-family losses have data-wise or class-wise characteristics with respect to gradient generation. Finally, we demonstrate the effectiveness of MPA-family losses, and MPA-family losses achieves higher accuracy on two datasets for ... |
format |
Text |
author |
Saeki, Shozo Kawahara, Minoru Aman, Hirohisa |
author_facet |
Saeki, Shozo Kawahara, Minoru Aman, Hirohisa |
author_sort |
Saeki, Shozo |
title |
Multi Proxy Anchor Family Loss for Several Types of Gradients ... |
title_short |
Multi Proxy Anchor Family Loss for Several Types of Gradients ... |
title_full |
Multi Proxy Anchor Family Loss for Several Types of Gradients ... |
title_fullStr |
Multi Proxy Anchor Family Loss for Several Types of Gradients ... |
title_full_unstemmed |
Multi Proxy Anchor Family Loss for Several Types of Gradients ... |
title_sort |
multi proxy anchor family loss for several types of gradients ... |
publisher |
arXiv |
publishDate |
2021 |
url |
https://dx.doi.org/10.48550/arxiv.2110.03997 https://arxiv.org/abs/2110.03997 |
genre |
DML |
genre_facet |
DML |
op_relation |
https://dx.doi.org/10.1016/j.cviu.2023.103654 |
op_rights |
arXiv.org perpetual, non-exclusive license http://arxiv.org/licenses/nonexclusive-distrib/1.0/ |
op_doi |
https://doi.org/10.48550/arxiv.2110.0399710.1016/j.cviu.2023.103654 |
_version_ |
1772181762324037632 |