Deep Metric Learning with Density Adaptivity
The problem of distance metric learning is mostly considered from the perspective of learning an embedding space, where the distances between pairs of examples are in correspondence with a similarity metric. With the rise and success of Convolutional Neural Networks (CNN), deep metric learning (DML)...
Main Authors: | , , , , |
---|---|
Format: | Article in Journal/Newspaper |
Language: | unknown |
Published: |
arXiv
2019
|
Subjects: | |
Online Access: | https://dx.doi.org/10.48550/arxiv.1909.03909 https://arxiv.org/abs/1909.03909 |
id |
ftdatacite:10.48550/arxiv.1909.03909 |
---|---|
record_format |
openpolar |
spelling |
ftdatacite:10.48550/arxiv.1909.03909 2023-05-15T16:01:26+02:00 Deep Metric Learning with Density Adaptivity Li, Yehao Yao, Ting Pan, Yingwei Chao, Hongyang Mei, Tao 2019 https://dx.doi.org/10.48550/arxiv.1909.03909 https://arxiv.org/abs/1909.03909 unknown arXiv arXiv.org perpetual, non-exclusive license http://arxiv.org/licenses/nonexclusive-distrib/1.0/ Computer Vision and Pattern Recognition cs.CV Multimedia cs.MM FOS Computer and information sciences Article CreativeWork article Preprint 2019 ftdatacite https://doi.org/10.48550/arxiv.1909.03909 2022-03-10T16:27:27Z The problem of distance metric learning is mostly considered from the perspective of learning an embedding space, where the distances between pairs of examples are in correspondence with a similarity metric. With the rise and success of Convolutional Neural Networks (CNN), deep metric learning (DML) involves training a network to learn a nonlinear transformation to the embedding space. Existing DML approaches often express the supervision through maximizing inter-class distance and minimizing intra-class variation. However, the results can suffer from overfitting problem, especially when the training examples of each class are embedded together tightly and the density of each class is very high. In this paper, we integrate density, i.e., the measure of data concentration in the representation, into the optimization of DML frameworks to adaptively balance inter-class similarity and intra-class variation by training the architecture in an end-to-end manner. Technically, the knowledge of density is employed as a regularizer, which is pluggable to any DML architecture with different objective functions such as contrastive loss, N-pair loss and triplet loss. Extensive experiments on three public datasets consistently demonstrate clear improvements by amending three types of embedding with the density adaptivity. More remarkably, our proposal increases Recall@1 from 67.95% to 77.62%, from 52.01% to 55.64% and from 68.20% to 70.56% on Cars196, CUB-200-2011 and Stanford Online Products dataset, respectively. : Accepted by IEEE Transactions on Multimedia Article in Journal/Newspaper DML DataCite Metadata Store (German National Library of Science and Technology) |
institution |
Open Polar |
collection |
DataCite Metadata Store (German National Library of Science and Technology) |
op_collection_id |
ftdatacite |
language |
unknown |
topic |
Computer Vision and Pattern Recognition cs.CV Multimedia cs.MM FOS Computer and information sciences |
spellingShingle |
Computer Vision and Pattern Recognition cs.CV Multimedia cs.MM FOS Computer and information sciences Li, Yehao Yao, Ting Pan, Yingwei Chao, Hongyang Mei, Tao Deep Metric Learning with Density Adaptivity |
topic_facet |
Computer Vision and Pattern Recognition cs.CV Multimedia cs.MM FOS Computer and information sciences |
description |
The problem of distance metric learning is mostly considered from the perspective of learning an embedding space, where the distances between pairs of examples are in correspondence with a similarity metric. With the rise and success of Convolutional Neural Networks (CNN), deep metric learning (DML) involves training a network to learn a nonlinear transformation to the embedding space. Existing DML approaches often express the supervision through maximizing inter-class distance and minimizing intra-class variation. However, the results can suffer from overfitting problem, especially when the training examples of each class are embedded together tightly and the density of each class is very high. In this paper, we integrate density, i.e., the measure of data concentration in the representation, into the optimization of DML frameworks to adaptively balance inter-class similarity and intra-class variation by training the architecture in an end-to-end manner. Technically, the knowledge of density is employed as a regularizer, which is pluggable to any DML architecture with different objective functions such as contrastive loss, N-pair loss and triplet loss. Extensive experiments on three public datasets consistently demonstrate clear improvements by amending three types of embedding with the density adaptivity. More remarkably, our proposal increases Recall@1 from 67.95% to 77.62%, from 52.01% to 55.64% and from 68.20% to 70.56% on Cars196, CUB-200-2011 and Stanford Online Products dataset, respectively. : Accepted by IEEE Transactions on Multimedia |
format |
Article in Journal/Newspaper |
author |
Li, Yehao Yao, Ting Pan, Yingwei Chao, Hongyang Mei, Tao |
author_facet |
Li, Yehao Yao, Ting Pan, Yingwei Chao, Hongyang Mei, Tao |
author_sort |
Li, Yehao |
title |
Deep Metric Learning with Density Adaptivity |
title_short |
Deep Metric Learning with Density Adaptivity |
title_full |
Deep Metric Learning with Density Adaptivity |
title_fullStr |
Deep Metric Learning with Density Adaptivity |
title_full_unstemmed |
Deep Metric Learning with Density Adaptivity |
title_sort |
deep metric learning with density adaptivity |
publisher |
arXiv |
publishDate |
2019 |
url |
https://dx.doi.org/10.48550/arxiv.1909.03909 https://arxiv.org/abs/1909.03909 |
genre |
DML |
genre_facet |
DML |
op_rights |
arXiv.org perpetual, non-exclusive license http://arxiv.org/licenses/nonexclusive-distrib/1.0/ |
op_doi |
https://doi.org/10.48550/arxiv.1909.03909 |
_version_ |
1766397291237736448 |