No Fuss Distance Metric Learning using Proxies
We address the problem of distance metric learning (DML), defined as learning a distance consistent with a notion of semantic similarity. Traditionally, for this problem supervision is expressed in the form of sets of points that follow an ordinal relationship -- an anchor point $x$ is similar to a...
Main Authors: | , , , , |
---|---|
Format: | Report |
Language: | unknown |
Published: |
arXiv
2017
|
Subjects: | |
Online Access: | https://dx.doi.org/10.48550/arxiv.1703.07464 https://arxiv.org/abs/1703.07464 |
id |
ftdatacite:10.48550/arxiv.1703.07464 |
---|---|
record_format |
openpolar |
spelling |
ftdatacite:10.48550/arxiv.1703.07464 2023-05-15T16:02:04+02:00 No Fuss Distance Metric Learning using Proxies Movshovitz-Attias, Yair Toshev, Alexander Leung, Thomas K. Ioffe, Sergey Singh, Saurabh 2017 https://dx.doi.org/10.48550/arxiv.1703.07464 https://arxiv.org/abs/1703.07464 unknown arXiv arXiv.org perpetual, non-exclusive license http://arxiv.org/licenses/nonexclusive-distrib/1.0/ Computer Vision and Pattern Recognition cs.CV FOS Computer and information sciences Preprint Article article CreativeWork 2017 ftdatacite https://doi.org/10.48550/arxiv.1703.07464 2022-04-01T10:53:02Z We address the problem of distance metric learning (DML), defined as learning a distance consistent with a notion of semantic similarity. Traditionally, for this problem supervision is expressed in the form of sets of points that follow an ordinal relationship -- an anchor point $x$ is similar to a set of positive points $Y$, and dissimilar to a set of negative points $Z$, and a loss defined over these distances is minimized. While the specifics of the optimization differ, in this work we collectively call this type of supervision Triplets and all methods that follow this pattern Triplet-Based methods. These methods are challenging to optimize. A main issue is the need for finding informative triplets, which is usually achieved by a variety of tricks such as increasing the batch size, hard or semi-hard triplet mining, etc. Even with these tricks, the convergence rate of such methods is slow. In this paper we propose to optimize the triplet loss on a different space of triplets, consisting of an anchor data point and similar and dissimilar proxy points which are learned as well. These proxies approximate the original data points, so that a triplet loss over the proxies is a tight upper bound of the original loss. This proxy-based loss is empirically better behaved. As a result, the proxy-loss improves on state-of-art results for three standard zero-shot learning datasets, by up to 15% points, while converging three times as fast as other triplet-based losses. : To be presented in ICCV 2017 Report DML DataCite Metadata Store (German National Library of Science and Technology) Triplets ENVELOPE(-59.750,-59.750,-62.383,-62.383) Anchor Point ENVELOPE(-56.815,-56.815,51.233,51.233) |
institution |
Open Polar |
collection |
DataCite Metadata Store (German National Library of Science and Technology) |
op_collection_id |
ftdatacite |
language |
unknown |
topic |
Computer Vision and Pattern Recognition cs.CV FOS Computer and information sciences |
spellingShingle |
Computer Vision and Pattern Recognition cs.CV FOS Computer and information sciences Movshovitz-Attias, Yair Toshev, Alexander Leung, Thomas K. Ioffe, Sergey Singh, Saurabh No Fuss Distance Metric Learning using Proxies |
topic_facet |
Computer Vision and Pattern Recognition cs.CV FOS Computer and information sciences |
description |
We address the problem of distance metric learning (DML), defined as learning a distance consistent with a notion of semantic similarity. Traditionally, for this problem supervision is expressed in the form of sets of points that follow an ordinal relationship -- an anchor point $x$ is similar to a set of positive points $Y$, and dissimilar to a set of negative points $Z$, and a loss defined over these distances is minimized. While the specifics of the optimization differ, in this work we collectively call this type of supervision Triplets and all methods that follow this pattern Triplet-Based methods. These methods are challenging to optimize. A main issue is the need for finding informative triplets, which is usually achieved by a variety of tricks such as increasing the batch size, hard or semi-hard triplet mining, etc. Even with these tricks, the convergence rate of such methods is slow. In this paper we propose to optimize the triplet loss on a different space of triplets, consisting of an anchor data point and similar and dissimilar proxy points which are learned as well. These proxies approximate the original data points, so that a triplet loss over the proxies is a tight upper bound of the original loss. This proxy-based loss is empirically better behaved. As a result, the proxy-loss improves on state-of-art results for three standard zero-shot learning datasets, by up to 15% points, while converging three times as fast as other triplet-based losses. : To be presented in ICCV 2017 |
format |
Report |
author |
Movshovitz-Attias, Yair Toshev, Alexander Leung, Thomas K. Ioffe, Sergey Singh, Saurabh |
author_facet |
Movshovitz-Attias, Yair Toshev, Alexander Leung, Thomas K. Ioffe, Sergey Singh, Saurabh |
author_sort |
Movshovitz-Attias, Yair |
title |
No Fuss Distance Metric Learning using Proxies |
title_short |
No Fuss Distance Metric Learning using Proxies |
title_full |
No Fuss Distance Metric Learning using Proxies |
title_fullStr |
No Fuss Distance Metric Learning using Proxies |
title_full_unstemmed |
No Fuss Distance Metric Learning using Proxies |
title_sort |
no fuss distance metric learning using proxies |
publisher |
arXiv |
publishDate |
2017 |
url |
https://dx.doi.org/10.48550/arxiv.1703.07464 https://arxiv.org/abs/1703.07464 |
long_lat |
ENVELOPE(-59.750,-59.750,-62.383,-62.383) ENVELOPE(-56.815,-56.815,51.233,51.233) |
geographic |
Triplets Anchor Point |
geographic_facet |
Triplets Anchor Point |
genre |
DML |
genre_facet |
DML |
op_rights |
arXiv.org perpetual, non-exclusive license http://arxiv.org/licenses/nonexclusive-distrib/1.0/ |
op_doi |
https://doi.org/10.48550/arxiv.1703.07464 |
_version_ |
1766397688749752320 |