S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning

Deep Metric Learning (DML) provides a crucial tool for visual similarity and zero-shot applications by learning generalizing embedding spaces, although recent work in DML has shown strong performance saturation across training objectives. However, generalization capacity is known to scale with the e...

Full description

Bibliographic Details
Main Authors: Roth, Karsten, Milbich, Timo, Ommer, Björn, Cohen, Joseph Paul, Ghassemi, Marzyeh
Format: Article in Journal/Newspaper
Language:unknown
Published: arXiv 2020
Subjects:
DML
Online Access:https://dx.doi.org/10.48550/arxiv.2009.08348
https://arxiv.org/abs/2009.08348
id ftdatacite:10.48550/arxiv.2009.08348
record_format openpolar
spelling ftdatacite:10.48550/arxiv.2009.08348 2023-05-15T16:01:21+02:00 S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning Roth, Karsten Milbich, Timo Ommer, Björn Cohen, Joseph Paul Ghassemi, Marzyeh 2020 https://dx.doi.org/10.48550/arxiv.2009.08348 https://arxiv.org/abs/2009.08348 unknown arXiv arXiv.org perpetual, non-exclusive license http://arxiv.org/licenses/nonexclusive-distrib/1.0/ Computer Vision and Pattern Recognition cs.CV FOS Computer and information sciences Article CreativeWork article Preprint 2020 ftdatacite https://doi.org/10.48550/arxiv.2009.08348 2022-03-10T15:21:40Z Deep Metric Learning (DML) provides a crucial tool for visual similarity and zero-shot applications by learning generalizing embedding spaces, although recent work in DML has shown strong performance saturation across training objectives. However, generalization capacity is known to scale with the embedding space dimensionality. Unfortunately, high dimensional embeddings also create higher retrieval cost for downstream applications. To remedy this, we propose \emph{Simultaneous Similarity-based Self-distillation (S2SD). S2SD extends DML with knowledge distillation from auxiliary, high-dimensional embedding and feature spaces to leverage complementary context during training while retaining test-time cost and with negligible changes to the training time. Experiments and ablations across different objectives and standard benchmarks show S2SD offers notable improvements of up to 7% in Recall@1, while also setting a new state-of-the-art. Code available at https://github.com/MLforHealth/S2SD. : Accepted to ICML2021 Article in Journal/Newspaper DML DataCite Metadata Store (German National Library of Science and Technology)
institution Open Polar
collection DataCite Metadata Store (German National Library of Science and Technology)
op_collection_id ftdatacite
language unknown
topic Computer Vision and Pattern Recognition cs.CV
FOS Computer and information sciences
spellingShingle Computer Vision and Pattern Recognition cs.CV
FOS Computer and information sciences
Roth, Karsten
Milbich, Timo
Ommer, Björn
Cohen, Joseph Paul
Ghassemi, Marzyeh
S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning
topic_facet Computer Vision and Pattern Recognition cs.CV
FOS Computer and information sciences
description Deep Metric Learning (DML) provides a crucial tool for visual similarity and zero-shot applications by learning generalizing embedding spaces, although recent work in DML has shown strong performance saturation across training objectives. However, generalization capacity is known to scale with the embedding space dimensionality. Unfortunately, high dimensional embeddings also create higher retrieval cost for downstream applications. To remedy this, we propose \emph{Simultaneous Similarity-based Self-distillation (S2SD). S2SD extends DML with knowledge distillation from auxiliary, high-dimensional embedding and feature spaces to leverage complementary context during training while retaining test-time cost and with negligible changes to the training time. Experiments and ablations across different objectives and standard benchmarks show S2SD offers notable improvements of up to 7% in Recall@1, while also setting a new state-of-the-art. Code available at https://github.com/MLforHealth/S2SD. : Accepted to ICML2021
format Article in Journal/Newspaper
author Roth, Karsten
Milbich, Timo
Ommer, Björn
Cohen, Joseph Paul
Ghassemi, Marzyeh
author_facet Roth, Karsten
Milbich, Timo
Ommer, Björn
Cohen, Joseph Paul
Ghassemi, Marzyeh
author_sort Roth, Karsten
title S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning
title_short S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning
title_full S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning
title_fullStr S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning
title_full_unstemmed S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning
title_sort s2sd: simultaneous similarity-based self-distillation for deep metric learning
publisher arXiv
publishDate 2020
url https://dx.doi.org/10.48550/arxiv.2009.08348
https://arxiv.org/abs/2009.08348
genre DML
genre_facet DML
op_rights arXiv.org perpetual, non-exclusive license
http://arxiv.org/licenses/nonexclusive-distrib/1.0/
op_doi https://doi.org/10.48550/arxiv.2009.08348
_version_ 1766397255581958144