Learning with Memory-based Virtual Classes for Deep Metric Learning

The core of deep metric learning (DML) involves learning visual similarities in high-dimensional embedding space. One of the main challenges is to generalize from seen classes of training data to unseen classes of test data. Recent works have focused on exploiting past embeddings to increase the num...

Full description

Bibliographic Details
Main Authors: Ko, Byungsoo, Gu, Geonmo, Kim, Han-Gyu
Format: Article in Journal/Newspaper
Language:unknown
Published: arXiv 2021
Subjects:
DML
Online Access:https://dx.doi.org/10.48550/arxiv.2103.16940
https://arxiv.org/abs/2103.16940
id ftdatacite:10.48550/arxiv.2103.16940
record_format openpolar
spelling ftdatacite:10.48550/arxiv.2103.16940 2023-05-15T16:01:31+02:00 Learning with Memory-based Virtual Classes for Deep Metric Learning Ko, Byungsoo Gu, Geonmo Kim, Han-Gyu 2021 https://dx.doi.org/10.48550/arxiv.2103.16940 https://arxiv.org/abs/2103.16940 unknown arXiv arXiv.org perpetual, non-exclusive license http://arxiv.org/licenses/nonexclusive-distrib/1.0/ Computer Vision and Pattern Recognition cs.CV Information Retrieval cs.IR Machine Learning cs.LG FOS Computer and information sciences Article CreativeWork article Preprint 2021 ftdatacite https://doi.org/10.48550/arxiv.2103.16940 2022-03-10T14:51:56Z The core of deep metric learning (DML) involves learning visual similarities in high-dimensional embedding space. One of the main challenges is to generalize from seen classes of training data to unseen classes of test data. Recent works have focused on exploiting past embeddings to increase the number of instances for the seen classes. Such methods achieve performance improvement via augmentation, while the strong focus on seen classes still remains. This can be undesirable for DML, where training and test data exhibit entirely different classes. In this work, we present a novel training strategy for DML called MemVir. Unlike previous works, MemVir memorizes both embedding features and class weights to utilize them as additional virtual classes. The exploitation of virtual classes not only utilizes augmented information for training but also alleviates a strong focus on seen classes for better generalization. Moreover, we embed the idea of curriculum learning by slowly adding virtual classes for a gradual increase in learning difficulty, which improves the learning stability as well as the final performance. MemVir can be easily applied to many existing loss functions without any modification. Extensive experimental results on famous benchmarks demonstrate the superiority of MemVir over state-of-the-art competitors. Code of MemVir is publicly available. : Accepted by ICCV2021 Article in Journal/Newspaper DML DataCite Metadata Store (German National Library of Science and Technology)
institution Open Polar
collection DataCite Metadata Store (German National Library of Science and Technology)
op_collection_id ftdatacite
language unknown
topic Computer Vision and Pattern Recognition cs.CV
Information Retrieval cs.IR
Machine Learning cs.LG
FOS Computer and information sciences
spellingShingle Computer Vision and Pattern Recognition cs.CV
Information Retrieval cs.IR
Machine Learning cs.LG
FOS Computer and information sciences
Ko, Byungsoo
Gu, Geonmo
Kim, Han-Gyu
Learning with Memory-based Virtual Classes for Deep Metric Learning
topic_facet Computer Vision and Pattern Recognition cs.CV
Information Retrieval cs.IR
Machine Learning cs.LG
FOS Computer and information sciences
description The core of deep metric learning (DML) involves learning visual similarities in high-dimensional embedding space. One of the main challenges is to generalize from seen classes of training data to unseen classes of test data. Recent works have focused on exploiting past embeddings to increase the number of instances for the seen classes. Such methods achieve performance improvement via augmentation, while the strong focus on seen classes still remains. This can be undesirable for DML, where training and test data exhibit entirely different classes. In this work, we present a novel training strategy for DML called MemVir. Unlike previous works, MemVir memorizes both embedding features and class weights to utilize them as additional virtual classes. The exploitation of virtual classes not only utilizes augmented information for training but also alleviates a strong focus on seen classes for better generalization. Moreover, we embed the idea of curriculum learning by slowly adding virtual classes for a gradual increase in learning difficulty, which improves the learning stability as well as the final performance. MemVir can be easily applied to many existing loss functions without any modification. Extensive experimental results on famous benchmarks demonstrate the superiority of MemVir over state-of-the-art competitors. Code of MemVir is publicly available. : Accepted by ICCV2021
format Article in Journal/Newspaper
author Ko, Byungsoo
Gu, Geonmo
Kim, Han-Gyu
author_facet Ko, Byungsoo
Gu, Geonmo
Kim, Han-Gyu
author_sort Ko, Byungsoo
title Learning with Memory-based Virtual Classes for Deep Metric Learning
title_short Learning with Memory-based Virtual Classes for Deep Metric Learning
title_full Learning with Memory-based Virtual Classes for Deep Metric Learning
title_fullStr Learning with Memory-based Virtual Classes for Deep Metric Learning
title_full_unstemmed Learning with Memory-based Virtual Classes for Deep Metric Learning
title_sort learning with memory-based virtual classes for deep metric learning
publisher arXiv
publishDate 2021
url https://dx.doi.org/10.48550/arxiv.2103.16940
https://arxiv.org/abs/2103.16940
genre DML
genre_facet DML
op_rights arXiv.org perpetual, non-exclusive license
http://arxiv.org/licenses/nonexclusive-distrib/1.0/
op_doi https://doi.org/10.48550/arxiv.2103.16940
_version_ 1766397342845501440