Exponential Discriminative Metric Embedding in Deep Learning

With the remarkable success achieved by the Convolutional Neural Networks (CNNs) in object recognition recently, deep learning is being widely used in the computer vision community. Deep Metric Learning (DML), integrating deep learning with conventional metric learning, has set new records in many f...

Full description

Bibliographic Details
Main Authors: Wu, Bowen, Chen, Zhangling, Wang, Jun, Wu, Huaming
Format: Report
Language:unknown
Published: arXiv 2018
Subjects:
DML
Online Access:https://dx.doi.org/10.48550/arxiv.1803.02504
https://arxiv.org/abs/1803.02504
id ftdatacite:10.48550/arxiv.1803.02504
record_format openpolar
spelling ftdatacite:10.48550/arxiv.1803.02504 2023-05-15T16:01:20+02:00 Exponential Discriminative Metric Embedding in Deep Learning Wu, Bowen Chen, Zhangling Wang, Jun Wu, Huaming 2018 https://dx.doi.org/10.48550/arxiv.1803.02504 https://arxiv.org/abs/1803.02504 unknown arXiv arXiv.org perpetual, non-exclusive license http://arxiv.org/licenses/nonexclusive-distrib/1.0/ Computer Vision and Pattern Recognition cs.CV Machine Learning cs.LG Machine Learning stat.ML FOS Computer and information sciences Preprint Article article CreativeWork 2018 ftdatacite https://doi.org/10.48550/arxiv.1803.02504 2022-04-01T09:43:48Z With the remarkable success achieved by the Convolutional Neural Networks (CNNs) in object recognition recently, deep learning is being widely used in the computer vision community. Deep Metric Learning (DML), integrating deep learning with conventional metric learning, has set new records in many fields, especially in classification task. In this paper, we propose a replicable DML method, called Include and Exclude (IE) loss, to force the distance between a sample and its designated class center away from the mean distance of this sample to other class centers with a large margin in the exponential feature projection space. With the supervision of IE loss, we can train CNNs to enhance the intra-class compactness and inter-class separability, leading to great improvements on several public datasets ranging from object recognition to face verification. We conduct a comparative study of our algorithm with several typical DML methods on three kinds of networks with different capacity. Extensive experiments on three object recognition datasets and two face recognition datasets demonstrate that IE loss is always superior to other mainstream DML methods and approach the state-of-the-art results. Report DML DataCite Metadata Store (German National Library of Science and Technology)
institution Open Polar
collection DataCite Metadata Store (German National Library of Science and Technology)
op_collection_id ftdatacite
language unknown
topic Computer Vision and Pattern Recognition cs.CV
Machine Learning cs.LG
Machine Learning stat.ML
FOS Computer and information sciences
spellingShingle Computer Vision and Pattern Recognition cs.CV
Machine Learning cs.LG
Machine Learning stat.ML
FOS Computer and information sciences
Wu, Bowen
Chen, Zhangling
Wang, Jun
Wu, Huaming
Exponential Discriminative Metric Embedding in Deep Learning
topic_facet Computer Vision and Pattern Recognition cs.CV
Machine Learning cs.LG
Machine Learning stat.ML
FOS Computer and information sciences
description With the remarkable success achieved by the Convolutional Neural Networks (CNNs) in object recognition recently, deep learning is being widely used in the computer vision community. Deep Metric Learning (DML), integrating deep learning with conventional metric learning, has set new records in many fields, especially in classification task. In this paper, we propose a replicable DML method, called Include and Exclude (IE) loss, to force the distance between a sample and its designated class center away from the mean distance of this sample to other class centers with a large margin in the exponential feature projection space. With the supervision of IE loss, we can train CNNs to enhance the intra-class compactness and inter-class separability, leading to great improvements on several public datasets ranging from object recognition to face verification. We conduct a comparative study of our algorithm with several typical DML methods on three kinds of networks with different capacity. Extensive experiments on three object recognition datasets and two face recognition datasets demonstrate that IE loss is always superior to other mainstream DML methods and approach the state-of-the-art results.
format Report
author Wu, Bowen
Chen, Zhangling
Wang, Jun
Wu, Huaming
author_facet Wu, Bowen
Chen, Zhangling
Wang, Jun
Wu, Huaming
author_sort Wu, Bowen
title Exponential Discriminative Metric Embedding in Deep Learning
title_short Exponential Discriminative Metric Embedding in Deep Learning
title_full Exponential Discriminative Metric Embedding in Deep Learning
title_fullStr Exponential Discriminative Metric Embedding in Deep Learning
title_full_unstemmed Exponential Discriminative Metric Embedding in Deep Learning
title_sort exponential discriminative metric embedding in deep learning
publisher arXiv
publishDate 2018
url https://dx.doi.org/10.48550/arxiv.1803.02504
https://arxiv.org/abs/1803.02504
genre DML
genre_facet DML
op_rights arXiv.org perpetual, non-exclusive license
http://arxiv.org/licenses/nonexclusive-distrib/1.0/
op_doi https://doi.org/10.48550/arxiv.1803.02504
_version_ 1766397238090661888