MES-loss: Mutually Equidistant Separation Metric Learning Loss Function

International audience Deep metric learning has attracted much attention in recent years due to its extensive applications, such as clustering and image retrieval. Thanks to the success of deep learning (DL), many deep metric learning (DML) methods have been proposed. Neural networks (NNs) utilize D...

Full description

Bibliographic Details
Published in:Pattern Recognition Letters
Main Authors: Boutaleb, Yasser, Mohamed, Soladie, Catherine, Duong, Nam-Duong, Kacete, Amine, Royan, Jérôme, Seguier, Renaud
Other Authors: Institut de Recherche Technologique b-com (IRT b-com), Institut d'Électronique et des Technologies du numéRique (IETR), Université de Rennes (UR)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-Nantes Université - pôle Sciences et technologie, Nantes Université (Nantes Univ)-Nantes Université (Nantes Univ), Immersive & Medical Technologies, Applications, Institut de Recherche Technologique b-com (IRT b-com)-Institut de Recherche Technologique b-com (IRT b-com)
Format: Article in Journal/Newspaper
Language:English
Published: HAL CCSD 2023
Subjects:
DML
Online Access:https://hal.science/hal-04122261
https://doi.org/10.1016/j.patrec.2023.06.005
Description
Summary:International audience Deep metric learning has attracted much attention in recent years due to its extensive applications, such as clustering and image retrieval. Thanks to the success of deep learning (DL), many deep metric learning (DML) methods have been proposed. Neural networks (NNs) utilize DML loss functions to learn a mapping function that maps samples into a highly discriminative low-dimensional feature space, facilitating measuring similarities between pairs of samples in such a manifold. Most existing methods usually try to boost the discriminatory power of NN by enhancing intra-class compactness in the high-level feature space. However, they do not explicitly imply constraints to improve inter-class separation. We propose in this paper a new composite DML loss function that, in addition to the intra-class compactness, explicitly implies regulations to enforce the best inter-class separation by mutually equidistantly distributing the centers of the classes. The proposed DML loss function achieved state-of-the-art results for clustering and image-retrieval tasks on two real-world data sets.