Anti-Collapse Loss for Deep Metric Learning Based on Coding Rate Metric ...

Deep metric learning (DML) aims to learn a discriminative high-dimensional embedding space for downstream tasks like classification, clustering, and retrieval. Prior literature predominantly focuses on pair-based and proxy-based methods to maximize inter-class discrepancy and minimize intra-class di...

Full description

Bibliographic Details
Main Authors: Jiang, Xiruo, Yao, Yazhou, Dai, Xili, Shen, Fumin, Hua, Xian-Sheng, Shen, Heng-Tao
Format: Report
Language:unknown
Published: arXiv 2024
Subjects:
DML
Online Access:https://dx.doi.org/10.48550/arxiv.2407.03106
https://arxiv.org/abs/2407.03106
Description
Summary:Deep metric learning (DML) aims to learn a discriminative high-dimensional embedding space for downstream tasks like classification, clustering, and retrieval. Prior literature predominantly focuses on pair-based and proxy-based methods to maximize inter-class discrepancy and minimize intra-class diversity. However, these methods tend to suffer from the collapse of the embedding space due to their over-reliance on label information. This leads to sub-optimal feature representation and inferior model performance. To maintain the structure of embedding space and avoid feature collapse, we propose a novel loss function called Anti-Collapse Loss. Specifically, our proposed loss primarily draws inspiration from the principle of Maximal Coding Rate Reduction. It promotes the sparseness of feature clusters in the embedding space to prevent collapse by maximizing the average coding rate of sample features or class proxies. Moreover, we integrate our proposed loss with pair-based and proxy-based methods, resulting in ... : accepted by IEEE Transactions on Multimedia ...