Deep Metric Learning with Chance Constraints ...

Deep metric learning (DML) aims to minimize empirical expected loss of the pairwise intra-/inter- class proximity violations in the embedding space. We relate DML to feasibility problem of finite chance constraints. We show that minimizer of proxy-based DML satisfies certain chance constraints, and...

Full description

Bibliographic Details
Main Authors: Gurbuz, Yeti Z., Can, Ogul, Alatan, A. Aydin
Format: Article in Journal/Newspaper
Language:unknown
Published: arXiv 2022
Subjects:
DML
Online Access:https://dx.doi.org/10.48550/arxiv.2209.09060
https://arxiv.org/abs/2209.09060
Description
Summary:Deep metric learning (DML) aims to minimize empirical expected loss of the pairwise intra-/inter- class proximity violations in the embedding space. We relate DML to feasibility problem of finite chance constraints. We show that minimizer of proxy-based DML satisfies certain chance constraints, and that the worst case generalization performance of the proxy-based methods can be characterized by the radius of the smallest ball around a class proxy to cover the entire domain of the corresponding class samples, suggesting multiple proxies per class helps performance. To provide a scalable algorithm as well as exploiting more proxies, we consider the chance constraints implied by the minimizers of proxy-based DML instances and reformulate DML as finding a feasible point in intersection of such constraints, resulting in a problem to be approximately solved by iterative projections. Simply put, we repeatedly train a regularized proxy-based loss and re-initialize the proxies with the embeddings of the deliberately ... : Accepted as a conference paper at IEEE/CVF Winter Conference on Applications of Computer Vision (WACV) 2024 ...