Generalizable Embeddings with Cross-batch Metric Learning ...

IEEE ICIP 2023, Hybrid Event, 8-11 October 2023, Kuala Lumpur, Malaysia ... : Global average pooling (GAP) is a popular component in deep metric learning (DML) for aggregating features. Its effectiveness is often attributed to treating each feature vector as a distinct semantic entity and GAP as a c...

Full description

Bibliographic Details
Main Authors: Yeti Z. Gurbuz, A. Aydın Alatan
Format: Article in Journal/Newspaper
Language:unknown
Published: IEEE 2023
Subjects:
DML
Online Access:https://dx.doi.org/10.17023/pv10-4578
https://rc.signalprocessingsociety.org/conferences/icip-2023/spsicip23vid0152
Description
Summary:IEEE ICIP 2023, Hybrid Event, 8-11 October 2023, Kuala Lumpur, Malaysia ... : Global average pooling (GAP) is a popular component in deep metric learning (DML) for aggregating features. Its effectiveness is often attributed to treating each feature vector as a distinct semantic entity and GAP as a combination of them. Albeit substantiated, such an explanation's algorithmic implications to learn generalizable entities to represent unseen classes, a crucial DML goal, remain unclear. To address this, we formulate GAP as a convex combination of learnable prototypes. We then show that the prototype learning can be expressed as a recursive process fitting a linear predictor to a batch of samples. Building on that perspective, we consider two batches of disjoint classes at each iteration and regularize the learning by expressing the samples of a batch with the prototypes that are fitted to the other batch. We validate our approach on 4 popular DML benchmarks. ...