Unsupervised metric learning with synthetic examples

Distance Metric Learning (DML) involves learning an embedding that brings similar examples closer while moving away dissimilar ones. Existing DML approaches make use of class labels to generate constraints for metric learning. In this paper, we address the less-studied problem of learning a metric i...

Full description

Bibliographic Details
Published in:Proceedings of the AAAI Conference on Artificial Intelligence
Main Authors: Dutta, Ujjal K., Harandi, Mehrtash, Sekhar, C. Chandra
Other Authors: Conitzer, Vincent, Sha, Fei
Format: Other Non-Article Part of Journal/Newspaper
Language:English
Published: Association for the Advancement of Artificial Intelligence (AAAI) 2020
Subjects:
DML
Online Access:https://research.monash.edu/en/publications/57ff31f8-f673-4448-8a1d-1a51ede2255d
https://doi.org/10.1609/aaai.v34i04.5795
https://researchmgt.monash.edu/ws/files/381951922/351816113_oa.pdf
http://www.scopus.com/inward/record.url?scp=85106445643&partnerID=8YFLogxK
Description
Summary:Distance Metric Learning (DML) involves learning an embedding that brings similar examples closer while moving away dissimilar ones. Existing DML approaches make use of class labels to generate constraints for metric learning. In this paper, we address the less-studied problem of learning a metric in an unsupervised manner. We do not make use of class labels, but use unlabeled data to generate adversarial, synthetic constraints for learning a metric inducing embedding. Being a measure of uncertainty, we minimize the entropy of a conditional probability to learn the metric. Our stochastic formulation scales well to large datasets, and performs competitive to existing metric learning methods.