SoftTriple Loss: Deep Metric Learning Without Triplet Sampling

Distance metric learning (DML) is to learn the embeddings where examples from the same class are closer than examples from different classes. It can be cast as an optimization problem with triplet constraints. Due to the vast number of triplet constraints, a sampling strategy is essential for DML. W...

Full description

Bibliographic Details
Published in:2019 IEEE/CVF International Conference on Computer Vision (ICCV)
Main Authors: Qian, Qi, Shang, Lei, Sun, Baigui, Hu, Juhua, Li, Hao, Jin, Rong
Format: Text
Language:unknown
Published: UW Tacoma Digital Commons 2019
Subjects:
DML
Online Access:https://digitalcommons.tacoma.uw.edu/tech_pub/366
https://doi.org/10.1109/ICCV.2019.00655
id ftuniwashingtaco:oai:digitalcommons.tacoma.uw.edu:tech_pub-1365
record_format openpolar
spelling ftuniwashingtaco:oai:digitalcommons.tacoma.uw.edu:tech_pub-1365 2023-09-05T13:19:05+02:00 SoftTriple Loss: Deep Metric Learning Without Triplet Sampling Qian, Qi Shang, Lei Sun, Baigui Hu, Juhua Li, Hao Jin, Rong 2019-09-11T07:00:00Z https://digitalcommons.tacoma.uw.edu/tech_pub/366 https://doi.org/10.1109/ICCV.2019.00655 unknown UW Tacoma Digital Commons https://digitalcommons.tacoma.uw.edu/tech_pub/366 doi:10.1109/ICCV.2019.00655 School of Engineering and Technology Publications Measurement task analysis neural networks principal component analysis optimization benchmark testing data models text 2019 ftuniwashingtaco https://doi.org/10.1109/ICCV.2019.00655 2023-08-21T14:17:32Z Distance metric learning (DML) is to learn the embeddings where examples from the same class are closer than examples from different classes. It can be cast as an optimization problem with triplet constraints. Due to the vast number of triplet constraints, a sampling strategy is essential for DML. With the tremendous success of deep learning in classifications, it has been applied for DML. When learning embeddings with deep neural networks (DNNs), only a mini-batch of data is available at each iteration. The set of triplet constraints has to be sampled within the mini-batch. Since a mini-batch cannot capture the neighbors in the original set well, it makes the learned embeddings sub-optimal. On the contrary, optimizing SoftMax loss, which is a classification loss, with DNN shows a superior performance in certain DML tasks. It inspires us to investigate the formulation of SoftMax. Our analysis shows that SoftMax loss is equivalent to a smoothed triplet loss where each class has a single center. In real-world data, one class can contain several local clusters rather than a single one, e.g., birds of different poses. Therefore, we propose the SoftTriple loss to extend the SoftMax loss with multiple centers for each class. Compared with conventional deep metric learning algorithms, optimizing SoftTriple loss can learn the embeddings without the sampling phase by mildly increasing the size of the last fully connected layer. Experiments on the benchmark fine-grained data sets demonstrate the effectiveness of the proposed loss function. Text DML University of Washington: UW Tacoma Digital Commons 2019 IEEE/CVF International Conference on Computer Vision (ICCV) 6449 6457
institution Open Polar
collection University of Washington: UW Tacoma Digital Commons
op_collection_id ftuniwashingtaco
language unknown
topic Measurement
task analysis
neural networks
principal component analysis
optimization
benchmark testing
data models
spellingShingle Measurement
task analysis
neural networks
principal component analysis
optimization
benchmark testing
data models
Qian, Qi
Shang, Lei
Sun, Baigui
Hu, Juhua
Li, Hao
Jin, Rong
SoftTriple Loss: Deep Metric Learning Without Triplet Sampling
topic_facet Measurement
task analysis
neural networks
principal component analysis
optimization
benchmark testing
data models
description Distance metric learning (DML) is to learn the embeddings where examples from the same class are closer than examples from different classes. It can be cast as an optimization problem with triplet constraints. Due to the vast number of triplet constraints, a sampling strategy is essential for DML. With the tremendous success of deep learning in classifications, it has been applied for DML. When learning embeddings with deep neural networks (DNNs), only a mini-batch of data is available at each iteration. The set of triplet constraints has to be sampled within the mini-batch. Since a mini-batch cannot capture the neighbors in the original set well, it makes the learned embeddings sub-optimal. On the contrary, optimizing SoftMax loss, which is a classification loss, with DNN shows a superior performance in certain DML tasks. It inspires us to investigate the formulation of SoftMax. Our analysis shows that SoftMax loss is equivalent to a smoothed triplet loss where each class has a single center. In real-world data, one class can contain several local clusters rather than a single one, e.g., birds of different poses. Therefore, we propose the SoftTriple loss to extend the SoftMax loss with multiple centers for each class. Compared with conventional deep metric learning algorithms, optimizing SoftTriple loss can learn the embeddings without the sampling phase by mildly increasing the size of the last fully connected layer. Experiments on the benchmark fine-grained data sets demonstrate the effectiveness of the proposed loss function.
format Text
author Qian, Qi
Shang, Lei
Sun, Baigui
Hu, Juhua
Li, Hao
Jin, Rong
author_facet Qian, Qi
Shang, Lei
Sun, Baigui
Hu, Juhua
Li, Hao
Jin, Rong
author_sort Qian, Qi
title SoftTriple Loss: Deep Metric Learning Without Triplet Sampling
title_short SoftTriple Loss: Deep Metric Learning Without Triplet Sampling
title_full SoftTriple Loss: Deep Metric Learning Without Triplet Sampling
title_fullStr SoftTriple Loss: Deep Metric Learning Without Triplet Sampling
title_full_unstemmed SoftTriple Loss: Deep Metric Learning Without Triplet Sampling
title_sort softtriple loss: deep metric learning without triplet sampling
publisher UW Tacoma Digital Commons
publishDate 2019
url https://digitalcommons.tacoma.uw.edu/tech_pub/366
https://doi.org/10.1109/ICCV.2019.00655
genre DML
genre_facet DML
op_source School of Engineering and Technology Publications
op_relation https://digitalcommons.tacoma.uw.edu/tech_pub/366
doi:10.1109/ICCV.2019.00655
op_doi https://doi.org/10.1109/ICCV.2019.00655
container_title 2019 IEEE/CVF International Conference on Computer Vision (ICCV)
container_start_page 6449
op_container_end_page 6457
_version_ 1776199896955617280