Robust Re-identification of Manta Rays from Natural Markings by Learning Pose Invariant Embeddings
Visual identification of individual animals that bear unique natural body markings is an important task in wildlife conservation. The photo databases of animal markings grow larger and each new observation has to be matched against thousands of images. Existing photo-identification solutions have co...
Main Authors: | , , , , |
---|---|
Format: | Report |
Language: | unknown |
Published: |
arXiv
2019
|
Subjects: | |
Online Access: | https://dx.doi.org/10.48550/arxiv.1902.10847 https://arxiv.org/abs/1902.10847 |
id |
ftdatacite:10.48550/arxiv.1902.10847 |
---|---|
record_format |
openpolar |
spelling |
ftdatacite:10.48550/arxiv.1902.10847 2023-05-15T16:36:05+02:00 Robust Re-identification of Manta Rays from Natural Markings by Learning Pose Invariant Embeddings Moskvyak, Olga Maire, Frederic Armstrong, Asia O. Dayoub, Feras Baktashmotlagh, Mahsa 2019 https://dx.doi.org/10.48550/arxiv.1902.10847 https://arxiv.org/abs/1902.10847 unknown arXiv arXiv.org perpetual, non-exclusive license http://arxiv.org/licenses/nonexclusive-distrib/1.0/ Computer Vision and Pattern Recognition cs.CV FOS Computer and information sciences Preprint Article article CreativeWork 2019 ftdatacite https://doi.org/10.48550/arxiv.1902.10847 2022-04-01T08:50:40Z Visual identification of individual animals that bear unique natural body markings is an important task in wildlife conservation. The photo databases of animal markings grow larger and each new observation has to be matched against thousands of images. Existing photo-identification solutions have constraints on image quality and appearance of the pattern of interest in the image. These constraints limit the use of photos from citizen scientists. We present a novel system for visual re-identification based on unique natural markings that is robust to occlusions, viewpoint and illumination changes. We adapt methods developed for face re-identification and implement a deep convolutional neural network (CNN) to learn embeddings for images of natural markings. The distance between the learned embedding points provides a dissimilarity measure between the corresponding input images. The network is optimized using the triplet loss function and the online semi-hard triplet mining strategy. The proposed re-identification method is generic and not species specific. We evaluate the proposed system on image databases of manta ray belly patterns and humpback whale flukes. To be of practical value and adopted by marine biologists, a re-identification system needs to have a top-10 accuracy of at least 95%. The proposed system achieves this performance standard. : 12 pages, 15 figures Report Humpback Whale DataCite Metadata Store (German National Library of Science and Technology) |
institution |
Open Polar |
collection |
DataCite Metadata Store (German National Library of Science and Technology) |
op_collection_id |
ftdatacite |
language |
unknown |
topic |
Computer Vision and Pattern Recognition cs.CV FOS Computer and information sciences |
spellingShingle |
Computer Vision and Pattern Recognition cs.CV FOS Computer and information sciences Moskvyak, Olga Maire, Frederic Armstrong, Asia O. Dayoub, Feras Baktashmotlagh, Mahsa Robust Re-identification of Manta Rays from Natural Markings by Learning Pose Invariant Embeddings |
topic_facet |
Computer Vision and Pattern Recognition cs.CV FOS Computer and information sciences |
description |
Visual identification of individual animals that bear unique natural body markings is an important task in wildlife conservation. The photo databases of animal markings grow larger and each new observation has to be matched against thousands of images. Existing photo-identification solutions have constraints on image quality and appearance of the pattern of interest in the image. These constraints limit the use of photos from citizen scientists. We present a novel system for visual re-identification based on unique natural markings that is robust to occlusions, viewpoint and illumination changes. We adapt methods developed for face re-identification and implement a deep convolutional neural network (CNN) to learn embeddings for images of natural markings. The distance between the learned embedding points provides a dissimilarity measure between the corresponding input images. The network is optimized using the triplet loss function and the online semi-hard triplet mining strategy. The proposed re-identification method is generic and not species specific. We evaluate the proposed system on image databases of manta ray belly patterns and humpback whale flukes. To be of practical value and adopted by marine biologists, a re-identification system needs to have a top-10 accuracy of at least 95%. The proposed system achieves this performance standard. : 12 pages, 15 figures |
format |
Report |
author |
Moskvyak, Olga Maire, Frederic Armstrong, Asia O. Dayoub, Feras Baktashmotlagh, Mahsa |
author_facet |
Moskvyak, Olga Maire, Frederic Armstrong, Asia O. Dayoub, Feras Baktashmotlagh, Mahsa |
author_sort |
Moskvyak, Olga |
title |
Robust Re-identification of Manta Rays from Natural Markings by Learning Pose Invariant Embeddings |
title_short |
Robust Re-identification of Manta Rays from Natural Markings by Learning Pose Invariant Embeddings |
title_full |
Robust Re-identification of Manta Rays from Natural Markings by Learning Pose Invariant Embeddings |
title_fullStr |
Robust Re-identification of Manta Rays from Natural Markings by Learning Pose Invariant Embeddings |
title_full_unstemmed |
Robust Re-identification of Manta Rays from Natural Markings by Learning Pose Invariant Embeddings |
title_sort |
robust re-identification of manta rays from natural markings by learning pose invariant embeddings |
publisher |
arXiv |
publishDate |
2019 |
url |
https://dx.doi.org/10.48550/arxiv.1902.10847 https://arxiv.org/abs/1902.10847 |
genre |
Humpback Whale |
genre_facet |
Humpback Whale |
op_rights |
arXiv.org perpetual, non-exclusive license http://arxiv.org/licenses/nonexclusive-distrib/1.0/ |
op_doi |
https://doi.org/10.48550/arxiv.1902.10847 |
_version_ |
1766026392046141440 |