Deep neural networks for automated detection of marine mammal species
Authors thank the Bureau of Ocean Energy Management for the funding of MARU deployments, Excelerate Energy Inc. for the funding of Autobuoy deployment, and Michael J. Weise of the US Office of Naval Research for support (N000141712867). Deep neural networks have advanced the field of detection and c...
Published in: | Scientific Reports |
---|---|
Main Authors: | , , , , , , , , , |
Other Authors: | , , , , , |
Format: | Article in Journal/Newspaper |
Language: | English |
Published: |
2020
|
Subjects: | |
Online Access: | http://hdl.handle.net/10023/19305 https://doi.org/10.1038/s41598-020-57549-y |
id |
ftstandrewserep:oai:research-repository.st-andrews.ac.uk:10023/19305 |
---|---|
record_format |
openpolar |
spelling |
ftstandrewserep:oai:research-repository.st-andrews.ac.uk:10023/19305 2023-07-02T03:32:10+02:00 Deep neural networks for automated detection of marine mammal species Shiu, Yu Palmer, Kaitlin Roch, Marie Fleishman, Erica Liu, Xiaobai Nosal, Eva-Marie Helble, Tyler Cholewiak, Danielle Gillespie, Douglas Michael Klinck, Holger University of St Andrews. School of Biology University of St Andrews. Sea Mammal Research Unit University of St Andrews. Scottish Oceans Institute University of St Andrews. Sound Tags Group University of St Andrews. Bioacoustics group University of St Andrews. Marine Alliance for Science & Technology Scotland 2020-01-17T17:30:03Z 12 application/pdf http://hdl.handle.net/10023/19305 https://doi.org/10.1038/s41598-020-57549-y eng eng Scientific Reports Shiu , Y , Palmer , K , Roch , M , Fleishman , E , Liu , X , Nosal , E-M , Helble , T , Cholewiak , D , Gillespie , D M & Klinck , H 2020 , ' Deep neural networks for automated detection of marine mammal species ' , Scientific Reports , vol. 10 , 607 . https://doi.org/10.1038/s41598-020-57549-y 2045-2322 PURE: 265793823 PURE UUID: f425a62a-030f-4d10-8b4f-cf40476980e8 ORCID: /0000-0001-9628-157X/work/67525783 Scopus: 85078027785 WOS: 000548341700001 http://hdl.handle.net/10023/19305 https://doi.org/10.1038/s41598-020-57549-y Copyright © 2020 The Author(s). Open Access. This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/. QH301 Biology 3rd-DAS SDG 14 - Life Below Water QH301 Journal article 2020 ftstandrewserep https://doi.org/10.1038/s41598-020-57549-y 2023-06-13T18:30:29Z Authors thank the Bureau of Ocean Energy Management for the funding of MARU deployments, Excelerate Energy Inc. for the funding of Autobuoy deployment, and Michael J. Weise of the US Office of Naval Research for support (N000141712867). Deep neural networks have advanced the field of detection and classification and allowed for effective identification of signals in challenging data sets. Numerous time-critical conservation needs may benefit from these methods. We developed and empirically studied a variety of deep neural networks to detect the vocalizations of endangered North Atlantic right whales (Eubalaena glacialis). We compared the performance of these deep architectures to that of traditional detection algorithms for the primary vocalization produced by this species, the upcall. We show that deep-learning architectures are capable of producing false-positive rates that are orders of magnitude lower than alternative algorithms while substantially increasing the ability to detect calls. We demonstrate that a deep neural network trained with recordings from a single geographic region recorded over a span of days is capable of generalizing well to data from multiple years and across the species’ range, and that the low false positives make the output of the algorithm amenable to quality control for verification. The deep neural networks we developed are relatively easy to implement with existing software, and may provide new insights applicable to the conservation of endangered species. Publisher PDF Peer reviewed Article in Journal/Newspaper Eubalaena glacialis North Atlantic University of St Andrews: Digital Research Repository Scientific Reports 10 1 |
institution |
Open Polar |
collection |
University of St Andrews: Digital Research Repository |
op_collection_id |
ftstandrewserep |
language |
English |
topic |
QH301 Biology 3rd-DAS SDG 14 - Life Below Water QH301 |
spellingShingle |
QH301 Biology 3rd-DAS SDG 14 - Life Below Water QH301 Shiu, Yu Palmer, Kaitlin Roch, Marie Fleishman, Erica Liu, Xiaobai Nosal, Eva-Marie Helble, Tyler Cholewiak, Danielle Gillespie, Douglas Michael Klinck, Holger Deep neural networks for automated detection of marine mammal species |
topic_facet |
QH301 Biology 3rd-DAS SDG 14 - Life Below Water QH301 |
description |
Authors thank the Bureau of Ocean Energy Management for the funding of MARU deployments, Excelerate Energy Inc. for the funding of Autobuoy deployment, and Michael J. Weise of the US Office of Naval Research for support (N000141712867). Deep neural networks have advanced the field of detection and classification and allowed for effective identification of signals in challenging data sets. Numerous time-critical conservation needs may benefit from these methods. We developed and empirically studied a variety of deep neural networks to detect the vocalizations of endangered North Atlantic right whales (Eubalaena glacialis). We compared the performance of these deep architectures to that of traditional detection algorithms for the primary vocalization produced by this species, the upcall. We show that deep-learning architectures are capable of producing false-positive rates that are orders of magnitude lower than alternative algorithms while substantially increasing the ability to detect calls. We demonstrate that a deep neural network trained with recordings from a single geographic region recorded over a span of days is capable of generalizing well to data from multiple years and across the species’ range, and that the low false positives make the output of the algorithm amenable to quality control for verification. The deep neural networks we developed are relatively easy to implement with existing software, and may provide new insights applicable to the conservation of endangered species. Publisher PDF Peer reviewed |
author2 |
University of St Andrews. School of Biology University of St Andrews. Sea Mammal Research Unit University of St Andrews. Scottish Oceans Institute University of St Andrews. Sound Tags Group University of St Andrews. Bioacoustics group University of St Andrews. Marine Alliance for Science & Technology Scotland |
format |
Article in Journal/Newspaper |
author |
Shiu, Yu Palmer, Kaitlin Roch, Marie Fleishman, Erica Liu, Xiaobai Nosal, Eva-Marie Helble, Tyler Cholewiak, Danielle Gillespie, Douglas Michael Klinck, Holger |
author_facet |
Shiu, Yu Palmer, Kaitlin Roch, Marie Fleishman, Erica Liu, Xiaobai Nosal, Eva-Marie Helble, Tyler Cholewiak, Danielle Gillespie, Douglas Michael Klinck, Holger |
author_sort |
Shiu, Yu |
title |
Deep neural networks for automated detection of marine mammal species |
title_short |
Deep neural networks for automated detection of marine mammal species |
title_full |
Deep neural networks for automated detection of marine mammal species |
title_fullStr |
Deep neural networks for automated detection of marine mammal species |
title_full_unstemmed |
Deep neural networks for automated detection of marine mammal species |
title_sort |
deep neural networks for automated detection of marine mammal species |
publishDate |
2020 |
url |
http://hdl.handle.net/10023/19305 https://doi.org/10.1038/s41598-020-57549-y |
genre |
Eubalaena glacialis North Atlantic |
genre_facet |
Eubalaena glacialis North Atlantic |
op_relation |
Scientific Reports Shiu , Y , Palmer , K , Roch , M , Fleishman , E , Liu , X , Nosal , E-M , Helble , T , Cholewiak , D , Gillespie , D M & Klinck , H 2020 , ' Deep neural networks for automated detection of marine mammal species ' , Scientific Reports , vol. 10 , 607 . https://doi.org/10.1038/s41598-020-57549-y 2045-2322 PURE: 265793823 PURE UUID: f425a62a-030f-4d10-8b4f-cf40476980e8 ORCID: /0000-0001-9628-157X/work/67525783 Scopus: 85078027785 WOS: 000548341700001 http://hdl.handle.net/10023/19305 https://doi.org/10.1038/s41598-020-57549-y |
op_rights |
Copyright © 2020 The Author(s). Open Access. This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/. |
op_doi |
https://doi.org/10.1038/s41598-020-57549-y |
container_title |
Scientific Reports |
container_volume |
10 |
container_issue |
1 |
_version_ |
1770271668828110848 |