id ftstandrewserep:oai:research-repository.st-andrews.ac.uk:10023/19834
record_format openpolar
spelling ftstandrewserep:oai:research-repository.st-andrews.ac.uk:10023/19834 2023-07-02T03:33:52+02:00 Learning deep models from synthetic data for extracting dolphin whistle contours Li, Pu Liu, Xiaobai Palmer, Kaitlin Fleishman, Erica Gillespie, Douglas Michael Nosal, Eva-Marie Shiu, Yu Klinck, Holger Cholewiak, Danielle Helble, Tyler Roch, Marie University of St Andrews. School of Biology University of St Andrews. Sea Mammal Research Unit University of St Andrews. Scottish Oceans Institute University of St Andrews. Sound Tags Group University of St Andrews. Bioacoustics group University of St Andrews. Marine Alliance for Science & Technology Scotland 2020-04-21T15:30:05Z 10 application/pdf http://hdl.handle.net/10023/19834 https://doi.org/10.1109/IJCNN48605.2020.9206992 eng eng IEEE Computer Society 2020 International Joint Conference on Neural Networks, IJCNN 2020 - Proceedings Proceedings of the International Joint Conference on Neural Networks Li , P , Liu , X , Palmer , K , Fleishman , E , Gillespie , D M , Nosal , E-M , Shiu , Y , Klinck , H , Cholewiak , D , Helble , T & Roch , M 2020 , Learning deep models from synthetic data for extracting dolphin whistle contours . in 2020 International Joint Conference on Neural Networks, IJCNN 2020 - Proceedings . , 9206992 , Proceedings of the International Joint Conference on Neural Networks , IEEE Computer Society , IEEE World Congress on Computational Intelligence (IEEE WCCI) - 2020 International Joint Conference on Neural Networks (IJCNN 2020) , Glasgow , United Kingdom , 19/07/20 . https://doi.org/10.1109/IJCNN48605.2020.9206992 conference 9781728169262 PURE: 267536251 PURE UUID: bbb67a66-ee56-4858-90d4-201454b839fb Scopus: 85093866240 WOS: 000626021403027 ORCID: /0000-0001-9628-157X/work/115631178 http://hdl.handle.net/10023/19834 https://doi.org/10.1109/IJCNN48605.2020.9206992 Copyright © 2020 IEEE. This work has been made available online in accordance with publisher policies or with permission. Permission for further reuse of this content should be sought from the publisher or the rights holder. This is the author created accepted manuscript following peer review and may differ slightly from the final published version. The final published version of this work is available at https://ieeexplore.ieee.org Whistle contour extraction Deep neural network Data synthesis Acoustic Odontocetes QA75 Electronic computers. Computer science QH301 Biology Software Artificial Intelligence 3rd-DAS QA75 QH301 Conference item 2020 ftstandrewserep https://doi.org/10.1109/IJCNN48605.2020.9206992 2023-06-13T18:29:40Z We present a learning-based method for extracting whistles of toothed whales (Odontoceti) in hydrophone recordings. Our method represents audio signals as time-frequency spectrograms and decomposes each spectrogram into a set of time-frequency patches. A deep neural network learns archetypical patterns (e.g., crossings, frequency modulated sweeps) from the spectrogram patches and predicts time-frequency peaks that are associated with whistles. We also developed a comprehensive method to synthesize training samples from background environments and train the network with minimal human annotation effort. We applied the proposed learn-from-synthesis method to a subset of the public Detection, Classification, Localization, and Density Estimation (DCLDE) 2011 workshop data to extract whistle confidence maps, which we then processed with an existing contour extractor to produce whistle annotations. The F1-score of our best synthesis method was 0.158 greater than our baseline whistle extraction algorithm (~25% improvement) when applied to common dolphin (Delphinus spp.) and bottlenose dolphin (Tursiops truncatus) whistles. Postprint Conference Object toothed whales University of St Andrews: Digital Research Repository 2020 International Joint Conference on Neural Networks (IJCNN) 1 10
institution Open Polar
collection University of St Andrews: Digital Research Repository
op_collection_id ftstandrewserep
language English
topic Whistle contour extraction
Deep neural network
Data synthesis
Acoustic
Odontocetes
QA75 Electronic computers. Computer science
QH301 Biology
Software
Artificial Intelligence
3rd-DAS
QA75
QH301
spellingShingle Whistle contour extraction
Deep neural network
Data synthesis
Acoustic
Odontocetes
QA75 Electronic computers. Computer science
QH301 Biology
Software
Artificial Intelligence
3rd-DAS
QA75
QH301
Li, Pu
Liu, Xiaobai
Palmer, Kaitlin
Fleishman, Erica
Gillespie, Douglas Michael
Nosal, Eva-Marie
Shiu, Yu
Klinck, Holger
Cholewiak, Danielle
Helble, Tyler
Roch, Marie
Learning deep models from synthetic data for extracting dolphin whistle contours
topic_facet Whistle contour extraction
Deep neural network
Data synthesis
Acoustic
Odontocetes
QA75 Electronic computers. Computer science
QH301 Biology
Software
Artificial Intelligence
3rd-DAS
QA75
QH301
description We present a learning-based method for extracting whistles of toothed whales (Odontoceti) in hydrophone recordings. Our method represents audio signals as time-frequency spectrograms and decomposes each spectrogram into a set of time-frequency patches. A deep neural network learns archetypical patterns (e.g., crossings, frequency modulated sweeps) from the spectrogram patches and predicts time-frequency peaks that are associated with whistles. We also developed a comprehensive method to synthesize training samples from background environments and train the network with minimal human annotation effort. We applied the proposed learn-from-synthesis method to a subset of the public Detection, Classification, Localization, and Density Estimation (DCLDE) 2011 workshop data to extract whistle confidence maps, which we then processed with an existing contour extractor to produce whistle annotations. The F1-score of our best synthesis method was 0.158 greater than our baseline whistle extraction algorithm (~25% improvement) when applied to common dolphin (Delphinus spp.) and bottlenose dolphin (Tursiops truncatus) whistles. Postprint
author2 University of St Andrews. School of Biology
University of St Andrews. Sea Mammal Research Unit
University of St Andrews. Scottish Oceans Institute
University of St Andrews. Sound Tags Group
University of St Andrews. Bioacoustics group
University of St Andrews. Marine Alliance for Science & Technology Scotland
format Conference Object
author Li, Pu
Liu, Xiaobai
Palmer, Kaitlin
Fleishman, Erica
Gillespie, Douglas Michael
Nosal, Eva-Marie
Shiu, Yu
Klinck, Holger
Cholewiak, Danielle
Helble, Tyler
Roch, Marie
author_facet Li, Pu
Liu, Xiaobai
Palmer, Kaitlin
Fleishman, Erica
Gillespie, Douglas Michael
Nosal, Eva-Marie
Shiu, Yu
Klinck, Holger
Cholewiak, Danielle
Helble, Tyler
Roch, Marie
author_sort Li, Pu
title Learning deep models from synthetic data for extracting dolphin whistle contours
title_short Learning deep models from synthetic data for extracting dolphin whistle contours
title_full Learning deep models from synthetic data for extracting dolphin whistle contours
title_fullStr Learning deep models from synthetic data for extracting dolphin whistle contours
title_full_unstemmed Learning deep models from synthetic data for extracting dolphin whistle contours
title_sort learning deep models from synthetic data for extracting dolphin whistle contours
publisher IEEE Computer Society
publishDate 2020
url http://hdl.handle.net/10023/19834
https://doi.org/10.1109/IJCNN48605.2020.9206992
genre toothed whales
genre_facet toothed whales
op_relation 2020 International Joint Conference on Neural Networks, IJCNN 2020 - Proceedings
Proceedings of the International Joint Conference on Neural Networks
Li , P , Liu , X , Palmer , K , Fleishman , E , Gillespie , D M , Nosal , E-M , Shiu , Y , Klinck , H , Cholewiak , D , Helble , T & Roch , M 2020 , Learning deep models from synthetic data for extracting dolphin whistle contours . in 2020 International Joint Conference on Neural Networks, IJCNN 2020 - Proceedings . , 9206992 , Proceedings of the International Joint Conference on Neural Networks , IEEE Computer Society , IEEE World Congress on Computational Intelligence (IEEE WCCI) - 2020 International Joint Conference on Neural Networks (IJCNN 2020) , Glasgow , United Kingdom , 19/07/20 . https://doi.org/10.1109/IJCNN48605.2020.9206992
conference
9781728169262
PURE: 267536251
PURE UUID: bbb67a66-ee56-4858-90d4-201454b839fb
Scopus: 85093866240
WOS: 000626021403027
ORCID: /0000-0001-9628-157X/work/115631178
http://hdl.handle.net/10023/19834
https://doi.org/10.1109/IJCNN48605.2020.9206992
op_rights Copyright © 2020 IEEE. This work has been made available online in accordance with publisher policies or with permission. Permission for further reuse of this content should be sought from the publisher or the rights holder. This is the author created accepted manuscript following peer review and may differ slightly from the final published version. The final published version of this work is available at https://ieeexplore.ieee.org
op_doi https://doi.org/10.1109/IJCNN48605.2020.9206992
container_title 2020 International Joint Conference on Neural Networks (IJCNN)
container_start_page 1
op_container_end_page 10
_version_ 1770273990481281024