Acoustic features as a tool to visualize and explore marine soundscapes: Applications illustrated using marine mammal Passive Acoustic Monitoring datasets

Passive Acoustic Monitoring (PAM) is emerging as a solution for monitoring species and environmental change over large spatial and temporal scales. However, drawing rigorous conclusions based on acoustic recordings is challenging, as there is no consensus over which approaches, and indices are best...

Full description

Bibliographic Details
Main Authors: Cominelli, Simone, Bellin, Nicolo', Brown, Carissa D., Lawson, Jack
Format: Other/Unknown Material
Language:unknown
Published: Zenodo 2024
Subjects:
Online Access:https://doi.org/10.5061/dryad.3bk3j9kn8
Description
Summary:Passive Acoustic Monitoring (PAM) is emerging as a solution for monitoring species and environmental change over large spatial and temporal scales. However, drawing rigorous conclusions based on acoustic recordings is challenging, as there is no consensus over which approaches, and indices are best suited for characterizing marine and terrestrial acoustic environments. Here, we describe the application of multiple machine-learning techniques to the analysis of a large PAM dataset. We combine pre-trained acoustic classification models (VGGish, NOAA & Google Humpback Whale Detector), dimensionality reduction (UMAP), and balanced random forest algorithms to demonstrate how machine-learned acoustic features capture different aspects of the marine environment. The UMAP dimensions derived from VGGish acoustic features exhibited good performance in separating marine mammal vocalizations according to species and locations. RF models trained on the acoustic features performed well for labelled sounds in the 8 kHz range, however, low and high-frequency sounds could not be classified using this approach. The workflow presented here shows how acoustic feature extraction, visualization, and analysis allow for establishing a link between ecologically relevant information and PAM recordings at multiple scales. The datasets and scripts provided in this repository allow replicating the results presented in the publication. Funding provided by: Memorial University of Newfoundland Crossref Funder Registry ID: https://ror.org/04haebc03 Award Number: Funding provided by: Fisheries and Oceans Canada Crossref Funder Registry ID: https://ror.org/02qa1x782 Award Number: Funding provided by: University of Parma Crossref Funder Registry ID: https://ror.org/02k7wn190 Award Number: Data acquisition and preparation We collected all records available in the Watkins Marine Mammal Database website listed under the "all cuts'' page. For each audio file in the WMD the associated metadata included a label for the sound sources present in ...