Variance-Reduced Stochastic Optimization for Efficient Inference of Hidden Markov Models ...

Hidden Markov models (HMMs) are popular models to identify a finite number of latent states from sequential data. However, fitting them to large data sets can be computationally demanding because most likelihood maximization techniques require iterating through the entire underlying data set for eve...

Full description

Bibliographic Details
Main Authors: Sidrow, Evan, Heckman, Nancy, Bouchard-Côté, Alexandre, Fortune, Sarah M. E., Trites, Andrew W., Auger-Méthé, Marie
Format: Text
Language:unknown
Published: arXiv 2023
Subjects:
Online Access:https://dx.doi.org/10.48550/arxiv.2310.04620
https://arxiv.org/abs/2310.04620
id ftdatacite:10.48550/arxiv.2310.04620
record_format openpolar
spelling ftdatacite:10.48550/arxiv.2310.04620 2024-09-09T20:02:21+00:00 Variance-Reduced Stochastic Optimization for Efficient Inference of Hidden Markov Models ... Sidrow, Evan Heckman, Nancy Bouchard-Côté, Alexandre Fortune, Sarah M. E. Trites, Andrew W. Auger-Méthé, Marie 2023 https://dx.doi.org/10.48550/arxiv.2310.04620 https://arxiv.org/abs/2310.04620 unknown arXiv https://dx.doi.org/10.1080/10618600.2024.2350476 Creative Commons Attribution 4.0 International https://creativecommons.org/licenses/by/4.0/legalcode cc-by-4.0 Computation stat.CO FOS Computer and information sciences Article article-journal Text ScholarlyArticle 2023 ftdatacite https://doi.org/10.48550/arxiv.2310.0462010.1080/10618600.2024.2350476 2024-06-17T08:32:40Z Hidden Markov models (HMMs) are popular models to identify a finite number of latent states from sequential data. However, fitting them to large data sets can be computationally demanding because most likelihood maximization techniques require iterating through the entire underlying data set for every parameter update. We propose a novel optimization algorithm that updates the parameters of an HMM without iterating through the entire data set. Namely, we combine a partial E step with variance-reduced stochastic optimization within the M step. We prove the algorithm converges under certain regularity conditions. We test our algorithm empirically using a simulation study as well as a case study of kinematic data collected using suction-cup attached biologgers from eight northern resident killer whales (Orcinus orca) off the western coast of Canada. In both, our algorithm converges in fewer epochs and to regions of higher likelihood compared to standard numerical optimization techniques. Our algorithm allows ... : 23 pages, 7 figures. Code available at https://github.com/evsi8432/sublinear-HMM-inference ... Text Orca Orcinus orca DataCite Canada
institution Open Polar
collection DataCite
op_collection_id ftdatacite
language unknown
topic Computation stat.CO
FOS Computer and information sciences
spellingShingle Computation stat.CO
FOS Computer and information sciences
Sidrow, Evan
Heckman, Nancy
Bouchard-Côté, Alexandre
Fortune, Sarah M. E.
Trites, Andrew W.
Auger-Méthé, Marie
Variance-Reduced Stochastic Optimization for Efficient Inference of Hidden Markov Models ...
topic_facet Computation stat.CO
FOS Computer and information sciences
description Hidden Markov models (HMMs) are popular models to identify a finite number of latent states from sequential data. However, fitting them to large data sets can be computationally demanding because most likelihood maximization techniques require iterating through the entire underlying data set for every parameter update. We propose a novel optimization algorithm that updates the parameters of an HMM without iterating through the entire data set. Namely, we combine a partial E step with variance-reduced stochastic optimization within the M step. We prove the algorithm converges under certain regularity conditions. We test our algorithm empirically using a simulation study as well as a case study of kinematic data collected using suction-cup attached biologgers from eight northern resident killer whales (Orcinus orca) off the western coast of Canada. In both, our algorithm converges in fewer epochs and to regions of higher likelihood compared to standard numerical optimization techniques. Our algorithm allows ... : 23 pages, 7 figures. Code available at https://github.com/evsi8432/sublinear-HMM-inference ...
format Text
author Sidrow, Evan
Heckman, Nancy
Bouchard-Côté, Alexandre
Fortune, Sarah M. E.
Trites, Andrew W.
Auger-Méthé, Marie
author_facet Sidrow, Evan
Heckman, Nancy
Bouchard-Côté, Alexandre
Fortune, Sarah M. E.
Trites, Andrew W.
Auger-Méthé, Marie
author_sort Sidrow, Evan
title Variance-Reduced Stochastic Optimization for Efficient Inference of Hidden Markov Models ...
title_short Variance-Reduced Stochastic Optimization for Efficient Inference of Hidden Markov Models ...
title_full Variance-Reduced Stochastic Optimization for Efficient Inference of Hidden Markov Models ...
title_fullStr Variance-Reduced Stochastic Optimization for Efficient Inference of Hidden Markov Models ...
title_full_unstemmed Variance-Reduced Stochastic Optimization for Efficient Inference of Hidden Markov Models ...
title_sort variance-reduced stochastic optimization for efficient inference of hidden markov models ...
publisher arXiv
publishDate 2023
url https://dx.doi.org/10.48550/arxiv.2310.04620
https://arxiv.org/abs/2310.04620
geographic Canada
geographic_facet Canada
genre Orca
Orcinus orca
genre_facet Orca
Orcinus orca
op_relation https://dx.doi.org/10.1080/10618600.2024.2350476
op_rights Creative Commons Attribution 4.0 International
https://creativecommons.org/licenses/by/4.0/legalcode
cc-by-4.0
op_doi https://doi.org/10.48550/arxiv.2310.0462010.1080/10618600.2024.2350476
_version_ 1809934305873362944