VPRTempo: A Fast Temporally Encoded Spiking Neural Network for Visual Place Recognition

Spiking Neural Networks (SNNs) are at the forefront of neuromorphic computing thanks to their potential energy-efficiency, low latencies, and capacity for continual learning. While these capabilities are well suited for robotics tasks, SNNs have seen limited adaptation in this field thus far. This w...

Full description

Bibliographic Details
Published in:2024 IEEE International Conference on Robotics and Automation (ICRA)
Main Authors: Hines, Adam D., Stratton, Peter G., Milford, Michael, Fischer, Tobias
Format: Book Part
Language:unknown
Published: Institute of Electrical and Electronics Engineers Inc. 2024
Subjects:
Online Access:https://eprints.qut.edu.au/251879/
id ftqueensland:oai:eprints.qut.edu.au:251879
record_format openpolar
spelling ftqueensland:oai:eprints.qut.edu.au:251879 2024-09-30T14:39:00+00:00 VPRTempo: A Fast Temporally Encoded Spiking Neural Network for Visual Place Recognition Hines, Adam D. Stratton, Peter G. Milford, Michael Fischer, Tobias 2024 application/pdf https://eprints.qut.edu.au/251879/ unknown Institute of Electrical and Electronics Engineers Inc. https://eprints.qut.edu.au/251879/1/2309.10225v2.pdf doi:10.1109/ICRA57147.2024.10610918 Hines, Adam D., Stratton, Peter G., Milford, Michael, & Fischer, Tobias (2024) VPRTempo: A Fast Temporally Encoded Spiking Neural Network for Visual Place Recognition. In Proceedings of the 2024 IEEE International Conference on Robotics and Automation (ICRA). Institute of Electrical and Electronics Engineers Inc., United States of America, pp. 10200-10207. http://purl.org/au-research/grants/arc/FL210100156 https://eprints.qut.edu.au/251879/ Centre for Robotics; Faculty of Engineering; School of Electrical Engineering & Robotics free_to_read http://creativecommons.org/licenses/by-nc/4.0/ 2024 IEEE 2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. It is a condition of access that users recognise and abide by the legal requirements associated with these rights. If you believe that this work infringes copyright please provide details by email to qut.copyright@qut.edu.au Proceedings of the 2024 IEEE International Conference on Robotics and Automation (ICRA) Chapter in Book, Report or Conference volume 2024 ftqueensland https://doi.org/10.1109/ICRA57147.2024.10610918 2024-09-18T14:26:30Z Spiking Neural Networks (SNNs) are at the forefront of neuromorphic computing thanks to their potential energy-efficiency, low latencies, and capacity for continual learning. While these capabilities are well suited for robotics tasks, SNNs have seen limited adaptation in this field thus far. This work introduces a SNN for Visual Place Recognition (VPR) that is both trainable within minutes and queryable in milliseconds, making it well suited for deployment on compute-constrained robotic systems. Our proposed system, VPRTempo, overcomes slow training and inference times using an abstracted SNN that trades biological realism for efficiency. VPRTempo employs a temporal code that determines the timing of a single spike based on a pixel's intensity, as opposed to prior SNNs relying on rate coding that determined the number of spikes; improving spike efficiency by over 100%. VPRTempo is trained using Spike-Timing Dependent Plasticity and a supervised delta learning rule enforcing that each output spiking neuron responds to just a single place. We evaluate our system on the Nordland and Oxford RobotCar benchmark localization datasets, which include up to 27k places. We found that VPRTempo's accuracy is comparable to prior SNNs and the popular NetVLAD place recognition algorithm, while being several orders of magnitude faster and suitable for real-time deployment - with inference speeds over 50 Hz on CPU. VPRTempo could be integrated as a loop closure component for online SLAM on resource-constrained systems such as space and underwater robots. Book Part Nordland Nordland Nordland Queensland University of Technology: QUT ePrints 2024 IEEE International Conference on Robotics and Automation (ICRA) 10200 10207
institution Open Polar
collection Queensland University of Technology: QUT ePrints
op_collection_id ftqueensland
language unknown
description Spiking Neural Networks (SNNs) are at the forefront of neuromorphic computing thanks to their potential energy-efficiency, low latencies, and capacity for continual learning. While these capabilities are well suited for robotics tasks, SNNs have seen limited adaptation in this field thus far. This work introduces a SNN for Visual Place Recognition (VPR) that is both trainable within minutes and queryable in milliseconds, making it well suited for deployment on compute-constrained robotic systems. Our proposed system, VPRTempo, overcomes slow training and inference times using an abstracted SNN that trades biological realism for efficiency. VPRTempo employs a temporal code that determines the timing of a single spike based on a pixel's intensity, as opposed to prior SNNs relying on rate coding that determined the number of spikes; improving spike efficiency by over 100%. VPRTempo is trained using Spike-Timing Dependent Plasticity and a supervised delta learning rule enforcing that each output spiking neuron responds to just a single place. We evaluate our system on the Nordland and Oxford RobotCar benchmark localization datasets, which include up to 27k places. We found that VPRTempo's accuracy is comparable to prior SNNs and the popular NetVLAD place recognition algorithm, while being several orders of magnitude faster and suitable for real-time deployment - with inference speeds over 50 Hz on CPU. VPRTempo could be integrated as a loop closure component for online SLAM on resource-constrained systems such as space and underwater robots.
format Book Part
author Hines, Adam D.
Stratton, Peter G.
Milford, Michael
Fischer, Tobias
spellingShingle Hines, Adam D.
Stratton, Peter G.
Milford, Michael
Fischer, Tobias
VPRTempo: A Fast Temporally Encoded Spiking Neural Network for Visual Place Recognition
author_facet Hines, Adam D.
Stratton, Peter G.
Milford, Michael
Fischer, Tobias
author_sort Hines, Adam D.
title VPRTempo: A Fast Temporally Encoded Spiking Neural Network for Visual Place Recognition
title_short VPRTempo: A Fast Temporally Encoded Spiking Neural Network for Visual Place Recognition
title_full VPRTempo: A Fast Temporally Encoded Spiking Neural Network for Visual Place Recognition
title_fullStr VPRTempo: A Fast Temporally Encoded Spiking Neural Network for Visual Place Recognition
title_full_unstemmed VPRTempo: A Fast Temporally Encoded Spiking Neural Network for Visual Place Recognition
title_sort vprtempo: a fast temporally encoded spiking neural network for visual place recognition
publisher Institute of Electrical and Electronics Engineers Inc.
publishDate 2024
url https://eprints.qut.edu.au/251879/
genre Nordland
Nordland
Nordland
genre_facet Nordland
Nordland
Nordland
op_source Proceedings of the 2024 IEEE International Conference on Robotics and Automation (ICRA)
op_relation https://eprints.qut.edu.au/251879/1/2309.10225v2.pdf
doi:10.1109/ICRA57147.2024.10610918
Hines, Adam D., Stratton, Peter G., Milford, Michael, & Fischer, Tobias (2024) VPRTempo: A Fast Temporally Encoded Spiking Neural Network for Visual Place Recognition. In Proceedings of the 2024 IEEE International Conference on Robotics and Automation (ICRA). Institute of Electrical and Electronics Engineers Inc., United States of America, pp. 10200-10207.
http://purl.org/au-research/grants/arc/FL210100156
https://eprints.qut.edu.au/251879/
Centre for Robotics; Faculty of Engineering; School of Electrical Engineering & Robotics
op_rights free_to_read
http://creativecommons.org/licenses/by-nc/4.0/
2024 IEEE
2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. It is a condition of access that users recognise and abide by the legal requirements associated with these rights. If you believe that this work infringes copyright please provide details by email to qut.copyright@qut.edu.au
op_doi https://doi.org/10.1109/ICRA57147.2024.10610918
container_title 2024 IEEE International Conference on Robotics and Automation (ICRA)
container_start_page 10200
op_container_end_page 10207
_version_ 1811641557272494080