Orca: Scalable Temporal Graph Neural Network Training with Theoretical Guarantees

Representation learning over dynamic graphs is critical for many real-world applications such as social network services and recommender systems. Temporal graph neural networks (T-GNNs) are powerful representation learning methods and have achieved remarkable effectiveness on continuous-time dynamic...

Full description

Bibliographic Details
Published in:Proceedings of the ACM on Management of Data
Main Authors: Li, Yiming, Shen, Yanyan, Chen, Lei, Yuan, Mingxuan
Other Authors: Hong Kong ITC ITF, Hong Kong RGC AOE Project, Hong Kong RGC GRF Project, National Key Research and Development Program of China, Shanghai Municipal Science and Technology Major Project, National Science Foundation of China, Hong Kong RGC CRF Project, Guangdong Basic and Applied Basic Research Foundation, SJTU Global Strategic Partnership Fund, Hong Kong RGC Theme-based project, China NSFC, Microsoft Research Asia Collaborative Research Grant, HKUST-Webank joint research lab grant, HKUST Global Strategic Partnership Fund
Format: Article in Journal/Newspaper
Language:English
Published: Association for Computing Machinery (ACM) 2023
Subjects:
Online Access:http://dx.doi.org/10.1145/3588737
https://dl.acm.org/doi/pdf/10.1145/3588737
id cracm:10.1145/3588737
record_format openpolar
spelling cracm:10.1145/3588737 2024-10-29T17:46:45+00:00 Orca: Scalable Temporal Graph Neural Network Training with Theoretical Guarantees Li, Yiming Shen, Yanyan Chen, Lei Yuan, Mingxuan Hong Kong ITC ITF Hong Kong RGC AOE Project Hong Kong RGC GRF Project National Key Research and Development Program of China Shanghai Municipal Science and Technology Major Project National Science Foundation of China Hong Kong RGC CRF Project Guangdong Basic and Applied Basic Research Foundation SJTU Global Strategic Partnership Fund Hong Kong RGC Theme-based project China NSFC Microsoft Research Asia Collaborative Research Grant HKUST-Webank joint research lab grant HKUST Global Strategic Partnership Fund 2023 http://dx.doi.org/10.1145/3588737 https://dl.acm.org/doi/pdf/10.1145/3588737 en eng Association for Computing Machinery (ACM) Proceedings of the ACM on Management of Data volume 1, issue 1, page 1-27 ISSN 2836-6573 journal-article 2023 cracm https://doi.org/10.1145/3588737 2024-09-30T04:02:00Z Representation learning over dynamic graphs is critical for many real-world applications such as social network services and recommender systems. Temporal graph neural networks (T-GNNs) are powerful representation learning methods and have achieved remarkable effectiveness on continuous-time dynamic graphs. However, T-GNNs still suffer from high time complexity, which increases linearly with the number of timestamps and grows exponentially with the model depth, causing them not scalable to large dynamic graphs. To address the limitations, we propose Orca, a novel framework that accelerates T-GNN training by non-trivially caching and reusing intermediate embeddings. We design an optimal cache replacement algorithm, named MRU, under a practical cache limit. MRU not only improves the efficiency of training T-GNNs by maximizing the number of cache hits but also reduces the approximation errors by avoiding keeping and reusing extremely stale embeddings. Meanwhile, we develop profound theoretical analyses of the approximation error introduced by our reuse schemes and offer rigorous convergence guarantees. Extensive experiments have validated that Orca can obtain two orders of magnitude speedup over the state-of-the-art baselines while achieving higher precision on large dynamic graphs. Article in Journal/Newspaper Orca ACM Publications (Association for Computing Machinery) Proceedings of the ACM on Management of Data 1 1 1 27
institution Open Polar
collection ACM Publications (Association for Computing Machinery)
op_collection_id cracm
language English
description Representation learning over dynamic graphs is critical for many real-world applications such as social network services and recommender systems. Temporal graph neural networks (T-GNNs) are powerful representation learning methods and have achieved remarkable effectiveness on continuous-time dynamic graphs. However, T-GNNs still suffer from high time complexity, which increases linearly with the number of timestamps and grows exponentially with the model depth, causing them not scalable to large dynamic graphs. To address the limitations, we propose Orca, a novel framework that accelerates T-GNN training by non-trivially caching and reusing intermediate embeddings. We design an optimal cache replacement algorithm, named MRU, under a practical cache limit. MRU not only improves the efficiency of training T-GNNs by maximizing the number of cache hits but also reduces the approximation errors by avoiding keeping and reusing extremely stale embeddings. Meanwhile, we develop profound theoretical analyses of the approximation error introduced by our reuse schemes and offer rigorous convergence guarantees. Extensive experiments have validated that Orca can obtain two orders of magnitude speedup over the state-of-the-art baselines while achieving higher precision on large dynamic graphs.
author2 Hong Kong ITC ITF
Hong Kong RGC AOE Project
Hong Kong RGC GRF Project
National Key Research and Development Program of China
Shanghai Municipal Science and Technology Major Project
National Science Foundation of China
Hong Kong RGC CRF Project
Guangdong Basic and Applied Basic Research Foundation
SJTU Global Strategic Partnership Fund
Hong Kong RGC Theme-based project
China NSFC
Microsoft Research Asia Collaborative Research Grant
HKUST-Webank joint research lab grant
HKUST Global Strategic Partnership Fund
format Article in Journal/Newspaper
author Li, Yiming
Shen, Yanyan
Chen, Lei
Yuan, Mingxuan
spellingShingle Li, Yiming
Shen, Yanyan
Chen, Lei
Yuan, Mingxuan
Orca: Scalable Temporal Graph Neural Network Training with Theoretical Guarantees
author_facet Li, Yiming
Shen, Yanyan
Chen, Lei
Yuan, Mingxuan
author_sort Li, Yiming
title Orca: Scalable Temporal Graph Neural Network Training with Theoretical Guarantees
title_short Orca: Scalable Temporal Graph Neural Network Training with Theoretical Guarantees
title_full Orca: Scalable Temporal Graph Neural Network Training with Theoretical Guarantees
title_fullStr Orca: Scalable Temporal Graph Neural Network Training with Theoretical Guarantees
title_full_unstemmed Orca: Scalable Temporal Graph Neural Network Training with Theoretical Guarantees
title_sort orca: scalable temporal graph neural network training with theoretical guarantees
publisher Association for Computing Machinery (ACM)
publishDate 2023
url http://dx.doi.org/10.1145/3588737
https://dl.acm.org/doi/pdf/10.1145/3588737
genre Orca
genre_facet Orca
op_source Proceedings of the ACM on Management of Data
volume 1, issue 1, page 1-27
ISSN 2836-6573
op_doi https://doi.org/10.1145/3588737
container_title Proceedings of the ACM on Management of Data
container_volume 1
container_issue 1
container_start_page 1
op_container_end_page 27
_version_ 1814276255050104832