Mu2ReST: Multi-resolution Recursive Spatio-Temporal Transformer for Long-Term Prediction

Abstract

Long-term spatio-temporal prediction (LTSTP) over different resolutions plays a crucial role in planning and dispatching smart city applications, such as smart transportation and smart grid. The Transformer, which has demonstrated superiority in capturing long-term dependencies, was recently studied for spatio-temporal prediction. However, it is difficult to leverage it using both multi-resolution knowledge and spatio-temporal dependencies to aid LTSTP. The challenge typically lies in addressing two issues: (1) efficiently fusing information across multiple resolutions that demands elaborate and complicated modifications to the model, and (2) handling the necessary long-term sequence that makes concurrent space and time attentions too costly to be performed. To address these issues, we proposed a multi-resolution recursive spatio-temporal transformer (Mu2ReST). It implements a novel multi-resolution structure with recursive prediction from coarser to finer resolutions. This proposal reveals that an arduous modification of the model is not the only way to leverage multi-resolution knowledge. It further uses a redesigned lightweight space-time attention implementation to concurrently capture spatial and temporal dependencies. Experiment results using open and commercial urban datasets demonstrate that Mu2ReST outperforms existing methods for multi-resolution LTSTP tasks.

Publication
Advances in Knowledge Discovery and Data Mining: 26th Pacific-Asia Conference, PAKDD 2022, Chengdu, China, May 16–19, 2022, Proceedings, Part I
Defu Cao
Defu Cao
PhD Candidate
Yan Liu
Yan Liu
Professor, Computer Science Department