Dam: Towards a foundation model for time series forecasting

Luke Darlow, Qiwen Deng, Ahmed Hassan, Martin Asenov, Rajkarn Singh, Artjom Joosen, Adam Barker, Amos Storkey

Research output: Contribution to conferencePosterpeer-review

Abstract / Description of output

It is challenging to scale time series forecasting models such that they forecast accurately for multiple distinct domains and datasets, all with potentially different underlying collection procedures (e.g., sample resolution), patterns (e.g., periodicity), and prediction requirements (e.g., reconstruction vs. forecasting). We call this general task universal forecasting. Existing methods usually assume that input data is regularly sampled, and they forecast to pre-determined horizons, resulting in failure to generalise outside of the scope of their training. We propose the DAM - a neural model that takes randomly sampled histories and outputs an adjustable basis composition as a continuous function of time for forecasting to non-fixed horizons. It involves three key components: (1) a flexible approach for using randomly sampled histories from a long-tail distribution, that enables an efficient global perspective of the underlying temporal dynamics while retaining focus on the recent history; (2) a transformer backbone that is trained on these actively sampled histories to produce, as representational output, (3) the basis coefficients of a continuous function of time. We show that a single univariate DAM, trained on 25 time series datasets, either outperformed or closely matched existing SoTA models at multivariate long-term forecasting across 18 datasets, including 8 held-out for zero-shot transfer, even though these models were trained to specialise for each dataset-horizon combination. This single DAM excels at zero-shot transfer and very-long-term forecasting, performs well at imputation, is interpretable via basis function composition and attention, can be tuned for different inference-cost requirements, is robust to missing and irregularly sampled data by design.
Original languageEnglish
Publication statusPublished - 9 May 2024
EventThe Twelfth International Conference on Learning Representations - Vienna, Austria
Duration: 7 May 202411 May 2024
https://iclr.cc/

Conference

ConferenceThe Twelfth International Conference on Learning Representations
Abbreviated titleICLR 2024
Country/TerritoryAustria
CityVienna
Period7/05/2411/05/24
Internet address

Fingerprint

Dive into the research topics of 'Dam: Towards a foundation model for time series forecasting'. Together they form a unique fingerprint.

Cite this