TEMPO: PROMPT-BASED GENERATIVE PRE-TRAINED TRANSFORMER FOR TIME SERIES FORECASTING
TEMPO is a novel framework that utilizes a prompt-based generative pre-trained transformer for time series forecasting. It effectively learns time series representations by decomposing data into trend, seasonal, and residual components and introducing prompt design for distribution adaptation. The model demonstrates superior zero-shot performance and handles multimodal inputs, highlighting its potential as a foundational model for diverse temporal phenomena. ✨
Article Points:
1
TEMPO is a prompt-based GPT for time series forecasting.
2
It decomposes time series into trend, seasonal, and residual components.
3
Prompt design facilitates distribution adaptation across diverse time series.
4
Achieves superior zero-shot performance on benchmark datasets.
5
Effectively leverages multimodal inputs, including textual information.
6
Paves the way for foundational models in time series forecasting.
TEMPO: PROMPT-BASED GENERATIVE PRE-TRAINED TRANSFORMER FOR TIME SERIES FORECASTING
Core Idea

Prompt-based GPT

Generative Pre-trained Transformer

Key Inductive Biases

Decomposition (Trend, Season, Residual)

Prompt Design (Distribution Adaptation)

Architecture

GPT-2 Backbone

STL Decomposition

Soft Prompting & LoRA

Performance

Superior Zero-Shot Forecasting

Multimodal Input Handling

State-of-the-art Results

Future Potential

Foundational Model Framework

Enhanced Numerical Reasoning