A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://huggingface.co/papers/2211.14730 below:

Website Navigation


Paper page - A Time Series is Worth 64 Words: Long-term Forecasting with Transformers

A Time Series is Worth 64 Words: Long-term Forecasting with Transformers

Published on Nov 27, 2022

Abstract

PatchTST, a channel-independent patch-based Transformer model, enhances multivariate time series forecasting and self-supervised learning, showing superior performance and long-term forecasting accuracy.

AI-generated summary

We propose an efficient design of Transformer-based models for multivariate time series forecasting and self-supervised representation learning. It is based on two key components: (i) segmentation of time series into subseries-level patches which are served as input tokens to Transformer; (ii) channel-independence where each channel contains a single univariate time series that shares the same embedding and Transformer weights across all the series. Patching design naturally has three-fold benefit: local semantic information is retained in the embedding; computation and memory usage of the attention maps are quadratically reduced given the same look-back window; and the model can attend longer history. Our channel-independent patch time series Transformer (PatchTST) can improve the long-term forecasting accuracy significantly when compared with that of SOTA Transformer-based models. We also apply our model to self-supervised pre-training tasks and attain excellent fine-tuning performance, which outperforms supervised training on large datasets. Transferring of masked pre-trained representation on one dataset to others also produces SOTA forecasting accuracy. Code is available at: https://github.com/yuqinie98/PatchTST.

Community

Upload images, audio, and videos by dragging in the text input, pasting, or clicking here.

Tap or paste here to upload images

Models citing this paper 3 ibm-granite/granite-timeseries-patchtst

Time Series Forecasting 0.0B Updated Aug 1, 2024 1.58k 13

ibm-research/patchtst-etth1-pretrain

Time Series Forecasting Updated Nov 14, 2024 1.14k 2

chungimungi/PatchTST-2-input-channels

0.0B Updated Apr 16, 2024 2

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2211.14730 in a dataset README.md to link it from this page.

Spaces citing this paper 1 Collections including this paper 2

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4