Starred repositories
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
The Electricity Transformer dataset is collected to support the further investigation on the long sequence forecasting problem.
MTS-Mixers: Multivariate Time Series Forecasting via Factorized Temporal and Channel Mixing
Official implementation for ICML24 paper "Irregular Multivariate Time Series Forecasting: A Transformable Patching Graph Neural Networks Approach"
[IEEE IoT-J 2026] The official repository of the SegRNN paper: "Segment Recurrent Neural Network for Long-Term Time Series Forecasting." This work is developed by the Lab of Professor Weiwei Lin (l…
[AAAI 2025] Official Implementation of "HDT: Hierarchical Discrete Transformer for Multivariate Time Series Forecasting"
[AAAI-23 Oral] Official implementation of the paper "Are Transformers Effective for Time Series Forecasting?"
This repository contains a reading list of papers on Time Series Forecasting/Prediction (TSF) and Spatio-Temporal Forecasting/Prediction (STF). These papers are mainly categorized according to the …
Code release for "Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting" (NeurIPS 2022), https://arxiv.org/abs/2205.14415
N-BEATS is a neural-network based model for univariate timeseries forecasting. N-BEATS is a ServiceNow Research project that was started at Element AI.
About Code release for "TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis" (ICLR 2023), https://openreview.net/pdf?id=ju_Uqw384Oq
Code for paper: Block Hankel Tensor ARIMA for Multiple Short Time Series Forecasting (AAAI-20)
Attempt to use XGBoost in stock price prediction
Python for《Deep Learning》,该书为《深度学习》(花书) 数学推导、原理剖析与源码级别代码实现
Time series forecasting for individual household power prediction: ARIMA, xgboost, RNN
An extension of XGBoost to probabilistic modelling
Attention-based CNN-LSTM and XGBoost hybrid model for stock prediction
Probabilistic prediction with XGBoost.
Trading Algorithm by XGBoost
Deep learning algorithms source code for beginners
Reformer, the efficient Transformer, in Pytorch
Pytorch Implementation of Google's TFT
The official code for Paper “Multi-scale Temporal Fusion Transformer for Incomplete Vehicle Trajectory Prediction”