default search action
Haixu Wu
Person information
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
2020 – today
- 2024
- [c15]Yong Liu, Tengge Hu, Haoran Zhang, Haixu Wu, Shiyu Wang, Lintao Ma, Mingsheng Long:
iTransformer: Inverted Transformers Are Effective for Time Series Forecasting. ICLR 2024 - [c14]Shiyu Wang, Haixu Wu, Xiaoming Shi, Tengge Hu, Huakun Luo, Lintao Ma, James Y. Zhang, Jun Zhou:
TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting. ICLR 2024 - [c13]Jiaxiang Dong, Haixu Wu, Yuxuan Wang, Yunzhong Qiu, Li Zhang, Jianmin Wang, Mingsheng Long:
TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling. ICML 2024 - [c12]Haixu Wu, Huakun Luo, Haowen Wang, Jianmin Wang, Mingsheng Long:
Transolver: A Fast Transformer Solver for PDEs on General Geometries. ICML 2024 - [c11]Lanxiang Xing, Haixu Wu, Yuezhou Ma, Jianmin Wang, Mingsheng Long:
HelmFluid: Learning Helmholtz Dynamics for Interpretable Fluid Prediction. ICML 2024 - [c10]Zhiyu Yao, Jian Wang, Haixu Wu, Jingdong Wang, Mingsheng Long:
Mobile Attention: Mobile-Friendly Linear-Attention for Vision Transformers. ICML 2024 - [i22]Haixu Wu, Huakun Luo, Haowen Wang, Jianmin Wang, Mingsheng Long:
Transolver: A Fast Transformer Solver for PDEs on General Geometries. CoRR abs/2402.02366 (2024) - [i21]Qilong Ma, Haixu Wu, Lanxiang Xing, Jianmin Wang, Mingsheng Long:
EuLagNet: Eulerian Fluid Prediction with Lagrangian Dynamics. CoRR abs/2402.02425 (2024) - [i20]Jiaxiang Dong, Haixu Wu, Yuxuan Wang, Yunzhong Qiu, Li Zhang, Jianmin Wang, Mingsheng Long:
TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling. CoRR abs/2402.02475 (2024) - [i19]Yuxuan Wang, Haixu Wu, Jiaxiang Dong, Yong Liu, Yunzhong Qiu, Haoran Zhang, Jianmin Wang, Mingsheng Long:
TimeXer: Empowering Transformers for Time Series Forecasting with Exogenous Variables. CoRR abs/2402.19072 (2024) - [i18]Haixu Wu, Huakun Luo, Yuezhou Ma, Jianmin Wang, Mingsheng Long:
RoPINN: Region Optimized Physics-Informed Neural Networks. CoRR abs/2405.14369 (2024) - [i17]Shiyu Wang, Haixu Wu, Xiaoming Shi, Tengge Hu, Huakun Luo, Lintao Ma, James Y. Zhang, Jun Zhou:
TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting. CoRR abs/2405.14616 (2024) - [i16]Hang Zhou, Yuezhou Ma, Haixu Wu, Haowen Wang, Mingsheng Long:
Unisolver: PDE-Conditional Transformers Are Universal PDE Solvers. CoRR abs/2405.17527 (2024) - [i15]Yuxuan Wang, Haixu Wu, Jiaxiang Dong, Yong Liu, Mingsheng Long, Jianmin Wang:
Deep Time Series Models: A Comprehensive Survey and Benchmark. CoRR abs/2407.13278 (2024) - [i14]Jiaxiang Dong, Haixu Wu, Yuxuan Wang, Li Zhang, Jianmin Wang, Mingsheng Long:
Metadata Matters for Time Series: Informative Forecasting with Transformers. CoRR abs/2410.03806 (2024) - 2023
- [j3]Haixu Wu, Hang Zhou, Mingsheng Long, Jianmin Wang:
Interpretable weather forecasting for worldwide stations with a unified deep model. Nat. Mac. Intell. 5(6): 602-611 (2023) - [j2]Yunbo Wang, Haixu Wu, Jianjin Zhang, Zhifeng Gao, Jianmin Wang, Philip S. Yu, Mingsheng Long:
PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive Learning. IEEE Trans. Pattern Anal. Mach. Intell. 45(2): 2208-2225 (2023) - [j1]Zhiyu Yao, Yunbo Wang, Haixu Wu, Jianmin Wang, Mingsheng Long:
ModeRNN: Harnessing Spatiotemporal Mode Collapse in Unsupervised Predictive Learning. IEEE Trans. Pattern Anal. Mach. Intell. 45(11): 13281-13296 (2023) - [c9]Haixu Wu, Tengge Hu, Yong Liu, Hang Zhou, Jianmin Wang, Mingsheng Long:
TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis. ICLR 2023 - [c8]Haixu Wu, Tengge Hu, Huakun Luo, Jianmin Wang, Mingsheng Long:
Solving High-Dimensional PDEs with Latent Spectral Models. ICML 2023: 37417-37438 - [c7]Jiaxiang Dong, Haixu Wu, Haoran Zhang, Li Zhang, Jianmin Wang, Mingsheng Long:
SimMTM: A Simple Pre-Training Framework for Masked Time-Series Modeling. NeurIPS 2023 - [i13]Haixu Wu, Tengge Hu, Huakun Luo, Jianmin Wang, Mingsheng Long:
Solving High-Dimensional PDEs with Latent Spectral Models. CoRR abs/2301.12664 (2023) - [i12]Jiaxiang Dong, Haixu Wu, Haoran Zhang, Li Zhang, Jianmin Wang, Mingsheng Long:
SimMTM: A Simple Pre-Training Framework for Masked Time-Series Modeling. CoRR abs/2302.00861 (2023) - [i11]Yong Liu, Tengge Hu, Haoran Zhang, Haixu Wu, Shiyu Wang, Lintao Ma, Mingsheng Long:
iTransformer: Inverted Transformers Are Effective for Time Series Forecasting. CoRR abs/2310.06625 (2023) - [i10]Lanxiang Xing, Haixu Wu, Yuezhou Ma, Jianmin Wang, Mingsheng Long:
HelmSim: Learning Helmholtz Dynamics for Interpretable Fluid Simulation. CoRR abs/2310.10565 (2023) - 2022
- [c6]Jiehui Xu, Haixu Wu, Jianmin Wang, Mingsheng Long:
Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy. ICLR 2022 - [c5]Haixu Wu, Jialong Wu, Jiehui Xu, Jianmin Wang, Mingsheng Long:
Flowformer: Linearizing Transformers with Conservation Flows. ICML 2022: 24226-24242 - [c4]Jialong Wu, Haixu Wu, Zihan Qiu, Jianmin Wang, Mingsheng Long:
Supported Policy Optimization for Offline Reinforcement Learning. NeurIPS 2022 - [c3]Yong Liu, Haixu Wu, Jianmin Wang, Mingsheng Long:
Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting. NeurIPS 2022 - [i9]Jialong Wu, Haixu Wu, Zihan Qiu, Jianmin Wang, Mingsheng Long:
Supported Policy Optimization for Offline Reinforcement Learning. CoRR abs/2202.06239 (2022) - [i8]Haixu Wu, Jialong Wu, Jiehui Xu, Jianmin Wang, Mingsheng Long:
Flowformer: Linearizing Transformers with Conservation Flows. CoRR abs/2202.06258 (2022) - [i7]Yong Liu, Haixu Wu, Jianmin Wang, Mingsheng Long:
Non-stationary Transformers: Rethinking the Stationarity in Time Series Forecasting. CoRR abs/2205.14415 (2022) - [i6]Haixu Wu, Tengge Hu, Yong Liu, Hang Zhou, Jianmin Wang, Mingsheng Long:
TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis. CoRR abs/2210.02186 (2022) - 2021
- [c2]Haixu Wu, Zhiyu Yao, Jianmin Wang, Mingsheng Long:
MotionRNN: A Flexible Model for Video Prediction With Spacetime-Varying Motions. CVPR 2021: 15435-15444 - [c1]Haixu Wu, Jiehui Xu, Jianmin Wang, Mingsheng Long:
Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting. NeurIPS 2021: 22419-22430 - [i5]Haixu Wu, Zhiyu Yao, Mingsheng Long, Jianmin Wang:
MotionRNN: A Flexible Model for Video Prediction with Spacetime-Varying Motions. CoRR abs/2103.02243 (2021) - [i4]Yunbo Wang, Haixu Wu, Jianjin Zhang, Zhifeng Gao, Jianmin Wang, Philip S. Yu, Mingsheng Long:
PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive Learning. CoRR abs/2103.09504 (2021) - [i3]Haixu Wu, Jiehui Xu, Jianmin Wang, Mingsheng Long:
Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting. CoRR abs/2106.13008 (2021) - [i2]Jiehui Xu, Haixu Wu, Jianmin Wang, Mingsheng Long:
Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy. CoRR abs/2110.02642 (2021) - [i1]Zhiyu Yao, Yunbo Wang, Haixu Wu, Jianmin Wang, Mingsheng Long:
ModeRNN: Harnessing Spatiotemporal Mode Collapse in Unsupervised Predictive Learning. CoRR abs/2110.03882 (2021)
Coauthor Index
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
Privacy notice: By enabling the option above, your browser will contact the API of archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from , , and to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
last updated on 2024-11-15 20:35 CET by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint