Stars
Learn fast, scalable, and calibrated measures of uncertainty using neural networks!
(NeurIPS 2021) Revisiting Deep Learning Models for Tabular Data
Research on Tabular Deep Learning: Papers & Packages
[ICLR 2024] Official implementation of "TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting"
A 15TB Collection of Physics Simulation Datasets
code and trained models for "Attentional Feature Fusion"
Official implementation of our ICML 2023 paper "LinSATNet: The Positive Linear Satisfiability Neural Networks".
Bayesian active learning library for research and industrial usecases.
[AAAI-23 Oral] Official implementation of the paper "Are Transformers Effective for Time Series Forecasting?"
Github page for: Graph Neural Networks for Multivariate Time Series Regression with Application to Seismic Data
PyTorch implementation of 'Weight Uncertainty in Neural Networks'
Implementation of Convolutional LSTM in PyTorch.
Supplementary material to reproduce "The Unreasonable Effectiveness of Deep Evidential Regression"
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Ray is an AI compute engine. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
Uncertainty Quantification 360 (UQ360) is an extensible open-source toolkit that can help you estimate, communicate and use uncertainty in machine learning model predictions.
国家自然科学基金申请书正文(面上项目)LaTeX 模板(非官方)
Source Han Sans | 思源黑体 | 思源黑體 | 思源黑體 香港 | 源ノ角ゴシック | 본고딕
A simple and extensible library to create Bayesian Neural Network layers on PyTorch.
GB/T 7714-2015 BibTeX Style
Distributed Asynchronous Hyperparameter Optimization in Python
A playbook for systematically maximizing the performance of deep learning models.
An interactive book on deep learning. Much easy, so MXNet. Wow. [Straight Dope is growing up] ---> Much of this content has been incorporated into the new Dive into Deep Learning Book available at …