Stars
Ray is an AI compute engine. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
A playbook for systematically maximizing the performance of deep learning models.
🎨 ML Visuals contains figures and templates which you can reuse and customize to improve your scientific writing.
Source Han Sans | 思源黑体 | 思源黑體 | 思源黑體 香港 | 源ノ角ゴシック | 본고딕
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Distributed Asynchronous Hyperparameter Optimization in Python
A deck tracker and deck manager for Hearthstone on Windows
An interactive book on deep learning. Much easy, so MXNet. Wow. [Straight Dope is growing up] ---> Much of this content has been incorporated into the new Dive into Deep Learning Book available at …
[AAAI-23 Oral] Official implementation of the paper "Are Transformers Effective for Time Series Forecasting?"
Implementation of Convolutional LSTM in PyTorch.
Keras + Hyperopt: A very simple wrapper for convenient hyperparameter optimization
Pytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace, SG-HMC and more
Minimal, clean example of lstm neural network training in python, for learning purposes.
A 15TB Collection of Physics Simulation Datasets
[ICLR 2024] Official implementation of "TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting"
Best transfer learning and domain adaptation resources (papers, tutorials, datasets, etc.)
Bayesian Convolutional Neural Network with Variational Inference based on Bayes by Backprop in PyTorch.
GB/T 7714-2015 BibTeX Style
Research on Tabular Deep Learning: Papers & Packages
国家自然科学基金申请书正文(面上项目)LaTeX 模板(非官方)
A simple and extensible library to create Bayesian Neural Network layers on PyTorch.
Bayesian active learning library for research and industrial usecases.
code and trained models for "Attentional Feature Fusion"
This will contain my notes for research papers that I read.
Experiments used in "Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning"