Stars
A Python package for probabilistic state space modeling with JAX
Burn is a next generation tensor library and Deep Learning Framework that doesn't compromise on flexibility, efficiency and portability.
🤗 The largest hub of ready-to-use datasets for AI models with fast, easy-to-use and efficient data manipulation tools
[CVPR 2025] Official PyTorch Implementation of MambaVision: A Hybrid Mamba-Transformer Vision Backbone
Lightning fast C++/CUDA neural network framework
A tiny scalar-valued autograd engine and a neural net library on top of it with PyTorch-like API
Code documentation written as code! How novel and totally my idea!
Tensors and Dynamic neural networks in Python with strong GPU acceleration
Differentiable ODE solvers with full GPU support and O(1)-memory backpropagation.
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
RAPIDS Community Notebooks
State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterprise-grade infrastructure.
A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch
Mask R-CNN for object detection and instance segmentation on Keras and TensorFlow
StyleGAN - Official TensorFlow Implementation
Code for the paper "Language Models are Unsupervised Multitask Learners"
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.