-
University of Alberta
- Edmonton
- https://cjiang2.github.io/
Highlights
- Pro
Stars
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Models and examples built with TensorFlow
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), ga…
The simplest, fastest repository for training/finetuning medium-sized GPTs.
Streamlit — A faster way to build and share data apps.
Ray is an AI compute engine. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
TensorFlow code and pre-trained models for BERT
🚀🚀 「大模型」2小时完全从0训练26M的小参数GPT!🌏 Train a 26M-parameter GPT from scratch in just 2h!
PyTorch Tutorial for Deep Learning Researchers
Python sample codes and textbook for robotics algorithms.
Generative Models by Stability AI
Open-sourced codes for MiniGPT-4 and MiniGPT-v2 (https://minigpt-4.github.io, https://minigpt-v2.github.io/)
Mask R-CNN for object detection and instance segmentation on Keras and TensorFlow
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Graph Neural Network Library for PyTorch
A Deep Learning based project for colorizing and restoring old images (and video!)
The interactive graphing library for Python ✨
State-of-the-Art Text Embeddings
PyTorch implementations of Generative Adversarial Networks.
OpenAI Baselines: high-quality implementations of reinforcement learning algorithms
Flet enables developers to easily build realtime web, mobile and desktop apps in Python. No frontend experience required.
End-to-End Object Detection with Transformers
A very simple framework for state-of-the-art Natural Language Processing (NLP)