Lists (22)
Sort Name ascending (A-Z)
AIGC
AI generate content except LLM, text-2-image, text-2-video, etc.C++
Language guide and best practiceDataset
Deep Learning
Foundation of deep learningEmbedding Database
Face
FoundationModel
Framework
Software frameworks and libraries, like RPC, Event-Driven Architecture, etc.Game
License
开源协议LLM
Large language models, like LlaMa, GPT, etc.MachineLearning
Prompt
PyTorch
PyTorch tools and tutoriesRadar
RL
强化学习Robot Learning
Time Series
Tracking Algorithm
Ultrasound
Utilites
tools and libraryWiFi
Starred repositories
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Simple Reinforcement learning tutorials, 莫烦Python 中文AI教学
🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support
The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with …
Home of StarCoder: fine-tuning & inference!
中文LLaMA-2 & Alpaca-2大模型二期项目 + 64K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs with 64K long context models)
A concise but complete full-attention transformer with a set of promising experimental features from various papers
Simple examples to introduce PyTorch
📡 All You Need to Know About Deep Learning - A kick-starter
Aligning pretrained language models with instruction data generated by themselves.
Simple, online, and realtime tracking of multiple objects in a video sequence.
[ICCV 2023 Oral] Text-to-Image Diffusion Models are Zero-Shot Video Generators
Code examples in pyTorch and Tensorflow for CS230
Flops counter for neural networks in pytorch framework
LLaMA: Open and Efficient Foundation Language Models
The devkit of the nuScenes dataset.
An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730
A lightweight framework for building LLM-based agents
Official implementation for "iTransformer: Inverted Transformers Are Effective for Time Series Forecasting" (ICLR 2024 Spotlight)
RetinaFace: Deep Face Detection Library for Python
[ICML 2023] SmoothQuant: Accurate and Efficient Post-Training Quantization for Large Language Models
An artificial intelligence platform for the StarCraft II with large-scale distributed training and grand-master agents.
A project for processing neural networks and rendering to gain insights on the architecture and parameters of a model through a decluttered representation.
HOTA (and other) evaluation metrics for Multi-Object Tracking (MOT).
[Unofficial] PyTorch implementation of "Conformer: Convolution-augmented Transformer for Speech Recognition" (INTERSPEECH 2020)