-
Microsoft Research
- China
- https://addf400.github.io/
- @HangboBao
Lists (3)
Sort Name ascending (A-Z)
Stars
The interactive graphing library for Python ✨
Evals is a framework for evaluating LLMs and LLM systems, and an open-source registry of benchmarks.
A Powerful Spider(Web Crawler) System in Python.
Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
tiktoken is a fast BPE tokeniser for use with OpenAI's models.
This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".
End-to-End Object Detection with Transformers
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Ongoing research training transformer models at scale
Python package built to ease deep learning on graph, on top of existing DL frameworks.
Open source code for AlphaFold 2.
An open source implementation of CLIP.
100+ Chinese Word Vectors 上百种预训练中文词向量
text and image to video generation: CogVideoX (2024) and CogVideo (ICLR 2023)
An open-source NLP research library, built on PyTorch.
Official implementation of AnimateDiff.
手写实现李航《统计学习方法》书中全部算法
PyTorch package for the discrete VAE used for DALL·E.
A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
This repository contains code examples for the Stanford's course: TensorFlow for Deep Learning Research.
YOLOX is a high-performance anchor-free YOLO, exceeding yolov3~v5 with MegEngine, ONNX, TensorRT, ncnn, and OpenVINO supported. Documentation: https://yolox.readthedocs.io/
Implementation of Denoising Diffusion Probabilistic Model in Pytorch
Hackable and optimized Transformers building blocks, supporting a composable construction.
TensorFlow-based neural network library
The leading native Python SSHv2 protocol library.
A PyTorch implementation of the Transformer model in "Attention is All You Need".
🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support
Pretrained ConvNets for pytorch: NASNet, ResNeXt, ResNet, InceptionV4, InceptionResnetV2, Xception, DPN, etc.