Stars
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Robust Speech Recognition via Large-Scale Weak Supervision
The simplest, fastest repository for training/finetuning medium-sized GPTs.
LlamaIndex is the leading document agent and OCR platform
DSPy: The framework for programming—not prompting—language models
🤗 Diffusers: State-of-the-art diffusion models for image, video, and audio generation in PyTorch.
Code and documentation to train Stanford's Alpaca models, and generate the data.
The open source AI engineering platform for agents, LLMs, and ML models. MLflow enables teams of all sizes to debug, evaluate, monitor, and optimize production-quality AI applications while control…
RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RN…
An open source implementation of CLIP.
Databricks’ Dolly, a large language model trained on the Databricks Machine Learning Platform
Running large language models on a single GPU for throughput-oriented scenarios.
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries
[ICLR 2024] Fine-tuning LLaMA to follow Instructions within 1 Hour and 1.2M Parameters
Aligning pretrained language models with instruction data generated by themselves.
An API Client package to access the APIs for NBA.com
[NeurIPS 2023] Reflexion: Language Agents with Verbal Reinforcement Learning
A LLM trained only on data from certain time periods to reduce modern bias
Can you make an High Quality Gif from A to Z only by coding? Yes. Do you want to, though?
Scrapes Kenpom via BeautifulSoup and finds the best and worst team for each number of losses. writes to csv.
Time-series exploration of atmospheric CO2 concentration