-
MIT
- Cambridge, MA
- https://www.mit.edu/~hjian42/
- @hjian42
Stars
📚 Freely available programming books
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
🦜🔗 The platform for reliable agents.
Tensors and Dynamic neural networks in Python with strong GPU acceleration
Robust Speech Recognition via Large-Scale Weak Supervision
A high-throughput and memory-efficient inference and serving engine for LLMs
scikit-learn: machine learning in Python
🌟 The Multi-Agent Framework: First AI Software Company, Towards Natural Language Programming
The simplest, fastest repository for training/finetuning medium-sized GPTs.
LlamaIndex is the leading framework for building LLM-powered agents over your data.
Streamlit — A faster way to build and share data apps.
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthr…
PyTorch Tutorial for Deep Learning Researchers
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
DSPy: The framework for programming—not prompting—language models
Free and Open Source Enterprise Resource Planning (ERP)
Machine Learning From Scratch. Bare bones NumPy implementations of machine learning models and algorithms with a focus on accessibility. Aims to cover everything from linear regression to deep lear…
An LLM-powered knowledge curation system that researches a topic and generates a full-length report with citations.
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Code for the paper "Language Models are Unsupervised Multitask Learners"
JARVIS, a system to connect LLMs with ML community. Paper: https://arxiv.org/pdf/2303.17580.pdf
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training