Highlights
- Pro
Stars
EleutherAI / nanoGPT-mup
Forked from karpathy/nanoGPTThe simplest, fastest repository for training/finetuning medium-sized GPTs.
Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)
pedrogengo / tiny-ai-client
Forked from piEsposito/tiny-ai-clientTiny client for LLMs with vision and tool calling. As simple as it gets.
Tiny client for LLMs with vision and tool calling. As simple as it gets.
💎 Amber the programming language compiled to Bash
A simple, performant and scalable Jax LLM!
HiddenVM — Use any desktop OS without leaving a trace.
Karras et al. (2022) diffusion models for PyTorch
Implementation of Rotary Embeddings, from the Roformer paper, in Pytorch
All pdfs of Victor Eijkhout's Art of HPC books and courses
Framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
Official inference library for Mistral models
Understanding Deep Learning - Simon J.D. Prince
Welcome to the Llama Cookbook! This is your go to guide for Building with Llama: Getting started with Inference, Fine-Tuning, RAG. We also show you how to solve end to end problems using Llama mode…
modal-labs / llama-recipes
Forked from meta-llama/llama-cookbookExamples and recipes for Llama 2 model
PixArt-α: Fast Training of Diffusion Transformer for Photorealistic Text-to-Image Synthesis
A Python toolbox for performing gradient-free optimization
Official Implementation for "Attend-and-Excite: Attention-Based Semantic Guidance for Text-to-Image Diffusion Models" (SIGGRAPH 2023)