Stars
Letta is the platform for building stateful agents: open AI with advanced memory that can learn and self-improve over time.
The code for NeurIPS 2025 paper "A-MEM: Agentic Memory for LLM Agents"
Source code and demo for memory bank and SiliconFriend
A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
Official repository for LongChat and LongEval
Benchmarking Chat Assistants on Long-Term Interactive Memory (ICLR 2025)
Implementation of our work, 'Unlearning That Lasts: Utility-Preserving, Robust, and almost Irreversible Forgetting in LLMs'
The official implementation of the paper "Mem-α: Learning Memory Construction via Reinforcement Learning"
Open source code for Paper: Evaluating Memory in LLM Agents via Incremental Multi-Turn Interactions
[EMNLP 2025 Oral] MemoryOS is designed to provide a memory operating system for personalized AI agents.
Purdue-ISL / Leo
Forked from UsmanJafri/LeoArtifacts accompanying the NSDI '24 paper: Leo: Online ML-based Traffic Classification at Multi-Terabit Line Rate.
[ICML2025, NeurIPS2025 Spotlight] Sparse VideoGen 1 & 2: Accelerating Video Diffusion Transformers with Sparse Attention
SoftVC VITS Singing Voice Conversion
[ICML2025] SpargeAttention: A training-free sparse attention that accelerates any model inference.
Amphion (/æmˈfaɪən/) is a toolkit for Audio, Music, and Speech Generation. Its purpose is to support reproducible research and help junior researchers and engineers get started in the field of audi…