ZeroGraph TypeScript is a minimalist LLM framework designed for AI Agent programming
ZeroGraph is a TypeScript implementation of PocketFlow (Python), designed to collectively advance the development of agent-oriented LLM programming framework technologies and concepts.
- Lightweight: Just 300 lines(10kb). Zero bloat, zero dependencies, zero vendor lock-in.
- TypeScript Native: Full type safety and excellent IDE support.
- Agentic Coding: Let AI Agents (e.g., Cursor AI) build Agents—10x productivity boost!
npm install @u0z/zero-graphOr with yarn:
yarn add @u0z/zero-graphimport { Node, Flow } from '@u0z/zero-graph';
// Define a simple node
class GreetingNode extends Node {
prep(shared: any): string {
return shared.name || 'World';
}
exec(name: string): string {
return `Hello, ${name}!`;
}
post(shared: any, prepRes: string, execRes: string): void {
shared.greeting = execRes;
}
}
// Create and run a flow
const flow = new Flow(new GreetingNode());
const shared = { name: 'TypeScript' };
flow.run(shared);
console.log(shared.greeting); // "Hello, TypeScript!"The basic building block that handles simple tasks:
class MyNode extends Node {
prep(shared: any): any {
// Prepare data from shared store
return shared.input;
}
exec(prepResult: any): any {
// Execute the main logic
return processData(prepResult);
}
post(shared: any, prepRes: any, execRes: any): string {
// Store result and return next action
shared.result = execRes;
return 'default';
}
}Orchestrates multiple nodes through actions:
const nodeA = new NodeA();
const nodeB = new NodeB();
const nodeC = new NodeC();
// Connect nodes with actions
nodeA.next(nodeB, 'success');
nodeA.next(nodeC, 'error');
const flow = new Flow(nodeA);
flow.run(shared);Process multiple items efficiently:
class BatchProcessor extends BatchNode {
exec(item: any): any {
return processItem(item);
}
}
const batchFlow = new BatchFlow(new BatchProcessor());Handle asynchronous operations:
class AsyncProcessor extends AsyncNode {
async execAsync(input: any): Promise<any> {
return await apiCall(input);
}
}
const asyncFlow = new AsyncFlow(new AsyncProcessor());
await asyncFlow.runAsync(shared);Check out the examples directory for comprehensive usage examples:
- Hello World - Basic node and flow usage
- Agent - Research agent with web search
- Workflow - Multi-step content generation
- Batch Processing - Handle multiple items
- Async Operations - Asynchronous workflows
- RAG - Retrieval-augmented generation
- Multi-Agent - Multiple agents collaboration
- Core Abstractions - Node, Flow, and Shared Store
- Design Patterns - Agent, Workflow, RAG, etc.
- API Reference - Complete API documentation
- Migration Guide - From Python to TypeScript
Current LLM frameworks are bloated... You only need 300 lines for LLM Framework!
| Lines | Size | TypeScript | |
|---|---|---|---|
| LangChain | 405K | +166MB | ❌ |
| CrewAI | 18K | +173MB | ❌ |
| LangGraph | 37K | +51MB | ❌ |
| ZeroGraph | 300 | +10KB | ✅ |
We welcome contributions! Please see our Contributing Guide for details.
MIT License - see LICENSE file for details.