The full-stack platform for AI agents. Build with type-safe schemas, frontend hooks, and real-time routes. Ship agents as easily as web apps.
v1 now in public preview — Feedback welcome on Discord and GitHub Discussions.
From local government to indie developers, people are already building with us.
Let's see this in action - a complete flow from agent to API to React frontend:
1. The Agent (src/agent/chat/agent.ts)
import { createAgent } from '@agentuity/runtime';
import { s } from '@agentuity/schema';
import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';
const agent = createAgent('chat', {
description: 'A simple chat agent',
schema: {
input: s.object({ message: s.string() }),
output: s.object({ response: s.string() }),
},
handler: async (ctx, { message }) => {
const { text } = await generateText({
model: openai('gpt-5-mini'),
prompt: message,
});
return { response: text };
},
});
export default agent;2. The API Route (src/api/index.ts)
import { createRouter } from '@agentuity/runtime';
import chat from '@agent/chat';
const router = createRouter();
router.post('/chat', chat.validator(), async (c) => {
const data = c.req.valid('json');
const result = await chat.run(data);
return c.json(result);
});
export default router;3. The React Hook (src/web/components/Chat.tsx)
import { useAPI } from '@agentuity/react';
export function Chat() {
const { invoke, data, isLoading } = useAPI('POST /api/chat');
return (
<div>
<button onClick={() => invoke({ message: 'Hello!' })}>
{isLoading ? 'Thinking...' : 'Send'}
</button>
{data && <p>{data.response}</p>}
</div>
);
}Use agentuity dev to test locally, then agentuity deploy to production. 🚀
Check out our SDK (runtime, CLI, frontend hooks, server utilities) and docs.
# Install the CLI
curl -fsS https://v1.agentuity.sh | sh
# Create a new project
agentuity create
# Start developing
cd my-project && agentuity dev
# Deploy to the cloud
agentuity deployEverything you need to build and ship full-stack AI agents:
Build
- TypeScript-first: Type-safe schemas, autocomplete everywhere, powered by Bun
- Frontend: Deploy web apps alongside your agents with built-in hooks
- Framework-agnostic: Use any AI library (e.g. Vercel AI SDK, Mastra) or bring your own
- Multi-agent: Coordinate agents with type-safe calls between them
- Offline-ready: Start building immediately, no account required
- Agent-friendly CLI:
--jsonoutput,--explainpreviews,--dry-runvalidation, schema discovery
Connect
- AI Gateway: Access all major LLM providers (e.g. OpenAI, Anthropic, Google) with just one API key
- Infrastructure as code: HTTP, cron, email, SMS, WebSocket, and SSE. Rollback-friendly deployments.
- Storage: Key-value, vector, object, durable streams. BYO supported.
- Custom domains: Automated SSL certificates and DNS management
- Deploy anywhere: Public cloud, private cloud, on-prem, or edge
Monitor
- Observability: OpenTelemetry tracing, structured logging, real-time analytics
- Evaluations: Automated quality checks after each agent run
- Container access: SSH and SCP into running deployments
- Documentation: Guides, examples, and reference
- Discord: Join 200+ developers to chat, ask for help, and share what you're building
Our SDK is open source. Contributions welcome! See the repo for guidelines.