deja
Shared memory for agents — from anywhere

deja, what one agent learns, they all know.

CI learns it. Your chat agent already knows it. Open source, self-hosted recall that any server, client, or script can shape.

npm i deja-client
await mem.learn("deploy failed", "check wrangler.toml")
await mem.inject("deploying") // → "check wrangler.toml"
Three ways to use deja
Library

Thin wrapper over HTTP. Two functions.

import deja from 'deja-client' const mem = deja(DEJA_URL) await mem.learn("trigger", "learning") await mem.inject("context")
REST API

HTTP endpoints. Use from anywhere.

POST /learn POST /inject POST /query GET /learnings GET /stats
MCP

Native protocol. Claude, Cursor, etc.

// claude_desktop_config.json "mcpServers": { "deja": { "url": "..." } }
Use cases

Incident response

Capture the postmortem as learnings, inject them before the next on-call handoff.

Agent onboarding

Give fresh agents the muscle memory of your best runs without flooding them with logs.

Long-running workflows

Stitch multi-day work into a single arc. Deja remembers outcomes, not noise.

Tool reliability

Teach agents the traps: flaky endpoints, brittle migrations, and safe retries.

Ops playbooks

Store the short form. Inject it when the runbook needs to be alive.

Product memory

Let agents remember the why, not just the what. Keep decisions tethered.

One memory, everywhere

deja is just HTTP. Anything that can make a request can learn and recall. CI/CD pipelines, local chat agents, Slack bots, cron jobs, CLI scripts — all writing to and reading from the same memory.

CI/CD learns
# GitHub Action on deploy failure
curl -X POST $DEJA_URL/learn \
-d '{"trigger": "deploy",
     "learning": "node 20 breaks esbuild"}'
Chat agent recalls
// Claude, 30 seconds later
await mem.inject("deploying")
// → "node 20 breaks esbuild"
Cron maintains
# Nightly cleanup
curl $DEJA_URL/stats
// stale memories auto-expire

CI learns it. Your chat agent knows it. Same memory, different doors.

Your context window is precious
.learn()
Post-run Capture what worked and what didn't.
vector
Semantic Indexed by meaning, not keywords.
.inject()
On-demand Recall what's relevant right now.

Don't carry everything. Ask for what's relevant right now.

Try it
Teach
Recall
Memories will appear here...
POST /learn
Under the hood
compute
Cloudflare Workers
isolation
Durable Objects
vectors
Vectorize + Workers AI
storage
SQLite + Drizzle
auth
API key + scoped access
Secure by default
Your infrastructure

Deploys to your Cloudflare account. No data leaves your Workers.

Per-tenant isolation

Each API key gets its own Durable Object. No cross-tenant leakage.

Auth-first routing

All routes require Bearer tokens by default. Explicitly opt in to public access.

Scoped access

Shared, agent-specific, and session-scoped memories. Least privilege by design.

Featured Integrations
View all →
Browse by category
Latest Patterns
View all →
Featured Prompt
All prompts →
system prompt

Basic System Prompt

A ready-to-paste system prompt that gives any agent full deja capabilities — learn, inject, and query via the REST API or deja-client.

You have access to persistent memory via deja, a shared memory service. Use it to store learnings and recall relevant context.

## Memory Operations

### Learn — store a memory after completing a task or encountering something noteworthy
```
POST {DEJA_URL}/learn
{
  "trigger": "when this situation ...
Honest answers

"Why not just use a database?"

You could. But then you're building semantic search, scoping, confidence decay, and cleanup yourself. deja is that layer, already wired to Cloudflare's vector and AI stack.

"Another AI tool?"

It's 1 Worker, 2 endpoints. No SDK required. curl works. The whole thing is ~500 lines of TypeScript.

"Vendor lock-in?"

Your Cloudflare account, your data, MIT license. It's a single Worker with a Durable Object and a Vectorize index. Fork it if we disappear.

"Will this actually get maintained?"

It's the tool we use daily. But it's also open source — so it doesn't depend on us. The architecture is intentionally simple enough to understand in an afternoon.

One agent learns it. The next one already knows.

Every lesson, every workaround, every hard-won fix — already waiting in context before the first token lands.