CLI tool for exploring AI agent chat data.
curl -LsSf https://astral.sh/uv/install.sh | sh
uv sync --all-extras --dev
uv pip install git+https://github.com/arclabs561/multiscale.git
uv pip install -e ../llm-helpersOptional: rank-fusion and rank-refine (not on PyPI):
uv pip install -e ../rank-fusion/rank-fusion-python
uv pip install -e ../rank-refine/rank-refine-pythonuv run ae chats
uv run ae convo <composer_id>
uv run ae find-solution "query"
uv run ae remember "query"
uv run ae ensure-indexedBasic:
info- Database infochats- List conversationsconvo <id>- Show conversationkeys [--prefix] [--like]- List keysshow <key>- Show key value
Search:
find-solution <query>- Search conversation historyremember <query>- Recall with optional LLM summarizationensure-indexed- Build indexes
Analysis:
pairs <id>- Extract QA pairsmultiscale- Hierarchical summarizationdesign-coherence- Organize design plans
See uv run agent-explorer --help for all commands.
AGENT_TYPE- Agent type (default:cursor)AGENT_STATE_DB/CURSOR_STATE_DB- Database pathAGENT_INDEX_JSONL/CURSOR_INDEX_JSONL- Index path (default:./cursor_index.jsonl)AGENT_VEC_DB/CURSOR_VEC_DB- Vector DB path (default:./cursor_vec.db)OPENAI_API_KEY- Required for LLM featuresOPENAI_MODEL- Model (default:gpt-4o-mini)OPENAI_EMBED_MODEL- Embedding model (default:text-embedding-3-small)
- Agent-agnostic design (Cursor implemented, extensible for Cline/Aider)
- Multi-source search (vector + sparse, rank-fusion optional)
- Conversation recall (LLM summarization optional)
- Key-value access to agent databases
- Hierarchical summarization
- Idempotent indexing with caching
- Read-only database access
- Commands output JSON (pipe to
jqfor formatting) - Close agent before accessing database