Nova is a personal‑AI workspace that puts privacy first.
Quickstart on your computer (with Docker):
git clone https://github.com/AMairesse/Nova.git
cd Nova/docker
cp .env.example .env
docker compose up -d
The default username is admin and the default password is changeme.
Then you can create your first agent and start playing with it : How to configure agents.
Optionally you can use Nova with llama.cpp included
If you also want to use llama.cpp for a default system provider available to all users, you can use the `docker-compose.add-llamacpp.yml` file:docker compose -f docker-compose.yml -f docker-compose.add-llamacpp.yml up -d
See docker/README.md for more details.
Instead of sending every prompt to a remote model, Nova lets you decide – transparently and at run‑time – whether an agent should reason with a local LLM running on your own machine or delegate to a cloud model only when extra horsepower is really needed. The result is a flexible “best of both worlds” setup that keeps sensitive data on‑prem while still giving you access to state‑of‑the‑art capabilities when you want them.
- Agent‑centric workflow – Create smart assistants (agents) and equip them with “tools” that can be simple Python helpers, calendar utilities, HTTP/APIs or even other agents. Agents can chain or delegate work to one another, allowing complex reasoning paths.
- Bring‑your‑own models – Connect to OpenAI (or compatible providers like openrouter.ai) or Mistral if the task is public, but switch to local back‑ends such as Ollama, llama.cpp or LM Studio for anything confidential. Each provider is configured once and can be reused by multiple agents.
- Privacy by design – API keys and tokens are stored encrypted; only the minimal data required for a given call ever leaves your machine.
- Built‑in tools – Nova comes with a bunch of “built‑in” tools for common tasks, like CalDav calendar queries, web surfing, date management and more to come!
- Pluggable tools – Besides built‑in utilities, Nova can talk to external micro‑services through the open MCP protocol or any REST endpoint, so your agents keep growing with your needs.
- Human‑in‑the‑loop UI – A lightweight web interface lets you chat with agents, watch their progress in real time, and manage providers / agents / tools without touching code.
- Asynchronous calls – You can safely invoke agents from the UI, and they will run in the background so you can do other things at the same time.
- API available – You can easily ask a question to your default agent using the API.
In short, Nova aims to make “agents with autonomy, privacy and extensibility” a reality for everyday users – giving you powerful automation while keeping your data yours.
- ✅ Tool‑aware agents: Agents can invoke built‑in tools, remote REST/MCP services or even other agents.
- ✅ Local‑first LLM routing: Decide per‑agent which provider to use: OpenAI, Mistral, Ollama, LM Studio or any future backend. Local models are preferred for sensitive data; the switch is transparent for you.
- ✅ Live streaming: you will see tool calls and sub‑agent calls in real time so you can follow what happens under the hood. Then the agent's response will be streamed.
- ✅ Plug‑and‑play MCP client: Connect to any Model Context Protocol server, cache its tool catalogue and call remote tools with automatic input validation.
- ✅ Multilingual & i18n‑ready: All UI strings use Django translations; English only currently.
- ✅ Extensible by design: Drop a Python module exposing a
get_functions()map and it instantly becomes a multi‑function “built‑in” tool.
- Key Features
- Production Deployment (Docker)
- Development Setup (Docker)
- API
- Project Layout
- Roadmap
- Contributing
- License
- Acknowledgements
- Troubleshooting
This is the recommended way to run Nova. See the Docker README.md for details.
Development setup also uses Docker given the number of components involved.
See the Docker README.md for details.
A simple API is available to ask a question to your default agent.
- Get your token from the configuration screen.
- Send a POST request to the API endpoint with your question.
curl -H "Authorization: Token YOUR_TOKEN_HERE" \
-H "Content-Type: application/json" \
--data '{"question":"Who are you and what can you do?"}' \
http://localhost:8080/api/ask/- Method: POST
- Endpoint:
http://localhost:8080/api/ask/ - Headers:
Authorization: Token YOUR_TOKEN_HEREContent-Type: application/json
- Request body:
{ "question": "Your question here" } - Response: The API returns a JSON object containing the agent's answer.
{
"question": "Who are you?",
"answer": "I am your default agent. I can answer your questions and assist you with various tasks."
}Notes:
- Replace
YOUR_TOKEN_HEREwith your actual token. - If your token is invalid or missing, the API will return a 401 Unauthorized error.
Nova
├─ docker/ # Docker compose configuration for the project
├─ nova/
| ├─ api/ # Minimal REST facade
| ├─ mcp/ # Thin wrapper around FastMCP
| ├─ migrations/ # Django model migration scripts
| ├─ static/ # JS helpers (streaming, tool modal manager…)
| ├─ templates/ # Django + Bootstrap 5 UI
| ├─ tools/ # Built‑in tool modules (CalDav, agent wrapper…)
| └─ views/ # Django views
├─ user_settings/ # Dedicated Django app for the user settings
- File management : add a file, receive a file as a result, file support for MCP tools, ...
- Add a scratchpad tool (acting like a memory for long task)
- Add a canvas tool (acting like a UI component for the agent to interact with the user)
- Better display for "thinking models"
Pull requests are welcome!
Nova is released under the MIT License – see LICENSE for details.
- Django – the rock‑solid web framework
- LangChain – agent & tool abstractions
- FastMCP – open protocol for tool servers
- Bootstrap 5 – sleek, responsive UI components
Made with ❤️ and a healthy concern for data privacy.
- Port conflicts: Ensure ports 80 (Nginx), 8000 (Daphne), and 5432 (PostgreSQL) are free. Stop conflicting services or edit
docker-compose.yml. - DB not ready: If web container fails with DB errors, check PostgreSQL logs (
docker compose logs db). Increase healthcheck timeouts if needed. - No superuser: Set
DJANGO_SUPERUSER_*in.envand restart. Or rundocker compose exec web python manage.py createsuperuser. - Ollama unreachable: Use
host.docker.internal(Docker Desktop) or your machine's IP for Base URL. Ensure Ollama runs on the host. - Volumes lost data? Back up volumes with
docker volume lsand tools likedocker-volume-backup.
For more help, open an issue on GitHub.