Interactive LLM REPL - Navigate the vast ocean of AI conversations
Named after the Norse god of the sea and sailors, Njord guides you through the vast ocean of AI conversations with a powerful terminal-based interface for multiple AI providers.
- OpenAI: Latest models including o3-pro, o3, o4-mini, gpt-4.1 series with reasoning model support
- Anthropic: Claude 4 and 3.x models (Sonnet, Opus, Haiku) with thinking mode support
- Google Gemini: Gemini 2.5 Pro, Flash, and Flash Lite models
- Smart Model Detection: Automatic provider switching based on model selection
- Real-time Streaming: Live response streaming with proper interruption handling
- Thinking Mode: See AI reasoning process for supported Anthropic models
- Multi-line Input: Triple-backtick code blocks for complex prompts
- Smart Interruption: Ctrl-C handling with message queuing and retry logic
- Tab Completion: Intelligent command and parameter completion with hints
- Auto-saving: Sessions automatically saved when they contain AI interactions
- Session Operations: Save, load, fork, merge, and continue sessions
- Safe Loading: Load copies of sessions without modifying originals
- Recent Sessions: Quick access to recently used conversations
- Session Search: Full-text search across all saved sessions with highlighted excerpts
- Session Summarization: Generate AI-powered summaries of conversations for quick review
- Automatic Extraction: Code blocks automatically detected and numbered
- Universal Clipboard: Copy to system clipboard + OSC52 for SSH/terminal compatibility
- File Operations: Save code blocks directly to files
- Safe Execution: Execute bash, Python, and JavaScript with confirmation prompts
- Language Support: Syntax detection for multiple programming languages
- Colored Output: Syntax highlighting for code blocks and role-based message coloring
- Message History: Navigate conversation history with timestamps and metadata
- Command System: Comprehensive slash commands for all operations
- Input History: Arrow key navigation through previous inputs
- Status Display: Current model, provider, and configuration at startup
You'll need at least one API key from the supported providers:
- OpenAI: Get your API key from OpenAI Platform
- Anthropic: Get your API key from Anthropic Console
- Google Gemini: Get your API key from Google AI Studio
Download the latest release from the releases page.
# Clone the repository
git clone https://github.com/yourusername/njord.git
cd njord
# Build the project
cargo build --release
# The binary will be at target/release/njordIf you have Rust/Cargo installed, you can build and install directly to your PATH:
# Clone and install in one step
git clone https://github.com/yourusername/njord.git
cd njord
cargo install --path .
# Or for a static binary (requires musl tools on Linux)
rustup target add x86_64-unknown-linux-musl
cargo install --path . --target x86_64-unknown-linux-musl
# Now njord should be available in your PATH
njord --helpSet your API keys as environment variables:
# Set at least one API key
export OPENAI_API_KEY="your-openai-key-here"
export ANTHROPIC_API_KEY="your-anthropic-key-here"
export GEMINI_API_KEY="your-gemini-key-here"Start Njord:
./njordOr with command-line options:
# Start with a specific model
./njord --model gpt-4
# Start with custom temperature
./njord --temperature 0.9
# Load a saved session
./njord --load-session "my-session"
# Start fresh session
./njord --new-session/models- List all available models across providers/model MODEL- Switch to any model (auto-detects provider)/status- Show current provider, model, and configuration
/chat new- Start fresh session/chat save NAME- Save current session/chat load NAME- Load safe copy of session/chat continue [NAME]- Resume most recent or named session/chat fork NAME- Save current session and start fresh/chat merge NAME- Merge another session into current/chat list- List all saved sessions with metadata/chat recent- Show recently used sessions/chat delete NAME- Delete saved session/chat rename NEW_NAME [OLD_NAME]- Rename sessions/chat auto-rename [NAME]- Auto-generate session titles using LLM/chat auto-rename-all- Bulk auto-rename all anonymous sessions
/history- Show full conversation with timestamps/undo [N]- Remove last N messages (default 1)/goto N- Jump to message N, removing later messages/search TERM- Search across all sessions with highlighted results/summarize [NAME]- Generate AI summary of session (defaults to current)
/blocks- List all code blocks in current session/block N- Display specific code block with syntax highlighting/copy N- Copy code block to clipboard (system + OSC52)/save N FILENAME- Save code block to file/exec N- Execute code block with safety confirmation
/system [PROMPT]- Set/view/clear system prompt/temp VALUE- Set temperature (0.0-2.0, model-dependent)/max-tokens N- Set maximum response tokens/thinking on|off- Enable/disable thinking mode (Anthropic models)/thinking-budget N- Set thinking token budget
/help- Show all commands/clear- Clear terminal screen/quit- Exit Njord
- Multi-line Input: Start with
and end withon its own line - Smart Interruption: Use Ctrl-C to cancel requests - messages are queued for retry
- Tab Completion: Press Tab for command completion with helpful hints
- Universal Clipboard:
/copyworks in SSH sessions and all terminal types - Session Safety:
/chat loadcreates copies, originals remain unchanged
- Rust 1.70+ (install from rustup.rs)
- Git
# Clone the repository
git clone https://github.com/yourusername/njord.git
cd njord
# Build in development mode
cargo build
# Build optimized release
cargo build --release
# Run directly with cargo
cargo run -- --helpFor deployment or distribution, you can build a statically-linked binary with zero runtime dependencies:
# Install musl tools
sudo apt update
sudo apt install musl-tools musl-dev
# Add musl target
rustup target add x86_64-unknown-linux-musl
# Build static binary
cargo build --release --target x86_64-unknown-linux-musl
# Binary will be at target/x86_64-unknown-linux-musl/release/njord# macOS (already statically links by default)
cargo build --release
# Windows
cargo build --release --target x86_64-pc-windows-msvc# Run all tests
cargo test
# Run tests with output
cargo test -- --nocapture
# Run specific test
cargo test test_namesrc/
├── main.rs # Application entry point
├── cli.rs # Command-line argument parsing
├── config.rs # Configuration management
├── repl.rs # Main REPL loop and logic
├── ui.rs # User interface and terminal handling
├── commands.rs # Command parsing and execution
├── session.rs # Chat session management
├── history.rs # Session persistence
└── providers/ # LLM provider implementations
├── mod.rs # Provider trait and factory
├── openai.rs # OpenAI API integration
├── anthropic.rs # Anthropic API integration
└── gemini.rs # Google Gemini API integration
# API Keys (at least one required)
export OPENAI_API_KEY="your-openai-key"
export ANTHROPIC_API_KEY="your-anthropic-key"
export GEMINI_API_KEY="your-gemini-key"
# Optional defaults
export NJORD_DEFAULT_MODEL="gpt-4"
export NJORD_DEFAULT_TEMPERATURE="0.7"njord --helpo3-pro(latest reasoning model)o3(reasoning model)o4-mini(fast reasoning model)o3-mini(compact reasoning model)o1-pro(reasoning model)o1(reasoning model)gpt-4.1(latest chat model)gpt-4ogpt-4.1-minigpt-4o-minigpt-4.1-nano
claude-sonnet-4-20250514(latest, supports thinking)claude-opus-4-20250514(supports thinking)claude-3-7-sonnet-20250219(supports thinking)claude-3-5-sonnet-20241022claude-3-5-haiku-20241022claude-3-5-sonnet-20240620
gemini-2.5-progemini-2.5-flashgemini-2.5-flash-lite
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Make your changes
- Add tests if applicable
- Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the ISC License - see the LICENSE file for details.
- Named after Njörðr, the Norse god associated with the sea, seafaring, wind, fishing, wealth, and crop fertility
- Built with Rust for performance and safety
- Uses rustls for pure Rust TLS implementation
- Terminal interface powered by rustyline
- Universal clipboard support via arboard and OSC52 escape sequences
- Developed with Aider - the entire project was collaboratively built using Aider and Claude-4-Sonnet
- Enhanced Retry Logic: Exponential backoff retry system with 5 attempts (0.5s to 16s backoff)
- Improved Test Isolation: Environment variable handling refactored for reliable testing
- Docker Support: Multi-stage Dockerfile with cargo-chef for efficient builds and static linking
- Rust Toolchain: Pinned to Rust 1.85.1 for consistent builds across environments
- Complete Code Management System: Extract, view, copy, save, and execute code blocks
- Universal Clipboard Integration: Works in SSH sessions and all terminal environments
- Advanced Session Operations: Fork, merge, continue, and safe loading of sessions
- Full-Text Search: Search across all sessions with intelligent excerpt highlighting
- Session Summarization: AI-powered summaries for quick conversation review
- Auto-Renaming System: LLM-generated session titles with bulk processing support
- Enhanced Tab Completion: Smart command completion with helpful hints
- Thinking Mode Support: See AI reasoning process for supported Anthropic models
- Robust Interruption Handling: Ctrl-C with message queuing and retry logic
- Professional Terminal UI: Syntax highlighting, colored output, and status displays
See ROADMAP.md for planned features and development phases.