Readme
Special Thanks
Table of Contents
Features 🔅
Beautiful & Ergonomic Diff Viewer : Review your code with syntax highlighting & leave comments
Smart Commit Messages : Generate conventional commit messages for your staged changes
Interactive Search : Find and explore commits using fuzzy search
Multiple AI Providers : Supports OpenAI, Claude, Groq, Ollama, and more
Flexible : Works with any git workflow and supports multiple AI providers
Rich Output : Markdown support for readable explanations and diffs (requires: mdcat)
Getting Started 🔅
Prerequisites
Before you begin, ensure you have:
git installed on your system
fzf (optional) - Required for lumen explain - - list command
mdcat (optional) - Required for pretty output formatting
Installation
Using Homebrew (MacOS and Linux)
brew install jnsahaj/lumen/lumen
Using Cargo
[!IMPORTANT]
cargo is a package manager for rust ,
and is installed automatically when you install rust .
See installation guide
cargo install lumen
Configuration (for AI features)
If you want to use AI-powered features (explain , draft , list , operate ), run the interactive setup:
lumen configure
This will guide you through:
Selecting an AI provider
Entering your API key (optional if using environment variable)
Specifying a custom model name (optional - press Enter to use the default)
The configuration is saved to ~/.config/lumen/lumen.config.json .
[!NOTE]
The diff command works without any configuration - it's a standalone visual diff viewer.
Usage 🔅
Generate Commit Messages
Create meaningful commit messages for your staged changes:
# Basic usage - generates a commit message based on staged changes
lumen draft
# Output: "feat(button.tsx): Update button color to blue"
# Add context for more meaningful messages
lumen draft --context "match brand guidelines"
# Output: "feat(button.tsx): Update button color to align with brand identity guidelines"
Generate Git Commands
Ask Lumen to generate Git commands based on a natural language query:
lumen operate " squash the last 3 commits into 1 with the message 'squashed commit'"
# Output: git reset --soft HEAD~3 && git commit -m "squashed commit" [y/N]
The command will display an explanation of what the generated command does, show any warnings for potentially dangerous operations, and prompt for confirmation before execution.
Visual Diff Viewer
Launch an interactive side-by-side diff viewer in your terminal:
# View uncommitted changes
lumen diff
# View changes for a specific commit
lumen diff HEAD~1
# View changes between branches
lumen diff main..feature/A
# View changes in a GitHub Pull Request
lumen diff --pr 123 # (--pr is optional)
lumen diff https://github.com/owner/repo/pull/123
# Filter to specific files
lumen diff --file src/main.rs --file src/lib.rs
# Watch mode - auto-refresh on file changes
lumen diff --watch
# Stacked mode - review commits one by one
lumen diff main..feature --stacked
# Jump to a specific file on open
lumen diff --focus src/main.rs
Stacked Diff Mode
Review a range of commits one at a time with --stacked :
lumen diff main..feature -- stacked
lumen diff HEAD~ 5..HEAD -- stacked
This displays each commit individually, letting you navigate through them:
ctrl+ h / ctrl+ l : Previous / next commit
Click the ‹ / › arrows in the header
The header shows the current commit position, SHA, and message. Viewed files are tracked per commit, so your progress is preserved when navigating.
When viewing a PR, you can mark files as viewed (syncs with GitHub) using the space keybinding.
Theme Configuration
Customize the diff viewer colors with preset themes:
# Using CLI flag
lumen diff --theme dracula
# Using environment variable
LUMEN_THEME=catppuccin-mocha lumen diff
# Or set permanently in config file (~/.config/lumen/lumen.config.json)
{
"theme": "dracula"
}
Available themes:
Theme
Value
Default (auto-detect)
dark , light
Catppuccin
catppuccin-mocha , catppuccin-latte
Dracula
dracula
Nord
nord
One Dark
one-dark
Gruvbox
gruvbox-dark , gruvbox-light
Solarized
solarized-dark , solarized-light
Priority: CLI flag > config file > LUMEN_THEME env var > OS auto-detect.
Annotations
Add comments to hunks during code review:
i : Add/edit annotation on focused hunk
I : View all annotations (edit, delete, copy, or export)
Annotations can be copied to clipboard or exported to a file for sharing.
Keybindings
j/ k or arrow keys: Navigate
{ / } : Jump between hunks
tab : Toggle sidebar
space : Mark file as viewed
e : Open file in editor
i/ I : Add annotation / view all annotations
ctrl+ h/ l : Previous/next commit (stacked mode)
? : Show all keybindings
Explain Changes
Understand what changed and why:
# Explain current changes in your working directory
lumen explain # All changes
lumen explain --staged # Only staged changes
# Explain specific commits
lumen explain HEAD # Latest commit
lumen explain abc123f # Specific commit
lumen explain HEAD~3..HEAD # Last 3 commits
lumen explain main..feature/A # Branch comparison
lumen explain main...feature/A # Branch comparison (merge base)
# Ask specific questions about changes
lumen explain --query "What's the performance impact of these changes?"
lumen explain HEAD --query "What are the potential side effects?"
# Interactive commit selection
lumen explain --list # Select commit interactively
Interactive Mode
# Launch interactive fuzzy finder to search through commits (requires: fzf)
lumen explain --list
# Deprecated: lumen list (use lumen explain --list instead)
Tips & Tricks
# Copy commit message to clipboard
lumen draft | pbcopy # macOS
lumen draft | xclip -selection c # Linux
# View the commit message and copy it
lumen draft | tee >(pbcopy)
# Open in your favorite editor
lumen draft | code -
# Directly commit using the generated message
lumen draft | git commit -F -
If you are using lazygit , you can add this to the user config
customCommands :
- key : ' <c-l>'
context : ' files'
command : ' lumen draft | tee >(pbcopy)'
loadingText : ' Generating message...'
showOutput : true
- key : ' <c-k>'
context : ' files'
command : ' lumen draft -c {{.Form.Context | quote}} | tee >(pbcopy)'
loadingText : ' Generating message...'
showOutput : true
prompts :
- type : ' input'
title : ' Context'
key : ' Context'
AI Providers 🔅
Configure your preferred AI provider:
# Using CLI arguments
lumen -p openai -k "your-api-key" -m "gpt-5-mini" draft
# Using environment variables
export LUMEN_AI_PROVIDER="openai"
export LUMEN_API_KEY="your-api-key"
export LUMEN_AI_MODEL="gpt-5-mini"
Supported Providers
Provider
API Key Required
Models
OpenAI openai (Default)
Yes
gpt- 5. 2 , gpt-5 , gpt-5-mini , gpt-5-nano , gpt- 4. 1 , gpt- 4. 1 - mini , o4-mini (default: gpt-5-mini )
Claude claude
Yes
claude-sonnet-4-5-20250930 , claude-opus-4-5-20251115 , claude-haiku-4-5-20251015 (default: claude-sonnet-4-5-20250930 )
Gemini gemini
Yes (free tier)
gemini-3-pro , gemini-3-flash-preview , gemini- 2. 5 - pro , gemini- 2. 5 - flash , gemini- 2. 5 - flash- lite (default: gemini- 2. 5 - flash )
Groq groq
Yes (free)
llama- 3. 3 - 70b- versatile , llama- 3. 1 - 8b- instant , meta- llama/ llama- 4 - maverick- 17b- 128e- instruct , openai/ gpt- oss- 120b (default: llama- 3. 3 - 70b- versatile )
DeepSeek deepseek
Yes
deepseek-chat (V3.2), deepseek-reasoner (default: deepseek-chat )
xAI xai
Yes
grok-4 , grok-4-mini , grok-4-mini-fast (default: grok-4-mini-fast )
OpenCode Zen opencode-zen
Yes
see list (default: claude-sonnet-4-5 )
Ollama ollama
No (local)
see list (default: llama3. 2 )
OpenRouter openrouter
Yes
see list (default: anthropic/ claude- sonnet- 4. 5 )
Vercel AI Gateway vercel
Yes
see list (default: anthropic/ claude- sonnet- 4. 5 )
Advanced Configuration 🔅
Configuration File
Lumen supports configuration through a JSON file. You can place the configuration file in one of the following locations:
Project Root: Create a lumen.config.json file in your project's root directory.
Custom Path: Specify a custom path using the --config CLI option.
Global Configuration (Optional): Place a lumen.config.json file in your system's default configuration directory:
Linux/macOS: ~/.config/lumen/lumen.config.json
Windows: % USERPROFILE % \. config\lumen\lumen. config. json
Lumen will load configurations in the following order of priority:
CLI arguments (highest priority)
Configuration file specified by --config
Project root lumen.config.json
Global configuration file (lowest priority)
{
" provider" : " openai" ,
" model" : " gpt-5-mini" ,
" api_key" : " sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" ,
" theme" : " catppuccin-mocha" ,
" draft" : {
" commit_types" : {
" docs" : " Documentation only changes" ,
" style" : " Changes that do not affect the meaning of the code" ,
" refactor" : " A code change that neither fixes a bug nor adds a feature" ,
" perf" : " A code change that improves performance" ,
" test" : " Adding missing tests or correcting existing tests" ,
" build" : " Changes that affect the build system or external dependencies" ,
" ci" : " Changes to our CI configuration files and scripts" ,
" chore" : " Other changes that don't modify src or test files" ,
" revert" : " Reverts a previous commit" ,
" feat" : " A new feature" ,
" fix" : " A bug fix"
}
}
}
Configuration Precedence
Options are applied in the following order (highest to lowest priority):
CLI Flags
Configuration File
Environment Variables
Default options
Example: Using different providers for different projects:
# Set global defaults in .zshrc/.bashrc
export LUMEN_AI_PROVIDER="openai"
export LUMEN_AI_MODEL="gpt-5-mini"
export LUMEN_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxx"
# Override per project using config file
{
"provider": "ollama",
"model": "llama3.2"
}
# Or override using CLI flags
lumen -p "ollama" -m "llama3.2" draft
Contributors
Made with contrib.rocks .
Interested in Contributing?
Contributions are welcome! Please feel free to submit a Pull Request.
Star History