1 unstable release
| 0.1.0 | Dec 15, 2025 |
|---|
#2820 in Command line utilities
3MB
9K
SLoC
cmdai
π§ Early Development Stage - Architecture defined, core implementation in progress
cmdai converts natural language descriptions into safe POSIX shell commands using local LLMs. Built with Rust for blazing-fast performance, single-binary distribution, and safety-first design.
$ cmdai "list all PDF files in Downloads folder larger than 10MB"
Generated command:
find ~/Downloads -name "*.pdf" -size +10M -ls
Execute this command? (y/N) y
π Project Status
This project is in active early development. The architecture and module structure are in place, with implementation ongoing.
β Completed
- Core CLI structure with comprehensive argument parsing
- Modular architecture with trait-based backends
- Embedded model backend with MLX (Apple Silicon) and CPU variants β¨
- Remote backend support (Ollama, vLLM) with automatic fallback β¨
- Safety validation with pattern matching and risk assessment
- Configuration management with TOML support
- Interactive user confirmation flows
- Multiple output formats (JSON, YAML, Plain)
- Contract-based test structure with TDD methodology
- Multi-platform CI/CD pipeline
π§ In Progress
- Model downloading and caching system
- Advanced command execution engine
- Performance optimization
π Planned
- Multi-step goal completion
- Advanced context awareness
- Shell script generation
- Command history and learning
β¨ Features (Planned & In Development)
- π Instant startup - Single binary with <100ms cold start (target)
- π§ Local LLM inference - Optimized for Apple Silicon with MLX
- π‘οΈ Safety-first - Comprehensive command validation framework
- π¦ Zero dependencies - Self-contained binary distribution
- π― Multiple backends - Extensible backend system (MLX, vLLM, Ollama)
- πΎ Smart caching - Hugging Face model management
- π Cross-platform - macOS, Linux, Windows support
π Quick Start
Prerequisites
- Rust 1.75+ with Cargo
- CMake (for model inference backends)
- macOS with Apple Silicon (optional, for GPU acceleration)
- Xcode (optional, for full MLX GPU support on Apple Silicon)
Platform-Specific Setup
macOS (Recommended for Apple Silicon)
For complete macOS setup instructions including GPU acceleration, see macOS Setup Guide.
Quick Install:
# Install Rust
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source "$HOME/.cargo/env"
# Install CMake via Homebrew
brew install cmake
# Clone and build
git clone https://github.com/wildcard/cmdai.git
cd cmdai
cargo build --release
# Run
./target/release/cmdai "list all files"
For GPU Acceleration (Apple Silicon only):
- Install Xcode from App Store (required for Metal compiler)
- Build with:
cargo build --release --features embedded-mlx - See macOS Setup Guide for details
Note: The default build uses a stub implementation that works immediately without Xcode. For production GPU acceleration, Xcode is required.
Linux
# Install Rust
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source "$HOME/.cargo/env"
# Install dependencies (Ubuntu/Debian)
sudo apt-get update
sudo apt-get install cmake build-essential
# Clone and build
git clone https://github.com/wildcard/cmdai.git
cd cmdai
cargo build --release
Windows
# Install Rust from https://rustup.rs
# Install CMake from https://cmake.org/download/
# Clone and build
git clone https://github.com/wildcard/cmdai.git
cd cmdai
cargo build --release
Building from Source
# Clone the repository
git clone https://github.com/wildcard/cmdai.git
cd cmdai
# Build the project (uses CPU backend by default)
cargo build --release
# Run the CLI
./target/release/cmdai --version
Development Commands
# Run tests
make test
# Format code
make fmt
# Run linter
make lint
# Build optimized binary
make build-release
# Run with debug logging
RUST_LOG=debug cargo run -- "your command"
π Usage
Basic Syntax
cmdai [OPTIONS] <PROMPT>
Examples
# Basic command generation
cmdai "list all files in the current directory"
# With specific shell
cmdai --shell zsh "find large files"
# JSON output for scripting
cmdai --output json "show disk usage"
# Adjust safety level
cmdai --safety permissive "clean temporary files"
# Auto-confirm dangerous commands
cmdai --confirm "remove old log files"
# Verbose mode with timing info
cmdai --verbose "search for Python files"
CLI Options
| Option | Description | Status |
|---|---|---|
-s, --shell <SHELL> |
Target shell (bash, zsh, fish, sh, powershell, cmd) | β Implemented |
--safety <LEVEL> |
Safety level (strict, moderate, permissive) | β Implemented |
-o, --output <FORMAT> |
Output format (json, yaml, plain) | β Implemented |
-y, --confirm |
Auto-confirm dangerous commands | β Implemented |
-v, --verbose |
Enable verbose output with timing | β Implemented |
-c, --config <FILE> |
Custom configuration file | β Implemented |
--show-config |
Display current configuration | β Implemented |
--auto |
Execute without confirmation | π Planned |
--allow-dangerous |
Allow potentially dangerous commands | π Planned |
--verbose |
Enable verbose logging | β Available |
Examples (Target Functionality)
# Simple command generation
cmdai "compress all images in current directory"
# With specific backend
cmdai --backend mlx "find large log files"
# Verbose mode for debugging
cmdai --verbose "show disk usage"
ποΈ Architecture
Module Structure
cmdai/
βββ src/
β βββ main.rs # CLI entry point
β βββ backends/ # LLM backend implementations
β β βββ mod.rs # Backend trait definition
β β βββ mlx.rs # Apple Silicon MLX backend
β β βββ vllm.rs # vLLM remote backend
β β βββ ollama.rs # Ollama local backend
β βββ safety/ # Command validation
β β βββ mod.rs # Safety validator
β βββ cache/ # Model caching
β βββ config/ # Configuration management
β βββ cli/ # CLI interface
β βββ models/ # Data models
β βββ execution/ # Command execution
βββ tests/ # Contract-based tests
βββ specs/ # Project specifications
Core Components
- CommandGenerator Trait - Unified interface for all LLM backends
- SafetyValidator - Command validation and risk assessment
- Backend System - Extensible architecture for multiple inference engines
- Cache Manager - Hugging Face model management (planned)
Backend Architecture
#[async_trait]
trait CommandGenerator {
async fn generate_command(&self, request: &CommandRequest)
-> Result<GeneratedCommand, GeneratorError>;
async fn is_available(&self) -> bool;
fn backend_info(&self) -> BackendInfo;
}
π§ Development
Prerequisites
- Rust 1.75+
- Cargo
- Make (optional, for convenience commands)
- Docker (optional, for development container)
Setup Development Environment
# Clone and enter the project
git clone https://github.com/wildcard/cmdai.git
cd cmdai
# Install dependencies and build
cargo build
# Run tests
cargo test
# Check formatting
cargo fmt -- --check
# Run clippy linter
cargo clippy -- -D warnings
Backend Configuration
cmdai supports multiple inference backends with automatic fallback:
Embedded Backend (Default)
- MLX: Optimized for Apple Silicon Macs (M1/M2/M3)
- CPU: Cross-platform fallback using Candle framework
- Model: Qwen2.5-Coder-1.5B-Instruct (quantized)
- No external dependencies required
Remote Backends (Optional)
Configure in ~/.config/cmdai/config.toml:
[backend]
primary = "embedded" # or "ollama", "vllm"
enable_fallback = true
[backend.ollama]
base_url = "http://localhost:11434"
model_name = "codellama:7b"
[backend.vllm]
base_url = "http://localhost:8000"
model_name = "codellama/CodeLlama-7b-hf"
api_key = "optional-api-key"
Project Configuration
The project uses several configuration files:
Cargo.toml- Rust dependencies and build configuration~/.config/cmdai/config.toml- User configurationclippy.toml- Linter rulesrustfmt.toml- Code formatting rulesdeny.toml- Dependency audit configuration
Testing Strategy
The project uses contract-based testing:
- Unit tests for individual components
- Integration tests for backend implementations
- Contract tests to ensure trait compliance
- Property-based testing for safety validation
π‘οΈ Safety Features
cmdai includes comprehensive safety validation to prevent dangerous operations:
Implemented Safety Checks
- β
System destruction patterns (
rm -rf /,rm -rf ~) - β
Fork bombs detection (
:(){:|:&};:) - β
Disk operations (
mkfs,dd if=/dev/zero) - β
Privilege escalation detection (
sudo su,chmod 777 /) - β
Critical path protection (
/bin,/usr,/etc) - β Command validation and sanitization
Risk Levels
- Safe (Green) - Normal operations, no confirmation needed
- Moderate (Yellow) - Requires user confirmation in strict mode
- High (Orange) - Requires confirmation in moderate mode
- Critical (Red) - Blocked in strict mode, requires explicit confirmation
Safety Configuration
Configure safety levels in ~/.config/cmdai/config.toml:
[safety]
enabled = true
level = "moderate" # strict, moderate, or permissive
require_confirmation = true
custom_patterns = ["additional", "dangerous", "patterns"]
π€ Contributing
We welcome contributions! This is an early-stage project with many opportunities to contribute.
Areas for Contribution
- π Backend implementations
- π‘οΈ Safety pattern definitions
- π§ͺ Test coverage expansion
- π Documentation improvements
- π Bug fixes and optimizations
Getting Started
- Fork the repository
- Create a feature branch
- Make your changes with tests
- Ensure all tests pass
- Submit a pull request
Development Guidelines
- Follow Rust best practices
- Add tests for new functionality
- Update documentation as needed
- Use conventional commit messages
- Run
make checkbefore submitting
π License
This project is licensed under the GNU Affero General Public License v3.0 (AGPL-3.0) - see the LICENSE file for details.
License Summary
- β Commercial use
- β Modification
- β Distribution
- β Private use
- β οΈ Network use requires source disclosure
- β οΈ Same license requirement
- β οΈ State changes documentation
π Acknowledgments
- MLX - Apple's machine learning framework
- vLLM - High-performance LLM serving
- Ollama - Local LLM runtime
- Hugging Face - Model hosting and caching
- clap - Command-line argument parsing
π Support & Community
- π Bug Reports: GitHub Issues
- π‘ Feature Requests: GitHub Discussions
- π Documentation: See
/specsdirectory for detailed specifications
πΊοΈ Roadmap
Phase 1: Core Structure (Current)
- CLI argument parsing
- Module architecture
- Backend trait system
- Basic command generation
Phase 2: Safety & Validation
- Dangerous pattern detection
- POSIX compliance checking
- User confirmation workflows
- Risk assessment system
Phase 3: Backend Integration
- vLLM HTTP API support
- Ollama local backend
- Response parsing
- Error handling
Phase 4: MLX Optimization
- FFI bindings with cxx
- Metal Performance Shaders
- Unified memory handling
- Apple Silicon optimization
Phase 5: Production Ready
- Comprehensive testing
- Performance optimization
- Binary distribution
- Package manager support
Built with Rust | Safety First | Open Source
Note: This is an active development project. Features and APIs are subject to change. See the specs directory for detailed design documentation.
Dependencies
~28β59MB
~894K SLoC