AI Conversation Intelligence with Technical Context Preservation
AI Conversation Intelligence with Technical Context Preservation
A lightweight, fully local Python utility that captures AI-powered CLI chat logs with intelligent semantic processing and preserves the crucial technical "how" and "why" that future development sessions need. Features dual AI provider support (Claude/Gemini), automatic chat analysis with implementation details, and structured JSON backups that maintain technical continuity. Simple 3-step setup with professional-grade organization and no external platform dependencies.
For Developers and AI Contributors:
- Development Directives:
dev/dev_directives/general.md
- Project Coordination:
dev/dev_stages/stages_workflow.md
- Master Context Memory:
llm.txt
- Current Initiative: bchat Vision
Current Status: Transitioning from Phase 2 (Deep Context Engine) to BChat MCP development with structured stage progression.
- π§ Technical Context Preservation: Captures the crucial "how" and "why" that future development sessions need
- π Fully Local: No data leaves your machine
- β‘ Simple Setup: Ready in 3 steps - clone, add API key, install
- π Universal Access:
bchat
command works from anywhere in your workspace - π€ AI-Smart: Intelligent chat analysis with implementation detail extraction
- π¦ Lightweight: Minimal dependencies, maximum functionality
- π§ Technical Context Intelligence: Preserves implementation details, code changes, and architectural decisions that enable seamless development continuity
- π Real-time Monitoring: Automatically watches and processes AI chat logs
- π§ Dual AI Providers: Choose between Claude or Gemini APIs for intelligent analysis
- π Structured Data: Creates machine-readable JSON indexes with technical metadata and implementation tracking
- π Daily Consolidation: Merges multiple chat files into organized single files with context preservation
- π Universal Access:
bchat
command works from any directory in your workspace - π¬ Multi-AI Support: Compatible with Claude Code, Gemini CLI, and extensible to other AI tools
- π‘οΈ Resilient Architecture: Circuit breaker patterns, retry logic, and graceful error handling
- β‘ Professional Organization: Clean workspace structure with essential files at root level
- Python 3.8+ (required)
- AI API Key (required for intelligent processing): Anthropic API key for Claude OR Google API key for Gemini
- Node.js 16+ (optional, for Gemini CLI integration)
- Git (for installation)
-
Clone the repository:
git clone https://github.com/Nyrk0/bchat.git cd bchat
-
Configure your API key:
cp .env.example .env # Edit .env and add your API key: # For Claude (recommended): ANTHROPIC_API_KEY=your_anthropic_api_key_here # For Gemini: GOOGLE_API_KEY=your_google_api_key_here
-
Run the installer:
./install.sh
-
Start using bchat:
# Check system status (works without API key) ./bchat --status # Backup and process chat conversation (requires API key) ./bchat # Use Gemini CLI with logging (requires API key) ./bchat -p "Explain quantum computing"
π‘ Note: Basic commands like
--status
work immediately. Intelligent processing features require an API key configured in step 2.
Windows installation is not yet supported. Contributions for Windows installer scripts are highly welcomed! The core Python functionality should work on Windows with manual setup.
The bchat
command works from anywhere in your workspace:
# From any directory - triggers chat backup/consolidation
bchat
- From AI CLI windows: Saves current AI conversation to structured logs
- From VSCode terminal: Consolidates recent chat activity
- From any location: Works globally across the workspace
bchat -p "Explain quantum computing"
bchat -a -p "Analyze this project structure" # Include all files
bchat --help # See all Gemini options
# Start monitoring system
./start
# Control monitoring (from ai-cli-chat-logger directory)
./rchat --help # View all options
./runchat # Alternative command (same as rchat)
# Manual consolidation
./rchat --consolidate
When you clone from GitHub:
cd /their/workspace/
git clone https://github.com/Nyrk0/bchat.git
You will get:
their-workspace/
βββ their-existing-file.txt # User's files (no conflict)
βββ their-config.json # User's files (no conflict)
βββ their-install.sh # User's files (no conflict)
βββ bchat/ # All bchat files contained here
βββ README.md # Documentation
βββ LICENSE # MIT License
βββ CLAUDE.md # Claude Code instructions
βββ bchat # Main executable
βββ install.sh # Installation script
βββ requirements.txt # Python dependencies
βββ .env.example # Environment template
βββ bin/ # All executable scripts
β βββ bchat-status # System status checker
β βββ rchat # Chat monitor launcher
β βββ runchat # Alternative launcher
β βββ start # Quick start script
βββ config/ # Configuration files
β βββ config.json # Main config (Claude Sonnet 4 default)
β βββ wrappers/
β βββ claude_wrapper.sh # Claude CLI logging wrapper
β βββ gemini_wrapper.sh # Gemini CLI logging wrapper
βββ core/ # Python source code
β βββ src/
β βββ chat_monitor.py # Core monitoring system
β βββ utils/
β βββ path_manager.py # Path resolution utilities
βββ data/ # Runtime data (created during use)
β βββ chats/ # Chat logs and processed JSON
β β βββ chat_index.json # Searchable session index
β β βββ context_summary.json # Cross-session analysis
β β βββ chat_log_*.json # Individual session logs
β β βββ claude_current_day_raw.log # Raw Claude logs
β β βββ gemini_current_day_raw.log # Raw Gemini logs
β βββ logs/
β βββ bchat.log # System operation logs
βββ dev/ # Development tools
β βββ venv/ # Virtual environment (created by install)
β βββ dev_directives/
β βββ general.md # Development guidelines
βββ docs/ # Complete documentation
βββ user-guide.md # User documentation
βββ ai-integration.md # AI integration guide
βββ CHANGELOG.md # Project history
βββ structure.md # Workspace organization guide
Perfect Namespace Isolation: All bchat files are contained within the bchat/
directory, preventing any conflicts with your existing files. You can have your own install.sh
, config.json
, etc. without any naming conflicts.
# API Keys (choose your preferred provider)
GOOGLE_API_KEY=your_google_api_key_here # For Gemini provider
ANTHROPIC_API_KEY=your_anthropic_api_key_here # For Claude provider
# Optional: Chat log retention (default: 90 days)
CHAT_LOG_RETENTION_DAYS=90
# Optional: Debug mode (default: false)
CHAT_MONITOR_DEBUG=false
{
"system": {
"project_name": "your_project",
"log_level": "INFO"
},
"api": {
"provider": "gemini",
"model": "gemini-2.5-flash",
"claude": {
"model": "claude-3-5-sonnet-20241022"
}
},
"monitoring": {
"enabled": true,
"debounce_delay": 2.0,
"triggers": ["bchat", "backup chat"]
}
}
The Core Purpose: bchat solves the critical problem of technical context continuity in AI-assisted development sessions.
Traditional chat logging captures what was decided but loses the crucial how and why:
- β Specific code changes and their locations
- β Root cause analysis of issues
- β System architecture understanding
- β Implementation strategies and technical decisions
- β Development stage progress and status
bchat preserves technical implementation context that future development sessions need:
- β Code Change Tracking: Documents specific files modified and why
- β Architecture Mapping: Captures component relationships and system understanding
- β Stage Progress: Tracks development methodology progress (ST_00 β ST_01 β ST_02...)
- β Issue Resolution: Preserves root cause analysis and solution implementation
- β Technical Decisions: Documents the reasoning behind implementation choices
Our comprehensive system analysis reveals that while basic JSON processing works excellently, enhanced technical context capture is essential for development continuity. See detailed findings in dev/dev_stages/ST_00/01-250808-audit_report.md
.
Key Discovery: Context continuity gaps were identified as a HIGH priority issue affecting development efficiency and technical knowledge preservation.
# Start a Claude Code session
claude
# After productive conversation, backup progress
bchat
# Continue in VSCode terminal and save work
bchat
# After significant progress
bchat
# System creates:
# β
chats/chat_backup_YYYY-MM-DD.md # Human-readable
# β
chats/chat_index.json # Machine-readable
# β
chats/context_summary.json # Cross-session context
We welcome contributions! Here's how to get started:
- Fork the repository
- Create a feature branch:
git checkout -b feature/your-feature
- Make your changes and test
- Commit with clear messages:
git commit -m "Add Windows installer support"
- Push and create a pull request
- Windows installer script - Adapt
install.sh
for Windows/PowerShell - Linux distribution testing - Test on Ubuntu, Debian, Fedora, etc.
- Performance optimizations - Async processing, caching
- Web dashboard - Browser interface for chat analytics
- Unit tests - Test coverage for all components
See CONTRIBUTING.md for detailed guidelines.
Installation fails
# Check Python version
python3 --version # Should be 3.8+
# Install dependencies manually
pip3 install watchdog google-generativeai python-dotenv
API errors
# Verify API key
echo $GOOGLE_API_KEY | head -c 10
# Test API connection
python3 -c "import google.generativeai as genai; genai.configure(api_key='$GOOGLE_API_KEY'); print('API works')"
bchat not found
# Re-run installer
./install.sh
# Check symlink
ls -la ../bchat
- π Documentation: Check our docs/ directory
- π Bug Reports: Create an issue
- π¬ Discussions: GitHub Discussions
- π§ Contact: Open an issue for questions
MIT License - see LICENSE file for details.
Copyright (c) 2025 Alex Walter Rettig Eglinton
- Claude Code - Official Claude CLI
- Gemini CLI - Google's Gemini CLI
- Watchdog - Python file monitoring
If this project helps you, please consider:
- β Starring the repository
- π Reporting bugs via Issues
- π‘ Suggesting features via Issues
- π§ Contributing code via Pull Requests
- π’ Sharing with others who might find it useful
π Ready to get started? Run ./install.sh
and you'll be monitoring AI conversations in under a minute!