A simple CLI tool that uses AI agents to generate daily work summaries from your GitHub activity.
Analyzes your GitHub activity and generates daily summaries like this:
Yesterday:
- Merged PR #142: Fix authentication bug in user login flow
- Reviewed PR #138: Add support for OAuth2 integration
- Opened issue #145: Memory leak in background task processor
Today:
- Continue work on OAuth2 integration testing
- Address memory leak issue in background processor
- Review pending PRs from team members
Uses a simple agent powered by Mozilla AI's any-agent framework with any-llm for LLM provider abstraction. The agent connects to GitHub via Model Context Protocol (MCP) to analyze your activity and generate summaries.
- Python 3.11 or newer
- A GitHub Personal Access Token
- API key for your chosen LLM provider (OpenAI, Anthropic, etc.)
pip install dailysumgit clone https://github.com/njbrake/dailysum.git
cd dailysum
pip install -e .Run the initialization command and follow the prompts:
dailysum initThis will:
- Prompt for your GitHub token
- Let you choose your preferred LLM model
- Optionally set your company name
- Save configuration to
~/.config/dailysum/config.toml
Set these environment variables:
export GITHUB_TOKEN="ghp_your_github_token_here"
export MODEL_ID="openai/gpt-4o-mini" # Optional, defaults to gpt-4o-mini
export COMPANY="Your Company Name" # OptionalThen run with the --use-env flag:
dailysum generate --use-env- Go to GitHub Settings > Developer settings > Personal access tokens
- Click "Generate new token (classic)"
- Select scopes:
repo,read:user,read:org,notifications - Copy the generated token
- OpenAI:
openai/gpt-4o,openai/gpt-4o-mini,openai/gpt-3.5-turbo - Anthropic:
anthropic/claude-3-5-sonnet-20241022,anthropic/claude-3-haiku-20240307 - Mistral:
mistral/mistral-large-latest,mistral/mistral-small-latest - Google:
google/gemini-1.5-pro,google/gemini-1.5-flash
See any-llm providers for the complete list.
dailysum generatedailysum configdailysum generate --use-env# Initialize with custom location
dailysum init --config-path /path/to/my/config.toml
# Generate using custom location
dailysum generate --config-path /path/to/my/config.tomlYou can easily switch between models by updating your config:
# Use a faster, cheaper model
dailysum init --model-id "openai/gpt-4o-mini"
# Use a more powerful model for complex summaries
dailysum init --model-id "anthropic/claude-3-5-sonnet-20241022"git clone https://github.com/njbrake/dailysum.git
cd dailysum
# Install with development dependencies
pip install -e ".[dev]"
# Set up pre-commit hooks
pre-commit installpytest# Format and lint
ruff format .
ruff check . --fix
# Type checking
mypy src/Contributions are welcome. Areas of interest:
- New LLM provider support
- Summary templates and prompts
- Additional GitHub data sources
- CLI improvements
- Tests and documentation
See Contributing Guide for details.
π Your Daily Summary
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Yesterday: β
β - Merged PR #234: Implement user authentication system β
β - Reviewed PR #231: Add Docker configuration β
β - Fixed critical bug in payment processing (Issue #189) β
β - Updated documentation for API endpoints β
β β
β Today: β
β - Complete integration tests for authentication system β
β - Review pending PRs from team members β
β - Start work on user dashboard redesign β
β - Investigate performance issues in search functionality β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Make sure you've either:
- Run
dailysum initto set up a config file, or - Set the
GITHUB_TOKENenvironment variable
Install the required dependencies:
pip install any-agent any-llmIf you hit GitHub API rate limits:
- Use a GitHub token (provides 5000 requests/hour vs 60 for unauthenticated)
- Consider running the tool less frequently
- The tool automatically respects rate limits and will wait if needed
MIT License - see LICENSE for details.
- any-llm - Unified LLM provider interface
- any-agent - Unified AI agent framework
- GitHub Copilot MCP - GitHub API access via Model Context Protocol
- Rich - Beautiful terminal output
- Click - Command-line interface framework
Built by Mozilla AI