a streamlined desktop application for prompt engineering and context preparation. provides an intuitive workflow for managing project context, crafting prompts, and applying ai-generated code changes.
- blazing fast indexing - concurrent file processing with 64 worker threads, persistent caching, and hash-based change detection
- project context generation - converts project files into structured text optimized for llm prompts with intelligent filtering
- multi-step workflow - guided process from context preparation to prompt composition and patch application
- local token counting - accurate offline token counting using tiktoken (cl100k_base encoding) - no api calls required
- file watching - real-time monitoring of project changes with automatic updates
- customizable rules - support for custom ignore patterns and prompt rules
- diff/patch support - parse and apply ai-generated code changes directly to your project
- dark/light themes - system-aware theming with manual toggle
- auto-updates - automatic update checking with one-click installer updates
- cross-platform - builds for windows, macos, and linux
flowchart LR
A[1. prepare context] --> B[2. compose prompt]
B --> C[3. execute prompt]
C --> D[4. apply patch]
style A fill:#3b82f6,stroke:#1d4ed8,color:#fff
style B fill:#8b5cf6,stroke:#6d28d9,color:#fff
style C fill:#10b981,stroke:#059669,color:#fff
style D fill:#f59e0b,stroke:#d97706,color:#fff
- prepare context - select project files, configure exclusions, generate context
- compose prompt - craft your prompt with project context and custom rules
- execute prompt - copy prompt for use with your preferred llm
- apply patch - parse diff output and apply changes to project files
the project launcher is the first screen you see when opening shotgun. it provides:
- sidebar navigation - switch between projects list and about section
- search bar - quickly filter through your recent projects
- open button - browse and select a new project folder
- recent projects - list of previously opened projects with quick access
- version display - shows current app version in the sidebar
the main workspace appears after selecting a project folder:
- left sidebar - file tree with token counts per file, gitignore-aware filtering
- central panel - multi-step workflow area with horizontal stepper navigation
- file selection - checkboxes to include/exclude files from context generation
- token indicators - real-time token count display for selected files
- settings access - gear icon to configure ignore rules and preferences
the context preparation step allows fine-grained control:
- file tree expansion - expand/collapse directories to explore project structure
- bulk selection - select entire folders or individual files
- token aggregation - see total token count update as you select files
- ignore patterns - files matching gitignore or custom rules appear dimmed
- search/filter - filter files by name to quickly find what you need
the horizontal stepper guides you through the four-step workflow:
- step 1: prepare context - select files and generate structured context
- step 2: compose prompt - write your prompt with context injection
- step 3: execute prompt - copy the complete prompt for your llm
- step 4: apply patch - paste llm response and apply code changes
each step shows completion status and allows navigation back to previous steps.
download the latest release for your platform from the releases page.
| platform | recommended | portable |
|---|---|---|
| windows | -installer.exe (nsis) |
-windows-amd64.exe |
| macos | -universal.dmg |
-universal.zip |
| linux | -linux-amd64.deb |
standalone binary |
# install wails cli
go install github.com/wailsapp/wails/v2/cmd/wails@latest# clone repository
git clone https://github.com/ubranch/shotgun.git
cd shotgun
# install frontend dependencies
cd frontend && pnpm install && cd ..
# run in development mode with hot reload
wails dev# build production executable
wails build
# output: build/bin/shotgun[.exe]settings are stored in the os-specific config directory:
| platform | location |
|---|---|
| windows | %APPDATA%/shotgun/ |
| macos | ~/Library/Application Support/shotgun/ |
| linux | ~/.config/shotgun/ |
configurable options:
- custom ignore rules (gitignore syntax)
- custom prompt rules
- excluded directories
- gitignore integration toggle
block-beta
columns 3
block:frontend:3
columns 3
A["vue.js 3"] B["tailwind css"] C["vscode codicons"]
end
block:framework:3
columns 1
D["wails v2"]
end
block:backend:3
columns 3
E["go 1.24"] F["tiktoken-go"] G["fsnotify"]
end
style frontend fill:#10b981,stroke:#059669
style framework fill:#3b82f6,stroke:#1d4ed8
style backend fill:#8b5cf6,stroke:#6d28d9
| layer | technology |
|---|---|
| framework | wails v2 |
| backend | go 1.24 |
| frontend | vue.js 3 + vite |
| styling | tailwind css |
| icons | vscode codicons |
graph TD
subgraph root["shotgun/"]
A[app.go] --> |core logic| B[main.go]
C[split_diff.go] --> |diff parsing| A
subgraph frontend["frontend/src/"]
D[App.vue]
E[MainLayout.vue]
F[LeftSidebar.vue]
G[CentralPanel.vue]
subgraph steps["steps/"]
H[Step1PrepareContext]
I[Step2ComposePrompt]
J[Step3ExecutePrompt]
K[Step4ApplyPatch]
end
subgraph launcher["launcher/"]
L[ProjectLauncher]
M[RecentProjectList]
end
end
end
style root fill:#1e293b,stroke:#475569,color:#fff
style frontend fill:#065f46,stroke:#059669,color:#fff
style steps fill:#7c3aed,stroke:#8b5cf6,color:#fff
style launcher fill:#b45309,stroke:#f59e0b,color:#fff
| method | description |
|---|---|
SelectDirectory() |
open native directory picker |
ListFiles(path) |
list files with gitignore awareness |
RequestShotgunContextGeneration(...) |
generate project context async |
CountTokensLocal(text) |
count tokens locally using tiktoken (offline) |
Get/SetCustomIgnoreRules() |
manage custom ignore patterns |
Get/SetCustomPromptRules() |
manage custom prompt rules |
StartFileWatcher(path) |
start monitoring directory |
StopFileWatcher() |
stop file monitoring |
CheckForUpdates() |
check github for new releases |
DownloadAndApplyUpdate() |
download and launch installer |
ResetApplication() |
reset to initial state |
backend (go)
github.com/wailsapp/wails/v2- desktop frameworkgithub.com/pkoukk/tiktoken-go- offline token counting (cl100k_base encoding)github.com/creativeprojects/go-selfupdate- auto-update from github releasesgithub.com/cespare/xxhash/v2- fast hashing for token cache invalidationgithub.com/fsnotify/fsnotify- file system watchinggithub.com/sabhiram/go-gitignore- gitignore parsinggithub.com/karrick/godirwalk- lock-free directory traversal (10x faster than filepath.Walk)github.com/adrg/xdg- xdg base directory spec
frontend (node)
vue^3.5 - ui frameworkhighlight.js- syntax highlightingtailwindcss- utility-first css@vscode/codicons- icon set
flowchart LR
subgraph input["file system"]
A[project files]
end
subgraph processing["concurrent processing"]
B[64 file workers]
C[32 dir scanners]
D[buffer pools]
end
subgraph caching["smart caching"]
E[xxhash fingerprint]
F[gob serialization]
G[persistent index]
end
A --> B
A --> C
B --> D
C --> D
D --> E
E --> F
F --> G
style input fill:#ef4444,stroke:#dc2626
style processing fill:#3b82f6,stroke:#1d4ed8
style caching fill:#10b981,stroke:#059669
| optimization | description |
|---|---|
| worker pools | 64 concurrent file readers, 32 directory scanners |
| directory traversal | godirwalk for lock-free, allocation-efficient scanning |
| hashing | xxhash for blazing fast content fingerprinting |
| memory mapping | mmap for files >64kb reduces memory copies |
| buffer pools | tiered pools (4kb/64kb/1mb) minimize gc pressure |
| persistent cache | gob-serialized index survives restarts |
files are only re-processed when content changes (hash-based invalidation). unchanged files load instantly from cache.
shotgun uses tiktoken-go for fully offline token counting - no api calls, no rate limits, no costs.
| property | value |
|---|---|
| encoding | cl100k_base (gpt-4, claude compatible) |
| processing | fully local, no network required |
| caching | token counts cached per file hash |
| performance | concurrent counting with worker pools |
token counts are displayed per-file in the file tree and aggregated for the entire context, helping you stay within llm context limits.
| resource | limit |
|---|---|
| max output size | 50 mb |
| max file read size | 500 mb |
| binary files | auto-detected and excluded |
mit
inspirebek - ubranch@usa.com