Catch expensive SQL before it hits production.
Static, offline SQL analyzer that prevents performance regressions, security risks, correctness bugs, and cloud‑cost traps — with a polished terminal experience.
- Overview
- Why teams adopt SlowQL
- Key features
- Installation
- Quick start
- CLI usage
- Keyboard navigation
- Exports
- Rule coverage (examples)
- How it works
- Integrations (CI / pre-commit / Docker)
- Performance & privacy
- Roadmap
- Contributing
- License
- FAQ
SlowQL is a static SQL analyzer and linter for performance, security, cost, and correctness. It scans SQL text to catch issues early — no database connection required — and presents the results in a premium, modern terminal dashboard.
Designed for data engineering and product teams shipping SQL daily on PostgreSQL, MySQL, SQLite, SQL Server, Snowflake, BigQuery, Redshift (pattern coverage varies by rule).
- Reduce cloud costs by preventing full-table scans, deep OFFSET pagination, and heavy JSON work.
- Prevent disasters by blocking
UPDATE/DELETEwithoutWHERE. - Shorten reviews with deterministic guidance and actionable fixes.
- Keep environments safe by detecting dynamic SQL concatenation, plaintext secrets, and PII exposure patterns.
- Produce credible, shareable HTML/CSV/JSON reports for leadership and CI/CD.
- Broad rule catalog across PERFORMANCE, COST, SECURITY, CORRECTNESS, RELIABILITY, QUALITY.
- Health score (0–100), severity distribution, and impact zones.
- Premium terminal dashboard:
- Health gauge
- Severity × Dimension heat map
- System detection capabilities (rules per dimension vs findings)
- Issue frequency spectrum
- Detailed issues table with real Impact & Fix
- Recommended action protocols generated from actual fixes
- Arrow-key menus (↑/↓ + Enter, q/Esc) for:
- Input mode (Compose | Paste | File | Compare)
- Quick actions (Export | Analyze more | Exit)
- Export selection (JSON | HTML | CSV | All)
- Exports: JSON (machine-readable), HTML (neon single page), CSV (flat rows)
- Non-interactive CI mode for pipelines
Python 3.9+
Recommended:
pipx install slowqlStandard:
pip install slowqlOptional for arrow-key menus:
pip install readcharAnalyze a file:
slowql --input-file queries.sqlExport immediately:
slowql --input-file queries.sql --export html csvCompose or paste (arrow‑key menus appear when --mode=auto in a TTY):
slowql --mode autoCompare two queries:
slowql --compareNon‑interactive (CI):
slowql --non-interactive --input-file queries.sql --export jsonusage: slowql [-h] [--input-file INPUT_FILE] [--mode {auto,paste,compose}] [--no-cache] [--compare]
[--export [{html,csv,json} ...]] [--out OUT] [--verbose] [--no-intro] [--fast]
[--duration DURATION] [--non-interactive]
[file]
Input Options:
file Input SQL file (optional positional)
--input-file Read SQL from file
--mode {auto,paste,compose} Editor mode (auto chooses compose on TTY)
Analysis Options:
--no-cache Disable query result caching
--compare Enable query comparison mode
Output Options:
--export [{html,csv,json} ...] Auto-export formats after each analysis
--out OUT Output directory for exports
--verbose Enable verbose analyzer output
UI Options:
--no-intro Skip intro animation
--fast Fast mode: minimal animations
--duration DURATION Intro animation duration (seconds)
--non-interactive Non-interactive mode for CI/CD
- Menus: ↑/↓ to move, Enter to select, q/Esc to cancel
- Quick Actions: Export Report | Analyze More | Exit
- Export Options: JSON | HTML | CSV | All
- Input Mode (auto): Compose | Paste | File | Compare | Cancel
Ifreadcharisn’t available or the terminal isn’t interactive, menus fall back to a numeric prompt.
- JSON: full machine‑readable payload
- HTML: shareable, dark neon single‑page report
- CSV:
severity,rule_id,dimension,message,impact,fix,location
Write exports automatically with --export:
slowql --input-file queries.sql --export html csvOr choose formats via the arrow‑key Export menu. Reports are written to ./reports by default (customize with --out).
Security:
- Dynamic SQL / concatenation
- Excessive grants and wildcard principals
- Hardcoded secrets / API keys
- PII exposure patterns (emails, SSNs)
Performance:
SELECT *- Non‑SARGable predicates (functions on columns, leading wildcard LIKE)
- Deep OFFSET pagination
- Regex in WHERE; heavy JSON extraction
Cost:
- Unbounded scans on partitioned data
- Unfiltered aggregation
- Cross‑region joins (where detectable)
Correctness / Logic:
UPDATE/DELETEwithoutWHERE- NULL comparison bugs (
= NULL/!= NULL) - Always true/false conditions
Reliability:
- Window functions without
PARTITION BY - Recursive CTE without bounds
Quality / Maintainability:
- Unused CTE
- Excess
DISTINCT - Identifier/style consistency (optional)
The catalog is large and expanding; see code and docs for the current list.
- Static analysis: no connection to your database.
- Deterministic: parses SQL text and applies rule signatures, heuristics, and structural checks.
- Multi‑query awareness: detects duplicate patterns and N+1 shapes across batches.
- Privacy‑first: your SQL never leaves your machine.
GitHub Actions:
name: SlowQL
on: [push, pull_request]
jobs:
lint-sql:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.11"
- run: pip install slowql readchar
- run: slowql --non-interactive --input-file sql/ --export json
- name: Fail on critical issues
run: |
python - <<'PY'
import json, glob
path = sorted(glob.glob('reports/slowql_results_*.json'))[-1]
data = json.load(open(path, encoding='utf-8'))
critical = data["statistics"]["by_severity"].get("CRITICAL", 0)
if critical > 0:
raise SystemExit(f"Found {critical} CRITICAL issues")
PYPre‑commit (conceptual):
repos:
- repo: local
hooks:
- id: slowql
name: SlowQL
entry: slowql --non-interactive --export json
language: system
files: \.sql$Docker:
docker run --rm -v "$PWD":/work makroumi/slowql slowql --input-file /work/queries.sql --export html- Static analysis; no DB round‑trips.
- Precompiled signatures; fast heuristics.
- Offline by default; zero telemetry.
- Nothing leaves your machine unless you export files.
- Issue browser (arrow keys; expand details)
- Multi‑select export (space toggles, Enter confirms)
- VS Code extension
- Dialect‑specific packs & AST expansion
- Pluggable rule packs and org policies
- Browser playground (Pyodide)
We welcome PRs for new rules, docs, tests, and formatters.
Dev setup:
git clone https://github.com/makroumi/slowql
cd slowql
python -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
pip install -e ".[dev]"
pytest -q
ruff check .
mypy .Before opening a PR:
- Include tests for new rules
- Update docs and examples
- Ensure
ruffandmypypass
Apache 2.0 — see LICENSE
Does SlowQL connect to my database?
No. It analyzes SQL text only.
Which dialects are supported?
Rules are mostly dialect‑agnostic (PostgreSQL, MySQL, SQLite, SQL Server, Snowflake, BigQuery, Redshift). Dialect‑specific packs are planned.
How many rules are there?
A large and growing catalog across performance, security, cost, logic, reliability, and style.
Can I write custom rules?
Yes — the detector architecture is modular. Public API and examples are planned.