Skip to content

Cyberpunk SQL analyzer with 50+ query detectors. Catches performance killers, security risks & bad patterns before production. Features stunning vaporwave CLI output, real-time analysis, and actionable fixes. Stop shipping slow queries.

License

Notifications You must be signed in to change notification settings

makroumi/slowql

SlowQL

Catch expensive SQL before it hits production.
Static, offline SQL analyzer that prevents performance regressions, security risks, correctness bugs, and cloud‑cost traps — with a polished terminal experience.

Release PyPI Docker GHCR Docker Pulls PyPI Downloads GitHub stars CI Coverage Ruff Mypy Tests Dependabot Security Discussions Contributors Sponsor

SlowQL Demo

Table of Contents

Overview

SlowQL is a static SQL analyzer and linter for performance, security, cost, and correctness. It scans SQL text to catch issues early — no database connection required — and presents the results in a premium, modern terminal dashboard.
Designed for data engineering and product teams shipping SQL daily on PostgreSQL, MySQL, SQLite, SQL Server, Snowflake, BigQuery, Redshift (pattern coverage varies by rule).

Why teams adopt SlowQL

  • Reduce cloud costs by preventing full-table scans, deep OFFSET pagination, and heavy JSON work.
  • Prevent disasters by blocking UPDATE/DELETE without WHERE.
  • Shorten reviews with deterministic guidance and actionable fixes.
  • Keep environments safe by detecting dynamic SQL concatenation, plaintext secrets, and PII exposure patterns.
  • Produce credible, shareable HTML/CSV/JSON reports for leadership and CI/CD.

Key features

  • Broad rule catalog across PERFORMANCE, COST, SECURITY, CORRECTNESS, RELIABILITY, QUALITY.
  • Health score (0–100), severity distribution, and impact zones.
  • Premium terminal dashboard:
    • Health gauge
    • Severity × Dimension heat map
    • System detection capabilities (rules per dimension vs findings)
    • Issue frequency spectrum
    • Detailed issues table with real Impact & Fix
    • Recommended action protocols generated from actual fixes
  • Arrow-key menus (↑/↓ + Enter, q/Esc) for:
    • Input mode (Compose | Paste | File | Compare)
    • Quick actions (Export | Analyze more | Exit)
    • Export selection (JSON | HTML | CSV | All)
  • Exports: JSON (machine-readable), HTML (neon single page), CSV (flat rows)
  • Non-interactive CI mode for pipelines

Installation

Python 3.9+

Recommended:

pipx install slowql

Standard:

pip install slowql

Optional for arrow-key menus:

pip install readchar

Quick start

Analyze a file:

slowql --input-file queries.sql

Export immediately:

slowql --input-file queries.sql --export html csv

Compose or paste (arrow‑key menus appear when --mode=auto in a TTY):

slowql --mode auto

Compare two queries:

slowql --compare

Non‑interactive (CI):

slowql --non-interactive --input-file queries.sql --export json

CLI usage

usage: slowql [-h] [--input-file INPUT_FILE] [--mode {auto,paste,compose}] [--no-cache] [--compare]
              [--export [{html,csv,json} ...]] [--out OUT] [--verbose] [--no-intro] [--fast]
              [--duration DURATION] [--non-interactive]
              [file]

Input Options:
  file                             Input SQL file (optional positional)
  --input-file                     Read SQL from file
  --mode {auto,paste,compose}      Editor mode (auto chooses compose on TTY)

Analysis Options:
  --no-cache                       Disable query result caching
  --compare                        Enable query comparison mode

Output Options:
  --export [{html,csv,json} ...]   Auto-export formats after each analysis
  --out OUT                        Output directory for exports
  --verbose                        Enable verbose analyzer output

UI Options:
  --no-intro                       Skip intro animation
  --fast                           Fast mode: minimal animations
  --duration DURATION              Intro animation duration (seconds)
  --non-interactive                Non-interactive mode for CI/CD

Keyboard navigation

  • Menus: ↑/↓ to move, Enter to select, q/Esc to cancel
  • Quick Actions: Export Report | Analyze More | Exit
  • Export Options: JSON | HTML | CSV | All
  • Input Mode (auto): Compose | Paste | File | Compare | Cancel
    If readchar isn’t available or the terminal isn’t interactive, menus fall back to a numeric prompt.

Exports

  • JSON: full machine‑readable payload
  • HTML: shareable, dark neon single‑page report
  • CSV: severity,rule_id,dimension,message,impact,fix,location

Write exports automatically with --export:

slowql --input-file queries.sql --export html csv

Or choose formats via the arrow‑key Export menu. Reports are written to ./reports by default (customize with --out).

Rule coverage (examples)

Security:

  • Dynamic SQL / concatenation
  • Excessive grants and wildcard principals
  • Hardcoded secrets / API keys
  • PII exposure patterns (emails, SSNs)

Performance:

  • SELECT *
  • Non‑SARGable predicates (functions on columns, leading wildcard LIKE)
  • Deep OFFSET pagination
  • Regex in WHERE; heavy JSON extraction

Cost:

  • Unbounded scans on partitioned data
  • Unfiltered aggregation
  • Cross‑region joins (where detectable)

Correctness / Logic:

  • UPDATE/DELETE without WHERE
  • NULL comparison bugs (= NULL / != NULL)
  • Always true/false conditions

Reliability:

  • Window functions without PARTITION BY
  • Recursive CTE without bounds

Quality / Maintainability:

  • Unused CTE
  • Excess DISTINCT
  • Identifier/style consistency (optional)

The catalog is large and expanding; see code and docs for the current list.

How it works

  • Static analysis: no connection to your database.
  • Deterministic: parses SQL text and applies rule signatures, heuristics, and structural checks.
  • Multi‑query awareness: detects duplicate patterns and N+1 shapes across batches.
  • Privacy‑first: your SQL never leaves your machine.

Integrations (CI / pre-commit / Docker)

GitHub Actions:

name: SlowQL
on: [push, pull_request]
jobs:
  lint-sql:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: "3.11"
      - run: pip install slowql readchar
      - run: slowql --non-interactive --input-file sql/ --export json
      - name: Fail on critical issues
        run: |
          python - <<'PY'
          import json, glob
          path = sorted(glob.glob('reports/slowql_results_*.json'))[-1]
          data = json.load(open(path, encoding='utf-8'))
          critical = data["statistics"]["by_severity"].get("CRITICAL", 0)
          if critical > 0:
              raise SystemExit(f"Found {critical} CRITICAL issues")
          PY

Pre‑commit (conceptual):

repos:
  - repo: local
    hooks:
      - id: slowql
        name: SlowQL
        entry: slowql --non-interactive --export json
        language: system
        files: \.sql$

Docker:

docker run --rm -v "$PWD":/work makroumi/slowql slowql --input-file /work/queries.sql --export html

Performance & privacy

  • Static analysis; no DB round‑trips.
  • Precompiled signatures; fast heuristics.
  • Offline by default; zero telemetry.
  • Nothing leaves your machine unless you export files.

Roadmap

  • Issue browser (arrow keys; expand details)
  • Multi‑select export (space toggles, Enter confirms)
  • VS Code extension
  • Dialect‑specific packs & AST expansion
  • Pluggable rule packs and org policies
  • Browser playground (Pyodide)

Contributing

We welcome PRs for new rules, docs, tests, and formatters.

Dev setup:

git clone https://github.com/makroumi/slowql
cd slowql
python -m venv .venv
source .venv/bin/activate   # Windows: .venv\Scripts\activate
pip install -e ".[dev]"
pytest -q
ruff check .
mypy .

Before opening a PR:

  • Include tests for new rules
  • Update docs and examples
  • Ensure ruff and mypy pass

License

Apache 2.0 — see LICENSE

FAQ

Does SlowQL connect to my database?
No. It analyzes SQL text only.

Which dialects are supported?
Rules are mostly dialect‑agnostic (PostgreSQL, MySQL, SQLite, SQL Server, Snowflake, BigQuery, Redshift). Dialect‑specific packs are planned.

How many rules are there?
A large and growing catalog across performance, security, cost, logic, reliability, and style.

Can I write custom rules?
Yes — the detector architecture is modular. Public API and examples are planned.

About

Cyberpunk SQL analyzer with 50+ query detectors. Catches performance killers, security risks & bad patterns before production. Features stunning vaporwave CLI output, real-time analysis, and actionable fixes. Stop shipping slow queries.

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages