This repository contains the Gyrinx Django application - a gang management tool
for Necromunda. The code for this application is in the gyrinx
directory.
📚 Full Documentation - Technical overview, architecture, and development guides.
Before getting started, you'll need:
- Python 3.12+ - Use pyenv to manage versions
- macOS with Homebrew - The local dev scripts (
setup-local-postgres.sh,dev.sh) are macOS-only. Linux contributors will need to set up PostgreSQL 16 manually and start it before runningdev.sh - Git - For version control
# Clone and enter the repository
git clone git@github.com:gyrinx-app/gyrinx.git && cd gyrinx
# Set up Python environment
python -m venv .venv && . .venv/bin/activate
pip install --editable .
# Configure the application
manage setupenv
# Set up frontend toolchain
nodeenv -p && npm install && npm run build
# One-time: install and initialise local PostgreSQL (macOS via Homebrew)
./scripts/setup-local-postgres.sh
# Start the development server (DB fork + migrate + runserver + CSS watch)
./scripts/dev.shVisit http://localhost:8000 to see the application.
There's a devcontainer configured in this repo which should get you up and running too, perhaps via a Codespace.
The Django manage.py file (in scripts/) is added to your shell by
setuptools, so you can just use manage from anywhere:
manage shellThe Quick Start above gets you running fast. For more detailed steps:
-
Clone the repository:
git clone git@github.com:gyrinx-app/gyrinx.git cd gyrinx -
Make sure you're using the right Python version:
python --version # should be >= 3.12If you use
pyenv, we have a.python-versionfile. If you have pyenv active in your environment, this file will automatically activate this version for you. -
Create and activate a virtual environment:
python -m venv .venv && . .venv/bin/activate
-
Install the project in editable mode so you can use the
managecommand:pip install --editable .setuptoolswill handle installing dependencies. -
You should then be able to run Django
managecommands. This one will set up your.envfile:manage setupenv
With that run, you'll have a
.envfile with a random and uniqueSECRET_KEYandDJANGO_SUPERUSER_PASSWORD:cat .env
-
Next, set up the frontend toolchain:
Get
nodeenv(installed bypipearlier) to install node and npm in the virtual env.nodeenv -p
Check it has worked (you might need to
deactivatethen. .venv/bin/activate):which node # should be /path/to/repo/.venv/bin/node which npm # should be /path/to/repo/.venv/bin/npm
-
Install the frontend dependencies:
npm install
-
Build the frontend:
npm run build
-
Install the pre-commit hooks:
Before making any changes, make sure you've got pre-commit hooks installed.
pre-commitis installed by pip.pre-commit install
The development workflow is a single command:
./scripts/dev.shThis handles:
- Forking the per-worktree database from the
gyrinx_maintemplate (if missing) - Running migrations
- Starting Django on a deterministic per-worktree port (8000 for
main, 8100–9599 for child worktrees) - Starting
npm run watchfor SCSS rebuilds
Useful flags:
./scripts/dev.sh --no-watch # skip the CSS watcher
./scripts/dev.sh --reset-db # drop and re-fork the worktree DBIf you've never set up local PostgreSQL on this machine, run
./scripts/setup-local-postgres.sh first.
Note
For details on the per-worktree database model and how gyrinx_main is used as a template,
see docs/useful-scripts.md.
The Python toolchain installs nodeenv which is then used to install node and
npm so we have a frontend toolchain.
To continuously rebuild the frontend (necessary for CSS updates from SASS):
npm run watchTests run against your local PostgreSQL database. setup-local-postgres.sh
configures everything you need: max_locks_per_transaction = 256 on the
cluster (required for pytest-xdist parallel syncdb), and a hook in
.venv/bin/activate that exports the per-worktree DB_NAME / DB_CONFIG /
DJANGO_PORT so pytest and manage target the right database.
Important
Re-run source .venv/bin/activate after switching worktrees. The hook
reads git rev-parse --show-toplevel at activation time, not on every
command. Symptom of forgetting: pytest fails with FATAL: role "postgres" does not exist (settings.py fell back to defaults because DB_CONFIG was
unset or pinned to the wrong worktree).
The wrapper script is a thin convenience over pytest:
./scripts/test.sh # parallel (via pyproject addopts: -n auto)
./scripts/test.sh -n 0 # serialOr invoke pytest directly — pyproject.toml already sets -n auto --nomigrations, so the bare command runs the full suite in parallel and
rebuilds the test DB from current model definitions on every run:
pytest # full suite, parallel
pytest gyrinx/core/tests/ # one directory
pytest -k campaign # by nameYou can also use pytest-watcher for continuous testing:
ptw .CI runs the same pytest invocation against a GitHub Actions service container
Postgres — see .github/workflows/test.yaml.
To create a new empty migration file for doing data migration:
manage makemigrations --empty contentYou can debug the SQL that Gyrinx is running using the Django Debug Toolbar that is installed.
You can also enable SQL logging by setting the SQL_DEBUG variable:
SQL_DEBUG=True
To test Gyrinx locally, you are really limited unless you have the content library data available. This is because the content library is what provides the data for the Gyrinx application to work with.
The content library is managed by the Gyrinx content team in production, and is what makes Gyrinx useful.
Gyrinx uses a custom-ish data export/import process to manage content library data from production, so you can test locally.
Note
This process is only available for trusted developers and admins.
- Export: Run the
gyrinx-dumpdataCloud Run job in production to export content to thegyrinx-app-bootstrap-dumpbucket - Import: Download
latest.jsonfrom the bucket and usemanage loaddata_overwrite latest.jsonto replace local content data
This process ensures we have access to the latest production content library. See docs/operations/content-data-management.md for details.