The Wagtail CMS for managing and publishing content for the Office for National Statistics (ONS)
For further developer documentation see docs
To get a local copy up and running, follow the steps below.
Ensure you have the following installed:
- Python: Version specified in
.python-version. We recommend using pyenv for managing Python versions. - Poetry: This is used to manage package dependencies and virtual environments.
- Colima for running the project in Docker containers.
- PostgreSQL for the database. Provided as container via
docker-compose.ymlwhen using the Docker setup. - Node and
nvm(Node Version Manager) for front-end tooling. - JQ for the step in the build that installs the design system templates.
texlive-latex-extraandtexlive-fonts-recommended: Required bymatplotlibto render LaTeX equations. See below for instructions on how to install on macOS.- Operation System: Ubuntu/ MacOS.
-
Clone the repository
git clone git@github.com:ONSdigital/dis-wagtail.git
-
Install dependencies
Poetry is used to manage dependencies in this project. For more information, read the Poetry documentation.
To install all dependencies, including development dependencies, run:
make install-dev
To install only production dependencies, run:
make install
Follow these steps to set up and run the project using Docker.
-
Build and Start the Containers
# pull the supporting containers make compose-pull # build the main application image make compose-build # start the containers make compose-up
-
Migrations and Superuser Creation
If this is your first time setting up the project, you’ll need to run migrations to set up the database schema, load the topics dev fixture, and create an administrative user.
# ssh into the web container make docker-shell # Run database migrations make migrate # Load the topics dev fixture make load-topics # Create a superuser for accessing the admin interface make createsuperuser
-
Compile translations
In order to see pages in different languages, you'll need to compile the translations. This is done by running:
```bash
make compilemessages
```
This will create the necessary `.mo` files for the translations.
-
Start Django Inside the Container
Once the containers are running, you need to manually start Django from within the web container. This allows for running both the Django server and any additional background services (e.g., schedulers).
⚠️ WARNING Thehonchocommand will pick up your local mounted.envfile when running viadocker-compose. Ensure that you comment out any variables in the.envfile which might cause clashes in the container context as they will take precedence when runninghoncho start.# Start both Django and the scheduler using Honcho honcho start # This is not needed if you used `honcho start` make runserver
You can then access the admin at http://0.0.0.0:8000/admin/ or http://localhost:8000/admin/.
You can also run the main application locally with the supporting backend services such as the Postgres and Redis running in Docker. This can be useful when you want to make changes that require the app to be restarted in order to be picked up.
The correct development configuration should be used by default when running from the shell. For IDEs you may need to add
DJANGO_SETTINGS_MODULE=cms.settings.dev to their runtime configuration (e.g PyCharm configuration documentation).
In order to run it:
-
Pull the images of the supporting services.
make compose-dev-pull
-
Start the supporting services in Docker.
make compose-dev-up
-
Run the below command to apply the necessary pre-run steps, which include:
- loading design system templates,
- collecting the static files,
- generating and applying database migrations,
- loading the topics dev fixture
- creating a superuser with:
- username:
admin - password:
changeme# pragma: allowlist secret
- username:
- setting the port the Wagtail site(s)
make dev-init
-
Run the Django server locally via your IDE or with the following command:
make runserver
By default, make targets will use the cms.settings.dev settings unless their commands explicitly use a different setting (via the --settings parameter or DJANGO_SETTINGS_MODULE environment variable)
is set in the environment. This default should work out of the box for local development.
To override settings in the environment, you can use a .env file. Note, however, that settings this file may also be picked up in the docker container, so
you may need to remove or rename the file, or comment out specific variables if you switch to running the app in the container.
Note
When running the application locally in a virtual environment via Poetry the .env file will not be picked up automatically.
For this to work you'll need to install the poetry-plugin-dotenv.
Note that this will not work if you have installed poetry via brew, it is recommended to install poetry via either pipx or the official installation
script, see the poetry installation docs.
Some functionality in the application relies on external services, which the default configuration points to. An example is the Dataset API. Access to these services may require additional setup, or a mocked version. Without it, the functionality will not work correctly.
Get started with development by running the following commands.
Before proceeding, make sure you have the development dependencies installed using the make install-dev command.
A Makefile is provided to simplify common development tasks. To view all available commands, run:
makeWhile the end goal is to have all front-end elements in the Design System, the new design introduces a number of components that we need to build and contributed to the DS. In order to aid development and avoid being blocked by the DS, we will use modern front-end tooling for that.
Here are the common commands:
# Install front-end dependencies.
npm install
# Start the Webpack build in watch mode, without live-reload.
npm run start
# Start the Webpack server build on port 3000 only with live-reload.
npm run start:reload
# Do a one-off Webpack development build.
npm run build
# Do a one-off Webpack production build.
npm run build:prodPython packages can be installed using poetry in the web container:
make docker-shell
poetry add wagtailmediaTo run the tests and check coverage, run:
make testDuring tests, the cms.settings.test settings module is used. When running test without using make test, ensure this
settings module is used.
Our suite of functional browser driven tests uses Behave, Playwright and Django Live Server Test Cases to run BDD Cucumber feature tests against the app from a browser.
Install the Playwright dependencies (including its browser drivers) with:
make playwright-installYou can run the tests as an all-in-one command with:
make functional-testsThis will start and stop the docker compose services with the relevant tests.
To run the docker compose dependencies (database and redis) separately, e.g. if you want to run individual functional tests yourself for development, start the docker compose dependencies with:
make functional-tests-upThis will start the dependent services in the background, allowing you to then run the tests separately.
Then once you are finished testing, stop the dependencies with:
make functional-tests-downBy default, the tests will run in headless mode with no visible browser window.
To disable headless mode and show the browser, set PLAYWRIGHT_HEADLESS=False in the environment from which you are
running the tests. In this circumstance, you will probably also find it helpful to enable "slow mo" mode, which slows
down the automated browser interactions to make it possible to follow what the tests are doing. You can configure it
using the PLAYWRIGHT_SLOW_MO environment variable, passing it a value of milliseconds by which to slow each
interaction, e.g. PLAYWRIGHT_SLOW_MO=1000 will cause each individual browser interaction from the tests to be delayed
by 1 second.
For example, you can run the tests with visible browser and each interaction slowed by a second by running:
PLAYWRIGHT_HEADLESS=False PLAYWRIGHT_SLOW_MO=1000 make functional-testsRefer to the detailed functional tests development docs
Various tools are used to lint and format the code in this project.
The project uses Ruff and pylint for linting and formatting of the Python code.
The tools are configured using the pyproject.toml file.
To lint the Python code, run:
make lintDjango Migration Linter for linting migrations files in the project.
To lint the django migration files:
make lint-migrationsIf you need to bypass migration linting on certain files, add the migration name (the file name minus the .py) to the list of ignored migrations in the
MIGRATION_LINTER_OPTIONS setting in the dev settings file, with a comment explaining why the file is ignored.
To auto-format the Python code, and correct fixable linting issues, run:
make format# lint and format custom CSS/JS
npm run lint
# only CSS
npm run lint:css
# only JS
npm run lint:js
# check css, js, markdown and yaml formatting
npm run lint:format
# format
npm run formatNote that this project has configuration for pre-commit. To set up locally:
# if you don't have it yet, globally
pip install pre-commitpylint, which is run as part of pre-commit, relies on the poetry packages all being installed. If you are running this on your local machine you need to install them if you have not done so previously. Poetry automatically creates a virtual environment when you do this, which the pylint command will make use of
# if you haven't run this locally previously
poetry installpylint also relies on the libpq library being installed as a global package on your local machine. The installation steps below are for Macs.
brew install libpqAfter the above command, follow the homebrew post-install instructions for PATH exports
# in the project directory, initialize pre-commit
pre-commit install
# Optional, run all checks once for this, then the checks will run only on the changed files
pre-commit run --all-filesThe detect-secrets pre-commit hook requires a baseline secrets file to be included. If you need to,
you can update this file, e.g. when adding dummy secrets for unit tests:
poetry run detect-secrets scan > .secrets.baselineMegaLinter is utilised to lint the non-python files in the project. It offers a single interface to execute a suite of linters for multiple languages and formats, ensuring adherence to best practices and maintaining consistency across the repository without the need to install each linter individually.
MegaLinter examines various file types and tools, including GitHub Actions, Shell scripts, Dockerfile, etc. It is
configured using the .mega-linter.yml file.
To run MegaLinter, ensure you have Docker installed on your system.
Note: The initial run may take some time to download the Docker image. However, subsequent executions will be considerably faster due to Docker caching. 🚀
To start the linter and automatically rectify fixable issues, run:
make megalintMailpit is a lightweight, local SMTP server and web interface that captures all outgoing email from our application without actually delivering it. Rather than sending mail to real recipients, messages are stored in Mailpit’s inbox for easy inspection.
- SMTP endpoint:
localhost:1025 - Web UI: http://localhost:8025
Use Mailpit to:
- Preview email content, headers and attachments
- Verify templates and formatting before going to production
- Test email-related workflows
No additional configuration is needed here – our development settings already point at Mailpit’s SMTP port,
and every send_mail call will appear instantly in the UI.
Tip
If you want to disable Mailpit and simply log emails to your console, switch to Django’s console backend:
EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'In order to generate the equations in Wagtail using LaTeX strings via Matplotlib (Non-JS Equations), we will need to use MacPorts to install the following packages texlive-latex-extra and texlive-fonts-recommended.
As a prerequisite you will have to have MacPorts installed.
sudo port install texlive-latex-extra texlive-fonts-recommendedWagtail is built on Django and changes to its models may require generating and running schema migrations. For full details see the Django documentation on migrations
Below are the commands you will most commonly use, note that these have to be run inside the container.
# Check if you need to generate any new migrations after changes to the model
make makemigrations-check
# Generate migrations
make makemigrations
# Apply migrations. Needed if new migrations have been generated (either by you, or via upstream code)
make migrateTranslations are managed using .po files, which are compiled into .mo files for use in the application.
The .po files are located in the cms/locale directory.
If you add new text to the application, you will need to update the .po files to include the new text. You can do this by running the following command:
make makemessagesThis will scan the codebase for new text and update the .po files accordingly.
Once you have updated the .po files, you will need to compile them into .mo files for use in the application.
You can do this by running the following command:
make compilemessagesThis will compile the .po files into .mo files, which are used by Django to display the translated text.
See CONTRIBUTING.md for details.
See LICENSE for details.