Recently I worked on a Django project where I built a Django API using DRF, but I also was responsible for the infrastructure setup. Setting up a baseline Django project was easy due to the Django cookiecutter, but for the CI/CD setup the path was not as straight forward. Overtime, I've found a large ecosystem of tools that format, lint and test Django/Python projects very well. Getting a good CI suite with GitHub Actions was easy due to the ecosystem of formatting, linting and testing tools, but there was no blog post/guide on the all the various tools available. Therefore, I thought of sharing all the various actions we're using to ensure the highest code standard as possible using these tools.
The tools used in the CI steps will not be described in depth as there is enough resources/documentation on them on their homepages. Also, at the end there will be a section on pre-commits that matches all the CI steps recommended.
All the CI steps target the Python programming language but some of these actions are specific to the Django framework. If you are not using Django just skip the sections that are Django specific.
Some of the steps specifically the Django based ones require a database and cache connection if it's used in the project. If you are using GitHub Actions it would be similar to the following:
# Define the env variables env: DATABASE_URL: psql://postgres:postgres@localhost:5432/test REDIS_URL: redis://localhost:6379/ test: name: Test runs-on: ubuntu-latest services: postgres: image: postgres:14 ports: ['5432:5432'] env: POSTGRES_DB: test POSTGRES_USER: postgres POSTGRES_PASSWORD: postgres redis: image: redis:6 ports: ['6379:6379']
The steps presented in the next section are based on that you have a virtual environment in your pipeline that installs all the dependencies for the project. With GitHub Actions and Poetry (replacement for Pip) it would be setup as:
steps: - uses: actions/checkout@v3 # Fetch Git Repo - uses: actions/setup-python@v4 # Setup Python id: setup-python with: python-version: 3.10.8 - name: Install Poetry uses: snok/install-poetry@v1 # Setup Poetry with: virtualenvs-create: true # Add Venv virtualenvs-in-project: true installer-parallel: true - name: Install dependencies # Install all the dependencies run: | poetry install --no-interaction --no-root - name: Format (black) # Activate the virtual env run: | source .venv/bin/activate black --check $(git ls-files -- '*.py' ':!:**/migrations/*.py')
The following dependencies are required (ignore Django/DRF/Celery ones if not used) and should be installed by Pip/Poetry in the pipeline:
They will be described briefly in each section below.
Formatting Code with Black
Black is a code formatter for Python, you can add it to your pipeline with following:
- name: Format (black) run: | black --check $(git ls-files -- '*.py' ':!:**/migrations/*.py')
--check flag ensures that the file is checked and fails if changes are detected.
Sorting Imports with Isort
Isort is an import sorter for Python, you can add it to your pipeline with the following:
- name: Sort imports (isort) run: | isort --check-only $(git ls-files -- '*.py' ':!:**/migrations/*.py')
--check-only flag ensures that the file is checked only and not formatted and fails if changes are detected.
Type Checking with Mypy
Mypy is static type checker for Python, you can add it to your pipeline with the following:
- name: Type Check (mypy) run: | mypy .
To add the Django and DRF integration to Mypy, install
djangorestframework-stubs with Pip/Poetry and add the following to
[tool.mypy] plugins = ["mypy_django_plugin.main", "mypy_drf_plugin.main"]
Linting with Pylint
Pylint is a linter for Python, you can add it to your pipeline with the following:
- name: Lint (pylint) run: | pylint $(git ls-files -- '*.py' ':!:**/migrations/*.py')
To add Django and Celery integration to Pylint install
pylint-celery and then add the following to
[MASTER] load-plugins=pylint_django, pylint_celery django-settings-module=config.settings.base
Ensuring that Python Syntax is up-to-date with Pyupgrade
Pyupgrade is a version syntax upgrade tool for Python, you can add it to your pipeline with the following:
- name: Python Upgrade (pyupgrade) run: | pyupgrade $(git ls-files -- '*.py' ':!:**/migrations/*.py') --py39-plus
--py39-plus flag indicates that we want to use the Python syntax of 3.9 and upwards, adjust the flag according to your version.
Ensuring that Django Syntax is up-to-date with Djangoupgrade
Djangoupgrade is a version syntax upgrade tool for Django, you can add it to your pipeline with the following:
- name: Django Upgrade (django-upgrade) run: | django-upgrade $(git ls-files -- '*.py' ':!:**/migrations/*.py') --target=3.2
--target=3.2 flag indicates that we want to use the Django syntax of 3.2 and upwards, adjust the flag according to your version.
Linting Django Templates with DjLint
DjLint is a linter and formatter that works with Django templates, you can add it to your pipeline with the following:
- name: Django Template/Html Lint (djlint) run: | djlint $(git ls-files -- '*.html')
DjLint is configurable in the file
pyproject.toml where you'll need to set the profile to Django and I recommend a couple of other settings (respecting gitignore, formatting CSS and JS and preserving blank lines):
[tool.djlint] profile = "django" preserve_blank_lines = true use_gitignore = true format_css = true format_js = true
Run Django migrations to ensure that migration leafs and nodes are correct and that running migrations works. Requires a Postgres database in your CI workflow.
- name: Run Migrations run: | ./manage.py migrate
Checking for Missing Django Migrations
Run Django migration checks to ensure that all new model changes are checked into Git.
# Some Django checks require migrations to run for full functionality # Therefore run migrations first before checking for missing migrations - name: Check for Missing Migrations run: | DJANGO_SETTINGS_MODULE=config.settings.production \ python manage.py makemigrations --check --dry-run
Running Django Checks
The Django system check framework ensures a couple of best practices for database models, caches, security amongst other things. A full list of checks can be found here, you can also add your custom checks as explained in the Django docs. To run these checks in your CI, add the following:
- name: Django Check run: | DJANGO_SETTINGS_MODULE=config.settings.production \ ./manage.py check --deploy --fail-level=WARNING
--deploy flag activates additional checks and flags as documented here. The
--fail-level=WARNING ensures that the check fails on issues with the
WARNING level and not only on the
Pytest is a framework for writing Python tests, you can add it to your pipeline with the following:
- name: Run Pytest run: | pytest --ignore .poetry # ignore virtual envs
To add Django integration to Pytest you can do so by installing Pytest-django and adding your
DJANGO_SETTINGS_MODULE to the Pytest arguments. Below is an example of the configuration in
[tool.pytest.ini_options] addopts = "--ds=config.settings.test" # <---- ds means django settings python_files = "tests.py test_*.py"
Multiple Cores with Pytest-xdist
Pytest-xdist is a Pytest plugin that runs your test suite on multiple CPU cores. Install it and add the flag
-n=auto to autodetect how many cores your machine has which will be how many it will use. You can add this to the default arguments for Pytest:
[tool.pytest.ini_options] addopts = "--ds=config.settings.test -n=auto"
Disabling Pytest-xdist Automatically
In some use cases you probably don't want to run tests on multiple cores. Examples of these are when using breakpoints or when targeting a single test locally, to have a sane default behavior you can use command line pre parsing to remove multicore tests in specific use cases. In the below example we check if the flags
-k which targets a test suite or a single test and the flag
-s which is used for input when using breakpoints exist in the arguments to the
pytest command. If they do, we remove
-n=auto which enables Pytest-xdist.
def pytest_cmdline_preparse(args): if "xdist" in sys.modules and ("-k" in args or "-s" in args): for i, arg in enumerate(args): # remove -n # option if arg == "-n=auto": del args[i] break
Better Visual Output using Pytest-sugar
Pytest-sugar extends Pytest and provides a much better UI showing progress bars, errors and failures in real time. Just install it with Pip/Poetry, and it will be automatically added.
Pytest-cov is a Pytest plugin which produces coverage reports. Simply install Pytest-cov and update the default arguments for Pytest:
[tool.pytest.ini_options] addopts = "--ds=config.settings.test -n=auto --cov"
You can add configuration to the
pyproject.toml file to indicate that a test coverage threshold needs to be reached and which files to omit:
[tool.coverage.report] fail_under = 88 [tool.coverage.run] source = . omit = ./venv/* ./.virtualenv/* ./.venv/* */migrations/* */apps.py */wsgi.py ./manage.py
OpenAPI Schema Validation with DRF-spectacular
drf-spectacular is a OpenAPI schema generator for the Django-rest-framework. If you are using the project ensure that you compile and validate the schema in your CI pipeline with the following:
- name: Compile and Validate schema run: | source .venv/bin/activate DJANGO_SETTINGS_MODULE=config.settings.production \ ./manage.py spectacular --file openapi-schema.yml --validate --api-version api-v1 --fail-on-warn
--fail-on-warn flag ensures we fail also on the level
Pre-commit is a hook system that takes place before developers add their commits. These hooks ensure code standards on newly added and changed code by developers. They also can ensure that parts of your CI suite runs locally on a developer's computer before opening a pull request and triggering a pipeline. The feedback cycle is much shorter as everything happens locally. Code can be formatted, linted and tested before it reaches the pipeline. Here's a collection of all the pre-commits that can be used and match the CI steps described above:
repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v4.2.0 hooks: - id: trailing-whitespace - id: end-of-file-fixer - id: check-yaml - repo: https://github.com/asottile/pyupgrade rev: v2.32.1 hooks: - id: pyupgrade args: [--py39-plus] language: system - repo: https://github.com/psf/black rev: 22.8.0 hooks: - id: black language: system - repo: https://github.com/PyCQA/isort rev: 5.10.1 hooks: # Isort sets the language_version by default # and we can't unset it with yaml when using system as the language - id: isort - repo: https://github.com/pycqa/pylint rev: "v2.15.8" hooks: - id: pylint language: system - repo: https://github.com/pre-commit/mirrors-mypy rev: 'v0.971' hooks: - id: mypy language: system - repo: https://github.com/adamchainz/django-upgrade rev: "1.11.0" hooks: - id: django-upgrade args: [--target-version, "3.2"] language: system - repo: https://github.com/Riverside-Healthcare/djLint rev: v1.19.9 hooks: - id: djlint-django args: - --reformat language: system
Note: most of these pre-commit hooks use the systems dependencies as we use our virtual envs installed versions of the tools.
Static checks are great as they reduce error-prone code reaching production environments. Having them in place early in your integration pipelines will save you headaches. Pre-commits ensure that developers have minimal friction with CI pipelines as their code should be auto-fixed locally on their computer. In this blog post there is a wide array of static checks and test recommendations, use what suits you and adjust and tweak the configurations to your preference. Feel free to share any other CI steps with me that can be added to the suite presented in this blog post.