Core Continuous Integration (CI) Steps for Python and Django Applications

1 year ago 1172 views
8 min read

Recently I worked on a Django project where I built a Django API using DRF, but I also was responsible for the infrastructure setup. Setting up a baseline Django project was easy due to the Django cookiecutter, but for the CI/CD setup the path was not as straight forward. Overtime, I've found a large ecosystem of tools that format, lint and test Django/Python projects very well. Getting a good CI suite with GitHub Actions was easy due to the ecosystem of formatting, linting and testing tools, but there was no blog post/guide on the all the various tools available. Therefore, I thought of sharing all the various actions we're using to ensure the highest code standard as possible using these tools.

The tools used in the CI steps will not be described in depth, as there is enough resources/documentation on them on their homepages. Also, at the end, there will be a section on pre-commits that matches all the CI steps recommended.

Dependency List

All the CI steps target the Python programming language, but some of these actions are specific to the Django framework. If you are not using Django, just skip the sections that are Django specific.

Some of the steps, specifically the Django based ones, require a database and cache connection if it's used in the project. If you are using GitHub Actions, it would be similar to the following:

# Define the env variables
  DATABASE_URL: psql://postgres:postgres@localhost:5432/test
  REDIS_URL: redis://localhost:6379/

  name: Test
  runs-on: ubuntu-latest
      image: postgres:14
      ports: ['5432:5432']
        POSTGRES_DB: test
        POSTGRES_USER: postgres
        POSTGRES_PASSWORD: postgres
      image: redis:6
      ports: ['6379:6379']

The steps presented in the next section are based on that you have a virtual environment in your pipeline that installs all the dependencies for the project. With GitHub Actions and Poetry (replacement for Pip) it would be setup as:

  - uses: actions/checkout@v3 # Fetch Git Repo

  - uses: actions/setup-python@v4 # Setup Python
    id: setup-python
      python-version: 3.10.8

  - name: Install Poetry
    uses: snok/install-poetry@v1 # Setup Poetry
      virtualenvs-create: true # Add Venv
      virtualenvs-in-project: true
      installer-parallel: true

  - name: Install dependencies # Install all the dependencies
    run: |
      poetry install --no-interaction --no-root

  - name: Format (black) # Activate the virtual env
    run: |
      source .venv/bin/activate
      black --check $(git ls-files -- '*.py' ':!:**/migrations/*.py')

The following dependencies are required (ignore Django/DRF/Celery ones if not used) and should be installed by Pip/Poetry in the pipeline:

  • Black
  • Isort
  • Mypy
    • django-stubs
    • djangorestframework-stubs
  • Pylint
    • pylint_django
    • pylint_celery
  • Pyupgrade
  • Djangoupgrade
  • DjLint
  • Pytest
    • pytest-sugar
    • pytest-xdist
    • django-coverage-plugin
    • pytest-cov
  • Pre-commit

They will be described briefly in each section below.


Formatting Code with Black

Black is a code formatter for Python, you can add it to your pipeline with the following:

- name: Format (black)
  run: |
    black --check $(git ls-files -- '*.py' ':!:**/migrations/*.py')

The --check flag ensures that the file is checked and fails if changes are detected.

Sorting Imports with Isort

Isort is an import sorter for Python, you can add it to your pipeline with the following:

- name: Sort imports (isort)
  run: |
    isort --check-only $(git ls-files -- '*.py' ':!:**/migrations/*.py')

The --check-only flag ensures that the file is checked only and not formatted, and fails if changes are detected.

Type Checking with Mypy

Mypy is a static type checker for Python, you can add it to your pipeline with the following:

- name: Type Check (mypy)
  run: |
    mypy .

To add the Django and DRF integration to Mypy, install django-stubs and djangorestframework-stubs with Pip/Poetry and add the following to pyproject.toml:

plugins = ["mypy_django_plugin.main", "mypy_drf_plugin.main"]

Linting with Pylint

Pylint is a linter for Python, you can add it to your pipeline with the following:

- name: Lint (pylint)
  run: |
    pylint $(git ls-files -- '*.py' ':!:**/migrations/*.py')

To add Django and Celery integration to Pylint install pylint-django and pylint-celery and then add the following to .pylintrc:

load-plugins=pylint_django, pylint_celery

Ensuring that Python Syntax is up-to-date with Pyupgrade

Pyupgrade is a version syntax upgrade tool for Python, you can add it to your pipeline with the following:

- name: Python Upgrade (pyupgrade)
  run: |
    pyupgrade $(git ls-files -- '*.py' ':!:**/migrations/*.py') --py39-plus

The --py39-plus flag indicates that we want to use the Python syntax of 3.9 and upwards, adjust the flag according to your version.

Ensuring that Django Syntax is up-to-date with Djangoupgrade

Djangoupgrade is a version syntax upgrade tool for Django, you can add it to your pipeline with the following:

- name: Django Upgrade (django-upgrade)
  run: |
    django-upgrade $(git ls-files -- '*.py' ':!:**/migrations/*.py') --target=3.2

The --target=3.2 flag indicates that we want to use the Django syntax of 3.2 and upwards, adjust the flag according to your version.

Linting Django Templates with DjLint

DjLint is a linter and formatter that works with Django templates, you can add it to your pipeline with the following:

- name: Django Template/Html Lint (djlint)
  run: |
    djlint $(git ls-files -- '*.html')

DjLint is configurable in the file pyproject.toml where you'll need to set the profile to Django and I recommend a couple of other settings (respecting gitignore, formatting CSS and JS and preserving blank lines):

profile = "django"
preserve_blank_lines = true
use_gitignore = true
format_css = true
format_js = true


Running Migrations

Run Django migrations to ensure that migration leafs and nodes are correct and that running migrations works. Requires a Postgres database in your CI workflow.

- name: Run Migrations
  run: |
    ./ migrate

Checking for Missing Django Migrations

Run Django migration checks to ensure that all new model changes are checked into Git.

# Some Django checks require migrations to run for full functionality
# Therefore run migrations first before checking for missing migrations
- name: Check for Missing Migrations
  run: |
    DJANGO_SETTINGS_MODULE=config.settings.production \
      python makemigrations --check --dry-run

Running Django Checks

The Django system check framework ensures a couple of best practices for database models, caches, security amongst other things. A full list of checks can be found here, you can also add your custom checks as explained in the Django docs. To run these checks in your CI, add the following:

- name: Django Check
  run: |
    DJANGO_SETTINGS_MODULE=config.settings.production \
      ./ check --deploy --fail-level=WARNING

The --deploy flag activates additional checks and flags, as documented here. The --fail-level=WARNING ensures that the check fails on issues with the WARNING level and not only on the ERROR level.


Pytest is a framework for writing Python tests, you can add it to your pipeline with the following:

- name: Run Pytest
  run: |
    pytest --ignore .poetry # ignore virtual envs

To add Django integration to Pytest you can do so by installing Pytest-django and adding your DJANGO_SETTINGS_MODULE to the Pytest arguments. Below is an example of the configuration in pyproject.toml.

addopts = "--ds=config.settings.test" # <---- ds means django settings
python_files = " test_*.py"

Multiple Cores with Pytest-xdist

Pytest-xdist is a Pytest plugin that runs your test suite on multiple CPU cores. Install it and add the flag -n=auto to autodetect how many cores your machine has, which will be how many it will use. You can add this to the default arguments for Pytest:

addopts = "--ds=config.settings.test -n=auto"

Disabling Pytest-xdist Automatically

In some use cases, you probably don't want to run tests on multiple cores. Examples of these are when using breakpoints or when targeting a single test locally, to have a sane default behavior you can use command line pre parsing to remove multicore tests in specific use cases. In the below example, we check if the flags -k which targets a test suite or a single test and the flag -s which is used for input when using breakpoints exist in the arguments to the pytest command. If they do, we remove -n=auto which enables Pytest-xdist.

def pytest_cmdline_preparse(args):
    if "xdist" in sys.modules and ("-k" in args or "-s" in args):
        for i, arg in enumerate(args):
            # remove -n # option
            if arg == "-n=auto":
                del args[i]

Better Visual Output using Pytest-sugar

Pytest-sugar extends Pytest and provides a much better UI showing progress bars, errors and failures in real time. Just install it with Pip/Poetry, and it will be automatically added.

Code Coverage

Pytest-cov is a Pytest plugin which produces coverage reports. Simply install Pytest-cov and update the default arguments for Pytest:

addopts = "--ds=config.settings.test -n=auto --cov"

You can add configuration to the pyproject.toml file to indicate that a test coverage threshold needs to be reached and which files to omit:

fail_under = 88

source = .

omit =

OpenAPI Schema Validation with DRF-spectacular

drf-spectacular is a OpenAPI schema generator for the Django-rest-framework. If you are using the project, ensure that you compile and validate the schema in your CI pipeline with the following:

- name: Compile and Validate schema
  run: |
    source .venv/bin/activate
    DJANGO_SETTINGS_MODULE=config.settings.production \
      ./ spectacular --file openapi-schema.yml --validate --api-version api-v1 --fail-on-warn

The --fail-on-warn flag ensures we fail also on the level WARNING.


Pre-commit is a hook system that takes place before developers add their commits. These hooks ensure code standards on newly added and changed code by developers. They also can ensure that parts of your CI suite runs locally on a developer's computer before opening a pull request and triggering a pipeline. The feedback cycle is much shorter as everything happens locally. Code can be formatted, linted and tested before it reaches the pipeline. Here's a collection of all the pre-commits that can be used and match the CI steps described above:

  - repo:
    rev: v4.2.0
      - id: trailing-whitespace
      - id: end-of-file-fixer
      - id: check-yaml

  - repo:
    rev: v2.32.1
      - id: pyupgrade
        args: [--py39-plus]
        language: system

  - repo:
    rev: 22.8.0
      - id: black
        language: system

  - repo:
    rev: 5.10.1
      # Isort sets the language_version by default
      # and we can't unset it with yaml when using system as the language
      - id: isort

  - repo:
    rev: "v2.15.8"
      - id: pylint
        language: system

  - repo:
    rev: 'v0.971'
      - id: mypy
        language: system

  - repo:
    rev: "1.11.0"
      - id: django-upgrade
        args: [--target-version, "3.2"]
        language: system

  - repo:
    rev: v1.19.9
      - id: djlint-django
          - --reformat
        language: system

Note: most of these pre-commit hooks use the systems dependencies, as we use our virtual envs installed versions of the tools.


Static checks are great as they reduce error-prone code reaching production environments. Having them in place early in your integration pipelines will save you headaches. Pre-commits ensure that developers have minimal friction with CI pipelines, as their code should be auto-fixed locally on their computer. In this blog post there is a wide array of static checks and test recommendations, use what suits you and adjust and tweak the configurations to your preference. Feel free to share any other CI steps with me that can be added to the suite presented in this blog post.

Similar Posts

1 month ago
django logging python scrapy

Custom Python Logging For Scrapy

3 min read

Scrapy is a great web scraping framework, but it lacks a good logging setup. In this short blog post, I'll show you how to use Structlog with Scrapy.

4 years ago
mailgun statuscake terraform cloudflare devops s3 rds django

Kickstarting Infrastructure for Django Applications with Terraform

8 min read

When creating Django applications or using cookiecutters as Django Cookiecutter you will have by default a number of dependencies that will be needed to be created as a S3 bucket, a Postgres Database and a Mailgun domain.

4 years ago
django cms wagtail headless api

Recipes when building a headless CMS with Wagtail's API

3 min read

Recently I built a headless CMS using Wagtail's API as a backend with NextJS/React/Redux as a frontend. Building the API I ran into some small issues with Image URL data, the API representation of snippets and creating a fully customized …