Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace Makefile with justfile #979

Closed
wants to merge 18 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/build_and_publish.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ jobs:
- name: Generate correct value for VERSION file
run: echo "$VERSION" > cohortextractor/VERSION
- name: Build docker
run: make docker-build ENV=prod VERSION=$VERSION
run: just docker-build prod
- name: Basic docker test
run: docker run cohortextractor --help
- name: Log into GitHub Container Registry
Expand Down
29 changes: 15 additions & 14 deletions .github/workflows/test_runner.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,16 +15,15 @@ jobs:
- uses: actions/setup-python@v4
with:
python-version: "3.8"
- uses: opensafely-core/setup-action@v1
with:
install-just: true
- name: Install deps
run: |
# ensure requirements .txt files are timestamped later than .in
# files so we don't re-compile them
touch requirements.*.txt
make devenv
- name: Run all pre-commit checks
just devenv
- name: Check files
run: |
source ${{ github.workspace }}/venv/bin/activate
pre-commit run --all-files
just check

test:
name: Run test suite
Expand All @@ -35,26 +34,28 @@ jobs:
- uses: actions/setup-python@v4
with:
python-version: "3.8"
- uses: opensafely-core/setup-action@v1
with:
install-just: true
- name: Install deps
run: |
# ensure requirements .txt files are timestamped later than .in
# files so we don't re-compile them
touch requirements.*.txt
make devenv
just devenv
- name: Start SQL Server and Trino instances for the tests
run: docker-compose up -d mssql trino
- name: Run tests
run: |
source ${{ github.workspace }}/venv/bin/activate
PYTHONPATH=. pytest -vvv tests/
PYTHONPATH=. just test -vvv tests/

docker-tests:
name: Test docker build
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v3
- uses: opensafely-core/setup-action@v1
with:
install-just: true
- name: Build docker
run: make docker-build ENV=dev
run: just docker-build dev
- name: Basic docker test
run: docker-compose run --rm -v $PWD:/workspace dev cohortextractor
- name: Run unit tests in docker
Expand Down
23 changes: 10 additions & 13 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,18 +1,15 @@
repos:
- repo: https://github.com/psf/black
rev: 23.3.0
hooks:
- id: black
default_language_version:
python: python3.8

- repo: https://github.com/PyCQA/flake8
rev: 6.0.0
hooks:
- id: flake8

- repo: https://github.com/PyCQA/isort
rev: 5.12.0
repos:
- repo: local
hooks:
- id: isort
- id: check
name: check
entry: just check
language: system
types: [python]
require_serial: true

- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.3.0
Expand Down
2 changes: 1 addition & 1 deletion DEVELOPERS.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ apt-get install -y gnupg && \
export PATH=$PATH:/opt/mssql-tools/bin

```
* Set up virtual env: `make devenv` - Note: if not on linux, you may need to install ctds manually
* Set up virtual env: `just devenv` - Note: if not on linux, you may need to install ctds manually
* `py.test tests/`

Note: if you change the database schema
Expand Down
59 changes: 0 additions & 59 deletions Makefile

This file was deleted.

1 change: 1 addition & 0 deletions cohortextractor/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@
from .measure import Measure
from .study_definition import StudyDefinition


init_logging()

with open(os.path.join(os.path.dirname(__file__), "VERSION")) as version_file:
Expand Down
2 changes: 1 addition & 1 deletion cohortextractor/codelistlib.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ class Codelist(list):

def codelist_from_csv(filename, system, column="code", category_column=None):
codes = []
with open(filename, "r") as f:
with open(filename) as f:
for row in csv.DictReader(f):
# We strip whitespace below. Longer term we expect this to be done
# automatically by OpenCodelists but for now we want to avoid the
Expand Down
1 change: 1 addition & 0 deletions cohortextractor/cohortextractor.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@

from .log_utils import log_execution_time, log_stats


logger = structlog.get_logger()

notebook_tag = "opencorona-research"
Expand Down
5 changes: 3 additions & 2 deletions cohortextractor/emis_backend.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@
from .pandas_utils import dataframe_from_rows, dataframe_to_file
from .trino_utils import trino_connection_from_url


logger = structlog.get_logger()
sql_logger = structlog.get_logger("cohortextractor.sql")

Expand Down Expand Up @@ -1105,7 +1106,7 @@ def _number_of_episodes_by_medication(
match = re.match(pattern, episode_defined_as)
if not match:
raise ValueError(
f"Argument `episode_defined_as` must match " f"pattern: {pattern}"
f"Argument `episode_defined_as` must match pattern: {pattern}"
)
washout_period = int(match.group(1))
else:
Expand Down Expand Up @@ -1169,7 +1170,7 @@ def _number_of_episodes_by_clinical_event(
match = re.match(pattern, episode_defined_as)
if not match:
raise ValueError(
f"Argument `episode_defined_as` must match " f"pattern: {pattern}"
f"Argument `episode_defined_as` must match pattern: {pattern}"
)
washout_period = int(match.group(1))
else:
Expand Down
1 change: 1 addition & 0 deletions cohortextractor/expressions.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
import sqlparse
from sqlparse import tokens as ttypes


IGNORE = object()

# As well as alphanumeric, underscore, and space characters, which are generally used
Expand Down
1 change: 1 addition & 0 deletions cohortextractor/measure.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import numpy


SMALL_NUMBER_THRESHOLD = 5


Expand Down
1 change: 1 addition & 0 deletions cohortextractor/mssql_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@

from .log_utils import log_execution_time, timing_log_counter


# Some drivers warn about the use of features marked "optional" in the DB-ABI
# spec, using a standardised set of warnings. See:
# https://www.python.org/dev/peps/pep-0249/#optional-db-api-extensions
Expand Down
1 change: 1 addition & 0 deletions cohortextractor/pandas_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
import pandas
import structlog


logger = structlog.get_logger()


Expand Down
1 change: 1 addition & 0 deletions cohortextractor/process_covariate_definitions.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
import datetime
import re


# ISARIC data has a lot of columns that are all varchar in the db.
ISARIC_COLUMN_MAPPINGS = {
"abdopain_ceoccur_v2": "str",
Expand Down
1 change: 1 addition & 0 deletions cohortextractor/study_definition.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
from .process_covariate_definitions import process_covariate_definitions
from .validate_dummy_data import validate_dummy_data


logger = structlog.get_logger()


Expand Down
7 changes: 4 additions & 3 deletions cohortextractor/tpp_backend.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@
from .process_covariate_definitions import ISARIC_COLUMN_MAPPINGS
from .therapeutics_utils import ALLOWED_RISK_GROUPS


logger = structlog.get_logger()


Expand Down Expand Up @@ -1401,7 +1402,7 @@ def _number_of_episodes_by_medication(
match = re.match(pattern, episode_defined_as)
if not match:
raise ValueError(
f"Argument `episode_defined_as` must match " f"pattern: {pattern}"
f"Argument `episode_defined_as` must match pattern: {pattern}"
)
washout_period = int(match.group(1))
else:
Expand Down Expand Up @@ -1467,7 +1468,7 @@ def _number_of_episodes_by_clinical_event(
match = re.match(pattern, episode_defined_as)
if not match:
raise ValueError(
f"Argument `episode_defined_as` must match " f"pattern: {pattern}"
f"Argument `episode_defined_as` must match pattern: {pattern}"
)
washout_period = int(match.group(1))
else:
Expand Down Expand Up @@ -3211,7 +3212,7 @@ def _number_of_episodes_by_covid_therapeutics(
match = re.match(pattern, episode_defined_as)
if not match:
raise ValueError(
f"Argument `episode_defined_as` must match " f"pattern: {pattern}"
f"Argument `episode_defined_as` must match pattern: {pattern}"
)
washout_period = int(match.group(1))
else:
Expand Down
2 changes: 2 additions & 0 deletions cohortextractor/trino_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
import requests
import structlog


# The Trino client ought to be close to a drop-in replacement for the Presto
# but it isn't: despite the tests passing we get 500s from Presto in production
# whenever we try to query it. As a quick fix we want to revert to using the
Expand All @@ -29,6 +30,7 @@
from retry import retry
from tabulate import tabulate


urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)

sql_logger = structlog.get_logger("cohortextractor.sql")
Expand Down
1 change: 1 addition & 0 deletions cohortextractor/update_custom_medication_dictionary.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@

from .mssql_utils import mssql_dbapi_connection_from_url


CUSTOM_MEDICATION_DICTIONARY_CSV = pathlib.Path(__file__).with_name(
"custom_medication_dictionary.csv"
)
Expand Down
Loading