This is the collection repository for a PHT-Medic station. It contains an airflow instance for processing train images and can be configured via environment variables. Further information can be found in the documentation or on our website.
The current stable version includes a docker-compose based Apache Airflow distribution with DAGs for executing trains as well as the associated postgres database. In the DAGs the Train Container Library is used for processing trains in the DAGs.
docker
anddocker-compose
need to be installed- The following ports are used by the station and need to be available on the host:
- 5432
- 8080
Copy the .env.tmpl
file at the root of the repository to .env
to configure your local environment.
Open the .env
file and edit the following environment variables to match the local
configuration. STATION_ID must be consistent to Vault and Harbor.
STATION_ID
Chosen identifier of the station (match central UI configuration)STATION_PRIVATE_KEY_PATH
path to the private key on the local filesystem that should be mounted as a volumePRIVATE_KEY_PASSWORD
optional password for the private key if it is encryptedAIRFLOW_USER
admin user to be created for the airflow instanceAIRFLOW_PW
password for the airflow admin userHARBOR_URL
the url of the central harbor instanceHARBOR_USER
username to authenticate against harborHARBOR_PW
password to authenticate against harborSTATION_DATA_DIR
the absolute path of the directory where the station stores the input data for trains, this path is also used by the FHIR client to store the query results before passing them to trainsFHIR_ADDRESS
(Optional) the address of the default fhir server connected to the station (this can also be configured per train)FHIR_USER
(Optional) username to authenticate against the FHIR server using Basic AuthFHIR_PW
(Optional) password for FHIR server Basic AuthFHIR_TOKEN
(Optional) Token to authenticate against the FHIR server using Bearer TokenCLIENT_ID
(Optional) identifier of client with permission to access the fhir serverCLIENT_SECRET
(Optional) secret of above client to authenticate against the providerOIDC_PROVIDER_URL
(Optional) token url of Open ID connect provider e.g. keycloak, that is configured for the FHIR serverFHIR_SERVER_TYPE
(Optional) the type of fhir server (PHT FHIR client supports IBM, Hapi and Blaze FHIR servers)
-
Install
docker
anddocker-compose
if not already installed. -
Make sure that the ports listed above are available.
-
Create the Docker volume for Postgres using:
docker volume create pg_station
-
Run:
docker-compose build
- Run
docker-compose up -d
. - Check that the logs do not contain any startup errors with
docker-compose logs -f
. - Go to
http://localhost:8080
and check whether you can see the web interface of Apache Airflow. - Login to the airflow web interface with the previously set user credentials
Adapt the json in the following section with the appropriate station and train id
{
"repository": "<HARBOR-REGISTRY>/<STATION_NAMESPACE>/<TRAIN-IMAGE>",
"tag": "latest",
"env": {
"FHIR_SERVER_URL": "<FHIR-ADDRESS>",
"FHIR_USER": "<ID>",
"FHIR_PW": "<PSW>"
}
}
If there are issues while building the airflow container you can use our prebuilt images to run the airflow instance. Edit the airflow service in the docker-compose.yml file and replace the build command with our prebuilt image:
# ------------- omitted ------------
services:
airflow:
# remove the build command
build: './airflow'
# replace with the image command
image: ghcr.io/pht-medic/airflow:latest
volumes:
- /var/run/docker.sock:/var/run/docker.sock
# ------------- omitted ------------