Skip to content

Commit

Permalink
PR changes
Browse files Browse the repository at this point in the history
  • Loading branch information
ggsdc committed Jan 25, 2024
1 parent ebeef2d commit da07075
Show file tree
Hide file tree
Showing 2 changed files with 46 additions and 19 deletions.
28 changes: 14 additions & 14 deletions cornflow-dags/DAG/rostering/README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -60,19 +60,19 @@ Parameters

- Requirements: table indicating which requirements should be complied and which not, and if they are, if they should be applied as strict or soft constraints.

- rq02: "soft", "strict" or "deactivated for the weekly hours constraint
- rq03: "soft", "strict" or "deactivated for the maximum daily hours constraint
- rq05: "soft", "strict" or "deactivated for the maximum days worked per week constraint
- rq06: "soft", "strict" or "deactivated for the minimum daily hours constraint
- rq07: "soft", "strict" or "deactivated for the minimum rest hours between shifts constraint
- rq08: "soft", "strict" or "deactivated for the constraint about needing to have a manager in the store at all times
- rq09: "soft", "strict" or "deactivated for the skills constraint
- rq10: "soft", "strict" or "deactivated for the employees' holidays constraint
- rq11: "strict" or "deactivated for the store holidays constraint
- rq12: "strict" or "deactivated for the employee downtime constraint
- rq13: "soft", "strict" or "deactivated for the employee start hour preferences constraint
- rq14: "soft", "strict" or "deactivated for the employee max preference hours constraint
- rq15: "soft", "strict" or "deactivated for the employee schedule constraint
- rq16: "soft", "strict" or "deactivated for the fixed worktable constraint
- rq02: "soft", "strict" or "deactivated" for the weekly hours constraint
- rq03: "soft", "strict" or "deactivated" for the maximum daily hours constraint
- rq05: "soft", "strict" or "deactivated" for the maximum days worked per week constraint
- rq06: "soft", "strict" or "deactivated" for the minimum daily hours constraint
- rq07: "soft", "strict" or "deactivated" for the minimum rest hours between shifts constraint
- rq08: "soft", "strict" or "deactivated" for the constraint about needing to have a manager in the store at all times
- rq09: "soft", "strict" or "deactivated" for the skills constraint
- rq10: "soft", "strict" or "deactivated" for the employees' holidays constraint
- rq11: "strict" or "deactivated" for the store holidays constraint
- rq12: "strict" or "deactivated" for the employee downtime constraint
- rq13: "soft", "strict" or "deactivated" for the employee start hour preferences constraint
- rq14: "soft", "strict" or "deactivated" for the employee max preference hours constraint
- rq15: "soft", "strict" or "deactivated" for the employee schedule constraint
- rq16: "soft", "strict" or "deactivated" for the fixed worktable constraint

- Penalties: table indicating for each soft constraint the penalty level when it is not respected.
37 changes: 32 additions & 5 deletions docs/source/main/install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ Then we run the following commands::

flask db upgrade -d cornflow/migrations
flask access_init
flask creare_admin_user -u admin -e admin@cornflow.org -p Adminpassword1!
flask create_admin_user -u admin -e admin@cornflow.org -p Adminpassword1!
flask create_service_user -u service_user -e service_user@cornflow.org -p Service_password1
flask run

Expand Down Expand Up @@ -124,15 +124,43 @@ We first start by cloning the repository::

git clone https://github.com/baobabsoluciones/cornflow

Then we are going to need some files to have, that can be modified to alter how the deployment is going to work.
To continue the deployment we are going to need the Dockerfiles and docker-compose files that are on the repository. To be able to build from the source code we are going to need to do some small modifications to some of the files.

The Dockerfile for cornflow can be found inside the ``cornflow/cornflow-server`` folder and can be used as is, as this is the Dockerfile used to build the original image on docker hub.

The Dockerfile for airflow can be found on the ``cornflow/cornflow-server/airflow_config`` folder and can be used as is, as this is the Dockerfile used to build the original image on docker hub. This image is built on top of another one that gets built manually and that has all the needed libraries for the solvers. This is done this way to improve the build time of the image as the solver libraries are quite heavy and quite prone to remain stable for long periods of time.

Then we need a docker-compose file. The ``docker-compose.yml`` or ``docker-compose-cornflow-celery.yml`` that are in the root folder of the repository can be used with some small tweaks to build the images from the local repo instead that taking the official images.

The changes needed to be done are just to comment lines 33-34 and 63-64 and uncomment lines 35-36 and 65-66. Then we can run the following commands to start up the containers::
To test out the simpler deployment we are going to use the ``docker-compose.yml`` file. We have to do the following changes. Comment the following lines:

.. code-block:: yaml
image: baobabsoluciones/airflow:release-v1.0.8
and:

.. code-block:: yaml
image: baobabsoluciones/cornflow:release-v1.0.8
In both cases the version of the image can be updated, but these lines are the ones that have to be commented in order to build the iamges from surce instead of downloading them from docker hub.

And uncomment the following lines:

.. code-block:: yaml
build:
context: ./cornflow-server/airflow_config
and:

.. code-block:: yaml
build:
context: ./cornflow-server
Then we can run the following commands to start up the containers::

cp -r cornflow/cornflow-dags/DAG/* cornflow/cornflow-server/airflow_config/dags/
cp cornflow/cornflow-dags/requirements.txt cornflow/cornflow-server/airflow_config/
Expand All @@ -150,8 +178,7 @@ And with these command we should have a cornflow and airflow instance up and run

If we want to have the full ecosystem to test out the celery backend, then we have to run the ``docker-compose-cornflow-celery.yml`` file instead of the ``docker-compose.yml`` file. This will start up a redis instance and a celery worker that will be used by airflow to run the DAGs.

To use this docker-compose file we need to comment lines 33-34 and 67-68
and uncomment lines 35-36 and 69-70.
The lines that have to be modified on this file are the same ones that the one on the ``docker-compose.yml`` file.

To start it up then we can run::

Expand Down

0 comments on commit da07075

Please sign in to comment.