- Virtual Machine that runs Ubuntu (preferably 18.04)
- Postgres database (preferably on separate server)
- Start with forking the main IBF-system repository
- This requires setting up a Github account first if not already
- Create account: https://mailchimp.com/help/create-an-account/
- Create maillist: https://mailchimp.com/help/create-audience/ and create users
- Best is to start with only adding one test email address to test the entire setup below.
- Only after testing the whole IBF-system correctly, add the other users
- Find MC_LIST_ID to fill in in
.env
(below) through https://mailchimp.com/help/find-audience-id/ - Create MC_API to fill in in
.env
(below) through https://mailchimp.com/help/about-api-keys/#Find-or-Generate-Your-API-Key - Create one or more segments. Assuming your set up is just for one country:
- Give all members the same tag, e.g.
Zambia
- Create a segment defined by all members with the tag
Zambia
. Save it and also name itZambia
- Go to https://us18.admin.mailchimp.com/lists/segments/
- Right-click your segment and press 'copy link address'
- Paste the address somewhere and copy the
segment_id
part - Fill this in as instructed in
.env
(below)
- Give all members the same tag, e.g.
-
User Management
- Create user group -
sudo groupadd ibf-users
- Add
ibf-user
to group -sudo usermod -a -G ibf-users ibf-user
- Command to verify group members -
grep ibf-users /etc/group
- Command to verify group members -
- Create
ibf-user
group if not already presentmkdir /home/ibf-user
- Change access of shared directory -
/home/ibf-user
chgrp -Rf ibf-users /home/ibf-user
sudo chown -R ibf-user:ibf-users /home/ibf-user
sudo chmod -R 775 /home/ibf-user
- Re-login to verify if you have access by running
touch /home/ibf-user
- Add the following lines to
/etc/sudoers
# Allow members of group ibf-users to execute systemctl daemon-reload %ibf-users ALL=NOPASSWD: /bin/systemctl daemon-reload # Allow members of group ibf-users to execute service webhook restart %ibf-users ALL=NOPASSWD: /usr/sbin/service webhook restart
- Create user group -
-
Install Software
- NodeJS
Source
- Follow the instructions in the source-link.
- Allow users to access docker commands
sudo usermod -aG docker ibf-user
- Verification -
grep docker /etc/group
- Verification -
docker -v
- Docker Compose
Source
- Follow the instructions in the source-link.
sudo chmod +x /usr/local/bin/docker-compose
- Verification -
docker-compose -v
- NodeJS
Source
-
Setup IBF-system
- Setup GIT
- Setup Environment Variables
- Create
/home/ibf-user/IBF-system/.env
throughcp /home/ibf-user/IBF-system/handover.example.env /home/ibf-user/IBF-system/.env
- Set the appropriate values in the
.env
file- Follow the instructions in the
.env
file on how to fill all variables - Use the credentials of your Postgres database to fill in the
DB_
-variables - Use the mailchimp credentials retrieved above to fill in the
MC_
-variables - Where unclear, ask assistance from the IBF-system development team
- Follow the instructions in the
- Load the
.env
vars bysource /home/ibf-user/IBF-system/.env
- Test if the vars were loaded correctly
echo $NODE_ENV
- Create
. tools/deploy.sh
- Verify that all containers work correctly
docker container ls
should show 3 running containersibf-api-service
ibf-geoserver
nginx
- (
ibf-dashboard
is only started up temporarily, but is closed again once the production build is done)
- Check running dashboard at
https://<ip>/login
- Check running API service at
https://<ip>/docs
- Check running Geoserver at
https://<ip>/geoserver
-
Load initial data
- Download raster-files.zip
- Unzip the files using
apt install unzip
andunzip raster-files.zip
, intoservices/API-service/geoserver-volume/raster-files/
such that that folder now has subfoldersinput
,mock-output
andoutput
. - Run seed script through
docker-compose exec ibf-api-service npm run seed
-
Connect external pipeline
- There should be an external pipeline able to upload impact forecast data to this VM. Please check this together with the pipeline owner or the applicable disaster-types.
- For example the Glofas floods pipeline can be found at: https://github.com/rodekruis/IBF_FLOOD_PIPELINE
- Follow the README there on how to set it up.
- Roughly:
- Fork that repository as well to your own Github account
- The repository includes a
workflow.yml
file, which can be run through Github Actions - If not already the case, change this file to run on a daily schedule (8AM UTC), using:
on: schedule: - cron: "0 8 * * *"
- Set up the necessary secrets in Settings > Secrets section of Github-repository (see https://github.com/rodekruis/IBF_FLOOD_PIPELINE/settings/secrets/actions to see which ones)
- Run the workflow manually once to fill the database with a first batch of data, which is needed for a working dashboard
-
Test
- Open the dashboard at
https://<ip>/login
- Log in with the admin-account
- dunant@redcross.nl
- ADMIN_PASSWORD set in
.env
- Check that the dashboard loads as expected
- Note that the dashboard is probably showing in non-trigger mode, as most days of the year no trigger is predicted. If you want to check also the trigger-mode, you need upload mock trigger data in some way. Check with IBF-development team for assistance.
- Open the dashboard at
Here a section on relevant things to know for hosting. To be completed.
- Get into the server
- How to check the logs
- How to restart
- ...