A repo to hold python tools that facilitate the assessment of natural hazards over various domains like population, landuse, infrastructure, etc
Install the project with dependencies to virtual environment as below.
pipenv run pip install -e .
If you want to install optional dependencies for testing and jupyter, execute the following command.
pipenv run pip install .[dev,jupyter]
To uninstall the project from Python environment, execute the following command.
pipenv run pip uninstall geo-cb-surge
Then, run the below command to show help menu.
pipenv run rapida --help
To access blob storage in Azure, each user must have a role of Storage Blob Data Contributor
.
- inside Docker container
Since it has an issue of opening browser by azure.identity package inside docker container, use az login
to authenticate prior to use API.
az login # authenticate with az login
pipenv run rapida init
- without Docker
init
command will open browser to authenticate to Azure
pipenv run rapida init
admin
command provides functionality to retrieve admin data for passed bounding bbox from either OpenStreetMap or OCHA.
- OSM
pipenv run rapida admin osm --help
- ocha
pipenv run rapida admin ocha --help
Each cbsurge
's modules has its own test suite which can be ran independently
make test
before running the above command, please use devcontainer
or make shell
to enter to docker container first.
make build
If you would like to build image for production, execute the below command
PRODUCTION=true make build
- launch docker container
make up
You can access to JupyterHub through http:localhost:8100
in your browser.
- set users
Authenticate users can be set through JUPYTER_USERS
variable at .env
file.
cp .env.example .env
vi .env
JUPYTER_USERS can have multiple users (username:password) for authentication
JUPYTER_USERS=docker:docker user:user
folder structure in the container will be as follows:
- /data - it will be mounted to fileshare. All files under this folder will be kept
- /data/{username} - users can explore all users file, a user has no permission to edit other users' folder.
- /home/{username} - user home folder. This data will be lost when the server is restarted.
make down
make shell
pipenv run rapida --help # run CLI in shell on docker container
You can login to UNDP account in local machine, then mount auth token information to the Docker. Thus, session class will use your local authentication info for the tool.
Firstly, copy env.example
to create .env
locally.
Set the following environmental variables.
TENANT_ID=
CLIENT_ID=
CLIENT_ID
(Use it from Microsoft Azure CLI) can be found here.
TENANT_ID
is for UNDP. Please ask administrator for it.
create new virtual environment in local machine (eg, pipenv), install the following dependencies.
pip install msal azure-core playwright azure-storage-blob click
Execute below py file independently to authenticate in local machine.
python cbsurge/az/auth.py
python cbsurge/az/auth.py --help
to show usage.
Use -c {cache_dir}
to change folder path to store token_cache.json
.
The script will create token_cache.json at ~/cbsurge/token_cache.json
.
Open docker-compose.yaml
. Uncomment the following code to mount json file from your local to the container.
You may need to adjust file path according to your environment settings.
volume:
- ~/.cbsurge/token_cache.json:/root/.cbsurge/token_cache.json
Using the below command to setup rapida tool. If it shows authentication successful
in the log, it uses credential from your local machine directly.
rapida init