Skip to content

Commit

Permalink
Merge branch 'master' into neelimagoogly-patch-4
Browse files Browse the repository at this point in the history
  • Loading branch information
neelimagoogly authored Jul 28, 2023
2 parents 1233a6a + 007eeea commit 005b099
Show file tree
Hide file tree
Showing 12 changed files with 258 additions and 114 deletions.
10 changes: 6 additions & 4 deletions .github/workflows/backend_build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -51,15 +51,17 @@ jobs:

- name: Install gdal
run: |
sudo apt-get -y install gdal-bin libgdal-dev python3-gdal && sudo apt-get -y autoremove && sudo apt-get clean
pip install GDAL==$(gdal-config --version) --global-option=build_ext --global-option="-I/usr/include/gdal"
sudo apt-get update && sudo apt-get -y install gdal-bin libgdal-dev python3-gdal && sudo apt-get -y autoremove && sudo apt-get clean
pip install GDAL==$(gdal-config --version)
- name: Install ramp dependecies
run: |
cd ramp-code && cd colab && make install
- name: Install tensorflow
run: pip install tensorflow==2.9.2

- name: Install fair utilities
run: pip install hot-fair-utilities==1.0.41
run: pip install hot-fair-utilities

- name: Install Psycopg2
run: |
Expand Down
26 changes: 0 additions & 26 deletions Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,29 +25,3 @@ To eliminate model biases, fAIr is built to work with the local communities and
See below a suggested product roadmap [subject to change] that provides high-level overview for planned work.
![image](https://user-images.githubusercontent.com/98902727/218769416-b3c71d3b-7c20-4d40-ab1e-88442d06445d.png)

# General Workflow of fAIr

![fAIr1](https://github.com/hotosm/fAIr/assets/97789856/01c0e3b6-a00c-439d-a2ed-1c14b62e6364)

1. Project Area by tha project manager and imagery from Open Areal Map is submitted to the task manager which then is sent to Open Street Map after undergoing the process of manual mapping and validation.
2. Local dataset(created using the imagery and raw API data as inputs) is trained and local model is created and trained.
3. It is then validated , published , mapped and pushed back into Open Street Map.
4. Finally according to the feedback , the published model is sent for improvement and training .
<hr>

# fAIr Architecture
![fAIr2](https://github.com/hotosm/fAIr/assets/97789856/63394f65-ce0d-4a3d-8683-7455f14fb366)

1. Third party extensions are sent to fAIr backend which then generates data for OSM raw data API , osmconflator and geoson2osm xml .
2. Data from fAIr backend is sent to fAIr utilities . The backend is using s separate library we call it fAIr-utilities to handle:

1. Data preparation for the models
2. models trainings
3. inference process
4. post processing (converting the predicted features to geo data)
3. The public API is then sent to the fAIr frontend.





2 changes: 1 addition & 1 deletion backend/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,6 @@ RUN pip install setuptools --upgrade
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt

RUN mkdir /app

WORKDIR /app
COPY . /app
71 changes: 0 additions & 71 deletions backend/docker-compose.yml

This file was deleted.

18 changes: 10 additions & 8 deletions backend/docker_sample_env
Original file line number Diff line number Diff line change
@@ -1,17 +1,19 @@
DEBUG=True
SECRET_KEY=yl2w)c0boi_ma-1v5)935^2#&m*r!1s9z9^*9e5co^08_ixzo6
DATABASE_URL=postgis://postgres:admin@pgsql:5432/ai
EXPORT_TOOL_API_URL=http://44.203.33.53:8000/raw-data/current-snapshot/
CORS_ALLOWED_ORIGINS=http://localhost:3000
GDAL_LIBRARY_PATH=''
MAXAR_CONNECT_ID=
EXPORT_TOOL_API_URL=https://galaxy-api.hotosm.org/v1/raw-data/current-snapshot/
CORS_ALLOWED_ORIGINS=http://127.0.0.1:3000
OSM_CLIENT_ID=
OSM_CLIENT_SECRET=
OSM_URL=https://www.openstreetmap.org
OSM_SCOPE=read_prefs
OSM_LOGIN_REDIRECT_URI=http://127.0.0.1:8000/api/v1/auth/callback/
OSM_LOGIN_REDIRECT_URI=http://127.0.0.1:3000/authenticate/
OSM_SECRET_KEY=
CELERY_BROKER_URL="redis://redis:6379/0"
CELERY_RESULT_BACKEND="redis://redis:6379/0"
RAMP_HOME="/home/kshitij/hotosm/fAIr-utilities"
TRAINING_WORKSPACE="/home/kshitij/hotosm/fAIr/backend/training"
TESTING_TOKEN=
RAMP_HOME="/RAMP_HOME"
TRAINING_WORKSPACE="/TRAINING_WORKSPACE"

TESTING_TOKEN=
GDAL_LIBRARY_PATH=''
MAXAR_CONNECT_ID=
2 changes: 1 addition & 1 deletion backend/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -17,5 +17,5 @@ django_celery_results==2.4.0
flower==1.2.0
validators==0.20.0
gpxpy==1.5.0
hot-fair-utilities==1.0.51
hot-fair-utilities
geojson2osm==0.0.1
84 changes: 84 additions & 0 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
version: "3.8"

services:
postgres:
restart: always
image: postgis/postgis
container_name: pgsql
environment:
- POSTGRES_DB=ai
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=admin
ports:
- "5434:5432"

redis:
image: redis
container_name: redis
ports:
- "6379:6379"

backend-api:
build:
context: ./backend
dockerfile: Dockerfile
container_name: api
command: python manage.py runserver 0.0.0.0:8000
deploy:
resources:
reservations:
devices:
- driver: nvidia
capabilities: [gpu]
ports:
- 8000:8000
volumes:
- ./backend:/app
- ${RAMP_HOME}:/RAMP_HOME
- ${TRAINING_WORKSPACE}:/TRAINING_WORKSPACE
depends_on:
- redis
- postgres

backend-worker:
build:
context: ./backend
dockerfile: Dockerfile
container_name: worker
command: celery -A aiproject worker --loglevel=INFO
deploy:
resources:
reservations:
devices:
- driver: nvidia
capabilities: [gpu]
volumes:
- ./backend:/app
- ${RAMP_HOME}:/RAMP_HOME
- ${TRAINING_WORKSPACE}:/TRAINING_WORKSPACE
depends_on:
- backend-api
- redis
- postgres

worker-dashboard:
image: mher/flower
container_name: flower
command: celery --broker=redis://redis:6379// flower --address=0.0.0.0 --port=5000
ports:
- 5500:5000
depends_on:
- backend-api
- redis
- backend-worker

frontend:
build:
context: ./frontend
dockerfile: Dockerfile.frontend
container_name: frontend
command: npm start -- --host 0.0.0.0 --port 3000
ports:
- 3000:3000
depends_on:
- backend-api
126 changes: 126 additions & 0 deletions docs/Docker-installation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,126 @@
Docker Compose is created with redis , worker , postgis database , api and frontend all in one making it easy for development . For production it is not recommended

## [DEV] Installation With Docker

1. Clone Repo

```
git clone https://github.com/hotosm/fAIr.git
```

2. Get Docker Compose Installed

If docker is not installed , Install it from [here](https://docs.docker.com/engine/install/)
```
docker compose version
```

3. Check your Graphics

fAIr works best with graphics card. It is highly recommended to use graphics card . It might not work with CPU only . Nvidia Graphics cards are tested

You need to make sure you can see your graphics card details and can be accessed through docker by installing necessary drivers

By following command you can see your graphics and graphics driver details
```
nvidia-smi
```

4. Clonse Base Model and Create RAMP_HOME

- Create a new folder called RAMP , outside fAIr

```
mkdir ramp
```
- Download BaseModel Checkpoint from [here](https://drive.google.com/file/d/1wvJhkiOrSlHmmvJ0avkAdu9sslFf5_I0/view?usp=sharing)

```
pip install gdown
gdown --fuzzy https://drive.google.com/file/d/1wvJhkiOrSlHmmvJ0avkAdu9sslFf5_I0/view?usp=sharing
```
- Clone Ramp Code

```
git clone https://github.com/kshitijrajsharma/ramp-code-fAIr.git ramp-code
```
- Unzip downloaded basemodel and move inside ramp-code/ramp

```
unzip checkpoint.tf.zip -d ramp-code/ramp
```
- Export Env variable for RAMP_HOME
Grab the file path of folder we created earlier ```ramp``` and export it as env variable
```
export RAMP_HOME=/home/YOUR_RAMP_LOCATION
```
eg : export RAMP_HOME=/home/kshitij/ramp

- Export ```TRAINING_WORKSPACE``` Env
Training workspace is the folder where fAIr will store its training files
for eg :
```
export TRAINING_WORKSPACE=/home/kshitij/hotosm/fAIr/trainings
```

5. Register your Local setup to OSM

- Go to [OpenStreetMap](https://www.openstreetmap.org/) , Login/Create Account
- Click on your Profile and Hit ```My Settings```
- Navigate to ```Oauth2 Applications```
- Register new application
- Check permissions for ```Read user preferences``` and Redirect URI to be ```http://127.0.0.1:3000/authenticate/``` , Give it name as ```fAIr Dev Local```
- You will get ```OSM_CLIENT_ID``` , ```OSM_CLIENT_SECRET``` Copy them

6. Create Env variables
- Create a file ```.env``` in backend with [docker_sample_env](../backend/docker_sample_env) content
```
cp docker_sample_env .env
```
- Fill out the details of ```OSM_CLIENT_ID``` &```OSM_CLIENT_SECRET``` in .env file and generate a unique key & paste it to ```OSM_SECRET_KEY``` (It can be random for dev setup)

Leave rest of the items as it is unless you know what you are doing

- Create ```.env``` in /frontend
```
cp .env_sample .env
```
You can leave it as it is for dev setup

7. Build & Run containers

```
docker compose build
```

```
docker compose up
```

8. Run Migrations

See Running containers grab their ID and launch bash to make migrations (This is needed for the first time to set database)

docker container ps

Grab Container ID & Open Bash

docker exec -it CONTAINER_ID bash


Once Bash is promoted hit following commands

python manage.py makemigrations
python manage.py makemigrations login
python manage.py makemigrations core
python manage.py migrate

9. Play and Develop

Restart containers

```
docker compose restart
```

Frontend will be available on 5000 port , Backend will be on 8000 , Flower will be on 5500
2 changes: 1 addition & 1 deletion frontend/.env sample → frontend/.env_sample
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
REACT_APP_CONNECT_ID=
REACT_APP_TM_API=https://tasking-manager-tm4-production-api.hotosm.org/api/v2/projects/PROJECT_ID/tasks/
REACT_APP_ENV=Dev
REACT_APP_API_BASE=http://localhost:8000/api/v1
REACT_APP_API_BASE=http://127.0.0.1:8000/api/v1
Loading

0 comments on commit 005b099

Please sign in to comment.