See also:
Clone the repo:
git clone git@github.com:security-force-monitor/sfm-cms.git
cd sfm-cms
Next, create a local settings file. If you're on the Blackbox keyring for the repo, decrypt the development settings file:
blackbox_cat configs/settings_local_dev.py.gpg > sfm_pc/settings_local.py
If you're not on the keyring, you can copy the example settings file and set
your own secret variables. Be sure to set the GOOGLE_MAPS_KEY
in order to load
maps!
cp sfm_pc/settings_local.example.py sfm_pc/settings_local.py
Finally, build the application:
docker-compose build
Once you've built the app, there are two ways that you can load data into your database.
The standard data loading process for the app can take many hours to complete. If you're working with a developer who already has a database set up, you can save time by having them make a dump of their database for you:
docker-compose run --rm app pg_dump -Fc -h postgres -U postgres -d sfm -O > sfm_empty.dump
Then, load this dump into your own database:
docker-compose run --rm app pg_restore -h postgres -U postgres -d sfm -O sfm.dump
Once the dump has been restored, move on to the step Set up the search index
below.
The standard way to load data into the app is to use the app's management commands for loading data. Start by updating the Countries
models:
docker-compose run --rm app ./manage.py update_countries_plus
OSM data is provided by the SFM team in a special GeoJSON format. You can import
location data from a local file by running the import_locations
command
directly –
# import fixtures/locations.geojson
docker-compose run --rm app ./manage.py import_locations
# import geojson file from arbitrary path
docker-compose run --rm app .manage.py import_locations --location_file path/to/your/file
– or, you can pass an optional --location_file_id
argument to download_country_data
to retrieve the specified file from Google Drive and import it before importing
entity data. (Read on for instructions!)
The download_country_data
expects to find credentials for the Google
Sheets and Google Drive APIs in sfm_pc/management/commands/credentials.json
.
If you are on the keyring for this project, run the following Blackbox command to
decrypt and create the expected credentials file, before you run download_country_data
.
(N.b., the data import service account lives under the SFM - Data Import
project
in the Google API Console.)
blackbox_cat configs/credentials.json.gpg > sfm_pc/management/commands/credentials.json
If you are not on the keyring, enable the Google Sheets and Google Drive APIs in
the Google API Console, then
create a service account to access them. This blog post
provides a helpful walkthrough! Be sure to save your credentials in
sfm_pc/management/commands/credentials.json
and give your service account
access to the correct files before you run download_country_data
.
Finally, import entity data from Google Drive:
# Download country data listed in fixtures/import_docket.csv
docker-compose run --rm app python manage.py download_country_data --entity_doc_id ${SOME_ID} --sources_doc_id ${SOME_ID} --location_doc_id ${SOME_ID} --country_code ${SOME_CODE} --parent_directory ${DIRECTORY}
# Import entity data from specified files
docker-compose run --rm app python -u manage.py import_country_data --country_code ${SOME_CODE} --country_path ${COUNTRY_PATH} --sources_path ${SOURCES_PATH}
See Importing the data for more detailed information about these management commands.
Build and start the Docker image for the Solr server:
docker-compose up --build solr
Open up another shell and create the search index:
docker-compose run --rm app ./manage.py update_index
docker-compose run --rm app ./manage.py update_composition_index
docker-compose up
Open http://localhost:8000/ to browse your shiny new website! New to WWIC development? See Getting started with development for helpful information about how the app is organized, as well as tips and tricks.
We use Django translation together with Rosetta and django-complex-fields.
Template translations appear inside trans
tags, like so:
{% trans "Countries" %}
Model field and form error translations appear inside _()
hooks, as such:
from django.utils.translation import gettext as _
...
field_name = _("End date")
This nomenclature signals that the text can be translated into the user's specified language. But first, someone with language expertise must provide the appropriate translation. Happily, Django can extract all translatable strings into a message file:
docker-compose run --rm app ./manage.py makemessages -l es
docker-compose run --rm app ./manage.py makemessages -l fr
This command generates a .po
file for each language. Rosetta facilitates the editing and compiling of these files. Go to /rosetta/
, and view the snippets of code, requiring translation, organized by language. Then, translate some text, click "Save and translate next block," and Rosetta compiles the code into Django-friendly translations.
While we keep translation files under version control, our deploy scripts preserve translation files between deployments. That means that if you make changes to translated content, you need to integrate your change with the deployed translation file, then manually replace the file on the server.
First, retrieve the deployed translations from the server.
# Assuming you are in the sfm-cms directory
mv -v locale locale-bk # Optionally back up your local translations
scp -r ${USER}@${INSTANCE_DOMAIN}:/path/to/app/locale locale/
Then, update the message files to include your changes.
python manage.py makemessages
Review the generated changes, make any adjustments, and compile the messages.
python manage.py compilemessages
Add the revised message files to version control for use in further development. Finally, upload the updated translations to the server, move them to the correct place, and restart the app for the translations to appear onsite.
# On your machine
scp -r locale/ ${USER}@${INSTANCE_DOMAIN}:/tmp
ssh ${USER}@{INSTANCE_DOMAIN}
# On the server
sudo mv -v /path/to/app/locale /path/to/app/locale-bk # Optionally back up deployed translations
sudo mv -v /tmp/locale /path/to/app/locale
sudo chown -R ${USER}.${GROUP} /path/to/app/locale
sudo supervisorctl restart sfm:sfm-cms
Run all tests from the root folder:
docker-compose -f docker-compose.yml -f tests/docker-compose.yml run --rm app
You can also run a collection of tests, like so:
docker-compose -f docker-compose.yml -f tests/docker-compose.yml run --rm app pytest tests/test_person.py
Or run a single test, like so:
docker-compose -f docker-compose.yml -f tests/docker-compose.yml run --rm app pytest tests/test_person.py::test_no_existing_sources
The data that powers WWIC is imported from three key Google Drive documents:
- A "sources" spreadsheet applicable to the entire import
- A "locations" GeoJSON file containing all implicated OSM data for entities in a particular country
- A "data" spreadsheet containing organization, person, and event entity data for a particular country
An import involves running a management command against a set of source, location, and data documents that represent either data for a new country, or updated data for an existing country.
The import has two pieces:
- The
download_country_data
management command downloads data from Google Drive. It exists within thesfm_pc
directory of this project. The command downloads the data from Google and saves it locally. As of February 2021, spreadsheets to be imported must follow this format: https://docs.google.com/spreadsheets/d/1-U_pVNDWFIlG8jLZLVujmV_FZ0H4IsbtfCeRiM8HeNQ/edit?usp=sharing - The
import_country_data
management command uses the local files to import the data to the database.
The following sections explain our process for running these imports locally and on live instances of WWIC.
When performing an import of new data, the first step should always be to
test the download in your development environment using the download_country_data
management command:
docker-compose run --rm app python manage.py download_country_data \
--sources_doc_id <some id> \
--location_doc_id <some id> \
--entity_doc_id <some id> \
--country_code <some code> \
--parent_directory <target directory>
Once you have the data downloaded, you can import the data by running the import_country_data
management command:
docker-compose run --rm app python manage.py import_country_data \
--country_code <some id> \
--country_path <directory where the country files are stored> \
--sources_path <directory with sources file> \
Alternatively, you can run the Make recipes. The Makefile contains recipes to import all of the data on the docket, import the first country in the CSV, or import a particular country by its country code.
# Import everything on the docket
docker-compose --env-file .env.import run --rm app make -e import_docket_import
# Import the first country on the docket
docker-compose --env-file .env.import run --rm app make -e next_import
# Import a particular country or countries
docker-compose --env-file .env.import run --rm app make -e ${COUNTRY_CODE}_cc_import [ ${COUNTRY_CODE}_cc_import ... ]
The download_country_data
management commands rely upon a Service
Account
having access to the spreadsheet. This service account is managed under the
SFM - Data Import
project in the Google API Console.
If you are indoctrinated into this project's keyring, decrypt the credentials for the service account like so:
gpg -d configs/credentials.json.gpg > sfm_pc/management/commands/credentials.json
Then, open sfm_pc/management/commands/credentials.json
and make note of the
client_email
value. You will need to share the files to be imported with this
email address. To grant the service account access to the documents, navigate to
the Share
button on each document and grant the client_email
of the service
account "Read" access. (You can also select multiple files, e.g., in the same
Drive, and update the sharing settings in one go.)
As of February 2021, SFM has shared with us a CSV file of source, location,
and data document IDs, as well as the country code they pertain to. This CSV
lives at fixtures/import_docket.csv
. If for
some reason you need to retrieve document IDs yourself, you can find the ID by
opening the document in your browser and copying the ID from the URL in your
address bar:
https://docs.google.com/spreadsheets/d/<doc id (very long hash looking thing)>/edit
If the importer raises warnings during the import, it will log them to logfiles
following the logfile pattern ${country_code}_${entity_type}_errors
. Once the
import is complete, take a look at your repo to check to see if the importer
generated any of these warning logfiles. If it did, send these logfiles to the
SFM team so they can make necessary adjustments to the data. We typically delete
logfiles between import runs, since the importer will append to a logfile if one
exists already.
Once your import has completed with error, refresh the derived data views and confirm that nothing breaks:
docker-compose run --rm app ./manage.py rebuild_index --noinput
docker-compose run --rm app ./manage.py update_composition_index --recreate
If you need to work with the data_archive
make recipe in your local development environment, cp .env.s3.example .env
and add your AWS access tokens. These tokens must be for an IAM user attached to the correct policy for accessing the wwic-data-archive-staging
S3 bucket. This bucket is configured to work in local development and on the staging server.
To create an archive locally, run docker-compose --env-file .env.s3 run --rm app make data_archive
so you can upload the zip archive to S3. Once this is done, the "download" link at the localhost:8000/en/download/
should work.
Once you've tested an import of a new dataset locally, you're ready to run the import on a live instance.
Since all live instances should already have credentials, you should be ready to run a live import using the Make recipes:
tmux new -s fresh-import
sudo su - datamade
workon sfm
cd ~/sfm-cms
# Import everything on the docket
make import_docket_import
# Import a particular country or countries
make -e ${COUNTRY_CODE}_cc_import [ ${COUNTRY_CODE}_cc_import ... ]
python manage.py rebuild_index --noinput
python manage.py update_composition_index --recreate
If need to fully rerun the entire WWIC import for all countries without disrupting a live instance, the Makefile includes a recipe for doing just that.
On every deploy, our build scripts create a separate directory, sfm-importer
. This
directory is a copy of the app that uses a separate database with the name importer
.
We'll use this database for our imports, so that the server can keep on serving
data normally during that process (which can take several hours!)
To perform a new import, start by creating a tmux session so your work doesn't
get interrupted. Then, activate the virtualenv for the project, cd
into the
appropriate directory, and fire the recipe to build a fresh database:
tmux new -s fresh-import
sudo su - datamade
source ~/.virtualenvs/sfm/bin/activate
cd ~/sfm-importer
make recreate_db
Finally, switch the sfm
and importer
databases:
# Renames the databases in a transaction -- the app doesn't need to stop
psql -U postgres < sfm_pc/management/commands/flush/rename.sql
Presto! A fresh import, with no server downtime.