- At least 32GB of RAM
- At least 200GB of free disk space
- python 3.6+
- pip for python3
- protobuf python package (
pip3 install protobuf
) - shapely 1.7.1 or later python package (
pip3 install "shapely>=1.7.1"
) - osmium 3.1.0 or later python package (
pip3 install "osmium>=3.1.0"
)
It is highly recommended to use PyPy instead of CPython, as it can speed up processing several times. Also, it is highly recommended to use solid state storage for the input files and output files as the workloads are very IO intensive.
This is an optional step that is normally not needed! This is needed only if a different structure of offline packages is needed.
A prerequisite for this is a directory containing .poly files
python3 scripts/build_poly_tilemasks.py POLY_FILE_DIRECTORY PACKAGES_TEMPLATE_DIRECTORY
PACKAGES_TEMPLATE_DIRECTORY is the output directory for 'packages.json.template' file. It is needed as an input for other stages. The whole process takes around 30 minutes, depending on the complexity of .poly files.
A prerequisite for this is a 'planet.mbtiles' file containing tiles for the whole planet (about 70GB).
python3 scripts/build_map_packages.py data/packages-carto.json.template PLANET_MBTILES_FILE PACKAGES_DIRECTORY
The individual package .mbtiles files are placed into PACKAGES_DIRECTORY. Also, the script generates 'packages.json' file that has URLs to the individual packages. Once the packages are uploaded, the URLs in this file need to be updated and the 'packages.json' can the be uploaded. The step should take about 20 hours using a 8-core CPU.
This step is optional and reduces the total size of the packages by about 5-10%, though some packages like Greenland can be up to 2x smaller after this.
python scripts/zdict_from_packages "PACKAGES_DIRECTORY/*.mbtiles" ZDICT_DIRECTORY
This step should take around 4 hours to complete. As a result .zdict files are created and stored in ZDICT_DIRECTORY.
After dictonary files are created, the packages can be recreated using .zdict files:
python3 scripts/build_map_packages.py --zdict ZDICT_DIRECTORY data/packages-carto.json.template PLANET_MBTILES_FILE PACKAGES_DIRECTORY
The generated .mbtiles files after this step are no longer usable with other SDKs or tools, as this utilizes custom .mbtiles extension only supported by CARTO Mobile SDK.
A prerequisite for this is Valhalla 3 installation.
First download the large 'planet.osm.pbf' (50GB) file. Build the package extracts:
valhalla_build_tiles -c data/valhalla.json PATH_TO_OSM_PBF_FILE
This step should take abound 24 hours.
python3 scripts/build_valhalla_packages.py data/packages-carto.json.template valhalla_tiles PACKAGES_DIRECTORY
The individual package .vtiles files are placed into PACKAGES_DIRECTORY. Also, the script generates 'packages.json' file that has URLs to the individual packages. Once the packages are uploaded, the URLs in this file need to be updated and the 'packages.json' can the be uploaded. The step should take about 12 hours using a 8-core CPU.
First download latest WhosOnFirst gazetter database as Sqlite file from [https://dist.whosonfirst.org/sqlite/]
Create R-Tree index for fast spatial queries using the downloaded database:
python3 scripts/build_wof_index.py WHOSONFIRST_FILE
This should take 5-10mins.
First download the large 'planet.osm.pbf' (50GB) file. Build the package extracts:
python3 scripts/extract_package_pbfs.py data/packages-carto.json.template PLANET_OSM_PBF_FILE PBF_EXTRACT_DIRECTORY
This step should take around 24 hours using a 8-core CPU. The amount of memory is critical, at least 32GB is required.
This steps extracts addresses, POIs, streets and buildings from .pbf extracts:
python3 scripts/build_osm_addresses.py data/packages-carto.json.template PBF_EXTRACT_DIRECTORY OSM_ADDRESS_DIRECTORY
This should take around 20 hours using a 8-core CPU.
python3 scripts/build_geocoding_packages.py data/packages-carto.json.template OSM_ADDRESS_DIRECTORY WHOSONFIRST_FILE PACKAGES_DIRECTORY
The individual package .nutigeodb files are placed into PACKAGES_DIRECTORY. Also, the script generates 'packages.json' file that has URLs to the individual packages. Once the packages are uploaded, the URLs in this file need to be updated and the 'packages.json' can the be uploaded. The step should take about 12 hours using a 8-core CPU.