Download sentinel2 images according to criteria (type, clouds, ...) for a location using Kalisio Krawler https://github.com/kalisio/krawler
It scrapes public Sentinel images data from ESA https://scihub.copernicus.eu/userguide/WebHome public server.
The images information (id, name, url, quicklook) are stored in a MongoDB database.
Git is a free and open source distributed version control system designed to handle everything from small to very large projects with speed and efficiency.
Download and install : https://git-scm.com
nvm is a version manager for node.js, designed to be installed per-user, and invoked per-shell.
Download and install : https://github.com/coreybutler/nvm-windows
Then install Node 16
nvm install 16
nvm use 16.15.1
Yarn is a package manager that doubles down as project manager.
Download and install : https://classic.yarnpkg.com/lang/en/docs/install/#windows-stable
Add yarn to Windows environment PATH variable
From docker hub https://hub.docker.com/_/mongo
docker pull mongo
docker run --name some-mongo -d mongo:tag
Install compass to explore and manipulate your database https://www.mongodb.com/try/download/compass
Krawler is a minimalist ETL that make automated process of extracting and processing (geographic) data from heterogeneous sources with ease
In the working directory (e.g. c:\workspace), create a kalisio directory
cd c:\workspace\kalisio
git clone https://github.com/kalisio/krawler
cd krawler
yarn install
yarn link
In the kalisio directory ( ex: C:\workspace\kalisio)
git clone https://github.com/calysteau/krawler-job04
cd krawler-job04
yarn install
yarn link @kalisio/krawler
Set parameters in jobfile.js
const platformname = 'Sentinel - 2'
const producttype = 'S2MSI2A'
const latitude = '43.604652'
const longitude = '1.444209'
const cloudcoverpercentagemin = '0'
const cloudcoverpercentagemax = '5'
// Register Open Access Hub account on https://scihub.copernicus.eu/userguide/SelfRegistration and set your login/password credentials
const username = 'login'
const userpassword = 'password'
Run the job
krawler jobfile.js
If you need to activate the Krawler DEBUG
set DEBUG= krawler* (pour CMD)
or
$env:DEBUG="krawler*" (pour PowerShell)