Skip to content

Commit

Permalink
Merge pull request #19 from ec-jrc/development
Browse files Browse the repository at this point in the history
Github action, ADW, time shift fix
  • Loading branch information
doc78 committed Jul 17, 2023
2 parents 814282b + 52f0840 commit 29084bc
Show file tree
Hide file tree
Showing 12 changed files with 331 additions and 65 deletions.
13 changes: 13 additions & 0 deletions .github/workflows/ci_env.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,19 @@ jobs:
shell: bash -el {0}
run: |
pip install -r requirements.txt
pip install .
- name: Workaround for poppler lib issue
shell: bash -el {0}
run: |
conda list
pip install poppler-utils
conda install 'poppler=>22.10.0'
- name: Check installation
shell: bash -el {0}
run: |
conda list
gdal-config --version
python -c "from osgeo import gdal; print(gdal.__version__)"
- name: Test with pytest
shell: bash -el {0}
run: |
Expand Down
26 changes: 20 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -410,7 +410,7 @@ defining the step numbers to skip when writing PCRaster maps. Same as old grib2p
grib2pcraster). It can be average or accumulation.</td>
</tr>
<tr>
<td>&nbsp;</td><td><b>halfweights</b></td><td>If set to True and type is "average", it will avaluate the average by using half weights for the first and the last step</td>
<td>&nbsp;</td><td><b>halfweights</b></td><td>If set to true and type is "average", it will avaluate the average by using half weights for the first and the last step</td>
</tr>
<tr>
<td>&nbsp;</td><td>forceZeroArray</td><td>Optional. In case of “accumulation”, and only
Expand Down Expand Up @@ -669,6 +669,20 @@ Attributes p, leafsize and eps for the kd tree algorithm are default in scipy li
| eps | 0 |
| leafsize | 10 |
#### ADW
It's the Angular Distance Weighted (ADW) algorithm with scipy.kd_tree, using 4 neighbours.
If @adw_broadcasting is set to true, computations will run in full broadcasting mode but requires more memory
```json
{
"Interpolation": {
"@latMap": "/dataset/maps/europe5km/lat.map",
"@lonMap": "/dataset/maps/europe5km/long.map",
"@mode": "adw",
"@adw_broadcasting": false}
}
```
#### bilinear
It's the bilinear interpolation algorithm applyied on regular and irregular grids. On irregular grids, it tries to get the best quatrilateral around each target point, but at the same time tries to use the best stable and grid-like shape from starting points. To do so, evaluates interpolation looking at point on similar latitude, thus on projected grib files may show some irregular results.
Expand Down Expand Up @@ -734,7 +748,7 @@ The JSON configuration in the execution file will look like:
{
"Aggregation": {
"@type": "average",
"@halfweights": False}
"@halfweights": false}
}
```
Expand All @@ -752,7 +766,7 @@ Temperatures are often extracted as averages on 24 hours or 6 hours. Here's a ty
"Aggregation": {
"@step": 24,
"@type": "average",
"@halfweights": False
"@halfweights": false
},
"OutMaps": {
"@cloneMap": "/dataset/maps/europe/dem.map",
Expand Down Expand Up @@ -798,7 +812,7 @@ This is needed because we performed 24 hours average over 6 hourly steps.
The to evaluate the average, the following steps are executed:
- when "halfweights" is False, the results of the function is the sum of all the values from "start_step-aggregation_step+1" to end_step, taking for each step the value corresponding to the next available value in the grib file. E.g:
- when "halfweights" is false, the results of the function is the sum of all the values from "start_step-aggregation_step+1" to end_step, taking for each step the value corresponding to the next available value in the grib file. E.g:
INPUT: start_step=24, end_step=<not specified, will take the end of file>, aggregation_step=24
GRIB File: contains data starting from step 0 to 48 every 6 hours: 0,6,12,18,24,30,....
Expand All @@ -807,7 +821,7 @@ GRIB File: contains data starting from step 0 to 48 every 6 hours: 0,6,12,18,24,
Day 2: same as Day 1 starting from (24+24)-24+1=25...
- when "halfweights" is True, the results of the function is the sum of all the values from "start_step-aggregation_step" to end_step, taking for each step the value corresponding to the next available value in the grib file but using half of the weights for the first and the last step in each aggregation_step cicle. E.g:
- when "halfweights" is true, the results of the function is the sum of all the values from "start_step-aggregation_step" to end_step, taking for each step the value corresponding to the next available value in the grib file but using half of the weights for the first and the last step in each aggregation_step cicle. E.g:
INPUT: start_step=24, end_step=<not specified, will take the end of file>, aggregation_step=24
GRIB File: contains data starting from step 0 to 72 every 6 hours: 0,6,12,18,24,30,36,....
Expand All @@ -816,7 +830,7 @@ GRIB File: contains data starting from step 0 to 72 every 6 hours: 0,6,12,18,24,
Day 2: same as Day 1 starting from (24+24)-24=24: the step 24 will have a weight of 3, while steps 30,36 and 42 will be counted 6 times, and finally the step 48 will have a weight of 3.
- if start_step is zero or is not specified, the aggregation will start from 1
- if start_step is zero or is not specified, the aggregation will start from 0
### Accumulation
For precipitation values, accumulation over 6 or 24 hours is often performed. Here's an example of configuration and execution output in DEBUG mode.
Expand Down
11 changes: 11 additions & 0 deletions configuration/global/intertables.json
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,17 @@
680
]
},
"I-15.75_16.125_511_415_212065_rotated_ll_-1700000_2700000_5000_-5000_34.97_71.07_-1700000_2700000_5000_-5000_-24.35_51.35scipy_adw": {
"filename": "tbl_pf10tp_550800_scipy_adw.npy",
"method": "adw",
"source_shape": [
212065
],
"target_shape": [
810,
680
]
},
"I-15.75_16.125_511_415_212065_rotated_ll_-1700000_2700000_5000_-5000_34.97_71.07_-1700000_2700000_5000_-5000_-24.35_51.35scipy_nearest": {
"filename": "tbl_pf10slhf_550800_scipy_nearest.npy",
"method": "nearest",
Expand Down
2 changes: 1 addition & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ numpy>=1.18.2
scipy>=1.4.1
# GDAL
numexpr>=2.7.1
# eccodes-python # replacing old GRIB API
importlib-metadata<5.0.0

# dask packages for parallel interpolation
dask[bag]
Expand Down
2 changes: 1 addition & 1 deletion src/pyg2p/VERSION
Original file line number Diff line number Diff line change
@@ -1 +1 @@
3.2.4
3.2.5
1 change: 1 addition & 0 deletions src/pyg2p/main/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -168,6 +168,7 @@ def _define_exec_params(self):
self._vars['outMaps.clone'] = self.api_conf['OutMaps']['cloneMap']
interpolation_conf = self.api_conf['OutMaps']['Interpolation']
self._vars['interpolation.mode'] = interpolation_conf.get('mode', self.default_values['interpolation.mode'])
self._vars['interpolation.use_broadcasting'] = interpolation_conf.get('use_broadcasting', False)
self._vars['interpolation.rotated_target'] = interpolation_conf.get('rotated_target', False)
if not self._vars['interpolation.dir'] and self.api_conf.get('intertableDir'):
self._vars['interpolation.dirs']['user'] = self.api_conf['intertableDir']
Expand Down
4 changes: 3 additions & 1 deletion src/pyg2p/main/context.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@


class Context:
allowed_interp_methods = ('grib_nearest', 'grib_invdist', 'nearest', 'invdist', 'bilinear', 'triangulation', 'bilinear_delaunay')
allowed_interp_methods = ('grib_nearest', 'grib_invdist', 'nearest', 'invdist', 'adw', 'cdd', 'bilinear', 'triangulation', 'bilinear_delaunay')
default_values = {'interpolation.mode': 'grib_nearest', 'outMaps.unitTime': '24'}

def __getitem__(self, param):
Expand Down Expand Up @@ -360,6 +360,7 @@ def _define_exec_params(self):
self._vars['outMaps.clone'] = exec_conf['OutMaps']['@cloneMap']
interpolation_conf = exec_conf['OutMaps']['Interpolation']
self._vars['interpolation.mode'] = interpolation_conf.get('@mode', self.default_values['interpolation.mode'])
self._vars['interpolation.adw_broadcasting'] = interpolation_conf.get('@adw_broadcasting', False)
self._vars['interpolation.rotated_target'] = interpolation_conf.get('@rotated_target', False)
if not self._vars['interpolation.dir'] and interpolation_conf.get('@intertableDir'):
# get from JSON
Expand Down Expand Up @@ -412,6 +413,7 @@ def _define_exec_params(self):
if exec_conf.get('Aggregation'):
self._vars['aggregation.step'] = exec_conf['Aggregation'].get('@step')
self._vars['aggregation.type'] = exec_conf['Aggregation'].get('@type')
self._vars['aggregation.halfweights'] = bool(exec_conf['Aggregation'].get('@halfweights'))

self._vars['execution.doAggregation'] = bool(self._vars.get('aggregation.step')) \
and bool(self._vars.get('aggregation.type'))
Expand Down
39 changes: 31 additions & 8 deletions src/pyg2p/main/interpolation/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@

from . import grib_interpolation_lib
from .latlong import LatLong
from .scipy_interpolation_lib import ScipyInterpolation, DEBUG_BILINEAR_INTERPOLATION, \
from .scipy_interpolation_lib import ScipyInterpolation, DEBUG_BILINEAR_INTERPOLATION, DEBUG_ADW_INTERPOLATION, \
DEBUG_MIN_LAT, DEBUG_MIN_LON, DEBUG_MAX_LAT, DEBUG_MAX_LON

from ...exceptions import ApplicationException, NO_INTERTABLE_CREATED
Expand All @@ -20,9 +20,9 @@
class Interpolator(Loggable):
_LOADED_INTERTABLES = {}
_prefix = 'I'
scipy_modes_nnear = {'nearest': 1, 'invdist': 4, 'bilinear': 4, 'triangulation': 3, 'bilinear_delaunay': 4}
scipy_modes_nnear = {'nearest': 1, 'invdist': 4, 'adw': 4, 'cdd': 4, 'bilinear': 4, 'triangulation': 3, 'bilinear_delaunay': 4}
suffixes = {'grib_nearest': 'grib_nearest', 'grib_invdist': 'grib_invdist',
'nearest': 'scipy_nearest', 'invdist': 'scipy_invdist',
'nearest': 'scipy_nearest', 'invdist': 'scipy_invdist', 'adw': 'scipy_adw', 'cdd': 'scipy_cdd',
'bilinear': 'scipy_bilinear', 'triangulation': 'scipy_triangulation', 'bilinear_delaunay': 'scipy_bilinear_delaunay'}
_format_intertable = 'tbl{prognum}_{source_file}_{target_size}_{suffix}.npy.gz'.format

Expand All @@ -31,6 +31,7 @@ def __init__(self, exec_ctx, mv_input):
self._mv_grib = mv_input
self.interpolate_with_grib = exec_ctx.is_with_grib_interpolation
self._mode = exec_ctx.get('interpolation.mode')
self._adw_broadcasting = exec_ctx.get('interpolation.adw_broadcasting')
self._source_filename = pyg2p.util.files.filename(exec_ctx.get('input.file'))
self._suffix = self.suffixes[self._mode]
self._intertable_dirs = exec_ctx.get('interpolation.dirs')
Expand Down Expand Up @@ -127,7 +128,7 @@ def _read_intertable(self, tbl_fullpath):
coeffs = intertable['coeffs']
return indexes[0], indexes[1], indexes[2], indexes[3], indexes[4], indexes[5], coeffs[0], coeffs[1], coeffs[2], coeffs[3]
else:
# self._mode in ('invdist', 'nearest', 'bilinear', 'triangulation', 'bilinear_delaunay'):
# self._mode in ('invdist', 'adw', 'cdd', 'nearest', 'bilinear', 'triangulation', 'bilinear_delaunay'):
# return indexes and weighted distances (only used with nnear > 1)
indexes = intertable['indexes']
coeffs = intertable['coeffs']
Expand Down Expand Up @@ -177,11 +178,33 @@ def interpolate_scipy(self, latgrib, longrib, v, grid_id, grid_details=None):
else:
intertable_id, intertable_name = self._intertable_filename(grid_id)

if DEBUG_ADW_INTERPOLATION:
# to debug create a limited controlled set of input values
#
# np.random.seed(0)
# latgrib = np.random.uniform(low=3, high=11, size=10)
# longrib = np.random.uniform(low=46, high=50, size=10)
# v = np.random.uniform(low=100, high=200, size=10)
# latgrib = np.array([ 7.39050803, 8.72151493, 7.82210701, 7.35906546, 6.38923839,
# 8.1671529, 6.50069769, 10.13418401, 10.70930208, 6.06753215])
# longrib = np.array([49.16690015, 48.11557968, 48.27217824, 49.70238655, 46.28414423,
# 46.3485172, 46.08087359, 49.33047938, 49.112627, 49.48004859])
# v = np.array([197.86183422, 179.91585642, 146.14793623, 178.05291763, 111.82744259,
# 163.99210213, 114.33532874, 194.4668917, 152.18483218, 141.466194 ])
# latgrib = np.array([ 7.39050803, 8.72151493, 7.82210701, 7.35906546])
# longrib = np.array([49.16690015, 48.11557968, 48.27217824, 49.70238655])
# v = np.array([100, 133, 166, 200 ])
latgrib = np.array([ 8, 8, 8, 8])
longrib = np.array([45, 48.5, 49, 50])
v = np.array([200, 100, 100, 100 ])
intertable_id, intertable_name = 'DEBUG_ADW','DEBUG_ADW.npy'

nnear = self.scipy_modes_nnear[self._mode]

if pyg2p.util.files.exists(intertable_name) or pyg2p.util.files.exists(intertable_name + '.gz'):
indexes, weights = self._read_intertable(intertable_name)
result = self._interpolate_scipy_append_mv(v, weights, indexes, nnear)
if (not DEBUG_ADW_INTERPOLATION) and \
(pyg2p.util.files.exists(intertable_name) or pyg2p.util.files.exists(intertable_name + '.gz')):
indexes, weights = self._read_intertable(intertable_name)
result = self._interpolate_scipy_append_mv(v, weights, indexes, nnear)

elif self.create_if_missing:
self.intertables_config.check_write()
Expand All @@ -192,7 +215,7 @@ def interpolate_scipy(self, latgrib, longrib, v, grid_id, grid_details=None):
self._log('\nInterpolating table not found\n Id: {}\nWill create file: {}'.format(intertable_id, intertable_name), 'WARN')
scipy_interpolation = ScipyInterpolation(longrib, latgrib, grid_details, v.ravel(), nnear, self.mv_out,
self._mv_grib, target_is_rotated=self._rotated_target_grid,
parallel=self.parallel, mode=self._mode)
parallel=self.parallel, mode=self._mode, use_broadcasting=self._adw_broadcasting)
_, weights, indexes = scipy_interpolation.interpolate(lonefas, latefas)
result = self._interpolate_scipy_append_mv(v, weights, indexes, nnear)

Expand Down
Loading

0 comments on commit 29084bc

Please sign in to comment.