Skip to content

Commit

Permalink
feat: sources sinks converter (UNST-8528) (#721)
Browse files Browse the repository at this point in the history
* test initializing the `ExtOldModel`

* remove unused imported modules

* remove unused imported modules

* reformat the test_ext_old_to_new.py and test the printed messages

* reformat the test_ext_old_to_new.py and test the printed messages

* add `InitialCondInterpolationMethod` class and test

* add `InitialCondFileType` class and test

* fix initial condition tests

* create `test_meteo_forcing_file_type` to test `MeteoForcingFileType`

* change tests to check for the parameter values not the parameter name

* test `MeteoInterpolationMethod` in the `test_meteo_interpolation_methods`

* reformat and test the Meteo class

* clean tests that were moved to the TestMeteo test class

* move `test_missing_required_fields` to the separate `TestMeteo` class

* solve the compare floating point values

* rename the old `TestMeteo` to `TestExtModel`

* reformat the `TestExtModel`

* add the cases for time_series/boundary condition files as forcing files to the `TestMeteo`

* correct field names in the `InitialCondition` class

* test the `InitialConditions` class for all possible behaviours

* fix issue in assigning mutable object (list) as a default

* test `construct_filemodel_new_or_existing` function

* reformat the test_ext_old_to_new.py file to use fixtures instead of the tests.utils module

* import the used function directly at the top of the file do not use "module.function"

* name test files by their respective modules not inner functions or classes

* reformat test file

* move check values to the conftest file

* test the `_read_ext_old_data` function

* test the `InitialConditionConverter` class

* reformat test

* move the initial condition fields tests from the test_ext.py to the test_inifield

* remove duplicated initial condition fields classes

* autoformat: isort & black

* move the InitialCondition tests to the tests_inifield.py

* fix use mutable as default value

* create and test `InitialConditionConverter`

* autoformat: isort & black

* user snake case parameter names

* integrate the Initial_Condition converter to the `ext_old_to_new`

* autoformat: isort & black

* delete test results

* fix test error

* clean

* autoformat: isort & black

* remove ignored tests

* use default_factory=list instead of [] to prevent using mutable objects as default value

* use fixtures instead of variables declared in the `tests.utils` module

* convert relative imports to absolute imports

* exclude python=3.12.5 bcause of black warning about memory safety issue.

* create separate group for the docs dependencies

* create `ExternalForcingConverter` class to include all converter functionality

* create class method for the `ExternalForcingConverter` from the old external forcing `ExtOldModel`

* autoformat: isort & black

* convert the `ext_old_to_new` function into a method called `update` in the `ExternalForcingConverter` class

* first step to move out the `construct_filemodel_new_or_existing` calling from the `update` method

* create setter and getter properties for each model, and make abstract version of the `update` method, separate the `save` functionality

* reformat the converter tests

* separate tests for the update, save, and add default paths to the models in the constructor method

* fix the error of using mutables as a default value.

* create separate test class for the update method in the converter

* test files for the update meteo test

* rename the meteo_converter into converters and merge the boundary coverter to it

* move the initial_condition_converter to the converters module

* move the initial_condition_converter to the converters module

* autoformat: isort & black

* add the `BoundaryConditionConverter` to the converter_factory.py

* create a `converters.BoundaryConditionConverter` class

* fix using mutables as default value

* fix using mutables as default value

* use the getattr to fetch attributes from the object

* create subfolders in the test directory

* get rid of relative paths

* add init file to the polyline test dir

* add tests for different polyline cases (basic case, and with label)

* add tests for different polyline cases (basic case, and with label)

* split the test_ext.py to a separate testing modules

* fix error in Boundary object instantiation in the `BoundaryConditionConverter`

* test the `ext.models.boundary` with an existing polyline

* test the `ExternalForcingConverter.update` with only boundary condition data

* replace float point comparison with np.isclose

* replace float point comparison with np.isclose

* remove unused imported function

* add a pull request template

* correct test folder name from extold to ext

* optimize the `ext.models.Boundary.forcing` property to have one return statement at the end of the property.

* test the `ExtOldParametersQuantity` class

* correct assert statement is `test_ext_old_initial_condition_quantity` test

* move the TestModels.TestLateral from thr test_ext.py to a separate test file test_laterals.py

* correct the test file description

* reformat the test_boundary.py file

* clean the test file from unused imported functions

* remove unused class `ext.models.InitialConditions`

* add reference to the initial condition documentation in the `InitialField` class

* add `ParametersConverter` class and test

* merge the `BaseConverter` class to the `converters` module

* test `MeteoConverter`

* merge `converter_factory` module to the `converters` module

* merge the `enum_converters` module to `utils`

* merge the `__contains__` to the `ConverterFactory` class

* abstract duplicate lines in the parameter and initial condition converters

* add try, except clause to avoid PermissionError in Windows

* update MetroForcingFileType class with updates types

* silence dimr and serializer tests

* autoformat: isort & black

* fix error in using mutables as default value

* remove the `initialsalinitytopuse` quantity from the `ExtOldInitialConditionQuantity` class.

* add missing initial condition quantities and add `__missing__` method to accept tracer quantities

* adjust the `tracer` quantity to `initialtracer`

* add list of old initial conditions quantities that comes with a suffix and test them

* update docstring

* remove duplicate `InitialConditions` class (duplicate if `InitialField`)

* fix floating point comparison

* fix floating point comparison

* remove un-used parameter `postfix and fix sonar warnings`

* remove the `locationtype` and convert the value of the extrapolation to yes/no from 1/0

* the `ExternalForcingConverter` class can be instantiated by `ExtOldModel` or a path to external forcing file and the `_read_old_file` not is used inside the constructor to read the external forcing file if path is given to the converter

* rename boundary polylines files to `boundary-polyline-` prefix

* update converter command line

* add `ExtOldSourcesSinks` class for the source and sinks quantities

* test `ExtOldSourcesSinks` class for the source and sinks quantities

* return the correct return type hint

* fix using mutables as default value, and use absolute imports

* abstract the conversion of the interpolation related info in a separate function

* add docstring and test for the `convert_interpolation_data` method

* add reference to documentation for the tim model

* remove unused fields in the `TimParser`

* move the `convert_interpolation_data` from the converters.py to the utils.py

* move the `convert_initial_cond_param_dict` from the converters.py to the utils.py

* add docs for the `UnknownKeywordErrorManager`

* refactor the `UnknownKeywordErrorManager`

* use absolute imports

* add a class method `_exclude_from_validation` to enable bypassing the `raise_error_for_unknown_keywords`

* create a `SourceSink` class and tests

* refactor the `TestSourceSink`

* change the type of discharge in the `SourceSink` class to float or list[float]

* add `SourceSinkConverter` and tests

* test discharge constant value in the `SourceSink`

* add utility function for locate the temperature and salinity in the quantities list

* add missing test files for the `test_converters.py::TestSourceSinkConverter.test_default`

* the `find_temperature_salinity_in_quantities` no returns a dictionary with `temperaturedelta` and `salinitydelta` not temperature and salinity

* add default test case for the SourceSinkConverter

* reformat the polyfile tests

* add testcase for polyfile with 2*3

* add testcase for the polyfile with 2*5 dimensions

* add `SourceSinkConverter.get_z_sources_sinks` method and tests

* add `initialsedfrac` quantity name to the `SourcesSink` model

* test `SourceSinkConverter` with a 5 column polyline file

* test `SourceSinkConverter.parse_tim_model` with quantities_list and tim file mismatch

* test `SourceSinkConverter.parse_tim_model` with default case

* move the `SourceSinkConverter` tests to a separate test file

* add missing test files

* add all cases of the temperature/salinity in the external forcing file wrt the tim file

* add no temperature no salinity to the SourceSink converter

* make the `discharge` attribute not optional

* replace equality be `is` to check None

* move test_extold.py to a sub dir

* move TestExtForcing to test_ext_forcing.py

* reformat tests

* add `quantities` property to the `ExtOldModel`

* add `x` and `y` properties to the `PolyFile` class

* move the `get_z_sources_sinks` method from the `SourceSinkConverter` to the `PolyFile` class

* remove the `discharge_salinity_temperature_sorsin` from the old meteo quantities

* add `root_dir` as a property to the `SourceSinkConverter`

* add `source_sink` to the attributes of the `ExtModel` as a list of `SourceSink` models

* add `SourceSinkConverter` and default test case

* add `SourceSinkConverter` and default test case

* reformat the `test_main_converter.py::TestUpdate` and create separate `TestUpdateSourcesSinks`

* rename test

* add an example

* add documentations

* use absolute imports

* change the backup function to keep the whole name with the extension and add the .bak

* clean

* convert the `ext_old_to_new_from_mdu` function to a method `from_mdu` in the `ExternalForcingConverter` class

* add a `verbose` property the `ExternalForcingConverter` class

* move logging to separate method

* refactor the `update` method

* add lines to parse the temperature and salinity in the mdu file

* update packages

* compare the temperature and salinity between the mdu vs ext vs tim file

* more descriptive variable name

* add missing function parameter docstring

* correct docstrings and type hints

* add example and test for the `PolyFile.get_z_source_sinks`method in case of a polyline without z values

* use `ForcingData` class as a type for the discharge, temperature, and salinity in the `SourceSink` class

* remove `interpolationmethod` and `operand` from the attributes of the `SourceSink` class

* the order is `salinity` then `temperature` in the tim file

* rename the `kwargs` to a descriptive name `mdu_quantities`

* test the combination of mdu and ext file

* abstract the lines for merging the ext_quantities and the mdu_quantities

* update docstring

* correct type hint

* Revert "use `ForcingData` class as a type for the discharge, temperature, and salinity in the `SourceSink` class"

This reverts commit 89b55a0

Signed-off-by: Mostafa Farrag <moah.farag@gmail.com>

---------

Signed-off-by: Mostafa Farrag <moah.farag@gmail.com>
Co-authored-by: MAfarrag <MAfarrag@users.noreply.github.com>
  • Loading branch information
MAfarrag and MAfarrag authored Jan 15, 2025
1 parent d9387e0 commit 24ee203
Show file tree
Hide file tree
Showing 32 changed files with 7,956 additions and 1,414 deletions.
72 changes: 70 additions & 2 deletions hydrolib/core/dflowfm/ext/models.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
from pathlib import Path
from typing import Dict, List, Literal, Optional, Union
from typing import Dict, List, Literal, Optional, Set, Union

from pydantic.v1 import Field, root_validator, validator
from strenum import StrEnum
Expand All @@ -24,6 +24,13 @@
from hydrolib.core.dflowfm.tim.models import TimModel
from hydrolib.core.utils import str_is_empty_or_none

SOURCE_SINKS_QUANTITIES_VALID_PREFIXES = (
"initialtracer",
"tracerbnd",
"sedfracbnd",
"initialsedfrac",
)


class Boundary(INIBasedModel):
"""
Expand Down Expand Up @@ -181,6 +188,65 @@ def validate_location_type(cls, v: str) -> str:
return v


class SourceSink(INIBasedModel):
"""
A `[SourceSink]` block for use inside an external forcings file,
i.e., a [ExtModel][hydrolib.core.dflowfm.ext.models.SourceSink].
All lowercased attributes match with the source-sink input as described in
[UM Sec.C.5.2.4](https://content.oss.deltares.nl/delft3dfm1d2d/D-Flow_FM_User_Manual_1D2D.pdf#subsection.C.5.2.4).
"""

_header: Literal["SourceSink"] = "SourceSink"
id: str = Field(alias="id")
name: str = Field("", alias="name")
locationfile: DiskOnlyFileModel = Field(
default_factory=lambda: DiskOnlyFileModel(None), alias="locationFile"
)

numcoordinates: Optional[int] = Field(alias="numCoordinates")
xcoordinates: Optional[List[float]] = Field(alias="xCoordinates")
ycoordinates: Optional[List[float]] = Field(alias="yCoordinates")

zsource: Optional[Union[float, List[float]]] = Field(alias="zSource")
zsink: Optional[Union[float, List[float]]] = Field(alias="zSink")
discharge: Union[float, List[float]] = Field(alias="discharge")
area: Optional[float] = Field(alias="Area")

salinitydelta: Optional[Union[List[float], float]] = Field(alias="SalinityDelta")
temperaturedelta: Optional[Union[List[float], float]] = Field(
alias="TemperatureDelta"
)

@classmethod
def _exclude_from_validation(cls, input_data: Optional[dict] = None) -> Set:
fields = cls.__fields__
unknown_keywords = [
key
for key in input_data.keys()
if key not in fields
and key.startswith(SOURCE_SINKS_QUANTITIES_VALID_PREFIXES)
]
return set(unknown_keywords)

class Config:
"""
Config class to tell Pydantic to accept fields not explicitly declared in the model.
"""

# Allow dynamic fields
extra = "allow"

def __init__(self, **data):
super().__init__(**data)
# Add dynamic attributes for fields starting with 'tracer'
for key, value in data.items():
if isinstance(key, str) and key.startswith(
SOURCE_SINKS_QUANTITIES_VALID_PREFIXES
):
setattr(self, key, value)


class MeteoForcingFileType(StrEnum):
"""
Enum class containing the valid values for the forcingFileType
Expand Down Expand Up @@ -318,7 +384,7 @@ def is_intermediate_link(self) -> bool:


class ExtGeneral(INIGeneral):
"""The external forcing file's `[General]` section with file meta data."""
"""The external forcing file's `[General]` section with file meta-data."""

_header: Literal["General"] = "General"
fileversion: str = Field("2.01", alias="fileVersion")
Expand All @@ -335,12 +401,14 @@ class ExtModel(INIModel):
general (ExtGeneral): `[General]` block with file metadata.
boundary (List[Boundary]): List of `[Boundary]` blocks for all boundary conditions.
lateral (List[Lateral]): List of `[Lateral]` blocks for all lateral discharges.
source_sink (List[SourceSink]): List of `[SourceSink]` blocks for all source/sink terms.
meteo (List[Meteo]): List of `[Meteo]` blocks for all meteorological forcings.
"""

general: ExtGeneral = ExtGeneral()
boundary: List[Boundary] = Field(default_factory=list)
lateral: List[Lateral] = Field(default_factory=list)
source_sink: List[SourceSink] = Field(default_factory=list)
meteo: List[Meteo] = Field(default_factory=list)
serializer_config: INISerializerConfig = INISerializerConfig(
section_indent=0, property_indent=0
Expand Down
13 changes: 11 additions & 2 deletions hydrolib/core/dflowfm/extold/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -210,8 +210,6 @@ class ExtOldMeteoQuantity(StrEnum):
"""Long wave radiation"""
SolarRadiation = "solarradiation"
"""Solar radiation"""
DischargeSalinityTemperatureSorSin = "discharge_salinity_temperature_sorsin"
"""Discharge, salinity temperature source-sinks"""
NudgeSalinityTemperature = "nudge_salinity_temperature"
"""Nudging salinity and temperature"""
AirPressure = "airpressure"
Expand Down Expand Up @@ -302,6 +300,12 @@ def _missing_(cls, value):
)


class ExtOldSourcesSinks(StrEnum):
"""Source and sink quantities"""

DischargeSalinityTemperatureSorSin = "discharge_salinity_temperature_sorsin"


class ExtOldQuantity(StrEnum):
"""Enum class containing the valid values for the boundary conditions category
of the external forcings.
Expand Down Expand Up @@ -789,3 +793,8 @@ def _get_serializer(
@classmethod
def _get_parser(cls) -> Callable[[Path], Dict]:
return Parser.parse

@property
def quantities(self) -> List[str]:
"""List all the quantities in the external forcings file."""
return [forcing.quantity for forcing in self.forcing]
25 changes: 19 additions & 6 deletions hydrolib/core/dflowfm/ini/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,15 +28,22 @@
ModelSaveSettings,
ParsableFileModel,
)

from ..ini.io_models import CommentBlock, Document, Property, Section
from .parser import Parser
from .serializer import (
from hydrolib.core.dflowfm.ini.io_models import (
CommentBlock,
Document,
Property,
Section,
)
from hydrolib.core.dflowfm.ini.parser import Parser
from hydrolib.core.dflowfm.ini.serializer import (
DataBlockINIBasedSerializerConfig,
INISerializerConfig,
write_ini,
)
from .util import UnknownKeywordErrorManager, make_list_validator
from hydrolib.core.dflowfm.ini.util import (
UnknownKeywordErrorManager,
make_list_validator,
)

logger = logging.getLogger(__name__)

Expand Down Expand Up @@ -121,12 +128,13 @@ class Config:
@root_validator(pre=True)
def _validate_unknown_keywords(cls, values):
unknown_keyword_error_manager = cls._get_unknown_keyword_error_manager()
do_not_validate = cls._exclude_from_validation(values)
if unknown_keyword_error_manager:
unknown_keyword_error_manager.raise_error_for_unknown_keywords(
values,
cls._header,
cls.__fields__,
cls._exclude_fields(),
cls._exclude_fields() | do_not_validate,
)
return values

Expand Down Expand Up @@ -181,6 +189,11 @@ def validate(cls: Type["INIBasedModel"], value: Any) -> "INIBasedModel":

return super().validate(value)

@classmethod
def _exclude_from_validation(cls, input_data: Optional = None) -> Set:
"""Fields that should not be checked when validating existing fields as they will be dynamically added."""
return set()

@classmethod
def _exclude_fields(cls) -> Set:
return {"comments", "datablock", "_header"}
Expand Down
58 changes: 42 additions & 16 deletions hydrolib/core/dflowfm/ini/util.py
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,7 @@ def validate_forbidden_fields(
return values

for field in field_names:
if values.get(field) != None:
if values.get(field) is not None:
raise ValueError(
f"{field} is forbidden when {conditional_field_name} {operator_str(comparison_func)} {conditional_value}"
)
Expand Down Expand Up @@ -228,7 +228,7 @@ def validate_required_fields(
return values

for field in field_names:
if values.get(field) == None:
if values.get(field) is None:
raise ValueError(
f"{field} should be provided when {conditional_field_name} {operator_str(comparison_func)} {conditional_value}"
)
Expand Down Expand Up @@ -654,28 +654,54 @@ def raise_error_for_unknown_keywords(
"""
unknown_keywords = self._get_all_unknown_keywords(data, fields, excluded_fields)

if len(unknown_keywords) == 0:
return

raise ValueError(
f"Unknown keywords are detected in section: '{section_header}', '{unknown_keywords}'"
)
if len(unknown_keywords) > 0:
raise ValueError(
f"Unknown keywords are detected in section: '{section_header}', '{unknown_keywords}'"
)

def _get_all_unknown_keywords(
self, data: Dict[str, Any], fields: Dict[str, ModelField], excluded_fields: Set
) -> List[str]:
"""
Get all unknown keywords in the data.
Args:
data: Dict[str, Any]: Input data containing all properties which are checked on unknown keywords.
fields: Dict[str, ModelField]: Known fields of the Model.
excluded_fields: Set[str]: Fields which should be excluded from the check for unknown keywords.
Returns:
List[str]: List of unknown keywords.
"""
list_of_unknown_keywords = []
for name in data:
if self._is_unknown_keyword(name, fields, excluded_fields):
list_of_unknown_keywords.append(name)
for keyword in data:
if self._is_unknown_keyword(keyword, fields, excluded_fields):
list_of_unknown_keywords.append(keyword)

return list_of_unknown_keywords

@staticmethod
def _is_unknown_keyword(
self, name: str, fields: Dict[str, ModelField], excluded_fields: Set
keyword: str, fields: Dict[str, ModelField], excluded_fields: Set
):
for model_field in fields.values():
if name == model_field.name or name == model_field.alias:
return False
"""
Check if the given field name equals to any of the model field names or aliases, if not, the function checks if
the field is not in the excluded_fields parameter.
Args:
keyword: str: Name of the field.
fields: Dict[str, ModelField]: Known fields of the Model.
excluded_fields: Set[str]: Fields which should be excluded from the check for unknown keywords.
Returns:
bool: True if the field is unknown (not a field name or alias and and not in the exclude list),
False otherwise
"""
exists = any(
keyword == model_field.name or keyword == model_field.alias
for model_field in fields.values()
)
# the field is not in the known fields, check if it should be excluded
unknown = not exists and keyword not in excluded_fields

return name not in excluded_fields
return unknown
85 changes: 82 additions & 3 deletions hydrolib/core/dflowfm/polyfile/models.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
"""models.py defines all classes and functions related to representing pol/pli(z) files.
"""

from typing import Callable, List, Optional, Sequence
from typing import Callable, List, Optional, Sequence, Tuple

from pydantic.v1 import Field

Expand Down Expand Up @@ -80,7 +80,30 @@ class PolyObject(BaseModel):


class PolyFile(ParsableFileModel):
"""Poly-file (.pol/.pli/.pliz) representation."""
"""
Poly-file (.pol/.pli/.pliz) representation.
Notes:
- The `has_z_values` attribute is used to determine if the PolyFile contains z-values.
- The `has_z_values` is false by default and should be set to true if the PolyFile path ends with `.pliz`.
- The `***.pliz` file should have a 2*3 structure, where the third column contains the z-values, otherwise
(the parser will give an error).
- If there is a label in the file, the parser will ignore the label and read the file as a normal polyline file.
```
tfl_01
2 2
0.00 1.00 #zee
0.00 2.00 #zee
```
- if the file is .pliz, and the dimensions are 2*5 the first three columns will be considered as x, y, z values
and the last two columns will be considered as data values.
```
L1
2 5
63.35 12.95 -4.20 -5.35 0
45.20 6.35 -3.00 -2.90 0
```
"""

has_z_values: bool = False
objects: Sequence[PolyObject] = Field(default_factory=list)
Expand All @@ -106,7 +129,63 @@ def _get_serializer(cls) -> Callable:

@classmethod
def _get_parser(cls) -> Callable:
# TODO Prevent circular dependency in Parser
# Prevent circular dependency in Parser
from .parser import read_polyfile

return read_polyfile

@property
def x(self) -> List[float]:
"""X-coordinates of all points in the PolyFile."""
return [point.x for obj in self.objects for point in obj.points]

@property
def y(self) -> List[float]:
"""Y-coordinates of all points in the PolyFile."""
return [point.y for obj in self.objects for point in obj.points]

def get_z_sources_sinks(self) -> Tuple[List[float], List[float]]:
"""
Get the z values of the source and sink points from the polyline file.
Returns:
z_source, z_sinkA: Tuple[List[float]]:
If the polyline has data (more than 3 columns), then both the z_source and z_sink will be a list of two values.
Otherwise, the z_source and the z_sink will be a single value each.
Note:
- calling this method on a polyline file that does not have z-values will return a list of None.
Examples:
in case the polyline has 3 columns:
>>> polyline = PolyFile("tests/data/input/source-sink/leftsor.pliz")
>>> z_source, z_sink = polyline.get_z_sources_sinks()
>>> print(z_source, z_sink)
[-3] [-4.2]
in case the polyline has more than 3 columns:
>>> polyline = PolyFile("tests/data/input/source-sink/leftsor-5-columns.pliz") #Doctest: +SKIP
>>> z_source, z_sink = polyline.get_z_sources_sinks()
>>> print(z_source, z_sink)
[-3, -2.9] [-4.2, -5.35]
in case the polyline does not have z-values:
>>> root_dir = "tests/data/input/dflowfm_individual_files/polylines"
>>> polyline = PolyFile(f"{root_dir}/boundary-polyline-no-z-no-label.pli")
>>> z_source, z_sink = polyline.get_z_sources_sinks()
>>> print(z_source, z_sink)
[None] [None]
"""
has_data = True if self.objects[0].points[0].data else False

z_source_sink = []
for elem in [0, -1]:
point = self.objects[0].points[elem]
if has_data:
z_source_sink.append([point.z, point.data[0]])
else:
z_source_sink.append([point.z])

z_sink: list[float | None] = z_source_sink[0]
z_source: list[float | None] = z_source_sink[1]
return z_source, z_sink
Loading

0 comments on commit 24ee203

Please sign in to comment.