diff --git a/docs/explanations/index.rst b/docs/explanations/index.rst index f41dd5e6b7..d243f51371 100644 --- a/docs/explanations/index.rst +++ b/docs/explanations/index.rst @@ -8,6 +8,7 @@ Explanations concepts components/index conventions + model_diagnostics/index modeling_extensions/index related_packages/index faq diff --git a/docs/explanations/model_diagnostics/diagnostics_workflow.png b/docs/explanations/model_diagnostics/diagnostics_workflow.png new file mode 100755 index 0000000000..8019d49abe Binary files /dev/null and b/docs/explanations/model_diagnostics/diagnostics_workflow.png differ diff --git a/docs/explanations/model_diagnostics/index.rst b/docs/explanations/model_diagnostics/index.rst new file mode 100644 index 0000000000..0b07cfe24f --- /dev/null +++ b/docs/explanations/model_diagnostics/index.rst @@ -0,0 +1,63 @@ +Model Diagnostics Workflow +========================== + +.. contents:: + :depth: 3 + :local: + +Introduction +------------ + +Writing well-posed equation-oriented models is a significant challenge, and even the most experienced developers often have to spend a lot of time diagnosing and resolving issues before a model is able to solve reliably. This documentation is intended to assist users with this process by outlining a general workflow for model development and debugging in order to more easily identify and resolve modeling issues. IDAES also provides a ``DiagnosticsToolbox`` to assist users with this workflow, and a detailed description of the API can be found :ref:`here`. + +General Workflow +---------------- + +The diagram below shows a high-level overview of the model development and diagnosis workflow recommended by the IDAES team. Boxes in blue represent model development tasks whilst green boxes indicate diagnosis and debugging steps. It is important to note that the workflow is inherently iterative; any change to the model you should start again from the beginning of the workflow to ensure that the changes you made did not introduce any unexpected new issues. + +.. image:: diagnostics_workflow.png + +Choose a Model to Debug +""""""""""""""""""""""" + +As shown above, all model development begins with a model and the IDAES team recommends starting with the simplest possible model you can. It is always easier to debug small changes, so users should apply this workflow from the very beginning of model development starting with the simplest possible representation of their system (e.g., a single unit model or set of material balances). At each step of the process (i.e., each change or new constraint), you should check to ensure that your model is well-posed and that it solves robustly before making additional changes. In this way, it will be clear where to start looking for new issues as they arise, as they will be related in some way to the change you just made. + +Start with a Square Model +""""""""""""""""""""""""" + +Next, you should ensure your model has zero degrees of freedom (as best you can); whilst your ultimate goal may be to solve some optimization problem with degrees of freedom, you should always start from a square model first. Firstly, this is because many of the model diagnosis tools work best with square models. Secondly, all models are based on the foundation of a square model; an optimization problem is just a square model with some degrees of freedom added. If your underlying square model is not well-posed then any more advanced problem you solve based on it is fundamentally flawed (even if it solves). + +Check for Structural Issues +""""""""""""""""""""""""""" + +Once you have a square model, the next step is to check to see if there are any issues with the structure of the model; for example structural singularities or unit consistency issues. As these issues exist in the very structure of the model, it is possible to check for these before calling a solver and in doing so make it more likely that you will be able to successfully solve the model. Any issues identified at this stage should be resolved before trying to move forwards. + +Try to Solve Model +"""""""""""""""""" + +Once you have ensured there are no structural issues with your model, the next step is to try to solve your model (or initialize it if you haven't already) using the solver of your choice (you should also experiment with different solvers if needed). Hopefully you will get some solution back from the solver, even if it is not optimal and/or feasible (in cases where you get critical solver failures, see the section later in this documentation). + +Check for Numerical Issues +"""""""""""""""""""""""""" + +Once you have at least a partial solution to your model, the next step is to check for numerical issues in your model, such as possible bounds violations and poor scaling. Due to their nature, numerical issues depend on having a solution to the model, and they can often be limited to certain model states (i.e., it is possible to have a model which behaves well at one model state only to fail badly if you change your model state). As such, numerical checks should be performed at a number of points across the full range of expected states to try to ensure that the model is well-posed across the full modeling range. Any issues that are identified here should be addressed before moving on, and remember that after any changes you should always start by checking for structural issues again. + +Apply Advanced Diagnostics Tools (if required) +"""""""""""""""""""""""""""""""""""""""""""""" + +If you are still having trouble solving your model after running the structural and numerical checks, then you will need to look to more advanced (and computationally expensive) tools to try and resolve your issues. Finally, once you are satisfied that your model is robust and well behaved, you can move on to solving more complex problems. + +Diagnostics Toolbox +------------------- + +Whilst the workflow outlined above gives a high-level overview of the model development and diagnostics process, there is a lot of detail buried in each of the steps. Rather than provide the user with a long series of steps and check-boxes to complete, the :ref:`IDAES Diagnostics Toolbox` instead provides a centralized interface for accessing diagnostics tools and guiding users through the diagnostics workflow. High level methods are provided for each of the model diagnosis steps (green boxes) in the workflow above which will provide a summary of any issues identified, and these will recommend additional methods and tools to use to further examine each issue identified. + +Modeling Log Book +----------------- + +Model development and diagnostics is inherently an iterative process, and often you will try one approach to resolve an issue only to find it made things worse or that you can no longer reproduce an important result from a previous model. Due to this, users are strongly encouraged to maintain a "modeling log book" which records their model development activities. This log book should record each change made to the model and why, along with a Git hash (or equivalent version control marker) and a record on any important results that were generated with the current version of the model. This log book will prove to be invaluable when (and not if) you need to revert to an older version of the model to undo some unsuccessful changes or to work out why a previous result can no longer be reproduced. + +Handling Critical Solver Failures +--------------------------------- + +TBA diff --git a/docs/explanations/modeling_extensions/diagnostics/index.rst b/docs/explanations/modeling_extensions/diagnostics/index.rst index 4f40d9694a..b85ab07c88 100644 --- a/docs/explanations/modeling_extensions/diagnostics/index.rst +++ b/docs/explanations/modeling_extensions/diagnostics/index.rst @@ -2,8 +2,14 @@ Degeneracy Hunter ================================== +.. note:: + + v2.2: The Degeneracy Hunter tool is being deprecated in favor of the newer Diagnostics Toolbox. + + Over the next few releases the functionality of Degeneracy Hunter will be moved over to the new Diagnostics Toolbox which will also contain a number of other tools for diagnosing model issues. + Degeneracy Hunter is a collection of tools for diagnostics of mathematical programs. * `Core ideas (conference paper) `_ * `Example notebook `_ -* :ref:`Full documentation` \ No newline at end of file +* :ref:`Full documentation` diff --git a/docs/index.rst b/docs/index.rst index 26b6cbd16f..4c07f74d51 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -33,6 +33,7 @@ See also :doc:`IDAES concepts ` in this documentation. | :doc:`Concepts ` | :doc:`Components of IDAES ` | :doc:`Conventions ` + | :doc:`Model Diagnostics Workflow ` | :doc:`Modeling Extensions ` | :doc:`Related Packages ` | :doc:`FAQ ` diff --git a/docs/reference_guides/core/util/diagnostics/degeneracy_hunter.rst b/docs/reference_guides/core/util/diagnostics/degeneracy_hunter.rst new file mode 100644 index 0000000000..c5ecf06d13 --- /dev/null +++ b/docs/reference_guides/core/util/diagnostics/degeneracy_hunter.rst @@ -0,0 +1,12 @@ +Degeneracy Hunter +================= + +.. note:: + + v2.2: The Degeneracy Hunter tool is being deprecated in favor of the newer Diagnostics Toolbox. + + Over the next few releases the functionality of Degeneracy Hunter will be moved over to the new Diagnostics Toolbox which will also contain a number of other tools for diagnosing model issues. + +.. autoclass:: idaes.core.util.model_diagnostics.DegeneracyHunter + :members: + diff --git a/docs/reference_guides/core/util/diagnostics/diagnostics_toolbox.rst b/docs/reference_guides/core/util/diagnostics/diagnostics_toolbox.rst new file mode 100644 index 0000000000..af22a493b3 --- /dev/null +++ b/docs/reference_guides/core/util/diagnostics/diagnostics_toolbox.rst @@ -0,0 +1,7 @@ +Diagnostics Toolbox +=================== + +The IDAES Diagnostics Toolbox is intended to provide a comprehensive collection of tools for identifying potential modeling issues, and guiding the user through the model diagnosis workflow. For more in-depth discussion of the model diagnosis workflow see the :ref:`Model Diagnostics Workflow` documentation. + +.. autoclass:: idaes.core.util.model_diagnostics.DiagnosticsToolbox + :members: diff --git a/docs/reference_guides/core/util/model_diagnostics.rst b/docs/reference_guides/core/util/model_diagnostics.rst index 379f1ee445..2ede432576 100644 --- a/docs/reference_guides/core/util/model_diagnostics.rst +++ b/docs/reference_guides/core/util/model_diagnostics.rst @@ -3,20 +3,16 @@ Model Diagnostic Functions The IDAES toolset contains a number of utility functions which can be useful for identifying modeling issues and debugging solver issues. -.. contents:: Contents - :depth: 2 +.. toctree:: + :maxdepth: 1 + + diagnostics/diagnostics_toolbox + diagnostics/degeneracy_hunter -Degeneracy Hunter -^^^^^^^^^^^^^^^^^ - -.. autoclass:: idaes.core.util.model_diagnostics.DegeneracyHunter - :members: - - -Available Methods -^^^^^^^^^^^^^^^^^ +Other Methods +^^^^^^^^^^^^^ .. automodule:: idaes.core.util.model_diagnostics - :exclude-members: DegeneracyHunter + :exclude-members: DegeneracyHunter, DiagnosticsToolbox :members: diff --git a/idaes/core/util/__init__.py b/idaes/core/util/__init__.py index 19a594f593..4a12b0b2a9 100644 --- a/idaes/core/util/__init__.py +++ b/idaes/core/util/__init__.py @@ -15,3 +15,4 @@ from .model_serializer import to_json, from_json, StoreSpec from .tags import svg_tag, ModelTag, ModelTagGroup +from .model_diagnostics import DiagnosticsToolbox diff --git a/idaes/core/util/model_diagnostics.py b/idaes/core/util/model_diagnostics.py index 60d4d84adf..92127f7c2e 100644 --- a/idaes/core/util/model_diagnostics.py +++ b/idaes/core/util/model_diagnostics.py @@ -17,29 +17,1035 @@ __author__ = "Alexander Dowling, Douglas Allan, Andrew Lee" - from operator import itemgetter +from sys import stdout +from math import log -import pyomo.environ as pyo -from pyomo.core.expr.visitor import identify_variables -from pyomo.contrib.pynumero.interfaces.pyomo_nlp import PyomoNLP -from pyomo.contrib.pynumero.asl import AmplInterface -from pyomo.core.base.block import _BlockData import numpy as np from scipy.linalg import svd from scipy.sparse.linalg import svds, norm from scipy.sparse import issparse, find +from pyomo.environ import ( + Binary, + Block, + check_optimal_termination, + ConcreteModel, + Constraint, + Objective, + Param, + Set, + SolverFactory, + value, + Var, +) +from pyomo.core.base.block import _BlockData +from pyomo.common.collections import ComponentSet +from pyomo.common.config import ConfigDict, ConfigValue, document_kwargs_from_configdict +from pyomo.util.check_units import identify_inconsistent_units +from pyomo.contrib.incidence_analysis import IncidenceGraphInterface +from pyomo.core.expr.visitor import identify_variables +from pyomo.contrib.pynumero.interfaces.pyomo_nlp import PyomoNLP +from pyomo.contrib.pynumero.asl import AmplInterface +from pyomo.common.deprecation import deprecation_warning + from idaes.core.util.model_statistics import ( + activated_blocks_set, + deactivated_blocks_set, + activated_equalities_set, + deactivated_equalities_set, + activated_inequalities_set, + deactivated_inequalities_set, + activated_objectives_set, + deactivated_objectives_set, + variables_in_activated_constraints_set, + variables_not_in_activated_constraints_set, + degrees_of_freedom, large_residuals_set, variables_near_bounds_set, ) -import idaes.core.util.scaling as iscale +from idaes.core.util.scaling import ( + get_jacobian, + extreme_jacobian_columns, + extreme_jacobian_rows, + extreme_jacobian_entries, + jacobian_cond, +) import idaes.logger as idaeslog _log = idaeslog.getLogger(__name__) +MAX_STR_LENGTH = 84 +TAB = " " * 4 + +# TODO: Add suggested steps to cautions - how? + +CONFIG = ConfigDict() +CONFIG.declare( + "variable_bounds_absolute_tolerance", + ConfigValue( + default=1e-4, + domain=float, + description="Absolute tolerance for considering a variable to be close " + "to its bounds.", + ), +) +CONFIG.declare( + "variable_bounds_relative_tolerance", + ConfigValue( + default=1e-4, + domain=float, + description="Relative tolerance for considering a variable to be close " + "to its bounds.", + ), +) +CONFIG.declare( + "variable_bounds_violation_tolerance", + ConfigValue( + default=0, + domain=float, + description="Absolute tolerance for considering a variable to violate its bounds.", + doc="Absolute tolerance for considering a variable to violate its bounds. " + "Some solvers relax bounds on variables thus allowing a small violation to be " + "considered acceptable.", + ), +) +CONFIG.declare( + "constraint_residual_tolerance", + ConfigValue( + default=1e-5, + domain=float, + description="Absolute tolerance to use when checking constraint residuals.", + ), +) +CONFIG.declare( + "variable_large_value_tolerance", + ConfigValue( + default=1e4, + domain=float, + description="Absolute tolerance for considering a value to be large.", + ), +) +CONFIG.declare( + "variable_small_value_tolerance", + ConfigValue( + default=1e-4, + domain=float, + description="Absolute tolerance for considering a value to be small.", + ), +) +CONFIG.declare( + "variable_zero_value_tolerance", + ConfigValue( + default=1e-8, + domain=float, + description="Absolute tolerance for considering a value to be near to zero.", + ), +) +CONFIG.declare( + "jacobian_large_value_caution", + ConfigValue( + default=1e4, + domain=float, + description="Tolerance for raising a caution for large Jacobian values.", + ), +) +CONFIG.declare( + "jacobian_large_value_warning", + ConfigValue( + default=1e8, + domain=float, + description="Tolerance for raising a warning for large Jacobian values.", + ), +) +CONFIG.declare( + "jacobian_small_value_caution", + ConfigValue( + default=1e-4, + domain=float, + description="Tolerance for raising a caution for small Jacobian values.", + ), +) +CONFIG.declare( + "jacobian_small_value_warning", + ConfigValue( + default=1e-8, + domain=float, + description="Tolerance for raising a warning for small Jacobian values.", + ), +) + + +@document_kwargs_from_configdict(CONFIG) +class DiagnosticsToolbox: + """ + The IDAES Model DiagnosticsToolbox. + + To get started: + + 1. Create an instance of your model (this does not need to be initialized yet). + 2. Fix variables until you have 0 degrees of freedom. Many of these tools presume + a square model, and a square model should always be the foundation of any more + advanced model. + 3. Create an instance of the DiagnosticsToolbox and provide the model to debug as + the model argument. + 4. Call the ``report_structural_issues()`` method. + + Model diagnostics is an iterative process and you will likely need to run these + tools multiple times to resolve all issues. After making a change to your model, + you should always start from the beginning again to ensure the change did not + introduce any new issues; i.e., always start from the report_structural_issues() + method. + + Note that structural checks do not require the model to be initialized, thus users + should start with these. Numerical checks require at least a partial solution to the + model and should only be run once all structural issues have been resolved. + + Report methods will print a summary containing three parts: + + 1. Warnings - these are critical issues that should be resolved before continuing. + For each warning, a method will be suggested in the Next Steps section to get + additional information. + 2. Cautions - these are things that could be correct but could also be the source of + solver issues. Not all cautions need to be addressed, but users should investigate + each one to ensure that the behavior is correct and that they will not be the source + of difficulties later. Methods exist to provide more information on all cautions, + but these will not appear in the Next Steps section. + 3. Next Steps - these are recommended methods to call from the DiagnosticsToolbox to + get further information on warnings. If no warnings are found, this will suggest + the next report method to call. + + Args: + + model: model to be diagnosed. The DiagnosticsToolbox does not support indexed Blocks. + + """ + + def __init__(self, model: _BlockData, **kwargs): + # TODO: In future may want to generalise this to accept indexed blocks + # However, for now some of the tools do not support indexed blocks + if not isinstance(model, _BlockData): + raise TypeError( + "model argument must be an instance of a Pyomo BlockData object " + "(either a scalar Block or an element of an indexed Block)." + ) + + self._model = model + self.config = CONFIG(kwargs) + + @property + def model(self): + """ + Model currently being diagnosed. + """ + return self._model + + def display_external_variables(self, stream=stdout): + """ + Prints a list of variables that appear within activated Constraints in the + model but are not contained within the model themselves. + + Args: + stream: an I/O object to write the list to (default = stdout) + + Returns: + None + + """ + ext_vars = [] + for v in variables_in_activated_constraints_set(self._model): + if not _var_in_block(v, self._model): + ext_vars.append(v.name) + + _write_report_section( + stream=stream, + lines_list=ext_vars, + title="The following external variable(s) appear in constraints within the model:", + header="=", + footer="=", + ) + + def display_unused_variables(self, stream=stdout): + """ + Prints a list of variables that do not appear in any activated Constraints. + + Args: + stream: an I/O object to write the list to (default = stdout) + + Returns: + None + + """ + _write_report_section( + stream=stream, + lines_list=variables_not_in_activated_constraints_set(self._model), + title="The following variable(s) do not appear in any activated constraints within the model:", + header="=", + footer="=", + ) + + def display_variables_fixed_to_zero(self, stream=stdout): + """ + Prints a list of variables that are fixed to an absolute value of 0. + + Args: + stream: an I/O object to write the list to (default = stdout) + + Returns: + None + + """ + _write_report_section( + stream=stream, + lines_list=_vars_fixed_to_zero(self._model), + title="The following variable(s) are fixed to zero:", + header="=", + footer="=", + ) + + def display_variables_at_or_outside_bounds(self, stream=stdout): + """ + Prints a list of variables with values that fall at or outside the bounds + on the variable. + + Args: + stream: an I/O object to write the list to (default = stdout) + + Returns: + None + + """ + _write_report_section( + stream=stream, + lines_list=[ + f"{v.name} ({'fixed' if v.fixed else 'free'}): value={value(v)} bounds={v.bounds}" + for v in _vars_violating_bounds( + self._model, + tolerance=self.config.variable_bounds_violation_tolerance, + ) + ], + title="The following variable(s) have values at or outside their bounds:", + header="=", + footer="=", + ) + + def display_variables_with_none_value(self, stream=stdout): + """ + Prints a list of variables with a value of None. + + Args: + stream: an I/O object to write the list to (default = stdout) + + Returns: + None + + """ + _write_report_section( + stream=stream, + lines_list=_vars_with_none_value(self._model), + title="The following variable(s) have a value of None:", + header="=", + footer="=", + ) + + def display_variables_with_value_near_zero(self, stream=stdout): + """ + Prints a list of variables with a value close to zero. The tolerance + for determining what is close to zero can be set in the class configuration + options. + + Args: + stream: an I/O object to write the list to (default = stdout) + + Returns: + None + + """ + _write_report_section( + stream=stream, + lines_list=[ + f"{v.name}: value={value(v)}" + for v in _vars_near_zero( + self._model, self.config.variable_zero_value_tolerance + ) + ], + title="The following variable(s) have a value close to zero:", + header="=", + footer="=", + ) + + def display_variables_with_extreme_values(self, stream=stdout): + """ + Prints a list of variables with extreme values. + + Tolerances can be set in the class configuration options. + + Args: + stream: an I/O object to write the list to (default = stdout) + + Returns: + None + + """ + _write_report_section( + stream=stream, + lines_list=[ + f"{i.name}: {value(i)}" + for i in _vars_with_extreme_values( + model=self._model, + large=self.config.variable_large_value_tolerance, + small=self.config.variable_small_value_tolerance, + zero=self.config.variable_zero_value_tolerance, + ) + ], + title="The following variable(s) have extreme values:", + header="=", + footer="=", + ) + + def display_variables_near_bounds(self, stream=stdout): + """ + Prints a list of variables with values close to their bounds. Tolerance can + be set in the class configuration options. + + Args: + stream: an I/O object to write the list to (default = stdout) + + Returns: + None + + """ + _write_report_section( + stream=stream, + lines_list=[ + f"{v.name}: value={value(v)} bounds={v.bounds}" + for v in variables_near_bounds_set( + self._model, + abs_tol=self.config.variable_bounds_absolute_tolerance, + rel_tol=self.config.variable_bounds_relative_tolerance, + ) + ], + title="The following variable(s) have values close to their bounds:", + header="=", + footer="=", + ) + + def display_components_with_inconsistent_units(self, stream=stdout): + """ + Prints a list of all Constraints, Expressions and Objectives in the + model with inconsistent units of measurement. + + Args: + stream: an I/O object to write the list to (default = stdout) + + Returns: + None + + """ + _write_report_section( + stream=stream, + lines_list=identify_inconsistent_units(self._model), + title="The following component(s) have unit consistency issues:", + end_line="For more details on unit inconsistencies, import the " + "assert_units_consistent method\nfrom pyomo.util.check_units", + header="=", + footer="=", + ) + + def display_constraints_with_large_residuals(self, stream=stdout): + """ + Prints a list of Constraints with residuals greater than a specified tolerance. + Tolerance can be set in the class configuration options. + + Args: + stream: an I/O object to write the list to (default = stdout) + + Returns: + None + + """ + _write_report_section( + stream=stream, + lines_list=large_residuals_set( + self._model, tol=self.config.constraint_residual_tolerance + ), + title="The following constraint(s) have large residuals:", + header="=", + footer="=", + ) + + def get_dulmage_mendelsohn_partition(self): + """ + Performs a Dulmage-Mendelsohn partitioning on the model and returns + the over- and under-constrained sub-problems. + + Returns: + list-of-lists variables in each independent block of the under-constrained set + list-of-lists constraints in each independent block of the under-constrained set + list-of-lists variables in each independent block of the over-constrained set + list-of-lists constraints in each independent block of the over-constrained set + + """ + igraph = IncidenceGraphInterface(self._model, include_inequality=False) + var_dm_partition, con_dm_partition = igraph.dulmage_mendelsohn() + + # Collect under- and over-constrained sub-system + uc_var = var_dm_partition.unmatched + var_dm_partition.underconstrained + uc_con = con_dm_partition.underconstrained + oc_var = var_dm_partition.overconstrained + oc_con = con_dm_partition.overconstrained + con_dm_partition.unmatched + + uc_vblocks, uc_cblocks = igraph.get_connected_components(uc_var, uc_con) + oc_vblocks, oc_cblocks = igraph.get_connected_components(oc_var, oc_con) + + return uc_vblocks, uc_cblocks, oc_vblocks, oc_cblocks + + def display_underconstrained_set(self, stream=stdout): + """ + Prints the variables and constraints in the under-constrained sub-problem + from a Dulmage-Mendelsohn partitioning. + + This can be used to identify the under-defined part of a model and thus + where additional information (fixed variables or constraints) are required. + + Args: + stream: an I/O object to write the list to (default = stdout) + + Returns: + None + + """ + uc_vblocks, uc_cblocks, _, _ = self.get_dulmage_mendelsohn_partition() + + stream.write("=" * MAX_STR_LENGTH + "\n") + stream.write("Dulmage-Mendelsohn Under-Constrained Set\n\n") + + for i, uc_vblock in enumerate(uc_vblocks): + stream.write(f"{TAB}Independent Block {i}:\n\n") + stream.write(f"{2*TAB}Variables:\n\n") + for v in uc_vblock: + stream.write(f"{3*TAB}{v.name}\n") + + stream.write(f"\n{2*TAB}Constraints:\n\n") + for c in uc_cblocks[i]: + stream.write(f"{3*TAB}{c.name}\n") + stream.write("\n") + + stream.write("=" * MAX_STR_LENGTH + "\n") + + def display_overconstrained_set(self, stream=stdout): + """ + Prints the variables and constraints in the over-constrained sub-problem + from a Dulmage-Mendelsohn partitioning. + + This can be used to identify the over-defined part of a model and thus + where constraints must be removed or variables unfixed. + + Args: + stream: an I/O object to write the list to (default = stdout) + + Returns: + None + + """ + _, _, oc_vblocks, oc_cblocks = self.get_dulmage_mendelsohn_partition() + + stream.write("=" * MAX_STR_LENGTH + "\n") + stream.write("Dulmage-Mendelsohn Over-Constrained Set\n\n") + + for i, oc_vblock in enumerate(oc_vblocks): + stream.write(f"{TAB}Independent Block {i}:\n\n") + stream.write(f"{2*TAB}Variables:\n\n") + for v in oc_vblock: + stream.write(f"{3*TAB}{v.name}\n") + + stream.write(f"\n{2*TAB}Constraints:\n\n") + for c in oc_cblocks[i]: + stream.write(f"{3*TAB}{c.name}\n") + stream.write("\n") + + stream.write("=" * MAX_STR_LENGTH + "\n") + + def display_variables_with_extreme_jacobians(self, stream=stdout): + """ + Prints the variables associated with columns in the Jacobian with extreme + L2 norms. This often indicates poorly scaled variables. + + Tolerances can be set via the DiagnosticsToolbox config. + + Args: + stream: an I/O object to write the output to (default = stdout) + + Returns: + None + + """ + xjc = extreme_jacobian_columns( + m=self._model, + scaled=False, + large=self.config.jacobian_large_value_caution, + small=self.config.jacobian_small_value_caution, + ) + xjc.sort(key=lambda i: abs(log(i[0])), reverse=True) + + _write_report_section( + stream=stream, + lines_list=[f"{i[1].name}: {i[0]:.3E}" for i in xjc], + title="The following variable(s) are associated with extreme Jacobian values:", + header="=", + footer="=", + ) + + def display_constraints_with_extreme_jacobians(self, stream=stdout): + """ + Prints the constraints associated with rows in the Jacobian with extreme + L2 norms. This often indicates poorly scaled constraints. + + Tolerances can be set via the DiagnosticsToolbox config. + + Args: + stream: an I/O object to write the output to (default = stdout) + + Returns: + None + + """ + xjr = extreme_jacobian_rows( + m=self._model, + scaled=False, + large=self.config.jacobian_large_value_caution, + small=self.config.jacobian_small_value_caution, + ) + xjr.sort(key=lambda i: abs(log(i[0])), reverse=True) + + _write_report_section( + stream=stream, + lines_list=[f"{i[1].name}: {i[0]:.3E}" for i in xjr], + title="The following constraint(s) are associated with extreme Jacobian values:", + header="=", + footer="=", + ) + + def display_extreme_jacobian_entries(self, stream=stdout): + """ + Prints variables and constraints associated with entries in the Jacobian with extreme + values. This can be indicative of poor scaling, especially for isolated terms (e.g. + variables which appear only in one term of a single constraint). + + Tolerances can be set via the DiagnosticsToolbox config. + + Args: + stream: an I/O object to write the output to (default = stdout) + + Returns: + None + + """ + xje = extreme_jacobian_entries( + m=self._model, + scaled=False, + large=self.config.jacobian_large_value_caution, + small=self.config.jacobian_small_value_caution, + zero=0, + ) + xje.sort(key=lambda i: abs(log(i[0])), reverse=True) + + _write_report_section( + stream=stream, + lines_list=[f"{i[1].name}, {i[2].name}: {i[0]:.3E}" for i in xje], + title="The following constraint(s) and variable(s) are associated with extreme Jacobian\nvalues:", + header="=", + footer="=", + ) + + # TODO: Block triangularization analysis + # Number and size of blocks, polynomial degree of 1x1 blocks, simple pivot check of moderate sized sub-blocks? + + def _collect_structural_warnings(self): + """ + Runs checks for structural warnings and returns two lists. + + Returns: + warnings - list of warning messages from structural analysis + next_steps - list of suggested next steps to further investigate warnings + + """ + uc = identify_inconsistent_units(self._model) + uc_var, uc_con, oc_var, oc_con = self.get_dulmage_mendelsohn_partition() + + # Collect warnings + warnings = [] + next_steps = [] + dof = degrees_of_freedom(self._model) + if dof != 0: + dstring = "Degrees" + if abs(dof) == 1: + dstring = "Degree" + warnings.append(f"WARNING: {dof} {dstring} of Freedom") + if len(uc) > 0: + cstring = "Components" + if len(uc) == 1: + cstring = "Component" + warnings.append(f"WARNING: {len(uc)} {cstring} with inconsistent units") + next_steps.append( + self.display_components_with_inconsistent_units.__name__ + "()" + ) + if any(len(x) > 0 for x in [uc_var, uc_con, oc_var, oc_con]): + warnings.append( + f"WARNING: Structural singularity found\n" + f"{TAB*2}Under-Constrained Set: {len(sum(uc_var, []))} " + f"variables, {len(sum(uc_con, []))} constraints\n" + f"{TAB*2}Over-Constrained Set: {len(sum(oc_var, []))} " + f"variables, {len(sum(oc_con, []))} constraints" + ) + + if any(len(x) > 0 for x in [uc_var, uc_con]): + next_steps.append(self.display_underconstrained_set.__name__ + "()") + if any(len(x) > 0 for x in [oc_var, oc_con]): + next_steps.append(self.display_overconstrained_set.__name__ + "()") + + return warnings, next_steps + + def _collect_structural_cautions(self): + """ + Runs checks for structural cautions and returns a list. + + Returns: + cautions - list of caution messages from structural analysis + + """ + # Collect cautions + cautions = [] + zero_vars = _vars_fixed_to_zero(self._model) + if len(zero_vars) > 0: + vstring = "variables" + if len(zero_vars) == 1: + vstring = "variable" + cautions.append(f"Caution: {len(zero_vars)} {vstring} fixed to 0") + unused_vars = variables_not_in_activated_constraints_set(self._model) + unused_vars_fixed = 0 + for v in unused_vars: + if v.fixed: + unused_vars_fixed += 1 + if len(unused_vars) > 0: + vstring = "variables" + if len(unused_vars) == 1: + vstring = "variable" + cautions.append( + f"Caution: {len(unused_vars)} " + f"unused {vstring} ({unused_vars_fixed} fixed)" + ) + + return cautions + + def _collect_numerical_warnings(self, jac=None, nlp=None): + """ + Runs checks for numerical warnings and returns two lists. + + Returns: + warnings - list of warning messages from numerical analysis + next_steps - list of suggested next steps to further investigate warnings + + """ + if jac is None or nlp is None: + jac, nlp = get_jacobian(self._model, scaled=False) + + warnings = [] + next_steps = [] + + # Large residuals + large_residuals = large_residuals_set( + self._model, tol=self.config.constraint_residual_tolerance + ) + if len(large_residuals) > 0: + cstring = "Constraints" + if len(large_residuals) == 1: + cstring = "Constraint" + warnings.append( + f"WARNING: {len(large_residuals)} {cstring} with large residuals" + ) + next_steps.append( + self.display_constraints_with_large_residuals.__name__ + "()" + ) + + # Variables outside bounds + violated_bounds = _vars_violating_bounds( + self._model, tolerance=self.config.variable_bounds_violation_tolerance + ) + if len(violated_bounds) > 0: + cstring = "Variables" + if len(violated_bounds) == 1: + cstring = "Variable" + warnings.append( + f"WARNING: {len(violated_bounds)} {cstring} at or outside bounds" + ) + next_steps.append( + self.display_variables_at_or_outside_bounds.__name__ + "()" + ) + + # Extreme Jacobian rows and columns + jac_col = extreme_jacobian_columns( + jac=jac, + nlp=nlp, + large=self.config.jacobian_large_value_warning, + small=self.config.jacobian_small_value_warning, + ) + if len(jac_col) > 0: + cstring = "Variables" + if len(jac_col) == 1: + cstring = "Variable" + warnings.append( + f"WARNING: {len(jac_col)} {cstring} with extreme Jacobian values" + ) + next_steps.append( + self.display_variables_with_extreme_jacobians.__name__ + "()" + ) + + jac_row = extreme_jacobian_rows( + jac=jac, + nlp=nlp, + large=self.config.jacobian_large_value_warning, + small=self.config.jacobian_small_value_warning, + ) + if len(jac_row) > 0: + cstring = "Constraints" + if len(jac_row) == 1: + cstring = "Constraint" + warnings.append( + f"WARNING: {len(jac_row)} {cstring} with extreme Jacobian values" + ) + next_steps.append( + self.display_constraints_with_extreme_jacobians.__name__ + "()" + ) + + return warnings, next_steps + + def _collect_numerical_cautions(self, jac=None, nlp=None): + """ + Runs checks for numerical cautions and returns a list. + + Returns: + cautions - list of caution messages from numerical analysis + + """ + if jac is None or nlp is None: + jac, nlp = get_jacobian(self._model, scaled=False) + + cautions = [] + + # Variables near bounds + near_bounds = variables_near_bounds_set( + self._model, + abs_tol=self.config.variable_bounds_absolute_tolerance, + rel_tol=self.config.variable_bounds_relative_tolerance, + ) + if len(near_bounds) > 0: + cstring = "Variables" + if len(near_bounds) == 1: + cstring = "Variable" + cautions.append( + f"Caution: {len(near_bounds)} {cstring} with value close to their bounds" + ) + + # Variables near zero + near_zero = _vars_near_zero( + self._model, self.config.variable_zero_value_tolerance + ) + if len(near_zero) > 0: + cstring = "Variables" + if len(near_zero) == 1: + cstring = "Variable" + cautions.append( + f"Caution: {len(near_zero)} {cstring} with value close to zero" + ) + + # Variables with extreme values + xval = _vars_with_extreme_values( + model=self._model, + large=self.config.variable_large_value_tolerance, + small=self.config.variable_small_value_tolerance, + zero=self.config.variable_zero_value_tolerance, + ) + if len(xval) > 0: + cstring = "Variables" + if len(xval) == 1: + cstring = "Variable" + cautions.append(f"Caution: {len(xval)} {cstring} with extreme value") + + # Variables with value None + none_value = _vars_with_none_value(self._model) + if len(none_value) > 0: + cstring = "Variables" + if len(none_value) == 1: + cstring = "Variable" + cautions.append(f"Caution: {len(none_value)} {cstring} with None value") + + # Extreme Jacobian rows and columns + jac_col = extreme_jacobian_columns( + jac=jac, + nlp=nlp, + large=self.config.jacobian_large_value_caution, + small=self.config.jacobian_small_value_caution, + ) + if len(jac_col) > 0: + cstring = "Variables" + if len(jac_col) == 1: + cstring = "Variable" + cautions.append( + f"Caution: {len(jac_col)} {cstring} with extreme Jacobian values" + ) + + jac_row = extreme_jacobian_rows( + jac=jac, + nlp=nlp, + large=self.config.jacobian_large_value_caution, + small=self.config.jacobian_small_value_caution, + ) + if len(jac_row) > 0: + cstring = "Constraints" + if len(jac_row) == 1: + cstring = "Constraint" + cautions.append( + f"Caution: {len(jac_row)} {cstring} with extreme Jacobian values" + ) + + # Extreme Jacobian entries + extreme_jac = extreme_jacobian_entries( + jac=jac, + nlp=nlp, + large=self.config.jacobian_large_value_caution, + small=self.config.jacobian_small_value_caution, + zero=0, + ) + if len(extreme_jac) > 0: + cstring = "Entries" + if len(extreme_jac) == 1: + cstring = "Entry" + cautions.append(f"Caution: {len(extreme_jac)} extreme Jacobian {cstring}") + + return cautions + + def assert_no_structural_warnings(self): + """ + Checks for structural warnings in the model and raises an AssertionError + if any are found. + + Raises: + AssertionError if any warnings are identified by structural analysis. + + """ + warnings, _ = self._collect_structural_warnings() + if len(warnings) > 0: + raise AssertionError(f"Structural issues found ({len(warnings)}).") + + def assert_no_numerical_warnings(self): + """ + Checks for numerical warnings in the model and raises an AssertionError + if any are found. + + Raises: + AssertionError if any warnings are identified by numerical analysis. + + """ + warnings, _ = self._collect_numerical_warnings() + if len(warnings) > 0: + raise AssertionError(f"Numerical issues found ({len(warnings)}).") + + def report_structural_issues(self, stream=stdout): + """ + Generates a summary report of any structural issues identified in the model provided + and suggests next steps for debugging the model. + + This should be the first method called when debugging a model and after any change + is made to the model. These checks can be run before trying to initialize and solve + the model. + + Args: + stream: I/O object to write report to (default = stdout) + + Returns: + None + + """ + # Potential evaluation errors + # TODO: High Index? + stats = _collect_model_statistics(self._model) + warnings, next_steps = self._collect_structural_warnings() + cautions = self._collect_structural_cautions() + + _write_report_section( + stream=stream, lines_list=stats, title="Model Statistics", header="=" + ) + _write_report_section( + stream=stream, + lines_list=warnings, + title=f"{len(warnings)} WARNINGS", + line_if_empty="No warnings found!", + ) + _write_report_section( + stream=stream, + lines_list=cautions, + title=f"{len(cautions)} Cautions", + line_if_empty="No cautions found!", + ) + _write_report_section( + stream=stream, + lines_list=next_steps, + title="Suggested next steps:", + line_if_empty="Try to initialize/solve your model and then call report_numerical_issues()", + footer="=", + ) + + def report_numerical_issues(self, stream=stdout): + """ + Generates a summary report of any numerical issues identified in the model provided + and suggest next steps for debugging model. + + Numerical checks should only be performed once all structural issues have been resolved, + and require that at least a partial solution to the model is available. + + Args: + stream: I/O object to write report to (default = stdout) + + Returns: + None + + """ + jac, nlp = get_jacobian(self._model, scaled=False) + + warnings, next_steps = self._collect_numerical_warnings(jac=jac, nlp=nlp) + cautions = self._collect_numerical_cautions(jac=jac, nlp=nlp) + + stats = [] + stats.append( + f"Jacobian Condition Number: {jacobian_cond(jac=jac, scaled=False):.3E}" + ) + _write_report_section( + stream=stream, lines_list=stats, title="Model Statistics", header="=" + ) + _write_report_section( + stream=stream, + lines_list=warnings, + title=f"{len(warnings)} WARNINGS", + line_if_empty="No warnings found!", + ) + _write_report_section( + stream=stream, + lines_list=cautions, + title=f"{len(cautions)} Cautions", + line_if_empty="No cautions found!", + ) + _write_report_section( + stream=stream, + lines_list=next_steps, + title="Suggested next steps:", + line_if_empty=f"If you still have issues converging your model consider:\n" + f"{TAB*2}svd_analysis(TBA)\n{TAB*2}degeneracy_hunter (TBA)", + footer="=", + ) + + class DegeneracyHunter: """ Degeneracy Hunter is a collection of utility functions to assist in mathematical @@ -57,10 +1063,15 @@ def __init__(self, block_or_jac, solver=None): Passing a Jacobian to Degeneracy Hunter is current untested. """ + msg = ( + "DegeneracyHunter is being deprecated in favor of the new " + "DiagnosticsToolbox." + ) + deprecation_warning(msg=msg, logger=_log, version="2.2.0", remove_in="3.0.0") block_like = False try: - block_like = issubclass(block_or_jac.ctype, pyo.Block) + block_like = issubclass(block_or_jac.ctype, Block) except AttributeError: pass @@ -74,9 +1085,7 @@ def __init__(self, block_or_jac, solver=None): self.nlp = PyomoNLP(self.block) # Get the scaled Jacobian of equality constraints - self.jac_eq = iscale.get_jacobian( - self.block, equality_constraints_only=True - )[0] + self.jac_eq = get_jacobian(self.block, equality_constraints_only=True)[0] # Create a list of equality constraint names self.eq_con_list = self.nlp.get_pyomo_equality_constraints() @@ -103,7 +1112,7 @@ def __init__(self, block_or_jac, solver=None): # Initialize solver if solver is None: # TODO: Test performance with open solvers such as cbc - self.solver = pyo.SolverFactory("gurobi") + self.solver = SolverFactory("gurobi") self.solver.options = {"NumericFocus": 3} else: @@ -273,16 +1282,16 @@ def _prepare_ids_milp(jac_eq, M=1e5): n_var = jac_eq.shape[1] # Create Pyomo model for irreducible degenerate set - m_dh = pyo.ConcreteModel() + m_dh = ConcreteModel() # Create index for constraints - m_dh.C = pyo.Set(initialize=range(n_eq)) + m_dh.C = Set(initialize=range(n_eq)) - m_dh.V = pyo.Set(initialize=range(n_var)) + m_dh.V = Set(initialize=range(n_var)) # Add variables - m_dh.nu = pyo.Var(m_dh.C, bounds=(-M, M), initialize=1.0) - m_dh.y = pyo.Var(m_dh.C, domain=pyo.Binary) + m_dh.nu = Var(m_dh.C, bounds=(-M, M), initialize=1.0) + m_dh.y = Var(m_dh.C, domain=Binary) # Constraint to enforce set is degenerate if issparse(jac_eq): @@ -299,19 +1308,19 @@ def eq_degenerate(m_dh, v): def eq_degenerate(m_dh, v): return sum(m_dh.J[c, v] * m_dh.nu[c] for c in m_dh.C) == 0 - m_dh.degenerate = pyo.Constraint(m_dh.V, rule=eq_degenerate) + m_dh.degenerate = Constraint(m_dh.V, rule=eq_degenerate) def eq_lower(m_dh, c): return -M * m_dh.y[c] <= m_dh.nu[c] - m_dh.lower = pyo.Constraint(m_dh.C, rule=eq_lower) + m_dh.lower = Constraint(m_dh.C, rule=eq_lower) def eq_upper(m_dh, c): return m_dh.nu[c] <= M * m_dh.y[c] - m_dh.upper = pyo.Constraint(m_dh.C, rule=eq_upper) + m_dh.upper = Constraint(m_dh.C, rule=eq_upper) - m_dh.obj = pyo.Objective(expr=sum(m_dh.y[c] for c in m_dh.C)) + m_dh.obj = Objective(expr=sum(m_dh.y[c] for c in m_dh.C)) return m_dh @@ -335,23 +1344,23 @@ def _prepare_find_candidates_milp(jac_eq, M=1e5, m_small=1e-5): n_var = jac_eq.shape[1] # Create Pyomo model for irreducible degenerate set - m_dh = pyo.ConcreteModel() + m_dh = ConcreteModel() # Create index for constraints - m_dh.C = pyo.Set(initialize=range(n_eq)) + m_dh.C = Set(initialize=range(n_eq)) - m_dh.V = pyo.Set(initialize=range(n_var)) + m_dh.V = Set(initialize=range(n_var)) # Specify minimum size for nu to be considered non-zero m_dh.m_small = m_small # Add variables - m_dh.nu = pyo.Var(m_dh.C, bounds=(-M - m_small, M + m_small), initialize=1.0) - m_dh.y_pos = pyo.Var(m_dh.C, domain=pyo.Binary) - m_dh.y_neg = pyo.Var(m_dh.C, domain=pyo.Binary) - m_dh.abs_nu = pyo.Var(m_dh.C, bounds=(0, M + m_small)) + m_dh.nu = Var(m_dh.C, bounds=(-M - m_small, M + m_small), initialize=1.0) + m_dh.y_pos = Var(m_dh.C, domain=Binary) + m_dh.y_neg = Var(m_dh.C, domain=Binary) + m_dh.abs_nu = Var(m_dh.C, bounds=(0, M + m_small)) - m_dh.pos_xor_neg = pyo.Constraint(m_dh.C) + m_dh.pos_xor_neg = Constraint(m_dh.C) # Constraint to enforce set is degenerate if issparse(jac_eq): @@ -364,7 +1373,7 @@ def eq_degenerate(m_dh, v): return sum(m_dh.J[c, v] * m_dh.nu[c] for c in C_) == 0 else: # This variable does not appear in any constraint - return pyo.Constraint.Skip + return Constraint.Skip else: m_dh.J = jac_eq @@ -374,58 +1383,58 @@ def eq_degenerate(m_dh, v): return sum(m_dh.J[c, v] * m_dh.nu[c] for c in m_dh.C) == 0 else: # This variable does not appear in any constraint - return pyo.Constraint.Skip + return Constraint.Skip m_dh.pprint() - m_dh.degenerate = pyo.Constraint(m_dh.V, rule=eq_degenerate) + m_dh.degenerate = Constraint(m_dh.V, rule=eq_degenerate) # When y_pos = 1, nu >= m_small # When y_pos = 0, nu >= - m_small def eq_pos_lower(m_dh, c): return m_dh.nu[c] >= -m_small + 2 * m_small * m_dh.y_pos[c] - m_dh.pos_lower = pyo.Constraint(m_dh.C, rule=eq_pos_lower) + m_dh.pos_lower = Constraint(m_dh.C, rule=eq_pos_lower) # When y_pos = 1, nu <= M + m_small # When y_pos = 0, nu <= m_small def eq_pos_upper(m_dh, c): return m_dh.nu[c] <= M * m_dh.y_pos[c] + m_small - m_dh.pos_upper = pyo.Constraint(m_dh.C, rule=eq_pos_upper) + m_dh.pos_upper = Constraint(m_dh.C, rule=eq_pos_upper) # When y_neg = 1, nu <= -m_small # When y_neg = 0, nu <= m_small def eq_neg_upper(m_dh, c): return m_dh.nu[c] <= m_small - 2 * m_small * m_dh.y_neg[c] - m_dh.neg_upper = pyo.Constraint(m_dh.C, rule=eq_neg_upper) + m_dh.neg_upper = Constraint(m_dh.C, rule=eq_neg_upper) # When y_neg = 1, nu >= -M - m_small # When y_neg = 0, nu >= - m_small def eq_neg_lower(m_dh, c): return m_dh.nu[c] >= -M * m_dh.y_neg[c] - m_small - m_dh.neg_lower = pyo.Constraint(m_dh.C, rule=eq_neg_lower) + m_dh.neg_lower = Constraint(m_dh.C, rule=eq_neg_lower) # Absolute value def eq_abs_lower(m_dh, c): return -m_dh.abs_nu[c] <= m_dh.nu[c] - m_dh.abs_lower = pyo.Constraint(m_dh.C, rule=eq_abs_lower) + m_dh.abs_lower = Constraint(m_dh.C, rule=eq_abs_lower) def eq_abs_upper(m_dh, c): return m_dh.nu[c] <= m_dh.abs_nu[c] - m_dh.abs_upper = pyo.Constraint(m_dh.C, rule=eq_abs_upper) + m_dh.abs_upper = Constraint(m_dh.C, rule=eq_abs_upper) # At least one constraint must be in the degenerate set - m_dh.degenerate_set_nonempty = pyo.Constraint( + m_dh.degenerate_set_nonempty = Constraint( expr=sum(m_dh.y_pos[c] + m_dh.y_neg[c] for c in m_dh.C) >= 1 ) # Minimize the L1-norm of nu - m_dh.obj = pyo.Objective(expr=sum(m_dh.abs_nu[c] for c in m_dh.C)) + m_dh.obj = Objective(expr=sum(m_dh.abs_nu[c] for c in m_dh.C)) return m_dh @@ -455,7 +1464,7 @@ def _check_candidate_ids(ids_milp, solver, c, eq_con_list=None, tee=False): ids_milp.nu[c].unfix() - if pyo.check_optimal_termination(results): + if check_optimal_termination(results): # We found an irreducible degenerate set # Create empty dictionary @@ -493,7 +1502,7 @@ def _find_candidate_eqs(candidates_milp, solver, eq_con_list=None, tee=False): results = solver.solve(candidates_milp, tee=tee) - if pyo.check_optimal_termination(results): + if check_optimal_termination(results): # We found a degenerate set # Create empty dictionary @@ -603,10 +1612,10 @@ def underdetermined_variables_and_constraints(self, n_calc=1, tol=0.1, dense=Fal n_sv = len(self.s) if n_sv < n_calc: raise ValueError( - "User wanted constraints and variables associated " + f"User wanted constraints and variables associated " f"with the {n_calc}-th smallest singular value, " f"but only {n_sv} small singular values have been " - "calculated. Run svd_analysis again and specify " + f"calculated. Run svd_analysis again and specify " f"n_sv>={n_calc}." ) print("Column: Variable") @@ -774,7 +1783,7 @@ def set_bounds_from_valid_range(component, descend_into=True): set_bounds_from_valid_range(component[k]) elif isinstance(component, _BlockData): for i in component.component_data_objects( - ctype=[pyo.Var, pyo.Param], descend_into=descend_into + ctype=[Var, Param], descend_into=descend_into ): set_bounds_from_valid_range(i) elif not hasattr(component, "bounds"): @@ -815,14 +1824,14 @@ def list_components_with_values_outside_valid_range(component, descend_into=True ) elif isinstance(component, _BlockData): for i in component.component_data_objects( - ctype=[pyo.Var, pyo.Param], descend_into=descend_into + ctype=[Var, Param], descend_into=descend_into ): comp_list.extend(list_components_with_values_outside_valid_range(i)) else: valid_range = get_valid_range_of_component(component) if valid_range is not None: - cval = pyo.value(component) + cval = value(component) if cval is not None and (cval < valid_range[0] or cval > valid_range[1]): comp_list.append(component) @@ -847,10 +1856,178 @@ def ipopt_solve_halt_on_error(model, options=None): if options is None: options = {} - solver = pyo.SolverFactory("ipopt") + solver = SolverFactory("ipopt") solver.options = options solver.options["halt_on_ampl_error"] = "yes" return solver.solve( model, tee=True, symbolic_solver_labels=True, export_defined_variables=False ) + + +# ------------------------------------------------------------------------------------------- +# Private module functions +def _var_in_block(var, block): + parent = var.parent_block() + while parent is not None: + if parent is block: + return True + parent = parent.parent_block() + return False + + +def _vars_fixed_to_zero(model): + # Set of variables fixed to 0 + zero_vars = ComponentSet() + for v in model.component_data_objects(Var, descend_into=True): + if v.fixed and value(v) == 0: + zero_vars.add(v) + return zero_vars + + +def _vars_near_zero(model, variable_zero_value_tolerance): + # Set of variables with values close to 0 + near_zero_vars = ComponentSet() + for v in model.component_data_objects(Var, descend_into=True): + if v.value is not None and abs(value(v)) <= variable_zero_value_tolerance: + near_zero_vars.add(v) + return near_zero_vars + + +def _vars_violating_bounds(model, tolerance): + violated_bounds = ComponentSet() + for v in model.component_data_objects(Var, descend_into=True): + if v.value is not None: + if v.lb is not None and v.value <= v.lb - tolerance: + violated_bounds.add(v) + elif v.ub is not None and v.value >= v.ub + tolerance: + violated_bounds.add(v) + + return violated_bounds + + +def _vars_with_none_value(model): + none_value = ComponentSet() + for v in model.component_data_objects(Var, descend_into=True): + if v.value is None: + none_value.add(v) + + return none_value + + +def _vars_with_extreme_values(model, large, small, zero): + extreme_vars = ComponentSet() + for v in model.component_data_objects(Var, descend_into=True): + if v.value is not None: + mag = abs(value(v)) + if mag > abs(large): + extreme_vars.add(v) + elif mag < abs(small) and mag > abs(zero): + extreme_vars.add(v) + + return extreme_vars + + +def _write_report_section( + stream, + lines_list, + title=None, + line_if_empty=None, + end_line=None, + header="-", + footer=None, +): + """ + Writes output in standard format for report and display methods. + + Args: + stream: stream to write to + lines_list: list containing lines to be written in body of report + title: title to be put at top of report + line_if_empty: line to be written if lines_list is empty + end_line: line to be written at end of report + header: character to use to write header separation line + footer: character to use to write footer separation line + + Returns: + None + + """ + stream.write(f"{header * MAX_STR_LENGTH}\n") + if title is not None: + stream.write(f"{title}\n\n") + if len(lines_list) > 0: + for i in lines_list: + stream.write(f"{TAB}{i}\n") + elif line_if_empty is not None: + stream.write(f"{TAB}{line_if_empty}\n") + stream.write("\n") + if end_line is not None: + stream.write(f"{end_line}\n") + if footer is not None: + stream.write(f"{footer * MAX_STR_LENGTH}\n") + + +def _collect_model_statistics(model): + vars_in_constraints = variables_in_activated_constraints_set(model) + fixed_vars_in_constraints = ComponentSet() + free_vars_in_constraints = ComponentSet() + free_vars_lb = ComponentSet() + free_vars_ub = ComponentSet() + free_vars_lbub = ComponentSet() + ext_fixed_vars_in_constraints = ComponentSet() + ext_free_vars_in_constraints = ComponentSet() + for v in vars_in_constraints: + if v.fixed: + fixed_vars_in_constraints.add(v) + if not _var_in_block(v, model): + ext_fixed_vars_in_constraints.add(v) + else: + free_vars_in_constraints.add(v) + if not _var_in_block(v, model): + ext_free_vars_in_constraints.add(v) + if v.lb is not None: + if v.ub is not None: + free_vars_lbub.add(v) + else: + free_vars_lb.add(v) + elif v.ub is not None: + free_vars_ub.add(v) + + # Generate report + # TODO: Binary and boolean vars + stats = [] + stats.append( + f"{TAB}Activated Blocks: {len(activated_blocks_set(model))} " + f"(Deactivated: {len(deactivated_blocks_set(model))})" + ) + stats.append( + f"{TAB}Free Variables in Activated Constraints: " + f"{len(free_vars_in_constraints)} " + f"(External: {len(ext_free_vars_in_constraints)})" + ) + stats.append(f"{TAB * 2}Free Variables with only lower bounds: {len(free_vars_lb)}") + stats.append(f"{TAB * 2}Free Variables with only upper bounds: {len(free_vars_ub)}") + stats.append( + f"{TAB * 2}Free Variables with upper and lower bounds: " + f"{len(free_vars_lbub)}" + ) + stats.append( + f"{TAB}Fixed Variables in Activated Constraints: " + f"{len(fixed_vars_in_constraints)} " + f"(External: {len(ext_fixed_vars_in_constraints)})" + ) + stats.append( + f"{TAB}Activated Equality Constraints: {len(activated_equalities_set(model))} " + f"(Deactivated: {len(deactivated_equalities_set(model))})" + ) + stats.append( + f"{TAB}Activated Inequality Constraints: {len(activated_inequalities_set(model))} " + f"(Deactivated: {len(deactivated_inequalities_set(model))})" + ) + stats.append( + f"{TAB}Activated Objectives: {len(activated_objectives_set(model))} " + f"(Deactivated: {len(deactivated_objectives_set(model))})" + ) + + return stats diff --git a/idaes/core/util/model_statistics.py b/idaes/core/util/model_statistics.py index c0ac0b1d5c..392e6a8fc2 100644 --- a/idaes/core/util/model_statistics.py +++ b/idaes/core/util/model_statistics.py @@ -24,7 +24,11 @@ from pyomo.dae import DerivativeVar from pyomo.core.expr import identify_variables from pyomo.common.collections import ComponentSet +from pyomo.common.deprecation import deprecation_warning +import idaes.logger as idaeslog + +_log = idaeslog.getLogger(__name__) # ------------------------------------------------------------------------- # Generator to handle cases where the input is an indexed Block @@ -682,7 +686,13 @@ def number_unfixed_variables(block): def variables_near_bounds_generator( - block, tol=1e-4, relative=True, skip_lb=False, skip_ub=False + block, + tol=None, + relative=None, + skip_lb=False, + skip_ub=False, + abs_tol=1e-4, + rel_tol=1e-4, ): """ Generator which returns all Var components in a model which have a value @@ -690,8 +700,8 @@ def variables_near_bounds_generator( Args: block : model to be studied - tol : (relative) tolerance for inclusion in generator (default = 1e-4) - relative : Boolean, use relative tolerance (default = True) + abs_tol : absolute tolerance for inclusion in generator (default = 1e-4) + rel_tol : relative tolerance for inclusion in generator (default = 1e-4) skip_lb: Boolean to skip lower bound (default = False) skip_ub: Boolean to skip upper bound (default = False) @@ -699,6 +709,23 @@ def variables_near_bounds_generator( A generator which returns all Var components block that are close to a bound """ + # Check for deprecated arguments + if relative is not None: + msg = ( + "variables_near_bounds_generator has deprecated the relative argument. " + "Please set abs_tol and rel_tol arguments instead." + ) + deprecation_warning(msg=msg, logger=_log, version="2.2.0", remove_in="3.0.0") + if tol is not None: + msg = ( + "variables_near_bounds_generator has deprecated the tol argument. " + "Please set abs_tol and rel_tol arguments instead." + ) + deprecation_warning(msg=msg, logger=_log, version="2.2.0", remove_in="3.0.0") + # Set tolerances using the provided value + abs_tol = tol + rel_tol = tol + for v in _iter_indexed_block_data_objects( block, ctype=Var, active=True, descend_into=True ): @@ -706,39 +733,45 @@ def variables_near_bounds_generator( if v.value is None: continue - if relative: - # First, determine absolute tolerance to apply to bounds - if v.ub is not None and v.lb is not None: - # Both upper and lower bounds, apply tol to (upper - lower) - atol = value((v.ub - v.lb) * tol) - elif v.ub is not None: - # Only upper bound, apply tol to bound value - atol = abs(value(v.ub * tol)) - elif v.lb is not None: - # Only lower bound, apply tol to bound value - atol = abs(value(v.lb * tol)) - else: - continue + # First, magnitude of variable + if v.ub is not None and v.lb is not None: + # Both upper and lower bounds, apply tol to (upper - lower) + mag = value(v.ub - v.lb) + elif v.ub is not None: + # Only upper bound, apply tol to bound value + mag = abs(value(v.ub)) + elif v.lb is not None: + # Only lower bound, apply tol to bound value + mag = abs(value(v.lb)) else: - atol = tol + mag = 0 - if v.ub is not None and not skip_lb and value(v.ub - v.value) <= atol: + # Calculate largest tolerance from absolute and relative + tol = max(abs_tol, mag * rel_tol) + + if v.ub is not None and not skip_ub and value(v.ub - v.value) <= tol: yield v - elif v.lb is not None and not skip_ub and value(v.value - v.lb) <= atol: + elif v.lb is not None and not skip_lb and value(v.value - v.lb) <= tol: yield v def variables_near_bounds_set( - block, tol=1e-4, relative=True, skip_lb=False, skip_ub=False + block, + tol=None, + relative=None, + skip_lb=False, + skip_ub=False, + abs_tol=1e-4, + rel_tol=1e-4, ): """ Method to return a ComponentSet of all Var components in a model which have - a value within tol (relative) of a bound. + a value within tolerance of a bound. Args: block : model to be studied - tol : relative tolerance for inclusion in generator (default = 1e-4) - relative : Boolean, use relative tolerance (default = True) + abs_tol : absolute tolerance for inclusion in generator (default = 1e-4) + rel_tol : relative tolerance for inclusion in generator (default = 1e-4) skip_lb: Boolean to skip lower bound (default = False) skip_ub: Boolean to skip upper bound (default = False) @@ -747,23 +780,28 @@ def variables_near_bounds_set( bound """ return ComponentSet( - variables_near_bounds_generator(block, tol, relative, skip_lb, skip_ub) + variables_near_bounds_generator( + block, tol, relative, skip_lb, skip_ub, abs_tol, rel_tol + ) ) -def number_variables_near_bounds(block, tol=1e-4): +def number_variables_near_bounds(block, tol=None, abs_tol=1e-4, rel_tol=1e-4): """ Method to return the number of all Var components in a model which have a value within tol (relative) of a bound. Args: block : model to be studied - tol : relative tolerance for inclusion in generator (default = 1e-4) + abs_tol : absolute tolerance for inclusion in generator (default = 1e-4) + rel_tol : relative tolerance for inclusion in generator (default = 1e-4) Returns: Number of components block that are close to a bound """ - return len(variables_near_bounds_set(block, tol)) + return len( + variables_near_bounds_set(block, tol=tol, abs_tol=abs_tol, rel_tol=rel_tol) + ) # ------------------------------------------------------------------------- diff --git a/idaes/core/util/scaling.py b/idaes/core/util/scaling.py index 35454a7de5..d4bd8256aa 100644 --- a/idaes/core/util/scaling.py +++ b/idaes/core/util/scaling.py @@ -866,7 +866,7 @@ def jacobian_cond(m=None, scaled=True, order=None, pinv=False, jac=None): (float) Condition number """ if jac is None: - jac, nlp = get_jacobian(m, scaled) # pylint: disable=unused-variable + jac, _ = get_jacobian(m, scaled) jac = jac.tocsc() if jac.shape[0] != jac.shape[1] and not pinv: _log.warning("Nonsquare Jacobian using pseudo inverse") diff --git a/idaes/core/util/tests/test_model_diagnostics.py b/idaes/core/util/tests/test_model_diagnostics.py index 97ff1e3da2..7da1b77f2a 100644 --- a/idaes/core/util/tests/test_model_diagnostics.py +++ b/idaes/core/util/tests/test_model_diagnostics.py @@ -13,16 +13,30 @@ """ This module contains model diagnostic utility functions for use in IDAES (Pyomo) models. """ - +from io import StringIO +import numpy as np import pytest -# Need to update -import pyomo.environ as pyo +from pyomo.environ import ( + Block, + ConcreteModel, + Constraint, + Expression, + log, + Objective, + Set, + SolverFactory, + Suffix, + TransformationFactory, + units, + Var, +) +from pyomo.common.collections import ComponentSet from pyomo.contrib.pynumero.asl import AmplInterface -import numpy as np + import idaes.core.util.scaling as iscale import idaes.logger as idaeslog - +from idaes.core.solvers import get_solver from idaes.core import FlowsheetBlock from idaes.core.util.testing import PhysicalParameterTestBlock @@ -34,23 +48,1188 @@ # Need to update from idaes.core.util.model_diagnostics import ( + DiagnosticsToolbox, DegeneracyHunter, get_valid_range_of_component, set_bounds_from_valid_range, list_components_with_values_outside_valid_range, ipopt_solve_halt_on_error, + _var_in_block, + _vars_fixed_to_zero, + _vars_near_zero, + _vars_violating_bounds, + _vars_with_none_value, + _vars_with_extreme_values, + _write_report_section, + _collect_model_statistics, ) -__author__ = "Alex Dowling, Douglas Allan" +__author__ = "Alex Dowling, Douglas Allan, Andrew Lee" + + +@pytest.fixture +def model(): + m = ConcreteModel() + m.b = Block() + + m.v1 = Var() + m.v2 = Var() + m.v3 = Var() + m.v4 = Var() + + m.v1.fix(0) + m.v2.fix(3) + m.v3.set_value(0) + + return m + + +@pytest.mark.unit +def test_var_in_block(model): + assert _var_in_block(model.v1, model) + assert not _var_in_block(model.v1, model.b) + + +@pytest.mark.unit +def test_vars_fixed_to_zero(model): + zero_vars = _vars_fixed_to_zero(model) + assert isinstance(zero_vars, ComponentSet) + assert len(zero_vars) == 1 + for i in zero_vars: + assert i is model.v1 + + +@pytest.mark.unit +def test_vars_near_zero(model): + model.v3.set_value(1e-5) + + near_zero_vars = _vars_near_zero(model, variable_zero_value_tolerance=1e-5) + assert isinstance(near_zero_vars, ComponentSet) + assert len(near_zero_vars) == 2 + for i in near_zero_vars: + assert i.local_name in ["v1", "v3"] + + near_zero_vars = _vars_near_zero(model, variable_zero_value_tolerance=1e-6) + assert isinstance(near_zero_vars, ComponentSet) + assert len(near_zero_vars) == 1 + for i in near_zero_vars: + assert i is model.v1 + + +@pytest.mark.unit +def test_vars_with_none_value(model): + none_value = _vars_with_none_value(model) + + assert isinstance(none_value, ComponentSet) + assert len(none_value) == 1 + for i in none_value: + assert i is model.v4 + + +@pytest.mark.unit +def test_vars_with_bounds_issues(model): + model.v1.setlb(2) + model.v1.setub(6) + model.v2.setlb(0) + model.v2.setub(10) + model.v4.set_value(10) + model.v4.setlb(0) + model.v4.setub(1) + + bounds_issue = _vars_violating_bounds(model, tolerance=0) + assert isinstance(bounds_issue, ComponentSet) + assert len(bounds_issue) == 2 + for i in bounds_issue: + assert i.local_name in ["v1", "v4"] + + m = ConcreteModel() + m.v = Var(initialize=-1e-8, bounds=(0, 1)) + + bounds_issue = _vars_violating_bounds(m, tolerance=1e-6) + assert isinstance(bounds_issue, ComponentSet) + assert len(bounds_issue) == 0 + + +@pytest.mark.unit +def test_vars_with_extreme_values(): + m = ConcreteModel() + m.v1 = Var(initialize=1e-12) # below zero + m.v2 = Var(initialize=1e-8) # small + m.v3 = Var(initialize=1e-4) + m.v4 = Var(initialize=1e0) + m.v5 = Var(initialize=1e4) + m.v6 = Var(initialize=1e8) + m.v6 = Var(initialize=1e12) # large + + xvars = _vars_with_extreme_values(m, large=1e9, small=1e-7, zero=1e-10) + + assert len(xvars) == 2 + for i in xvars: + assert i.name in ["v2", "v6"] + + +@pytest.mark.unit +def test_write_report_section_all(): + stream = StringIO() + + _write_report_section( + stream=stream, + lines_list=["a", "b", "c"], + title="foo", + line_if_empty="bar", + end_line="baz", + header="-", + footer="=", + ) + + expected = """------------------------------------------------------------------------------------ +foo + + a + b + c + +baz +==================================================================================== +""" + assert stream.getvalue() == expected + + +@pytest.mark.unit +def test_write_report_section_no_lines(): + stream = StringIO() + + _write_report_section( + stream=stream, + lines_list=[], + title="foo", + line_if_empty="bar", + end_line="baz", + header="-", + footer="=", + ) + + expected = """------------------------------------------------------------------------------------ +foo + + bar + +baz +==================================================================================== +""" + assert stream.getvalue() == expected + + +@pytest.mark.unit +def test_write_report_section_lines_only(): + stream = StringIO() + + _write_report_section( + stream=stream, + lines_list=["a", "b", "c"], + ) + + expected = """------------------------------------------------------------------------------------ + a + b + c + +""" + assert stream.getvalue() == expected + + +@pytest.mark.unit +class TestStatsWriter: + def test_blocks(self): + m = ConcreteModel() + m.b1 = Block() + m.b1.b2 = Block() + m.b3 = Block() + m.b3.b4 = Block() + m.b5 = Block() + m.b5.b6 = Block() + + m.b3.b4.deactivate() + m.b5.deactivate() + + stats = _collect_model_statistics(m) + + assert len(stats) == 9 + assert stats[0] == " Activated Blocks: 4 (Deactivated: 3)" + assert ( + stats[1] == " Free Variables in Activated Constraints: 0 (External: 0)" + ) + assert stats[2] == " Free Variables with only lower bounds: 0" + assert stats[3] == " Free Variables with only upper bounds: 0" + assert stats[4] == " Free Variables with upper and lower bounds: 0" + assert ( + stats[5] == " Fixed Variables in Activated Constraints: 0 (External: 0)" + ) + assert stats[6] == " Activated Equality Constraints: 0 (Deactivated: 0)" + assert stats[7] == " Activated Inequality Constraints: 0 (Deactivated: 0)" + assert stats[8] == " Activated Objectives: 0 (Deactivated: 0)" + + def test_constraints(self): + m = ConcreteModel() + m.b1 = Block() + m.b2 = Block() + + m.b1.v1 = Var() + m.b1.v2 = Var() + m.b1.v3 = Var() + m.b1.v4 = Var() + + m.b1.c1 = Constraint(expr=m.b1.v1 == m.b1.v2) + m.b1.c2 = Constraint(expr=m.b1.v1 >= m.b1.v2) + m.b1.c3 = Constraint(expr=m.b1.v1 == m.b1.v2) + m.b1.c4 = Constraint(expr=m.b1.v1 >= m.b1.v2) + + m.b2.c1 = Constraint(expr=m.b1.v1 == m.b1.v2) + m.b2.c2 = Constraint(expr=m.b1.v1 >= m.b1.v2) + + m.b2.deactivate() + m.b1.c3.deactivate() + m.b1.c4.deactivate() + + stats = _collect_model_statistics(m) + + assert len(stats) == 9 + assert stats[0] == " Activated Blocks: 2 (Deactivated: 1)" + assert ( + stats[1] == " Free Variables in Activated Constraints: 2 (External: 0)" + ) + assert stats[2] == " Free Variables with only lower bounds: 0" + assert stats[3] == " Free Variables with only upper bounds: 0" + assert stats[4] == " Free Variables with upper and lower bounds: 0" + assert ( + stats[5] == " Fixed Variables in Activated Constraints: 0 (External: 0)" + ) + assert stats[6] == " Activated Equality Constraints: 1 (Deactivated: 1)" + assert stats[7] == " Activated Inequality Constraints: 1 (Deactivated: 1)" + assert stats[8] == " Activated Objectives: 0 (Deactivated: 0)" + + def test_objectives(self): + m = ConcreteModel() + m.b1 = Block() + m.b2 = Block() + + m.b1.o1 = Objective(expr=1) + m.b1.o2 = Objective(expr=1) + + m.b2.o1 = Objective(expr=1) + + m.b2.deactivate() + m.b1.o2.deactivate() + + stats = _collect_model_statistics(m) + + assert len(stats) == 9 + assert stats[0] == " Activated Blocks: 2 (Deactivated: 1)" + assert ( + stats[1] == " Free Variables in Activated Constraints: 0 (External: 0)" + ) + assert stats[2] == " Free Variables with only lower bounds: 0" + assert stats[3] == " Free Variables with only upper bounds: 0" + assert stats[4] == " Free Variables with upper and lower bounds: 0" + assert ( + stats[5] == " Fixed Variables in Activated Constraints: 0 (External: 0)" + ) + assert stats[6] == " Activated Equality Constraints: 0 (Deactivated: 0)" + assert stats[7] == " Activated Inequality Constraints: 0 (Deactivated: 0)" + assert stats[8] == " Activated Objectives: 1 (Deactivated: 1)" + + def test_free_variables(self): + m = ConcreteModel() + m.b1 = Block() + + m.v1 = Var(bounds=(0, 1)) + + m.b1.v2 = Var(bounds=(0, None)) + m.b1.v3 = Var(bounds=(None, 0)) + m.b1.v4 = Var(bounds=(0, 1)) + m.b1.v5 = Var() + m.b1.v6 = Var(bounds=(0, 1)) + + m.b1.c1 = Constraint(expr=0 == m.v1 + m.b1.v2 + m.b1.v3 + m.b1.v4 + m.b1.v5) + + stats = _collect_model_statistics(m.b1) + + assert len(stats) == 9 + assert stats[0] == " Activated Blocks: 1 (Deactivated: 0)" + assert ( + stats[1] == " Free Variables in Activated Constraints: 5 (External: 1)" + ) + assert stats[2] == " Free Variables with only lower bounds: 1" + assert stats[3] == " Free Variables with only upper bounds: 1" + assert stats[4] == " Free Variables with upper and lower bounds: 2" + assert ( + stats[5] == " Fixed Variables in Activated Constraints: 0 (External: 0)" + ) + assert stats[6] == " Activated Equality Constraints: 1 (Deactivated: 0)" + assert stats[7] == " Activated Inequality Constraints: 0 (Deactivated: 0)" + assert stats[8] == " Activated Objectives: 0 (Deactivated: 0)" + + def test_fixed_variables(self): + m = ConcreteModel() + m.b1 = Block() + + m.v1 = Var(bounds=(0, 1)) + + m.b1.v2 = Var(bounds=(0, None)) + m.b1.v3 = Var(bounds=(None, 0)) + m.b1.v4 = Var(bounds=(0, 1)) + m.b1.v5 = Var() + m.b1.v6 = Var(bounds=(0, 1)) + + m.b1.c1 = Constraint(expr=0 == m.v1 + m.b1.v2 + m.b1.v3 + m.b1.v4 + m.b1.v5) + + m.v1.fix(0.5) + m.b1.v2.fix(-0.5) + m.b1.v5.fix(-0.5) + + stats = _collect_model_statistics(m.b1) + + assert len(stats) == 9 + assert stats[0] == " Activated Blocks: 1 (Deactivated: 0)" + assert ( + stats[1] == " Free Variables in Activated Constraints: 2 (External: 0)" + ) + assert stats[2] == " Free Variables with only lower bounds: 0" + assert stats[3] == " Free Variables with only upper bounds: 1" + assert stats[4] == " Free Variables with upper and lower bounds: 1" + assert ( + stats[5] == " Fixed Variables in Activated Constraints: 3 (External: 1)" + ) + assert stats[6] == " Activated Equality Constraints: 1 (Deactivated: 0)" + assert stats[7] == " Activated Inequality Constraints: 0 (Deactivated: 0)" + assert stats[8] == " Activated Objectives: 0 (Deactivated: 0)" + + +@pytest.mark.solver +class TestDiagnosticsToolbox: + @pytest.mark.unit + def test_invalid_model_type(self): + with pytest.raises( + TypeError, + match="model argument must be an instance of a Pyomo BlockData object " + "\(either a scalar Block or an element of an indexed Block\).", + ): + DiagnosticsToolbox(model="foo") + + # Check for indexed Blocks + m = ConcreteModel() + m.s = Set(initialize=[1, 2, 3]) + m.b = Block(m.s) + + with pytest.raises( + TypeError, + match="model argument must be an instance of a Pyomo BlockData object " + "\(either a scalar Block or an element of an indexed Block\).", + ): + DiagnosticsToolbox(model=m.b) + + @pytest.fixture(scope="class") + def model(self): + m = ConcreteModel() + m.b = Block() + + m.v1 = Var(units=units.m) # External variable + m.b.v2 = Var(units=units.m) + m.b.v3 = Var(bounds=(0, 5)) + m.b.v4 = Var() + m.b.v5 = Var(bounds=(0, 1)) + m.b.v6 = Var() + m.b.v7 = Var( + units=units.m, bounds=(0, 1) + ) # Poorly scaled variable with lower bound + m.b.v8 = Var() # unused variable + + m.b.c1 = Constraint(expr=m.v1 + m.b.v2 == 10) # Unit consistency issue + m.b.c2 = Constraint(expr=m.b.v3 == m.b.v4 + m.b.v5) + m.b.c3 = Constraint(expr=2 * m.b.v3 == 3 * m.b.v4 + 4 * m.b.v5 + m.b.v6) + m.b.c4 = Constraint(expr=m.b.v7 == 2e-8 * m.v1) # Poorly scaled constraint + + m.b.v2.fix(5) + m.b.v5.fix(2) + m.b.v6.fix(0) + + solver = get_solver() + solver.solve(m) + + return m + + @pytest.mark.component + def test_display_external_variables(self, model): + dt = DiagnosticsToolbox(model=model.b) + + stream = StringIO() + dt.display_external_variables(stream) + + expected = """==================================================================================== +The following external variable(s) appear in constraints within the model: + + v1 + +==================================================================================== +""" + + assert stream.getvalue() == expected + + @pytest.mark.component + def test_display_unused_variables(self, model): + dt = DiagnosticsToolbox(model=model.b) + + stream = StringIO() + dt.display_unused_variables(stream) + + expected = """==================================================================================== +The following variable(s) do not appear in any activated constraints within the model: + + b.v8 + +==================================================================================== +""" + + assert stream.getvalue() == expected + + @pytest.mark.component + def test_display_variables_fixed_to_zero(self, model): + dt = DiagnosticsToolbox(model=model.b) + + stream = StringIO() + dt.display_variables_fixed_to_zero(stream) + + expected = """==================================================================================== +The following variable(s) are fixed to zero: + + b.v6 + +==================================================================================== +""" + + assert stream.getvalue() == expected + + @pytest.mark.component + def test_display_variables_at_or_outside_bounds(self, model): + dt = DiagnosticsToolbox(model=model.b) + + stream = StringIO() + dt.display_variables_at_or_outside_bounds(stream) + + expected = """==================================================================================== +The following variable(s) have values at or outside their bounds: + + b.v3 (free): value=0.0 bounds=(0, 5) + b.v5 (fixed): value=2 bounds=(0, 1) + +==================================================================================== +""" + + assert stream.getvalue() == expected + + @pytest.mark.component + def test_display_variables_with_none_value(self, model): + dt = DiagnosticsToolbox(model=model.b) + + stream = StringIO() + dt.display_variables_with_none_value(stream) + + expected = """==================================================================================== +The following variable(s) have a value of None: + + b.v8 + +==================================================================================== +""" + + assert stream.getvalue() == expected + + @pytest.mark.component + def test_display_variables_with_value_near_zero(self, model): + dt = DiagnosticsToolbox(model=model.b) + + stream = StringIO() + dt.display_variables_with_value_near_zero(stream) + + expected = """==================================================================================== +The following variable(s) have a value close to zero: + + b.v3: value=0.0 + b.v6: value=0 + +==================================================================================== +""" + + assert stream.getvalue() == expected + + @pytest.mark.component + def test_display_variables_with_extreme_values(self, model): + dt = DiagnosticsToolbox(model=model.b) + + stream = StringIO() + dt.display_variables_with_extreme_values(stream) + + expected = """==================================================================================== +The following variable(s) have extreme values: + + b.v7: 1.0000939326524314e-07 + +==================================================================================== +""" + + assert stream.getvalue() == expected + + @pytest.mark.component + def test_display_variables_near_bounds(self, model): + dt = DiagnosticsToolbox(model=model.b) + + stream = StringIO() + dt.display_variables_near_bounds(stream) + + expected = """==================================================================================== +The following variable(s) have values close to their bounds: + + b.v3: value=0.0 bounds=(0, 5) + b.v5: value=2 bounds=(0, 1) + b.v7: value=1.0000939326524314e-07 bounds=(0, 1) + +==================================================================================== +""" + + assert stream.getvalue() == expected + + @pytest.mark.component + def test_display_components_with_inconsistent_units(self, model): + dt = DiagnosticsToolbox(model=model.b) + + stream = StringIO() + dt.display_components_with_inconsistent_units(stream) + + expected = """==================================================================================== +The following component(s) have unit consistency issues: + + b.c1 + +For more details on unit inconsistencies, import the assert_units_consistent method +from pyomo.util.check_units +==================================================================================== +""" + + assert stream.getvalue() == expected + + @pytest.mark.component + def test_display_constraints_with_large_residuals(self, model): + dt = DiagnosticsToolbox(model=model.b) + + stream = StringIO() + dt.display_constraints_with_large_residuals(stream) + + expected = """==================================================================================== +The following constraint(s) have large residuals: + + b.c2 + +==================================================================================== +""" + + assert stream.getvalue() == expected + + @pytest.mark.component + def test_get_dulmage_mendelsohn_partition(self, model): + # Clone model so we can add some singularities + m = model.clone() + + # Create structural singularities + m.b.v2.unfix() + m.b.v4.fix(2) + + # Add a second set of structural singularities + m.b.b2 = Block() + m.b.b2.v1 = Var() + m.b.b2.v2 = Var() + m.b.b2.v3 = Var() + m.b.b2.v4 = Var() + + m.b.b2.c1 = Constraint(expr=m.b.b2.v1 == m.b.b2.v2) + m.b.b2.c2 = Constraint(expr=2 * m.b.b2.v1 == 3 * m.b.b2.v2) + m.b.b2.c3 = Constraint(expr=m.b.b2.v3 == m.b.b2.v4) + + m.b.b2.v2.fix(42) + + dt = DiagnosticsToolbox(model=m.b) + + ( + uc_vblocks, + uc_cblocks, + oc_vblocks, + oc_cblocks, + ) = dt.get_dulmage_mendelsohn_partition() + + assert len(uc_vblocks) == 2 + assert len(uc_vblocks[0]) == 3 + for i in uc_vblocks[0]: + assert i.name in ["v1", "b.v2", "b.v7"] + assert len(uc_vblocks[1]) == 2 + for i in uc_vblocks[1]: + assert i.name in ["b.b2.v3", "b.b2.v4"] + + assert len(uc_cblocks) == 2 + assert len(uc_cblocks[0]) == 2 + for i in uc_cblocks[0]: + assert i.name in ["b.c1", "b.c4"] + assert len(uc_cblocks[1]) == 1 + for i in uc_cblocks[1]: + assert i.name in ["b.b2.c3"] + + assert len(oc_vblocks) == 2 + assert len(oc_vblocks[0]) == 1 + for i in oc_vblocks[0]: + assert i.name in ["b.v3"] + assert len(oc_vblocks[1]) == 1 + for i in oc_vblocks[1]: + assert i.name in ["b.b2.v1"] + + assert len(oc_cblocks) == 2 + assert len(oc_cblocks[0]) == 2 + for i in oc_cblocks[0]: + assert i.name in ["b.c2", "b.c3"] + assert len(oc_cblocks[1]) == 2 + for i in oc_cblocks[1]: + assert i.name in ["b.b2.c1", "b.b2.c2"] + + @pytest.mark.component + def test_display_underconstrained_set(self, model): + # Clone model so we can add some singularities + m = model.clone() + + # Create structural singularities + m.b.v2.unfix() + m.b.v4.fix(2) + + # Add a second set of structural singularities + m.b.b2 = Block() + m.b.b2.v1 = Var() + m.b.b2.v2 = Var() + m.b.b2.v3 = Var() + m.b.b2.v4 = Var() + + m.b.b2.c1 = Constraint(expr=m.b.b2.v1 == m.b.b2.v2) + m.b.b2.c2 = Constraint(expr=2 * m.b.b2.v1 == 3 * m.b.b2.v2) + m.b.b2.c3 = Constraint(expr=m.b.b2.v3 == m.b.b2.v4) + + m.b.b2.v2.fix(42) + + dt = DiagnosticsToolbox(model=m.b) + + stream = StringIO() + dt.display_underconstrained_set(stream) + + expected = """==================================================================================== +Dulmage-Mendelsohn Under-Constrained Set + + Independent Block 0: + + Variables: + + b.v2 + v1 + b.v7 + + Constraints: + + b.c1 + b.c4 + + Independent Block 1: + + Variables: + + b.b2.v4 + b.b2.v3 + + Constraints: + + b.b2.c3 + +==================================================================================== +""" + + assert stream.getvalue() == expected + + @pytest.mark.component + def test_display_overconstrained_set(self, model): + # Clone model so we can add some singularities + m = model.clone() + + # Create structural singularities + m.b.v2.unfix() + m.b.v4.fix(2) + + # Add a second set of structural singularities + m.b.b2 = Block() + m.b.b2.v1 = Var() + m.b.b2.v2 = Var() + m.b.b2.v3 = Var() + m.b.b2.v4 = Var() + + m.b.b2.c1 = Constraint(expr=m.b.b2.v1 == m.b.b2.v2) + m.b.b2.c2 = Constraint(expr=2 * m.b.b2.v1 == 3 * m.b.b2.v2) + m.b.b2.c3 = Constraint(expr=m.b.b2.v3 == m.b.b2.v4) + + m.b.b2.v2.fix(42) + + dt = DiagnosticsToolbox(model=m.b) + + stream = StringIO() + dt.display_overconstrained_set(stream) + + expected = """==================================================================================== +Dulmage-Mendelsohn Over-Constrained Set + + Independent Block 0: + + Variables: + + b.v3 + + Constraints: + + b.c2 + b.c3 + + Independent Block 1: + + Variables: + + b.b2.v1 + + Constraints: + + b.b2.c1 + b.b2.c2 + +==================================================================================== +""" + + assert stream.getvalue() == expected + + @pytest.mark.component + def test_display_variables_with_extreme_jacobians(self): + model = ConcreteModel() + model.v1 = Var(initialize=1e-8) + model.v2 = Var() + model.v3 = Var() + + model.c1 = Constraint(expr=model.v1 == model.v2) + model.c2 = Constraint(expr=model.v1 == 1e-8 * model.v3) + model.c3 = Constraint(expr=1e8 * model.v1 + 1e10 * model.v2 == 1e-6 * model.v3) + + dt = DiagnosticsToolbox(model=model) + + stream = StringIO() + dt.display_variables_with_extreme_jacobians(stream) + + expected = """==================================================================================== +The following variable(s) are associated with extreme Jacobian values: + + v2: 1.000E+10 + v1: 1.000E+08 + v3: 1.000E-06 + +==================================================================================== +""" + + assert stream.getvalue() == expected + + @pytest.mark.component + def test_display_constraints_with_extreme_jacobians(self): + model = ConcreteModel() + model.v1 = Var(initialize=1e-8) + model.v2 = Var() + model.v3 = Var() + + model.c1 = Constraint(expr=model.v1 == model.v2) + model.c2 = Constraint(expr=model.v1 == 1e-8 * model.v3) + model.c3 = Constraint(expr=1e8 * model.v1 + 1e10 * model.v2 == 1e-6 * model.v3) + + dt = DiagnosticsToolbox(model=model) + + stream = StringIO() + dt.display_constraints_with_extreme_jacobians(stream) + + expected = """==================================================================================== +The following constraint(s) are associated with extreme Jacobian values: + + c3: 1.000E+10 + +==================================================================================== +""" + + assert stream.getvalue() == expected + + @pytest.mark.component + def test_display_extreme_jacobian_entries(self): + model = ConcreteModel() + model.v1 = Var(initialize=1e-8) + model.v2 = Var() + model.v3 = Var() + + model.c1 = Constraint(expr=model.v1 == model.v2) + model.c2 = Constraint(expr=model.v1 == 1e-8 * model.v3) + model.c3 = Constraint(expr=1e8 * model.v1 + 1e10 * model.v2 == 1e-6 * model.v3) + + dt = DiagnosticsToolbox(model=model) + + stream = StringIO() + dt.display_extreme_jacobian_entries(stream) + + expected = """==================================================================================== +The following constraint(s) and variable(s) are associated with extreme Jacobian +values: + + c3, v2: 1.000E+10 + c2, v3: 1.000E-08 + c3, v1: 1.000E+08 + c3, v3: 1.000E-06 + +==================================================================================== +""" + + assert stream.getvalue() == expected + + @pytest.mark.component + def test_collect_structural_warnings_base_case(self, model): + dt = DiagnosticsToolbox(model=model.b) + + warnings, next_steps = dt._collect_structural_warnings() + + assert len(warnings) == 1 + assert warnings == ["WARNING: 1 Component with inconsistent units"] + assert len(next_steps) == 1 + assert next_steps == ["display_components_with_inconsistent_units()"] + + @pytest.mark.component + def test_collect_structural_warnings_underconstrained(self, model): + # Clone model so we can add some singularities + m = model.clone() + + # Create structural singularities + m.b.v2.unfix() + + dt = DiagnosticsToolbox(model=m.b) + + warnings, next_steps = dt._collect_structural_warnings() + + assert len(warnings) == 3 + assert "WARNING: 1 Component with inconsistent units" in warnings + assert "WARNING: 1 Degree of Freedom" in warnings + assert ( + """WARNING: Structural singularity found + Under-Constrained Set: 3 variables, 2 constraints + Over-Constrained Set: 0 variables, 0 constraints""" + in warnings + ) + + assert len(next_steps) == 2 + assert "display_components_with_inconsistent_units()" in next_steps + assert "display_underconstrained_set()" in next_steps + + @pytest.mark.component + def test_collect_structural_warnings_overconstrained(self, model): + # Clone model so we can add some singularities + m = model.clone() + + # Fix units + m.b.del_component(m.b.c1) + m.b.c1 = Constraint(expr=m.v1 + m.b.v2 == 10 * units.m) + + # Create structural singularities + m.b.v4.fix(2) + + dt = DiagnosticsToolbox(model=m.b) + + warnings, next_steps = dt._collect_structural_warnings() + + assert len(warnings) == 2 + assert "WARNING: -1 Degree of Freedom" in warnings + assert ( + """WARNING: Structural singularity found + Under-Constrained Set: 0 variables, 0 constraints + Over-Constrained Set: 1 variables, 2 constraints""" + in warnings + ) + + assert len(next_steps) == 1 + assert "display_overconstrained_set()" in next_steps + + @pytest.mark.component + def test_collect_structural_cautions(self, model): + dt = DiagnosticsToolbox(model=model.b) + + cautions = dt._collect_structural_cautions() + + assert len(cautions) == 2 + assert "Caution: 1 variable fixed to 0" in cautions + assert "Caution: 1 unused variable (0 fixed)" in cautions + + @pytest.mark.component + def test_collect_numerical_warnings(self, model): + dt = DiagnosticsToolbox(model=model.b) + + warnings, next_steps = dt._collect_numerical_warnings() + + assert len(warnings) == 2 + assert "WARNING: 1 Constraint with large residuals" in warnings + assert "WARNING: 2 Variables at or outside bounds" in warnings + + assert len(next_steps) == 2 + assert "display_constraints_with_large_residuals()" in next_steps + assert "display_variables_at_or_outside_bounds()" in next_steps + + @pytest.mark.component + def test_collect_numerical_warnings_corrected(self, model): + m = model.clone() + + # Fix numerical issues + m.b.v3.setlb(-5) + m.b.v5.setub(10) + + solver = get_solver() + solver.solve(m) + + dt = DiagnosticsToolbox(model=m.b) + + warnings, next_steps = dt._collect_numerical_warnings() + + assert len(warnings) == 0 + + assert len(next_steps) == 0 + + @pytest.mark.component + def test_collect_numerical_warnings_jacobian(self): + model = ConcreteModel() + model.v1 = Var(initialize=1e-8) + model.v2 = Var(initialize=0) + model.v3 = Var(initialize=0) + + model.c1 = Constraint(expr=model.v1 == model.v2) + model.c2 = Constraint(expr=model.v1 == 1e-8 * model.v3) + model.c3 = Constraint(expr=1e8 * model.v1 + 1e10 * model.v2 == 1e-6 * model.v3) + + dt = DiagnosticsToolbox(model=model) + + warnings, next_steps = dt._collect_numerical_warnings() + print(warnings) + assert len(warnings) == 3 + assert "WARNING: 2 Variables with extreme Jacobian values" in warnings + assert "WARNING: 1 Constraint with extreme Jacobian values" in warnings + assert "WARNING: 1 Constraint with large residuals" in warnings + + assert len(next_steps) == 3 + assert "display_variables_with_extreme_jacobians()" in next_steps + assert "display_constraints_with_extreme_jacobians()" in next_steps + assert "display_constraints_with_large_residuals()" in next_steps + + @pytest.mark.component + def test_collect_numerical_cautions(self, model): + dt = DiagnosticsToolbox(model=model.b) + + cautions = dt._collect_numerical_cautions() + print(cautions) + assert len(cautions) == 5 + assert "Caution: 3 Variables with value close to their bounds" in cautions + assert "Caution: 2 Variables with value close to zero" in cautions + assert "Caution: 1 Variable with None value" in cautions + assert "Caution: 1 extreme Jacobian Entry" in cautions + assert "Caution: 1 Variable with extreme value" in cautions + + @pytest.mark.component + def test_collect_numerical_cautions_jacobian(self): + model = ConcreteModel() + model.v1 = Var(initialize=1e-8) + model.v2 = Var(initialize=0) + model.v3 = Var(initialize=0) + + model.c1 = Constraint(expr=model.v1 == model.v2) + model.c2 = Constraint(expr=model.v1 == 1e-8 * model.v3) + model.c3 = Constraint(expr=1e8 * model.v1 + 1e10 * model.v2 == 1e-6 * model.v3) + + dt = DiagnosticsToolbox(model=model) + + cautions = dt._collect_numerical_cautions() + print(cautions) + assert len(cautions) == 4 + assert "Caution: 3 Variables with value close to zero" in cautions + assert "Caution: 3 Variables with extreme Jacobian values" in cautions + assert "Caution: 1 Constraint with extreme Jacobian values" in cautions + assert "Caution: 4 extreme Jacobian Entries" in cautions + + @pytest.mark.component + def test_assert_no_structural_warnings(self, model): + m = model.clone() + dt = DiagnosticsToolbox(model=m.b) + + with pytest.raises(AssertionError, match="Structural issues found \(1\)."): + dt.assert_no_structural_warnings() + + # Fix units issue + m.b.del_component(m.b.c1) + m.b.c1 = Constraint(expr=m.v1 + m.b.v2 == 10 * units.m) + dt.assert_no_structural_warnings() + + @pytest.mark.component + def test_assert_no_numerical_warnings(self, model): + m = model.clone() + dt = DiagnosticsToolbox(model=m.b) + + with pytest.raises(AssertionError, match="Numerical issues found \(2\)."): + dt.assert_no_numerical_warnings() + + # Fix numerical issues + m.b.v3.setlb(-5) + m.b.v5.setub(10) + + solver = get_solver() + solver.solve(m) + + dt = DiagnosticsToolbox(model=m.b) + dt.assert_no_numerical_warnings() + + @pytest.mark.component + def test_report_structural_issues(self, model): + dt = DiagnosticsToolbox(model=model.b) + + stream = StringIO() + dt.report_structural_issues(stream) + + expected = """==================================================================================== +Model Statistics + + Activated Blocks: 1 (Deactivated: 0) + Free Variables in Activated Constraints: 4 (External: 1) + Free Variables with only lower bounds: 0 + Free Variables with only upper bounds: 0 + Free Variables with upper and lower bounds: 2 + Fixed Variables in Activated Constraints: 3 (External: 0) + Activated Equality Constraints: 4 (Deactivated: 0) + Activated Inequality Constraints: 0 (Deactivated: 0) + Activated Objectives: 0 (Deactivated: 0) + +------------------------------------------------------------------------------------ +1 WARNINGS + + WARNING: 1 Component with inconsistent units + +------------------------------------------------------------------------------------ +2 Cautions + + Caution: 1 variable fixed to 0 + Caution: 1 unused variable (0 fixed) + +------------------------------------------------------------------------------------ +Suggested next steps: + + display_components_with_inconsistent_units() + +==================================================================================== +""" + + assert stream.getvalue() == expected + + @pytest.mark.component + def test_report_numerical_issues(self, model): + dt = DiagnosticsToolbox(model=model.b) + + stream = StringIO() + dt.report_numerical_issues(stream) + + expected = """==================================================================================== +Model Statistics + + Jacobian Condition Number: 1.700E+01 + +------------------------------------------------------------------------------------ +2 WARNINGS + + WARNING: 1 Constraint with large residuals + WARNING: 2 Variables at or outside bounds + +------------------------------------------------------------------------------------ +5 Cautions + + Caution: 3 Variables with value close to their bounds + Caution: 2 Variables with value close to zero + Caution: 1 Variable with extreme value + Caution: 1 Variable with None value + Caution: 1 extreme Jacobian Entry + +------------------------------------------------------------------------------------ +Suggested next steps: + + display_constraints_with_large_residuals() + display_variables_at_or_outside_bounds() + +==================================================================================== +""" + print(stream.getvalue()) + assert stream.getvalue() == expected + + @pytest.mark.component + def test_report_numerical_issues_jacobian(self): + model = ConcreteModel() + model.v1 = Var(initialize=1e-8) + model.v2 = Var(initialize=0) + model.v3 = Var(initialize=0) + + model.c1 = Constraint(expr=model.v1 == model.v2) + model.c2 = Constraint(expr=model.v1 == 1e-8 * model.v3) + model.c3 = Constraint(expr=1e8 * model.v1 + 1e10 * model.v2 == 1e-6 * model.v3) + + dt = DiagnosticsToolbox(model=model) + + stream = StringIO() + dt.report_numerical_issues(stream) + + expected = """==================================================================================== +Model Statistics + + Jacobian Condition Number: 1.407E+18 + +------------------------------------------------------------------------------------ +3 WARNINGS + + WARNING: 1 Constraint with large residuals + WARNING: 2 Variables with extreme Jacobian values + WARNING: 1 Constraint with extreme Jacobian values + +------------------------------------------------------------------------------------ +4 Cautions + + Caution: 3 Variables with value close to zero + Caution: 3 Variables with extreme Jacobian values + Caution: 1 Constraint with extreme Jacobian values + Caution: 4 extreme Jacobian Entries + +------------------------------------------------------------------------------------ +Suggested next steps: + + display_constraints_with_large_residuals() + display_variables_with_extreme_jacobians() + display_constraints_with_extreme_jacobians() + +==================================================================================== +""" + print(stream.getvalue()) + assert stream.getvalue() == expected @pytest.fixture() def dummy_problem(): - m = pyo.ConcreteModel() + m = ConcreteModel() - m.I = pyo.Set(initialize=[i for i in range(5)]) + m.I = Set(initialize=[i for i in range(5)]) - m.x = pyo.Var(m.I, initialize=1.0) + m.x = Var(m.I, initialize=1.0) diag = [100, 1, 10, 0.1, 5] out = [1, 1, 1, 1, 1] @@ -59,7 +1238,7 @@ def dummy_problem(): def dummy_eqn(b, i): return out[i] == diag[i] * m.x[i] - m.obj = pyo.Objective(expr=0) + m.obj = Objective(expr=0) return m @@ -82,6 +1261,22 @@ def v_exp(): ).T +@pytest.mark.unit +def test_deprecate_degeneracy_hunter(caplog): + m = ConcreteModel() + m.v = Var() + m.o = Objective(expr=m.v) + dh = DegeneracyHunter(m) + + msg = ( + "DEPRECATED: DegeneracyHunter is being deprecated in favor of the new " + "DiagnosticsToolbox. (deprecated in 2.2.0, will be removed in (or after) 3.0.0)" + ) + assert msg.replace(" ", "") in caplog.records[0].message.replace("\n", "").replace( + " ", "" + ) + + @pytest.mark.skipif( not AmplInterface.available(), reason="pynumero_ASL is not available" ) @@ -208,10 +1403,10 @@ def test_sv_value_error(dummy_problem): ) @pytest.mark.unit def test_single_eq_error(capsys): - m = pyo.ConcreteModel() - m.x = pyo.Var(initialize=1) - m.con = pyo.Constraint(expr=(2 * m.x == 1)) - m.obj = pyo.Objective(expr=0) + m = ConcreteModel() + m.x = Var(initialize=1) + m.con = Constraint(expr=(2 * m.x == 1)) + m.obj = Objective(expr=0) dh = DegeneracyHunter(m) with pytest.raises( @@ -232,17 +1427,17 @@ def test_single_eq_error(capsys): # This was from # @pytest.fixture() def problem1(): - m = pyo.ConcreteModel() + m = ConcreteModel() - m.I = pyo.Set(initialize=[i for i in range(5)]) + m.I = Set(initialize=[i for i in range(5)]) - m.x = pyo.Var(m.I, bounds=(-10, 10), initialize=1.0) + m.x = Var(m.I, bounds=(-10, 10), initialize=1.0) - m.con1 = pyo.Constraint(expr=m.x[0] + m.x[1] - m.x[3] >= 10) - m.con2 = pyo.Constraint(expr=m.x[0] * m.x[3] + m.x[1] >= 0) - m.con3 = pyo.Constraint(expr=m.x[4] * m.x[3] + m.x[0] * m.x[3] - m.x[4] == 0) + m.con1 = Constraint(expr=m.x[0] + m.x[1] - m.x[3] >= 10) + m.con2 = Constraint(expr=m.x[0] * m.x[3] + m.x[1] >= 0) + m.con3 = Constraint(expr=m.x[4] * m.x[3] + m.x[0] * m.x[3] - m.x[4] == 0) - m.obj = pyo.Objective(expr=sum(m.x[i] ** 2 for i in m.I)) + m.obj = Objective(expr=sum(m.x[i] ** 2 for i in m.I)) return m @@ -257,21 +1452,21 @@ def example2(with_degenerate_constraint=True): m2: Pyomo model """ - m2 = pyo.ConcreteModel() + m2 = ConcreteModel() - m2.I = pyo.Set(initialize=[i for i in range(1, 4)]) + m2.I = Set(initialize=[i for i in range(1, 4)]) - m2.x = pyo.Var(m2.I, bounds=(0, 5), initialize=1.0) + m2.x = Var(m2.I, bounds=(0, 5), initialize=1.0) - m2.con1 = pyo.Constraint(expr=m2.x[1] + m2.x[2] >= 1) - m2.con2 = pyo.Constraint(expr=m2.x[1] + m2.x[2] + m2.x[3] == 1) - m2.con3 = pyo.Constraint(expr=m2.x[2] - 2 * m2.x[3] <= 1) - m2.con4 = pyo.Constraint(expr=m2.x[1] + m2.x[3] >= 1) + m2.con1 = Constraint(expr=m2.x[1] + m2.x[2] >= 1) + m2.con2 = Constraint(expr=m2.x[1] + m2.x[2] + m2.x[3] == 1) + m2.con3 = Constraint(expr=m2.x[2] - 2 * m2.x[3] <= 1) + m2.con4 = Constraint(expr=m2.x[1] + m2.x[3] >= 1) if with_degenerate_constraint: - m2.con5 = pyo.Constraint(expr=m2.x[1] + m2.x[2] + m2.x[3] == 1) + m2.con5 = Constraint(expr=m2.x[1] + m2.x[2] + m2.x[3] == 1) - m2.obj = pyo.Objective(expr=sum(m2.x[i] for i in m2.I)) + m2.obj = Objective(expr=sum(m2.x[i] for i in m2.I)) return m2 @@ -297,14 +1492,14 @@ def extract_constraint_names(cs): @pytest.mark.skipif( not AmplInterface.available(), reason="pynumero_ASL is not available" ) -@pytest.mark.skipif(not pyo.SolverFactory("ipopt").available(False), reason="no Ipopt") +@pytest.mark.skipif(not SolverFactory("ipopt").available(False), reason="no Ipopt") @pytest.mark.unit def test_problem1(): # Create test problem m = problem1() # Specify Ipopt as the solver - opt = pyo.SolverFactory("ipopt") + opt = SolverFactory("ipopt") # Specifying an iteration limit of 0 allows us to inspect the initial point opt.options["max_iter"] = 0 @@ -352,14 +1547,14 @@ def test_problem1(): @pytest.mark.skipif( not AmplInterface.available(), reason="pynumero_ASL is not available" ) -@pytest.mark.skipif(not pyo.SolverFactory("ipopt").available(False), reason="no Ipopt") +@pytest.mark.skipif(not SolverFactory("ipopt").available(False), reason="no Ipopt") @pytest.mark.unit def test_problem2_without_degenerate_constraint(): # Create test problem instance m2 = example2(with_degenerate_constraint=False) # Specify Ipopt as the solver - opt = pyo.SolverFactory("ipopt") + opt = SolverFactory("ipopt") # Specifying an iteration limit of 0 allows us to inspect the initial point opt.options["max_iter"] = 0 @@ -401,14 +1596,14 @@ def test_problem2_without_degenerate_constraint(): @pytest.mark.skipif( not AmplInterface.available(), reason="pynumero_ASL is not available" ) -@pytest.mark.skipif(not pyo.SolverFactory("ipopt").available(False), reason="no Ipopt") +@pytest.mark.skipif(not SolverFactory("ipopt").available(False), reason="no Ipopt") @pytest.mark.unit def test_problem2_with_degenerate_constraint(): # Create test problem instance m2 = example2(with_degenerate_constraint=True) # Specify Ipopt as the solver - opt = pyo.SolverFactory("ipopt") + opt = SolverFactory("ipopt") # Specifying an iteration limit of 0 allows us to inspect the initial point opt.options["max_iter"] = 0 @@ -460,7 +1655,7 @@ def test_problem2_with_degenerate_constraint(): @pytest.mark.unit def test_get_valid_range_of_component(): - m = pyo.ConcreteModel() + m = ConcreteModel() m.fs = FlowsheetBlock() m.fs.params = PhysicalParameterTestBlock() @@ -478,7 +1673,7 @@ def test_get_valid_range_of_component(): @pytest.mark.unit def test_get_valid_range_of_component_no_metadata(): - m = pyo.ConcreteModel() + m = ConcreteModel() m.fs = FlowsheetBlock() with pytest.raises( @@ -489,7 +1684,7 @@ def test_get_valid_range_of_component_no_metadata(): @pytest.mark.unit def test_get_valid_range_of_component_no_metadata_entry(caplog): - m = pyo.ConcreteModel() + m = ConcreteModel() m.fs = FlowsheetBlock() m.fs.params = PhysicalParameterTestBlock() @@ -506,7 +1701,7 @@ def test_get_valid_range_of_component_no_metadata_entry(caplog): @pytest.mark.unit def test_set_bounds_from_valid_range_scalar(): - m = pyo.ConcreteModel() + m = ConcreteModel() m.fs = FlowsheetBlock() m.fs.params = PhysicalParameterTestBlock() @@ -528,7 +1723,7 @@ def test_set_bounds_from_valid_range_scalar(): @pytest.mark.unit def test_set_bounds_from_valid_range_indexed(): - m = pyo.ConcreteModel() + m = ConcreteModel() m.fs = FlowsheetBlock() m.fs.params = PhysicalParameterTestBlock() @@ -553,7 +1748,7 @@ def test_set_bounds_from_valid_range_indexed(): @pytest.mark.unit def test_set_bounds_from_valid_range_block(): - m = pyo.ConcreteModel() + m = ConcreteModel() m.fs = FlowsheetBlock() m.fs.params = PhysicalParameterTestBlock() @@ -572,10 +1767,10 @@ def test_set_bounds_from_valid_range_block(): @pytest.mark.unit def test_set_bounds_from_valid_range_invalid_type(): - m = pyo.ConcreteModel() + m = ConcreteModel() m.fs = FlowsheetBlock() - m.fs.foo = pyo.Expression(expr=1) + m.fs.foo = Expression(expr=1) with pytest.raises( TypeError, @@ -586,7 +1781,7 @@ def test_set_bounds_from_valid_range_invalid_type(): @pytest.mark.unit def test_list_components_with_values_outside_valid_range_scalar(): - m = pyo.ConcreteModel() + m = ConcreteModel() m.fs = FlowsheetBlock() m.fs.params = PhysicalParameterTestBlock() @@ -624,7 +1819,7 @@ def test_list_components_with_values_outside_valid_range_scalar(): @pytest.mark.unit def test_list_components_with_values_outside_valid_range_indexed(): - m = pyo.ConcreteModel() + m = ConcreteModel() m.fs = FlowsheetBlock() m.fs.params = PhysicalParameterTestBlock() @@ -653,7 +1848,7 @@ def test_list_components_with_values_outside_valid_range_indexed(): @pytest.mark.unit def test_list_components_with_values_outside_valid_range_block(): - m = pyo.ConcreteModel() + m = ConcreteModel() m.fs = FlowsheetBlock() m.fs.params = PhysicalParameterTestBlock() @@ -683,11 +1878,11 @@ def test_list_components_with_values_outside_valid_range_block(): @pytest.mark.component def test_ipopt_solve_halt_on_error(capsys): - m = pyo.ConcreteModel() + m = ConcreteModel() - m.v = pyo.Var(initialize=-5, bounds=(None, -1)) - m.e = pyo.Expression(expr=pyo.log(m.v)) - m.c = pyo.Constraint(expr=m.e == 1) + m.v = Var(initialize=-5, bounds=(None, -1)) + m.e = Expression(expr=log(m.v)) + m.c = Constraint(expr=m.e == 1) try: results = ipopt_solve_halt_on_error(m) diff --git a/idaes/core/util/tests/test_model_statistics.py b/idaes/core/util/tests/test_model_statistics.py index 02b85a6220..3f45d08809 100644 --- a/idaes/core/util/tests/test_model_statistics.py +++ b/idaes/core/util/tests/test_model_statistics.py @@ -312,76 +312,154 @@ def test_number_unfixed_variables(m): @pytest.mark.unit -def test_variables_near_bounds_set(m): - tset = variables_near_bounds_set(m) - assert len(tset) == 6 - for i in tset: - assert i in ComponentSet( - [ - m.b2["a"].v1, - m.b2["b"].v1, - m.b2["a"].v2["a"], - m.b2["a"].v2["b"], - m.b2["b"].v2["a"], - m.b2["b"].v2["b"], - ] - ) - - m.b2["a"].v1.value = 1.001 - tset = variables_near_bounds_set(m) - assert len(tset) == 5 - for i in tset: - assert i in ComponentSet( - [ - m.b2["b"].v1, - m.b2["a"].v2["a"], - m.b2["a"].v2["b"], - m.b2["b"].v2["a"], - m.b2["b"].v2["b"], - ] - ) - - tset = variables_near_bounds_set(m, tol=1e-3) - assert len(tset) == 6 - for i in tset: - assert i in ComponentSet( - [ - m.b2["a"].v1, - m.b2["b"].v1, - m.b2["a"].v2["a"], - m.b2["a"].v2["b"], - m.b2["b"].v2["a"], - m.b2["b"].v2["b"], - ] - ) - - m.b2["a"].v1.setlb(None) - tset = variables_near_bounds_set(m) - assert len(tset) == 5 - for i in tset: - assert i in ComponentSet( - [ - m.b2["b"].v1, - m.b2["a"].v2["a"], - m.b2["a"].v2["b"], - m.b2["b"].v2["a"], - m.b2["b"].v2["b"], - ] - ) - - m.b2["a"].v2["a"].setub(None) - tset = variables_near_bounds_set(m) - assert len(tset) == 4 - for i in tset: - assert i in ComponentSet( - [m.b2["b"].v1, m.b2["a"].v2["b"], m.b2["b"].v2["a"], m.b2["b"].v2["b"]] - ) - - m.b2["a"].v2["b"].value = None - tset = variables_near_bounds_set(m) - assert len(tset) == 3 - for i in tset: - assert i in ComponentSet([m.b2["b"].v1, m.b2["b"].v2["a"], m.b2["b"].v2["b"]]) +def test_variables_near_bounds_tol_deprecation(m, caplog): + variables_near_bounds_set(m, tol=1e-3) + + msg = ( + "DEPRECATED: variables_near_bounds_generator has deprecated the tol argument. " + "Please set abs_tol and rel_tol arguments instead. (deprecated in " + "2.2.0, will be removed in (or after) 3.0.0)" + ) + assert msg.replace(" ", "") in caplog.records[0].message.replace("\n", "").replace( + " ", "" + ) + + +@pytest.mark.unit +def test_variables_near_bounds_relative_deprecation(m, caplog): + variables_near_bounds_set(m, relative=False) + + msg = ( + "DEPRECATED: variables_near_bounds_generator has deprecated the relative argument. " + "Please set abs_tol and rel_tol arguments instead. (deprecated in " + "2.2.0, will be removed in (or after) 3.0.0)" + ) + assert msg.replace(" ", "") in caplog.records[0].message.replace("\n", "").replace( + " ", "" + ) + + +@pytest.mark.unit +def test_variables_near_bounds_set(): + m = ConcreteModel() + m.v = Var(initialize=0.5, bounds=(0, 1)) + + # Small value, both bounds + # Away from bounds + vset = variables_near_bounds_set(m) + assert len(vset) == 0 + + # Near lower bound, relative + m.v.set_value(1e-6) + vset = variables_near_bounds_set(m, abs_tol=1e-8, rel_tol=1e-4) + assert len(vset) == 1 + + # Near lower bound, absolute + vset = variables_near_bounds_set(m, abs_tol=1e-4, rel_tol=1e-8) + assert len(vset) == 1 + + # Near upper bound, relative + m.v.set_value(1 - 1e-6) + vset = variables_near_bounds_set(m, abs_tol=1e-8, rel_tol=1e-4) + assert len(vset) == 1 + + # Near upper bound, absolute + vset = variables_near_bounds_set(m, abs_tol=1e-4, rel_tol=1e-8) + assert len(vset) == 1 + + # Small value, lower bound + # Away from bounds + m.v.set_value(0.5) + m.v.setub(None) + vset = variables_near_bounds_set(m) + assert len(vset) == 0 + + # Near lower bound, relative + m.v.set_value(1e-6) + vset = variables_near_bounds_set(m, abs_tol=1e-8, rel_tol=1e-4) + # Lower bound of 0 means relative tolerance is 0 + assert len(vset) == 0 + + # Near lower bound, absolute + vset = variables_near_bounds_set(m, abs_tol=1e-4, rel_tol=1e-8) + assert len(vset) == 1 + + # Small value, upper bound + # Away from bounds + m.v.set_value(0.5) + m.v.setub(1) + m.v.setlb(None) + vset = variables_near_bounds_set(m) + assert len(vset) == 0 + + # Near upper bound, relative + m.v.set_value(1 - 1e-6) + vset = variables_near_bounds_set(m, abs_tol=1e-8, rel_tol=1e-4) + assert len(vset) == 1 + + # Near upper bound, absolute + vset = variables_near_bounds_set(m, abs_tol=1e-4, rel_tol=1e-8) + assert len(vset) == 1 + + # Large value, both bounds + # Relative tolerance based on magnitude of 100 + m.v.setlb(450) + m.v.setub(550) + m.v.set_value(500) + vset = variables_near_bounds_set(m) + assert len(vset) == 0 + + # Near lower bound, relative + m.v.set_value(451) + vset = variables_near_bounds_set(m, rel_tol=1e-2) + assert len(vset) == 1 + + # Near lower bound, absolute + vset = variables_near_bounds_set(m, abs_tol=1) + assert len(vset) == 1 + + # Near upper bound, relative + m.v.set_value(549) + vset = variables_near_bounds_set(m, rel_tol=1e-2) + assert len(vset) == 1 + + # Near upper bound, absolute + vset = variables_near_bounds_set(m, abs_tol=1) + assert len(vset) == 1 + + # Large value, lower bound + # Relative tolerance based on magnitude of 450 + m.v.setlb(450) + m.v.setub(None) + m.v.set_value(500) + vset = variables_near_bounds_set(m) + assert len(vset) == 0 + + # Near lower bound, relative + m.v.set_value(451) + vset = variables_near_bounds_set(m, rel_tol=1e-2) + assert len(vset) == 1 + + # Near lower bound, absolute + vset = variables_near_bounds_set(m, abs_tol=1) + assert len(vset) == 1 + + # Large value, upper bound + # Relative tolerance based on magnitude of 550 + m.v.setlb(None) + m.v.setub(550) + m.v.set_value(500) + vset = variables_near_bounds_set(m) + assert len(vset) == 0 + + # Near upper bound, relative + m.v.set_value(549) + vset = variables_near_bounds_set(m, rel_tol=1e-2) + assert len(vset) == 1 + + # Near upper bound, absolute + vset = variables_near_bounds_set(m, abs_tol=1) + assert len(vset) == 1 @pytest.mark.unit