Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

warning in test: tests/tests_automl/test_dir_change.py::AutoMLDirChangeTest::test_compute_predictions_after_dir_change #751

Closed
a-szulc opened this issue Aug 23, 2024 · 1 comment

Comments

@a-szulc
Copy link
Contributor

a-szulc commented Aug 23, 2024

============================= test session starts ==============================
platform linux -- Python 3.12.3, pytest-8.3.2, pluggy-1.5.0 -- /home/adas/mljar/mljar-supervised/venv/bin/python3
cachedir: .pytest_cache
rootdir: /home/adas/mljar/mljar-supervised
configfile: pytest.ini
plugins: cov-5.0.0
collecting ... collected 1 item

tests/tests_automl/test_dir_change.py::AutoMLDirChangeTest::test_compute_predictions_after_dir_change AutoML directory: automl_testing_A/automl_testing
The task is regression with evaluation metric rmse
AutoML will use algorithms: ['Baseline', 'Linear', 'Decision Tree', 'Random Forest', 'Xgboost', 'Neural Network']
AutoML will ensemble available models
AutoML steps: ['simple_algorithms', 'default_algorithms', 'ensemble']
* Step simple_algorithms will try to check up to 3 models
1_Baseline rmse 141.234462 trained in 0.25 seconds
2_DecisionTree rmse 89.485706 trained in 0.24 seconds
3_Linear rmse 0.0 trained in 0.25 seconds
* Step default_algorithms will try to check up to 3 models
4_Default_Xgboost rmse 55.550016 trained in 0.33 seconds
5_Default_NeuralNetwork rmse 9.201656 trained in 0.29 seconds
6_Default_RandomForest rmse 82.006449 trained in 0.57 seconds
* Step ensemble will try to check up to 1 model
Ensemble rmse 0.0 trained in 0.17 seconds
AutoML fit time: 5.97 seconds
AutoML best model: 3_Linear
FAILED

=================================== FAILURES ===================================
________ AutoMLDirChangeTest.test_compute_predictions_after_dir_change _________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_and_report.<locals>.<lambda> at 0x7d808995b2e0>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: Callable[[], TResult],
        when: Literal["collect", "setup", "call", "teardown"],
        reraise: type[BaseException] | tuple[type[BaseException], ...] | None = None,
    ) -> CallInfo[TResult]:
        """Call func, wrapping the result in a CallInfo.
    
        :param func:
            The function to call. Called without arguments.
        :type func: Callable[[], _pytest.runner.TResult]
        :param when:
            The phase in which the function is called.
        :param reraise:
            Exception or exceptions that shall propagate if raised by the
            function, instead of being wrapped in the CallInfo.
        """
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: TResult | None = func()

venv/lib/python3.12/site-packages/_pytest/runner.py:341: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
venv/lib/python3.12/site-packages/_pytest/runner.py:242: in <lambda>
    lambda: runtest_hook(item=item, **kwds), when=when, reraise=reraise
venv/lib/python3.12/site-packages/pluggy/_hooks.py:513: in __call__
    return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult)
venv/lib/python3.12/site-packages/pluggy/_manager.py:120: in _hookexec
    return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
venv/lib/python3.12/site-packages/_pytest/threadexception.py:92: in pytest_runtest_call
    yield from thread_exception_runtest_hook()
venv/lib/python3.12/site-packages/_pytest/threadexception.py:68: in thread_exception_runtest_hook
    yield
venv/lib/python3.12/site-packages/_pytest/unraisableexception.py:95: in pytest_runtest_call
    yield from unraisable_exception_runtest_hook()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

    def unraisable_exception_runtest_hook() -> Generator[None, None, None]:
        with catch_unraisable_exception() as cm:
            try:
                yield
            finally:
                if cm.unraisable:
                    if cm.unraisable.err_msg is not None:
                        err_msg = cm.unraisable.err_msg
                    else:
                        err_msg = "Exception ignored in"
                    msg = f"{err_msg}: {cm.unraisable.object!r}\n\n"
                    msg += "".join(
                        traceback.format_exception(
                            cm.unraisable.exc_type,
                            cm.unraisable.exc_value,
                            cm.unraisable.exc_traceback,
                        )
                    )
>                   warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))
E                   pytest.PytestUnraisableExceptionWarning: Exception ignored in: <_io.FileIO [closed]>
E                   
E                   Traceback (most recent call last):
E                     File "/home/adas/mljar/mljar-supervised/supervised/base_automl.py", line 218, in load
E                       self._data_info = json.load(open(data_info_path))
E                                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E                   ResourceWarning: unclosed file <_io.TextIOWrapper name='automl_testing_B/automl_testing/data_info.json' mode='r' encoding='UTF-8'>

venv/lib/python3.12/site-packages/_pytest/unraisableexception.py:85: PytestUnraisableExceptionWarning
=========================== short test summary info ============================
FAILED tests/tests_automl/test_dir_change.py::AutoMLDirChangeTest::test_compute_predictions_after_dir_change
============================== 1 failed in 8.06s ===============================
@a-szulc
Copy link
Contributor Author

a-szulc commented Aug 29, 2024

fixed in #765

@a-szulc a-szulc closed this as completed Aug 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant