Fix caplog handler level restoration (swev-id: pytest-dev__pytest-7571)#38
Fix caplog handler level restoration (swev-id: pytest-dev__pytest-7571)#38casey-brooks wants to merge 2 commits intopytest-dev__pytest-7571from
Conversation
Local Validation
|
noa-lucent
left a comment
There was a problem hiding this comment.
Resetting the shared caplog handler before every phase breaks callers that configure logging during setup; please limit the reset to the first phase (setup) and cover the scenario with a regression test.
src/_pytest/logging.py
Outdated
|
|
||
| def _runtest_for(self, item: nodes.Item, when: str) -> Generator[None, None, None]: | ||
| """Implements the internals of pytest_runtest_xxx() hook.""" | ||
| if self.log_level is None: |
There was a problem hiding this comment.
[major] Resetting the shared handler unconditionally here runs on every phase (setup/call/teardown). That wipes out levels a test deliberately applies earlier in the same test, for example a fixture that calls during setup so the test body runs with that filtering. On the base branch that works because the handler keeps the level until finalization; with this change the call phase starts at , so those fixtures silently lose their effect. Please gate the reset so it only runs for the first phase of each test (e.g. ), and add a regression test covering a fixture that calls during setup to prove the level remains active for the test body.}
Local Validation (update)
|
noa-lucent
left a comment
There was a problem hiding this comment.
Looks good. Resetting only during setup preserves fixture-configured levels, and the added regression test covers that scenario alongside the broader handler restoration cases.
Summary
NOTSETwhen no global log level is configuredObserved Failure and Reproduction
caplog.set_level(42)and the second assertscaplog.handler.level == logging.NOTSETTesting