Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 20 additions & 0 deletions src/backtest/runner.py
Original file line number Diff line number Diff line change
Expand Up @@ -514,6 +514,26 @@ def run_all(self, only_cached: bool = False) -> list[BestResult]:
else:
search_space[name] = options

n_params = len(search_space)
dof_multiplier = self.cfg.param_dof_multiplier
min_bars_floor = self.cfg.param_min_bars
min_bars_for_optimization = max(min_bars_floor, dof_multiplier * n_params)
if search_space and len(df) < min_bars_for_optimization:
self.logger.info(
"skipping optimization due to insufficient bars",
collection=col.name,
symbol=symbol,
timeframe=timeframe,
bars=len(df),
min_bars=min_bars_for_optimization,
n_params=n_params,
dof_multiplier=dof_multiplier,
min_bars_floor=min_bars_floor,
strategy=strat.name,
search_method=search_method,
)
search_space = {}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Search space parameters silently dropped when gate triggers

High Severity

When the data-sufficiency gate triggers and clears search_space = {}, parameters that had multiple options are silently lost. They are never transferred into fixed_params, so the subsequent evaluate({}) call constructs full_params without them. The strategy then runs with missing parameters, likely producing incorrect results or failures. The search-space parameters need default values (e.g., first option from each list) added to fixed_params before clearing search_space.

Fix in Cursor Fix in Web

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Keep tunable params when skipping optimization

When data is below the new threshold, search_space is cleared entirely, so run_all() later calls evaluate({}) and drops every parameter that originally had multiple candidate values. For strategies that require those parameters (e.g., external adapters/classes expecting constructor args from the grid), this turns a previously valid run into generate_signals failures or silently changes behavior on short datasets. Instead of erasing the space, the gate should collapse each tunable parameter to a deterministic single value (such as the first option) so required keys are still passed.

Useful? React with 👍 / 👎.


best_val = -np.inf
best_params: dict[str, Any] | None = None
best_stats: dict[str, Any] | None = None
Expand Down
4 changes: 4 additions & 0 deletions src/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,8 @@ class Config:
engine: str = "pybroker" # pybroker engine
param_search: str = "grid" # grid | optuna
param_trials: int = 25
param_dof_multiplier: int = 100
param_min_bars: int = 2000
max_workers: int = 1
asset_workers: int = 1
param_workers: int = 1
Expand Down Expand Up @@ -127,6 +129,8 @@ def load_config(path: str | Path) -> Config:
engine=str(raw.get("engine", "pybroker")).lower(),
param_search=str(raw.get("param_search", raw.get("param_optimizer", "grid"))).lower(),
param_trials=int(raw.get("param_trials", raw.get("opt_trials", 25))),
param_dof_multiplier=int(raw.get("param_dof_multiplier", 100)),
param_min_bars=int(raw.get("param_min_bars", 2000)),
max_workers=int(raw.get("max_workers", raw.get("asset_workers", 1))),
asset_workers=int(raw.get("asset_workers", raw.get("max_workers", 1))),
param_workers=int(raw.get("param_workers", 1)),
Expand Down