Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug report: Calibrate fails when there are no latent variables in the model #374

Closed
djinnome opened this issue Sep 14, 2023 · 6 comments
Closed
Assignees
Labels
bug Something isn't working

Comments

@djinnome
Copy link
Contributor

If I have an SIR model where both beta and gamma are learnable parameters, then calibrate fails with this error:

RuntimeError: AutoLowRankMultivariateNormal found no latent variables; Use an empty guide instead

However, if either beta or gamma (or both), then calibrate works.
I will write a branch containing a set of unit tests that confirms this problem on an SIR model, and the PR will be ready for review when the tests no longer fail.

@akruel01
Copy link
Contributor

akruel01 commented Sep 14, 2023

Minimal working example from notebook/july_evaluation/Scenario3/Scenario_3a.ipynb:

from pyciemss.utils import get_tspan
from mira.sources.askenet import model_from_json_file
from pyciemss.PetriNetODE.interfaces import (
    load_and_sample_petri_model,
    load_and_calibrate_and_sample_petri_model
)

best_fit_model = model_from_json_file('ES3_detection_log10V.json')
best_fit_model.initials['E'].value = 11
S, E, I = best_fit_model.initials['S'].value, best_fit_model.initials['E'].value, best_fit_model.initials['I'].value
best_fit_model.parameters['lambda'].value = 9.66e-8*(S + E + I)
best_fit_model.parameters['gamma'].value = 0.08
best_fit_model.parameters['beta'].value = 44852600
best_fit_model.parameters['delta'].value = 0.125
best_fit_model.parameters['k'].value = 0.333
best_fit_model.parameters['alpha'].value = 249 
best_fit_model.parameters['alpha'].distribution = None

timepoints = list(get_tspan(0, 226, 227).detach().numpy())
prior_samples = load_and_sample_petri_model(
        best_fit_model, 1, timepoints=timepoints,
        visual_options={"title": "3_base", "keep":["V_sol"]}, time_unit="days")

calibrated_samples = load_and_calibrate_and_sample_petri_model(
    best_fit_model,
    './data/processed_dataset_train.csv',
    5,
    timepoints=timepoints,
    num_iterations=2,
    visual_options={"title": "3_base", "keep":["V_sol"]},
    time_unit="days")

@SamWitty
Copy link
Contributor

While I don't think this is a bug per-se, we could certainly be much much more informative about providing an error message when a user attempts to calibrate when there are no distributions on parameters.

To clarify, here are the ways we interpret various configurations of distributions on model parameters and arguments to the calibrate methods.

  1. The parameter does not have a distribution -> The parameter will not be updated during calibration.
  2. The parameter has a distribution and is not included in the deterministic_learnable_parameters -> The calibration will return a distribution over that parameter.
  3. The parameter has a distribution and is included in the deterministic_learnable_parameters -> The calibration will return a point estimate over that parameter.

Therefore, it shouldn't be surprising that if no parameters have a distribution associated with them, that the calibration will throw an error. Again, we can make that error message more informative.

@djinnome
Copy link
Contributor Author

Hi @SamWitty

Thanks for clarifying, and what you say makes sense. I added few more tests to reveal some perhaps unexpected behaviors.

  1. All parameters have a distribution and are included in the deterministic_learnable_parameters -> The calibration fails with a RuntimeError
  2. One parameter has a distribution and one parameter does not have a distribution, but is included in deterministic_learnable_parameters -> The calibration runs without error.
  3. One parameter has a distribution, and is included in deterministic_learnable_parameters and one parameter does not have a distribution -> the calibration fails with a RuntimeError
  4. Neither parameter has a distribution and are included in deterministic_learnable_parameters -> The calibration fails with a RuntimeError

Is this what you expect? And if so, how should we document this behavior?

@djinnome
Copy link
Contributor Author

djinnome commented Sep 19, 2023

The rule seems to be "There must be at least one parameter with a distribution that is not included in deterministic_learnable_parameters."

@SamWitty
Copy link
Contributor

@djinnome , could you revisit this to see if the behavior persists with the new refactor?

4 -> This is a bug if it still holds, but I believe I have tests that address this.
5 -> This is expected, but would probably be worth triggering a warning to the user.
6 -> This is a bug if it still holds. I don't think this is currently tested.
7 -> This is expected behavior. We can make the error message more informative.

@SamWitty SamWitty added the bug Something isn't working label Dec 19, 2023
@SamWitty SamWitty changed the title Calibrate fails when there are no latent variables in the model Bug report: Calibrate fails when there are no latent variables in the model Dec 19, 2023
@djinnome djinnome self-assigned this Dec 19, 2023
@SamWitty
Copy link
Contributor

Closing this issue, as having all latent variables in the deterministic_learnable_parameters is covered in the tests here: https://github.com/ciemss/pyciemss/blob/main/tests/test_interfaces.py#L241

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
3 participants