You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Pertinent to above and @seabbs comment on using upjitter. Using the softplus transform instead of an upjitter (or clamp to do relu) stabilises the sampling. Also, using LogExpFunctions.xexpy to implement the effect of the AR process on observations in the stochastic model.
After these changes, an experiment to not use initial values to start NUTS still converges onto the correct posterior distributions, so at least in this context this is a big improvement in the stability of the model. I know @damonbayer has wrangled with this in Bayesian inference of ODEs so I wonder what your thoughts on using special functions to stabilise sampling are?
This finding really makes me wonder if we should have some kind of (preferable smooth) rectifier transformation as default in a large chunk of the modelling. Maybe via TransformLatentModel etc?
Originally posted by @SamuelBrand1 in #464 (comment)
This finding really makes me wonder if we should have some kind of (preferable smooth) rectifier transformation as default in a large chunk of the modelling. Maybe via
TransformLatentModel
etc?Transform...
struct using softplus etc for chatzilena et al replication. #471Transform
struct as discussed Check, and if possible apply,Transform...
struct using softplus etc for chatzilena et al replication. #471 (comment)EpiAwareBase.generate_latent_infs(epi_model::ExpGrowthRate, rt)
#469The text was updated successfully, but these errors were encountered: