-
Notifications
You must be signed in to change notification settings - Fork 36
Remove Sampler
and initialstep
#1037
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: breaking
Are you sure you want to change the base?
Conversation
Note that, apart from being simpler code, Distributions.Uniform also doesn't allow the lower and upper bounds to be exactly equal (but we might like to keep that option open in DynamicPPL, e.g. if the user wants to initialise all values to the same value in linked space).
This should have been changed in #940, but slipped through as the file wasn't listed as one of the changed files.
function AbstractMCMC.sample( | ||
rng::Random.AbstractRNG, | ||
model::Model, | ||
sampler::Sampler, | ||
N::Integer; | ||
chain_type=default_chain_type(sampler), | ||
resume_from=nothing, | ||
initial_state=loadstate(resume_from), | ||
kwargs..., | ||
) | ||
return AbstractMCMC.mcmcsample( | ||
rng, model, sampler, N; chain_type, initial_state, kwargs... | ||
) | ||
end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This method (or to be precise its default kwargs) will have to be shifted upwards to Turing. The same is true for the multiple-chain sampling one.
function AbstractMCMC.step( | ||
rng::Random.AbstractRNG, | ||
model::Model, | ||
spl::Sampler; | ||
initial_params::AbstractInitStrategy=init_strategy(spl), | ||
kwargs..., | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This one requires some subtlety. It ensures that for all Sampler
s the first step goes via initialstep
because the rest of the code is shared. We can do the same using an internal Turing.Inference.initialstep
for now; in the long term that function will also be removed if we use LogDensityFunction in sample
.
Benchmark Report for Commit 35d0be2Computer Information
Benchmark Results
|
8e3dcac
to
0b87d0d
Compare
0b87d0d
to
992569f
Compare
As title. The challenge is not here, of course, it's making sure that Turing still works correctly. I'm about 99% sure that this can be done fairly straightforwardly because I've actually done all the work before, see TuringLang/Turing.jl#2588.
Because I branched off the InitContext branches it will have to wait until those PRs are merged.