Skip to content

Commit 8f7d89a

Browse files
committed
formatting bullets
1 parent 2b10215 commit 8f7d89a

File tree

1 file changed

+3
-0
lines changed

1 file changed

+3
-0
lines changed

paper.md

+3
Original file line numberDiff line numberDiff line change
@@ -52,9 +52,11 @@ bibliography: paper.bib
5252
# Summary
5353

5454
A Julia language [@julia] package providing practical and modular implementation of ``Calibrate, Emulate, Sample" [@Cleary:2021], hereafter CES, an accelerated workflow for obtaining model parametric uncertainty is presented. This is also known as Bayesian inversion or uncertainty quantification. To apply CES one requires a computer model (written in any programming language) dependent on free parameters, a prior distribution encoding some prior knowledge about the distribution over the free parameters, and some data with which to constrain this prior distribution. The pipeline has three stages, most easily explained in reverse:
55+
5556
1. the final stage, and goal of the workflow, is to draw samples (Sample) from the Bayesian posterior distribution, that is, the prior distribution conditioned on the observed data;
5657
2. to accelerate and regularize this process we train statistical emulators to represent the user-provided parameter-to-data map (Emulate);
5758
3. the training points for these emulators are generated by the computer model, and selected adaptively around regions of high posterior mass (Calibrate).
59+
5860
We describe CES as an accelerated workflow, as it is often able to use dramatically fewer evaluations of the computer model when compared with applying sampling algorithms, such as Markov chain Monte Carlo (MCMC), directly.
5961

6062
* Calibration tools: We recommend choosing adaptive training points with Ensemble Kalman methods such as EKI [@Iglesias:2013] and its variants [@Huang:2022]; and CES provides explicit utilities from the codebase EnsembleKalmanProcesses.jl [@Dunbar:2022a].
@@ -164,6 +166,7 @@ A histogram of the samples from the CES algorithm is displayed in \autoref{fig:G
164166

165167
# Research projects using the package
166168
Some research projects that use this codebase, or modifications of it, are
169+
167170
* [@Dunbar:2021]
168171
* [@Bieli:2022]
169172
* [@Hillier:2022]

0 commit comments

Comments
 (0)