| def poisson_gamma_posterior_predictive(
- gamma: Gamma, n: NUMERIC = 1
-) -> NegativeBinomial:
- """Posterior predictive distribution for a poisson likelihood with a gamma prior
-
- Args:
- gamma: Gamma distribution
- n: Number of trials for each sample, defaults to 1.
- Can be used to scale the distributions to a different unit of time.
-
- Returns:
- NegativeBinomial distribution related to posterior predictive
-
- """
- n = n * gamma.alpha
- p = gamma.beta / (1 + gamma.beta)
-
- return NegativeBinomial(n=n, p=p)
+ | def poisson_gamma_posterior_predictive(
+ gamma: Gamma, n: NUMERIC = 1
+) -> NegativeBinomial:
+ """Posterior predictive distribution for a poisson likelihood with a gamma prior
+
+ Args:
+ gamma: Gamma distribution
+ n: Number of trials for each sample, defaults to 1.
+ Can be used to scale the distributions to a different unit of time.
+
+ Returns:
+ NegativeBinomial distribution related to posterior predictive
+
+ """
+ n = n * gamma.alpha
+ p = gamma.beta / (1 + gamma.beta)
+
+ return NegativeBinomial(n=n, p=p)
|
diff --git a/objects.inv b/objects.inv
index df58faf..beff35f 100644
Binary files a/objects.inv and b/objects.inv differ
diff --git a/search/search_index.json b/search/search_index.json
index 0d1db0a..a5901cf 100644
--- a/search/search_index.json
+++ b/search/search_index.json
@@ -1 +1 @@
-{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"Conjugate Models","text":"Bayesian conjugate models in Python "},{"location":"#installation","title":"Installation","text":"pip install conjugate-models\n "},{"location":"#features","title":"Features","text":" - Connection to Scipy Distributions with
dist attribute - Built in Plotting with
plot_pdf and plot_pmf methods - Vectorized Operations for parameters and data
- Indexing Parameters for subsetting and slicing
- Generalized Numerical Inputs for inputs other than builtins and numpy arrays
- Unsupported Distributions for sampling from unsupported distributions
"},{"location":"#supported-models","title":"Supported Models","text":"Many likelihoods are supported including Bernoulli / Binomial Categorical / Multinomial Poisson Normal (including linear regression) - and many more
"},{"location":"#basic-usage","title":"Basic Usage","text":" - Define prior distribution from
distributions module - Pass data and prior into model from
models modules - Analytics with posterior and posterior predictive distributions
from conjugate.distributions import Beta, BetaBinomial\nfrom conjugate.models import binomial_beta, binomial_beta_posterior_predictive\n\n# Observed Data\nX = 4\nN = 10\n\n# Analytics\nprior = Beta(1, 1)\nprior_predictive: BetaBinomial = binomial_beta_posterior_predictive(n=N, beta=prior)\n\nposterior: Beta = binomial_beta(n=N, x=X, beta_prior=prior)\nposterior_predictive: BetaBinomial = binomial_beta_posterior_predictive(n=N, beta=posterior) \n\n# Figure\nimport matplotlib.pyplot as plt\n\nfig, axes = plt.subplots(ncols=2)\n\nax = axes[0]\nax = posterior.plot_pdf(ax=ax, label=\"posterior\")\nprior.plot_pdf(ax=ax, label=\"prior\")\nax.axvline(x=X/N, color=\"black\", ymax=0.05, label=\"MLE\")\nax.set_title(\"Success Rate\")\nax.legend()\n\nax = axes[1]\nposterior_predictive.plot_pmf(ax=ax, label=\"posterior predictive\")\nprior_predictive.plot_pmf(ax=ax, label=\"prior predictive\")\nax.axvline(x=X, color=\"black\", ymax=0.05, label=\"Sample\")\nax.set_title(\"Number of Successes\")\nax.legend()\nplt.show()\n "},{"location":"#too-simple","title":"Too Simple?","text":"Simple model, sure. Useful model, potentially. Constant probability of success, p , for n trials. rng = np.random.default_rng(42)\n\n# Observed Data\nn_times = 75\np = np.repeat(0.5, n_times)\nsamples = rng.binomial(n=1, p=p, size=n_times)\n\n# Model\nn = np.arange(n_times) + 1\nprior = Beta(alpha=1, beta=1)\nposterior = binomial_beta(n=n, x=samples.cumsum(), beta_prior=prior)\n\n# Figure\nplt.plot(n, p, color=\"black\", label=\"true p\", linestyle=\"--\")\nplt.scatter(n, samples, color=\"black\", label=\"observed samples\")\nplt.plot(n, posterior.dist.mean(), color=\"red\", label=\"posterior mean\")\n# fill between the 95% credible interval\nplt.fill_between(\n n, \n posterior.dist.ppf(0.025),\n posterior.dist.ppf(0.975),\n color=\"red\",\n alpha=0.2,\n label=\"95% credible interval\",\n)\npadding = 0.025\nplt.ylim(0 - padding, 1 + padding)\nplt.xlim(1, n_times)\nplt.legend(loc=\"best\")\nplt.xlabel(\"Number of trials\")\nplt.ylabel(\"Probability\")\nplt.show()\n Even with a moving probability, this simple to implement model can be useful. ...\n\ndef sigmoid(x):\n return 1 / (1 + np.exp(-x))\n\np_raw = rng.normal(loc=0, scale=0.2, size=n_times).cumsum()\np = sigmoid(p_raw)\n\n...\n "},{"location":"#resources","title":"Resources","text":""},{"location":"distributions/","title":"Distributions","text":"These are the supported distributions based on the conjugate models. Many have the dist attribute which is a scipy.stats distribution object. From there, you can use the methods from scipy.stats to get the pdf, cdf, etc. Distributions can be plotted using the plot_pmf or plot_pdf methods of the distribution. from conjugate.distribution import Beta \n\nbeta = Beta(1, 1)\nscipy_dist = beta.dist \n\nprint(scipy_dist.mean())\n# 0.5\nprint(scipy_dist.ppf([0.025, 0.975]))\n# [0.025 0.975]\n\nsamples = scipy_dist.rvs(100)\n\nbeta.plot_pmf(label=\"beta distribution\")\n Distributions like Poisson can be added with other Poissons or multiplied by numerical values in order to scale rate. For instance, daily_rate = 0.25\ndaily_pois = Poisson(lam=daily_rate)\n\ntwo_day_pois = daily_pois + daily_pois\nweekly_pois = 7 * daily_pois\n Below are the currently supported distributions "},{"location":"distributions/#conjugate.distributions.Beta","title":"Beta dataclass ","text":" Bases: ContinuousPlotDistMixin , SliceMixin Beta distribution. Parameters: Name Type Description Default alpha NUMERIC shape parameter required beta NUMERIC shape parameter required Source code in conjugate/distributions.py @dataclass\nclass Beta(ContinuousPlotDistMixin, SliceMixin):\n \"\"\"Beta distribution.\n\n Args:\n alpha: shape parameter\n beta: shape parameter\n\n \"\"\"\n\n alpha: NUMERIC\n beta: NUMERIC\n\n def __post_init__(self) -> None:\n self.max_value = 1.0\n\n @classmethod\n def from_mean(cls, mean: float, alpha: float) -> \"Beta\":\n \"\"\"Alternative constructor from mean and alpha.\"\"\"\n beta = get_beta_param_from_mean_and_alpha(mean=mean, alpha=alpha)\n return cls(alpha=alpha, beta=beta)\n\n @classmethod\n def from_successes_and_failures(cls, successes: int, failures: int) -> \"Beta\":\n \"\"\"Alternative constructor based on hyperparameter interpretation.\"\"\"\n alpha = successes + 1\n beta = failures + 1\n return cls(alpha=alpha, beta=beta)\n\n @property\n def dist(self):\n return stats.beta(self.alpha, self.beta)\n "},{"location":"distributions/#conjugate.distributions.Beta.from_mean","title":"from_mean(mean, alpha) classmethod ","text":"Alternative constructor from mean and alpha. Source code in conjugate/distributions.py @classmethod\ndef from_mean(cls, mean: float, alpha: float) -> \"Beta\":\n \"\"\"Alternative constructor from mean and alpha.\"\"\"\n beta = get_beta_param_from_mean_and_alpha(mean=mean, alpha=alpha)\n return cls(alpha=alpha, beta=beta)\n "},{"location":"distributions/#conjugate.distributions.Beta.from_successes_and_failures","title":"from_successes_and_failures(successes, failures) classmethod ","text":"Alternative constructor based on hyperparameter interpretation. Source code in conjugate/distributions.py @classmethod\ndef from_successes_and_failures(cls, successes: int, failures: int) -> \"Beta\":\n \"\"\"Alternative constructor based on hyperparameter interpretation.\"\"\"\n alpha = successes + 1\n beta = failures + 1\n return cls(alpha=alpha, beta=beta)\n "},{"location":"distributions/#conjugate.distributions.BetaBinomial","title":"BetaBinomial dataclass ","text":" Bases: DiscretePlotMixin , SliceMixin Beta binomial distribution. Parameters: Name Type Description Default n NUMERIC number of trials required alpha NUMERIC shape parameter required beta NUMERIC shape parameter required Source code in conjugate/distributions.py @dataclass\nclass BetaBinomial(DiscretePlotMixin, SliceMixin):\n \"\"\"Beta binomial distribution.\n\n Args:\n n: number of trials\n alpha: shape parameter\n beta: shape parameter\n\n \"\"\"\n\n n: NUMERIC\n alpha: NUMERIC\n beta: NUMERIC\n\n def __post_init__(self):\n if isinstance(self.n, np.ndarray):\n self.max_value = self.n.max()\n else:\n self.max_value = self.n\n\n @property\n def dist(self):\n return stats.betabinom(self.n, self.alpha, self.beta)\n "},{"location":"distributions/#conjugate.distributions.BetaNegativeBinomial","title":"BetaNegativeBinomial dataclass ","text":" Bases: SliceMixin Beta negative binomial distribution. Parameters: Name Type Description Default n NUMERIC number of successes required alpha NUMERIC shape parameter required Source code in conjugate/distributions.py @dataclass\nclass BetaNegativeBinomial(SliceMixin):\n \"\"\"Beta negative binomial distribution.\n\n Args:\n n: number of successes\n alpha: shape parameter\n\n \"\"\"\n\n n: NUMERIC\n alpha: NUMERIC\n beta: NUMERIC\n "},{"location":"distributions/#conjugate.distributions.Binomial","title":"Binomial dataclass ","text":" Bases: DiscretePlotMixin , SliceMixin Binomial distribution. Parameters: Name Type Description Default n NUMERIC number of trials required p NUMERIC probability of success required Source code in conjugate/distributions.py @dataclass\nclass Binomial(DiscretePlotMixin, SliceMixin):\n \"\"\"Binomial distribution.\n\n Args:\n n: number of trials\n p: probability of success\n\n \"\"\"\n\n n: NUMERIC\n p: NUMERIC\n\n def __post_init__(self):\n if isinstance(self.n, np.ndarray):\n self.max_value = self.n.max()\n else:\n self.max_value = self.n\n\n @property\n def dist(self):\n return stats.binom(n=self.n, p=self.p)\n "},{"location":"distributions/#conjugate.distributions.Dirichlet","title":"Dirichlet dataclass ","text":" Bases: DirichletPlotDistMixin Dirichlet distribution. Parameters: Name Type Description Default alpha NUMERIC shape parameter required Source code in conjugate/distributions.py @dataclass\nclass Dirichlet(DirichletPlotDistMixin):\n \"\"\"Dirichlet distribution.\n\n Args:\n alpha: shape parameter\n\n \"\"\"\n\n alpha: NUMERIC\n\n def __post_init__(self) -> None:\n self.max_value = 1.0\n\n @property\n def dist(self):\n if self.alpha.ndim == 1:\n return stats.dirichlet(self.alpha)\n\n return VectorizedDist(self.alpha, dist=stats.dirichlet)\n "},{"location":"distributions/#conjugate.distributions.Exponential","title":"Exponential dataclass ","text":" Bases: ContinuousPlotDistMixin , SliceMixin Exponential distribution. Parameters: Name Type Description Default lam NUMERIC rate parameter required Source code in conjugate/distributions.py @dataclass\nclass Exponential(ContinuousPlotDistMixin, SliceMixin):\n \"\"\"Exponential distribution.\n\n Args:\n lam: rate parameter\n\n \"\"\"\n\n lam: NUMERIC\n\n @property\n def dist(self):\n return stats.expon(scale=self.lam)\n\n def __mul__(self, other):\n return Gamma(alpha=other, beta=1 / self.lam)\n\n __rmul__ = __mul__\n "},{"location":"distributions/#conjugate.distributions.Gamma","title":"Gamma dataclass ","text":" Bases: ContinuousPlotDistMixin , SliceMixin Gamma distribution. Gamma Distribution Scipy Docmentation Parameters: Name Type Description Default alpha NUMERIC shape parameter required beta NUMERIC rate parameter required Source code in conjugate/distributions.py @dataclass\nclass Gamma(ContinuousPlotDistMixin, SliceMixin):\n \"\"\"Gamma distribution.\n\n <a href=https://en.wikipedia.org/wiki/Gamma_distribution>Gamma Distribution</a>\n <a href=https://docs.scipy.org/doc/scipy/reference/generated/scipy.stats.gamma.html>Scipy Docmentation</a>\n\n Args:\n alpha: shape parameter\n beta: rate parameter\n \"\"\"\n\n alpha: NUMERIC\n beta: NUMERIC\n\n @property\n def dist(self):\n return stats.gamma(a=self.alpha, scale=1 / self.beta)\n\n def __mul__(self, other):\n return Gamma(alpha=self.alpha * other, beta=self.beta)\n\n __rmul__ = __mul__\n "},{"location":"distributions/#conjugate.distributions.Geometric","title":"Geometric dataclass ","text":" Bases: DiscretePlotMixin , SliceMixin Geometric distribution. Parameters: Name Type Description Default p NUMERIC probability of success required Source code in conjugate/distributions.py @dataclass\nclass Geometric(DiscretePlotMixin, SliceMixin):\n \"\"\"Geometric distribution.\n\n Args:\n p: probability of success\n\n \"\"\"\n\n p: NUMERIC\n\n @property\n def dist(self):\n return stats.geom(self.p)\n "},{"location":"distributions/#conjugate.distributions.InverseGamma","title":"InverseGamma dataclass ","text":" Bases: ContinuousPlotDistMixin , SliceMixin InverseGamma distribution. Parameters: Name Type Description Default alpha NUMERIC shape required beta NUMERIC scale required Source code in conjugate/distributions.py @dataclass\nclass InverseGamma(ContinuousPlotDistMixin, SliceMixin):\n \"\"\"InverseGamma distribution.\n\n Args:\n alpha: shape\n beta: scale\n\n \"\"\"\n\n alpha: NUMERIC\n beta: NUMERIC\n\n @property\n def dist(self):\n return stats.invgamma(a=self.alpha, scale=self.beta)\n "},{"location":"distributions/#conjugate.distributions.NegativeBinomial","title":"NegativeBinomial dataclass ","text":" Bases: DiscretePlotMixin , SliceMixin Negative binomial distribution. Parameters: Name Type Description Default n NUMERIC number of successes required p NUMERIC probability of success required Source code in conjugate/distributions.py @dataclass\nclass NegativeBinomial(DiscretePlotMixin, SliceMixin):\n \"\"\"Negative binomial distribution.\n\n Args:\n n: number of successes\n p: probability of success\n\n \"\"\"\n\n n: NUMERIC\n p: NUMERIC\n\n @property\n def dist(self):\n return stats.nbinom(n=self.n, p=self.p)\n\n def __mul__(self, other):\n return NegativeBinomial(n=self.n * other, p=self.p)\n\n __rmul__ = __mul__\n "},{"location":"distributions/#conjugate.distributions.Normal","title":"Normal dataclass ","text":" Bases: ContinuousPlotDistMixin , SliceMixin Normal distribution. Parameters: Name Type Description Default mu NUMERIC mean required sigma NUMERIC standard deviation required Source code in conjugate/distributions.py @dataclass\nclass Normal(ContinuousPlotDistMixin, SliceMixin):\n \"\"\"Normal distribution.\n\n Args:\n mu: mean\n sigma: standard deviation\n\n \"\"\"\n\n mu: NUMERIC\n sigma: NUMERIC\n\n @property\n def dist(self):\n return stats.norm(self.mu, self.sigma)\n\n def __mul__(self, other):\n sigma = ((self.sigma**2) * other) ** 0.5\n return Normal(mu=self.mu * other, sigma=sigma)\n\n __rmul__ = __mul__\n "},{"location":"distributions/#conjugate.distributions.NormalInverseGamma","title":"NormalInverseGamma dataclass ","text":"Normal inverse gamma distribution. Parameters: Name Type Description Default mu NUMERIC mean required delta_inverse NUMERIC covariance matrix required alpha NUMERIC shape required beta NUMERIC scale required Source code in conjugate/distributions.py @dataclass\nclass NormalInverseGamma:\n \"\"\"Normal inverse gamma distribution.\n\n Args:\n mu: mean\n delta_inverse: covariance matrix\n alpha: shape\n beta: scale\n\n \"\"\"\n\n mu: NUMERIC\n delta_inverse: NUMERIC\n alpha: NUMERIC\n beta: NUMERIC\n\n @classmethod\n def from_inverse_gamma(\n cls, mu: NUMERIC, delta_inverse: NUMERIC, inverse_gamma: InverseGamma\n ) -> \"NormalInverseGamma\":\n return cls(\n mu=mu,\n delta_inverse=delta_inverse,\n alpha=inverse_gamma.alpha,\n beta=inverse_gamma.beta,\n )\n\n @property\n def inverse_gamma(self) -> InverseGamma:\n return InverseGamma(alpha=self.alpha, beta=self.beta)\n\n def sample_variance(self, size: int, random_state=None) -> NUMERIC:\n \"\"\"Sample variance from the inverse gamma distribution.\n\n Args:\n size: number of samples\n random_state: random state\n\n Returns:\n samples from the inverse gamma distribution\n\n \"\"\"\n return self.inverse_gamma.dist.rvs(size=size, random_state=random_state)\n\n def sample_beta(\n self, size: int, return_variance: bool = False, random_state=None\n ) -> Union[NUMERIC, Tuple[NUMERIC, NUMERIC]]:\n \"\"\"Sample beta from the normal distribution.\n\n Args:\n size: number of samples\n return_variance: whether to return variance as well\n random_state: random state\n\n Returns:\n samples from the normal distribution and optionally variance\n\n \"\"\"\n variance = self.sample_variance(size=size, random_state=random_state)\n\n beta = np.stack(\n [\n stats.multivariate_normal(self.mu, v * self.delta_inverse).rvs(\n size=1, random_state=random_state\n )\n for v in variance\n ]\n )\n\n if return_variance:\n return beta, variance\n\n return beta\n "},{"location":"distributions/#conjugate.distributions.NormalInverseGamma.sample_beta","title":"sample_beta(size, return_variance=False, random_state=None) ","text":"Sample beta from the normal distribution. Parameters: Name Type Description Default size int number of samples required return_variance bool whether to return variance as well False random_state random state None Returns: Type Description Union[NUMERIC, Tuple[NUMERIC, NUMERIC]] samples from the normal distribution and optionally variance Source code in conjugate/distributions.py def sample_beta(\n self, size: int, return_variance: bool = False, random_state=None\n) -> Union[NUMERIC, Tuple[NUMERIC, NUMERIC]]:\n \"\"\"Sample beta from the normal distribution.\n\n Args:\n size: number of samples\n return_variance: whether to return variance as well\n random_state: random state\n\n Returns:\n samples from the normal distribution and optionally variance\n\n \"\"\"\n variance = self.sample_variance(size=size, random_state=random_state)\n\n beta = np.stack(\n [\n stats.multivariate_normal(self.mu, v * self.delta_inverse).rvs(\n size=1, random_state=random_state\n )\n for v in variance\n ]\n )\n\n if return_variance:\n return beta, variance\n\n return beta\n "},{"location":"distributions/#conjugate.distributions.NormalInverseGamma.sample_variance","title":"sample_variance(size, random_state=None) ","text":"Sample variance from the inverse gamma distribution. Parameters: Name Type Description Default size int number of samples required random_state random state None Returns: Type Description NUMERIC samples from the inverse gamma distribution Source code in conjugate/distributions.py def sample_variance(self, size: int, random_state=None) -> NUMERIC:\n \"\"\"Sample variance from the inverse gamma distribution.\n\n Args:\n size: number of samples\n random_state: random state\n\n Returns:\n samples from the inverse gamma distribution\n\n \"\"\"\n return self.inverse_gamma.dist.rvs(size=size, random_state=random_state)\n "},{"location":"distributions/#conjugate.distributions.Poisson","title":"Poisson dataclass ","text":" Bases: DiscretePlotMixin , SliceMixin Poisson distribution. Parameters: Name Type Description Default lam NUMERIC rate parameter required Source code in conjugate/distributions.py @dataclass\nclass Poisson(DiscretePlotMixin, SliceMixin):\n \"\"\"Poisson distribution.\n\n Args:\n lam: rate parameter\n\n \"\"\"\n\n lam: NUMERIC\n\n @property\n def dist(self):\n return stats.poisson(self.lam)\n\n def __mul__(self, other) -> \"Poisson\":\n return Poisson(lam=self.lam * other)\n\n __rmul__ = __mul__\n\n def __add__(self, other) -> \"Poisson\":\n return Poisson(self.lam + other.lam)\n\n __radd__ = __add__\n "},{"location":"distributions/#conjugate.distributions.StudentT","title":"StudentT dataclass ","text":" Bases: ContinuousPlotDistMixin , SliceMixin StudentT distribution. Parameters: Name Type Description Default mu NUMERIC mean required sigma NUMERIC standard deviation required nu NUMERIC degrees of freedom required Source code in conjugate/distributions.py @dataclass\nclass StudentT(ContinuousPlotDistMixin, SliceMixin):\n \"\"\"StudentT distribution.\n\n Args:\n mu: mean\n sigma: standard deviation\n nu: degrees of freedom\n\n \"\"\"\n\n mu: NUMERIC\n sigma: NUMERIC\n nu: NUMERIC\n\n @property\n def dist(self):\n return stats.t(self.nu, self.mu, self.sigma)\n "},{"location":"distributions/#conjugate.distributions.Uniform","title":"Uniform dataclass ","text":" Bases: ContinuousPlotDistMixin , SliceMixin Uniform distribution. Parameters: Name Type Description Default low NUMERIC lower bound required high NUMERIC upper bound required Source code in conjugate/distributions.py @dataclass\nclass Uniform(ContinuousPlotDistMixin, SliceMixin):\n \"\"\"Uniform distribution.\n\n Args:\n low: lower bound\n high: upper bound\n\n \"\"\"\n\n low: NUMERIC\n high: NUMERIC\n\n def __post_init__(self):\n self.min_value = self.low\n self.max_value = self.high\n\n @property\n def dist(self):\n return stats.uniform(self.low, self.high)\n "},{"location":"mixins/","title":"Mixins","text":"Two sets of mixins to support the plotting and slicing of the distribution parameters "},{"location":"mixins/#conjugate.plot.ContinuousPlotDistMixin","title":"ContinuousPlotDistMixin ","text":" Bases: PlotDistMixin Functionality for plot_pdf method of continuous distributions. Source code in conjugate/plot.py class ContinuousPlotDistMixin(PlotDistMixin):\n \"\"\"Functionality for plot_pdf method of continuous distributions.\"\"\"\n\n def plot_pdf(self, ax: Optional[plt.Axes] = None, **kwargs) -> plt.Axes:\n \"\"\"Plot the pdf of distribution\n\n Args:\n ax: matplotlib Axes, optional\n **kwargs: Additonal kwargs to pass to matplotlib\n\n Returns:\n new or modified Axes\n\n \"\"\"\n ax = self._settle_axis(ax=ax)\n\n x = self._create_x_values()\n x = self._reshape_x_values(x)\n\n return self._create_plot_on_axis(x, ax, **kwargs)\n\n def _create_x_values(self) -> np.ndarray:\n return np.linspace(self.min_value, self.max_value, 100)\n\n def _setup_labels(self, ax) -> None:\n ax.set_xlabel(\"Domain\")\n ax.set_ylabel(\"Density $f(x)$\")\n\n def _create_plot_on_axis(self, x, ax, **kwargs) -> plt.Axes:\n yy = self.dist.pdf(x)\n if \"label\" in kwargs:\n label = kwargs.pop(\"label\")\n label = resolve_label(label, yy)\n else:\n label = None\n\n ax.plot(x, yy, label=label, **kwargs)\n self._setup_labels(ax=ax)\n ax.set_ylim(0, None)\n return ax\n "},{"location":"mixins/#conjugate.plot.ContinuousPlotDistMixin.plot_pdf","title":"plot_pdf(ax=None, **kwargs) ","text":"Plot the pdf of distribution Parameters: Name Type Description Default ax Optional[Axes] matplotlib Axes, optional None **kwargs Additonal kwargs to pass to matplotlib {} Returns: Type Description Axes new or modified Axes Source code in conjugate/plot.py def plot_pdf(self, ax: Optional[plt.Axes] = None, **kwargs) -> plt.Axes:\n \"\"\"Plot the pdf of distribution\n\n Args:\n ax: matplotlib Axes, optional\n **kwargs: Additonal kwargs to pass to matplotlib\n\n Returns:\n new or modified Axes\n\n \"\"\"\n ax = self._settle_axis(ax=ax)\n\n x = self._create_x_values()\n x = self._reshape_x_values(x)\n\n return self._create_plot_on_axis(x, ax, **kwargs)\n "},{"location":"mixins/#conjugate.plot.DirichletPlotDistMixin","title":"DirichletPlotDistMixin ","text":" Bases: ContinuousPlotDistMixin Plot the pdf using samples from the dirichlet distribution. Source code in conjugate/plot.py class DirichletPlotDistMixin(ContinuousPlotDistMixin):\n \"\"\"Plot the pdf using samples from the dirichlet distribution.\"\"\"\n\n def plot_pdf(\n self, ax: Optional[plt.Axes] = None, samples: int = 1_000, **kwargs\n ) -> plt.Axes:\n \"\"\"Plots the pdf\"\"\"\n distribution_samples = self.dist.rvs(size=samples)\n\n ax = self._settle_axis(ax=ax)\n xx = self._create_x_values()\n\n for x in distribution_samples.T:\n kde = gaussian_kde(x)\n\n yy = kde(xx)\n\n ax.plot(xx, yy, **kwargs)\n\n self._setup_labels(ax=ax)\n return ax\n "},{"location":"mixins/#conjugate.plot.DirichletPlotDistMixin.plot_pdf","title":"plot_pdf(ax=None, samples=1000, **kwargs) ","text":"Plots the pdf Source code in conjugate/plot.py def plot_pdf(\n self, ax: Optional[plt.Axes] = None, samples: int = 1_000, **kwargs\n) -> plt.Axes:\n \"\"\"Plots the pdf\"\"\"\n distribution_samples = self.dist.rvs(size=samples)\n\n ax = self._settle_axis(ax=ax)\n xx = self._create_x_values()\n\n for x in distribution_samples.T:\n kde = gaussian_kde(x)\n\n yy = kde(xx)\n\n ax.plot(xx, yy, **kwargs)\n\n self._setup_labels(ax=ax)\n return ax\n "},{"location":"mixins/#conjugate.plot.DiscretePlotMixin","title":"DiscretePlotMixin ","text":" Bases: PlotDistMixin Adding the plot_pmf method to class. Source code in conjugate/plot.py class DiscretePlotMixin(PlotDistMixin):\n \"\"\"Adding the plot_pmf method to class.\"\"\"\n\n def plot_pmf(\n self, ax: Optional[plt.Axes] = None, mark: str = \"o-\", **kwargs\n ) -> plt.Axes:\n ax = self._settle_axis(ax=ax)\n\n x = self._create_x_values()\n x = self._reshape_x_values(x)\n\n return self._create_plot_on_axis(x, ax, mark, **kwargs)\n\n def _create_x_values(self) -> np.ndarray:\n return np.arange(self.min_value, self.max_value + 1, 1)\n\n def _create_plot_on_axis(\n self, x, ax, mark, conditional: bool = False, **kwargs\n ) -> plt.Axes:\n yy = self.dist.pmf(x)\n if conditional:\n yy = yy / np.sum(yy)\n ylabel = f\"Conditional Probability $f(x|{self.min_value} \\\\leq x \\\\leq {self.max_value})$\"\n else:\n ylabel = \"Probability $f(x)$\"\n\n if \"label\" in kwargs:\n label = kwargs.pop(\"label\")\n label = resolve_label(label, yy)\n else:\n label = None\n\n ax.plot(x, yy, mark, label=label, **kwargs)\n\n if self.max_value - self.min_value < 15:\n ax.set_xticks(x.ravel())\n else:\n ax.set_xticks(x.ravel(), minor=True)\n ax.set_xticks(x[::5].ravel())\n\n ax.set_xlabel(\"Domain\")\n ax.set_ylabel(ylabel)\n ax.set_ylim(0, None)\n return ax\n "},{"location":"mixins/#conjugate.plot.PlotDistMixin","title":"PlotDistMixin ","text":"Base mixin in order to support plotting. Requires the dist attribute of the scipy distribution. Source code in conjugate/plot.py class PlotDistMixin:\n \"\"\"Base mixin in order to support plotting. Requires the dist attribute of the scipy distribution.\"\"\"\n\n @property\n def dist(self) -> Distribution:\n raise NotImplementedError(\"Implement this property in the subclass.\")\n\n @property\n def max_value(self) -> float:\n if not hasattr(self, \"_max_value\"):\n raise ValueError(\"Set the max value before plotting.\")\n\n return self._max_value\n\n @max_value.setter\n def max_value(self, value: float) -> None:\n self._max_value = value\n\n def set_max_value(self, value: float) -> \"PlotDistMixin\":\n self.max_value = value\n\n return self\n\n @property\n def min_value(self) -> float:\n if not hasattr(self, \"_min_value\"):\n self._min_value = 0.0\n\n return self._min_value\n\n @min_value.setter\n def min_value(self, value: float) -> None:\n self._min_value = value\n\n def set_min_value(self, value: float) -> \"PlotDistMixin\":\n \"\"\"Set the minimum value for plotting.\"\"\"\n self.min_value = value\n\n return self\n\n def set_bounds(self, lower: float, upper: float) -> \"PlotDistMixin\":\n \"\"\"Set both the min and max values for plotting.\"\"\"\n return self.set_min_value(lower).set_max_value(upper)\n\n def _reshape_x_values(self, x: np.ndarray) -> np.ndarray:\n \"\"\"Make sure that the values are ready for plotting.\"\"\"\n for value in asdict(self).values():\n if not isinstance(value, float):\n return x[:, None]\n\n return x\n\n def _settle_axis(self, ax: Optional[plt.Axes] = None) -> plt.Axes:\n return ax if ax is not None else plt.gca()\n "},{"location":"mixins/#conjugate.plot.PlotDistMixin.set_bounds","title":"set_bounds(lower, upper) ","text":"Set both the min and max values for plotting. Source code in conjugate/plot.py def set_bounds(self, lower: float, upper: float) -> \"PlotDistMixin\":\n \"\"\"Set both the min and max values for plotting.\"\"\"\n return self.set_min_value(lower).set_max_value(upper)\n "},{"location":"mixins/#conjugate.plot.PlotDistMixin.set_min_value","title":"set_min_value(value) ","text":"Set the minimum value for plotting. Source code in conjugate/plot.py def set_min_value(self, value: float) -> \"PlotDistMixin\":\n \"\"\"Set the minimum value for plotting.\"\"\"\n self.min_value = value\n\n return self\n "},{"location":"mixins/#conjugate.plot.resolve_label","title":"resolve_label(label, yy) ","text":"https://stackoverflow.com/questions/73662931/matplotlib-plot-a-numpy-array-as-many-lines-with-a-single-label Source code in conjugate/plot.py def resolve_label(label: LABEL_INPUT, yy: np.ndarray):\n \"\"\"\n\n https://stackoverflow.com/questions/73662931/matplotlib-plot-a-numpy-array-as-many-lines-with-a-single-label\n \"\"\"\n if yy.ndim == 1:\n return label\n\n ncols = yy.shape[1]\n if ncols != 1:\n if isinstance(label, str):\n return [f\"{label} {i}\" for i in range(1, ncols + 1)]\n\n if callable(label):\n return [label(i) for i in range(ncols)]\n\n if isinstance(label, Iterable):\n return label\n\n raise ValueError(\"Label must be a string, iterable, or callable.\")\n\n return label\n "},{"location":"mixins/#conjugate.slice.SliceMixin","title":"SliceMixin ","text":"Mixin in order to slice the parameters Source code in conjugate/slice.py class SliceMixin:\n \"\"\"Mixin in order to slice the parameters\"\"\"\n\n def __getitem__(self, key):\n params = asdict(self)\n\n def slice(value, key):\n try:\n return value[key]\n except Exception:\n return value\n\n new_params = {k: slice(value=v, key=key) for k, v in params.items()}\n\n return self.__class__(**new_params)\n "},{"location":"models/","title":"Models","text":"For more on these models, check out the Conjugate Prior Wikipedia Table Below are the supported models "},{"location":"models/#conjugate.models.binomial_beta","title":"binomial_beta(n, x, beta_prior) ","text":"Posterior distribution for a binomial likelihood with a beta prior. Parameters: Name Type Description Default n NUMERIC total number of trials required x NUMERIC sucesses from that trials required beta_prior Beta Beta distribution prior required Returns: Type Description Beta Beta distribution posterior Source code in conjugate/models.py def binomial_beta(n: NUMERIC, x: NUMERIC, beta_prior: Beta) -> Beta:\n \"\"\"Posterior distribution for a binomial likelihood with a beta prior.\n\n Args:\n n: total number of trials\n x: sucesses from that trials\n beta_prior: Beta distribution prior\n\n Returns:\n Beta distribution posterior\n\n \"\"\"\n alpha_post, beta_post = get_binomial_beta_posterior_params(\n beta_prior.alpha, beta_prior.beta, n, x\n )\n\n return Beta(alpha=alpha_post, beta=beta_post)\n "},{"location":"models/#conjugate.models.binomial_beta_posterior_predictive","title":"binomial_beta_posterior_predictive(n, beta) ","text":"Posterior predictive distribution for a binomial likelihood with a beta prior. Parameters: Name Type Description Default n NUMERIC number of trials required beta Beta Beta distribution required Returns: Type Description BetaBinomial BetaBinomial posterior predictive distribution Source code in conjugate/models.py def binomial_beta_posterior_predictive(n: NUMERIC, beta: Beta) -> BetaBinomial:\n \"\"\"Posterior predictive distribution for a binomial likelihood with a beta prior.\n\n Args:\n n: number of trials\n beta: Beta distribution\n\n Returns:\n BetaBinomial posterior predictive distribution\n\n \"\"\"\n return BetaBinomial(n=n, alpha=beta.alpha, beta=beta.beta)\n "},{"location":"models/#conjugate.models.categorical_dirichlet","title":"categorical_dirichlet(x, dirichlet_prior) ","text":"Posterior distribution of Categorical model with Dirichlet prior. Source code in conjugate/models.py def categorical_dirichlet(x: NUMERIC, dirichlet_prior: Dirichlet) -> Dirichlet:\n \"\"\"Posterior distribution of Categorical model with Dirichlet prior.\"\"\"\n alpha_post = get_dirichlet_posterior_params(dirichlet_prior.alpha, x)\n\n return Dirichlet(alpha=alpha_post)\n "},{"location":"models/#conjugate.models.exponetial_gamma","title":"exponetial_gamma(x_total, n, gamma_prior) ","text":"Posterior distribution for an exponential likelihood with a gamma prior Source code in conjugate/models.py def exponetial_gamma(x_total: NUMERIC, n: NUMERIC, gamma_prior: Gamma) -> Gamma:\n \"\"\"Posterior distribution for an exponential likelihood with a gamma prior\"\"\"\n alpha_post, beta_post = get_exponential_gamma_posterior_params(\n alpha=gamma_prior.alpha, beta=gamma_prior.beta, x_total=x_total, n=n\n )\n\n return Gamma(alpha=alpha_post, beta=beta_post)\n "},{"location":"models/#conjugate.models.geometric_beta","title":"geometric_beta(x_total, n, beta_prior, one_start=True) ","text":"Posterior distribution for a geometric likelihood with a beta prior. Parameters: Name Type Description Default x_total sum of all trials outcomes required n total number of trials required beta_prior Beta Beta distribution prior required one_start bool whether to outcomes start at 1, defaults to True. False is 0 start. True Returns: Type Description Beta Beta distribution posterior Source code in conjugate/models.py def geometric_beta(x_total, n, beta_prior: Beta, one_start: bool = True) -> Beta:\n \"\"\"Posterior distribution for a geometric likelihood with a beta prior.\n\n Args:\n x_total: sum of all trials outcomes\n n: total number of trials\n beta_prior: Beta distribution prior\n one_start: whether to outcomes start at 1, defaults to True. False is 0 start.\n\n Returns:\n Beta distribution posterior\n\n \"\"\"\n alpha_post = beta_prior.alpha + n\n beta_post = beta_prior.beta + x_total\n\n if one_start:\n beta_post = beta_post - n\n\n return Beta(alpha=alpha_post, beta=beta_post)\n "},{"location":"models/#conjugate.models.linear_regression","title":"linear_regression(X, y, normal_inverse_gamma_prior, inv=np.linalg.inv) ","text":"Posterior distribution for a linear regression model with a normal inverse gamma prior. Derivation taken from this blog here. Parameters: Name Type Description Default X NUMERIC design matrix required y NUMERIC response vector required normal_inverse_gamma_prior NormalInverseGamma NormalInverseGamma prior required inv function to invert matrix, defaults to np.linalg.inv inv Returns: Type Description NormalInverseGamma NormalInverseGamma posterior distribution Source code in conjugate/models.py def linear_regression(\n X: NUMERIC,\n y: NUMERIC,\n normal_inverse_gamma_prior: NormalInverseGamma,\n inv=np.linalg.inv,\n) -> NormalInverseGamma:\n \"\"\"Posterior distribution for a linear regression model with a normal inverse gamma prior.\n\n Derivation taken from this blog [here](https://gregorygundersen.com/blog/2020/02/04/bayesian-linear-regression/).\n\n Args:\n X: design matrix\n y: response vector\n normal_inverse_gamma_prior: NormalInverseGamma prior\n inv: function to invert matrix, defaults to np.linalg.inv\n\n Returns:\n NormalInverseGamma posterior distribution\n\n \"\"\"\n N = X.shape[0]\n\n delta = inv(normal_inverse_gamma_prior.delta_inverse)\n\n delta_post = (X.T @ X) + delta\n delta_post_inverse = inv(delta_post)\n\n mu_post = (\n # (B, B)\n delta_post_inverse\n # (B, 1)\n # (B, B) * (B, 1) + (B, N) * (N, 1)\n @ (delta @ normal_inverse_gamma_prior.mu + X.T @ y)\n )\n\n alpha_post = normal_inverse_gamma_prior.alpha + (0.5 * N)\n beta_post = normal_inverse_gamma_prior.beta + (\n 0.5\n * (\n (y.T @ y)\n # (1, B) * (B, B) * (B, 1)\n + (normal_inverse_gamma_prior.mu.T @ delta @ normal_inverse_gamma_prior.mu)\n # (1, B) * (B, B) * (B, 1)\n - (mu_post.T @ delta_post @ mu_post)\n )\n )\n\n return NormalInverseGamma(\n mu=mu_post, delta_inverse=delta_post_inverse, alpha=alpha_post, beta=beta_post\n )\n "},{"location":"models/#conjugate.models.linear_regression_posterior_predictive","title":"linear_regression_posterior_predictive(normal_inverse_gamma, X, eye=np.eye) ","text":"Posterior predictive distribution for a linear regression model with a normal inverse gamma prior. Source code in conjugate/models.py def linear_regression_posterior_predictive(\n normal_inverse_gamma: NormalInverseGamma, X: NUMERIC, eye=np.eye\n) -> MultivariateStudentT:\n \"\"\"Posterior predictive distribution for a linear regression model with a normal inverse gamma prior.\"\"\"\n mu = X @ normal_inverse_gamma.mu\n sigma = (normal_inverse_gamma.beta / normal_inverse_gamma.alpha) * (\n eye(X.shape[0]) + (X @ normal_inverse_gamma.delta_inverse @ X.T)\n )\n nu = 2 * normal_inverse_gamma.alpha\n\n return MultivariateStudentT(\n mu=mu,\n sigma=sigma,\n nu=nu,\n )\n "},{"location":"models/#conjugate.models.multinomial_dirichlet","title":"multinomial_dirichlet(x, dirichlet_prior) ","text":"Posterior distribution of Multinomial model with Dirichlet prior. Parameters: Name Type Description Default x NUMERIC counts required dirichlet_prior Dirichlet Dirichlet prior on the counts required Returns: Type Description Dirichlet Dirichlet posterior distribution Source code in conjugate/models.py def multinomial_dirichlet(x: NUMERIC, dirichlet_prior: Dirichlet) -> Dirichlet:\n \"\"\"Posterior distribution of Multinomial model with Dirichlet prior.\n\n Args:\n x: counts\n dirichlet_prior: Dirichlet prior on the counts\n\n Returns:\n Dirichlet posterior distribution\n\n \"\"\"\n alpha_post = get_dirichlet_posterior_params(dirichlet_prior.alpha, x)\n\n return Dirichlet(alpha=alpha_post)\n "},{"location":"models/#conjugate.models.negative_binomial_beta","title":"negative_binomial_beta(r, n, x, beta_prior) ","text":"Posterior distribution for a negative binomial likelihood with a beta prior. Args: Source code in conjugate/models.py def negative_binomial_beta(r, n, x, beta_prior: Beta) -> Beta:\n \"\"\"Posterior distribution for a negative binomial likelihood with a beta prior.\n\n Args:\n\n \"\"\"\n alpha_post = beta_prior.alpha + (r * n)\n beta_post = beta_prior.beta + x\n\n return Beta(alpha=alpha_post, beta=beta_post)\n "},{"location":"models/#conjugate.models.negative_binomial_beta_posterior_predictive","title":"negative_binomial_beta_posterior_predictive(r, beta) ","text":"Posterior predictive distribution for a negative binomial likelihood with a beta prior Source code in conjugate/models.py def negative_binomial_beta_posterior_predictive(r, beta: Beta) -> BetaNegativeBinomial:\n \"\"\"Posterior predictive distribution for a negative binomial likelihood with a beta prior\"\"\"\n return BetaNegativeBinomial(r=r, alpha=beta.alpha, beta=beta.beta)\n "},{"location":"models/#conjugate.models.normal_known_mean","title":"normal_known_mean(x_total, x2_total, n, mu, inverse_gamma_prior) ","text":"Posterior distribution for a normal likelihood with a known mean and a variance prior. Parameters: Name Type Description Default x_total NUMERIC sum of all outcomes required x2_total NUMERIC sum of all outcomes squared required n NUMERIC total number of samples in x_total required mu NUMERIC known mean required inverse_gamma_prior InverseGamma InverseGamma prior for variance required Returns: Type Description InverseGamma InverseGamma posterior distribution for the variance Source code in conjugate/models.py def normal_known_mean(\n x_total: NUMERIC,\n x2_total: NUMERIC,\n n: NUMERIC,\n mu: NUMERIC,\n inverse_gamma_prior: InverseGamma,\n) -> InverseGamma:\n \"\"\"Posterior distribution for a normal likelihood with a known mean and a variance prior.\n\n Args:\n x_total: sum of all outcomes\n x2_total: sum of all outcomes squared\n n: total number of samples in x_total\n mu: known mean\n inverse_gamma_prior: InverseGamma prior for variance\n\n Returns:\n InverseGamma posterior distribution for the variance\n\n \"\"\"\n alpha_post = inverse_gamma_prior.alpha + (n / 2)\n beta_post = inverse_gamma_prior.beta + (\n 0.5 * (x2_total - (2 * mu * x_total) + (n * (mu**2)))\n )\n\n return InverseGamma(alpha=alpha_post, beta=beta_post)\n "},{"location":"models/#conjugate.models.normal_known_mean_posterior_predictive","title":"normal_known_mean_posterior_predictive(mu, inverse_gamma) ","text":"Posterior predictive distribution for a normal likelihood with a known mean and a variance prior. Parameters: Name Type Description Default mu NUMERIC known mean required inverse_gamma InverseGamma InverseGamma prior required Returns: Type Description StudentT StudentT posterior predictive distribution Source code in conjugate/models.py def normal_known_mean_posterior_predictive(\n mu: NUMERIC, inverse_gamma: InverseGamma\n) -> StudentT:\n \"\"\"Posterior predictive distribution for a normal likelihood with a known mean and a variance prior.\n\n Args:\n mu: known mean\n inverse_gamma: InverseGamma prior\n\n Returns:\n StudentT posterior predictive distribution\n\n \"\"\"\n return StudentT(\n n=2 * inverse_gamma.alpha,\n mu=mu,\n sigma=(inverse_gamma.beta / inverse_gamma.alpha) ** 0.5,\n )\n "},{"location":"models/#conjugate.models.poisson_gamma","title":"poisson_gamma(x_total, n, gamma_prior) ","text":"Posterior distribution for a poisson likelihood with a gamma prior Source code in conjugate/models.py def poisson_gamma(x_total: NUMERIC, n: NUMERIC, gamma_prior: Gamma) -> Gamma:\n \"\"\"Posterior distribution for a poisson likelihood with a gamma prior\"\"\"\n alpha_post, beta_post = get_poisson_gamma_posterior_params(\n alpha=gamma_prior.alpha, beta=gamma_prior.beta, x_total=x_total, n=n\n )\n\n return Gamma(alpha=alpha_post, beta=beta_post)\n "},{"location":"models/#conjugate.models.poisson_gamma_posterior_predictive","title":"poisson_gamma_posterior_predictive(gamma, n=1) ","text":"Posterior predictive distribution for a poisson likelihood with a gamma prior Parameters: Name Type Description Default gamma Gamma Gamma distribution required n NUMERIC Number of trials for each sample, defaults to 1. Can be used to scale the distributions to a different unit of time. 1 Returns: Type Description NegativeBinomial NegativeBinomial distribution related to posterior predictive Source code in conjugate/models.py def poisson_gamma_posterior_predictive(\n gamma: Gamma, n: NUMERIC = 1\n) -> NegativeBinomial:\n \"\"\"Posterior predictive distribution for a poisson likelihood with a gamma prior\n\n Args:\n gamma: Gamma distribution\n n: Number of trials for each sample, defaults to 1.\n Can be used to scale the distributions to a different unit of time.\n\n Returns:\n NegativeBinomial distribution related to posterior predictive\n\n \"\"\"\n n = n * gamma.alpha\n p = gamma.beta / (1 + gamma.beta)\n\n return NegativeBinomial(n=n, p=p)\n "},{"location":"examples/bayesian-update/","title":"Bayesian Update","text":"Easy to use Bayesian inference incrementally by making the posterior the prior for the next update. import numpy as np\nimport matplotlib.pyplot as plt\n\nfrom conjugate.distributions import NormalInverseGamma\nfrom conjugate.models import linear_regression\n\ndef create_sampler(mu, sigma, rng): \n \"\"\"Generate a sampler from a normal distribution with mean `mu` and standard deviation `sigma`.\"\"\"\n def sample(n: int): \n return rng.normal(loc=mu, scale=sigma, size=n)\n\n return sample\n\n\nmu = 5.0\nsigma = 2.5\nrng = np.random.default_rng(0)\nsample = create_sampler(mu=mu, sigma=sigma, rng=rng)\n\n\nprior = NormalInverseGamma(\n mu=np.array([0]), \n delta_inverse=np.array([[1]]), \n alpha=1, beta=1, \n)\n\n\ncumsum = 0\nbatch_sizes = [5, 10, 25]\nax = plt.gca()\nfor batch_size in batch_sizes:\n y = sample(n=batch_size)\n X = np.ones_like(y)[:, None]\n\n posterior = linear_regression(X, y, prior)\n beta_samples, variance_samples = posterior.sample_beta(size=1000, return_variance=True, random_state=rng)\n\n cumsum += batch_size\n label = f\"n={cumsum}\"\n ax.scatter(variance_samples ** 0.5, beta_samples, alpha=0.25, label=label)\n\n prior = posterior \n\nax.scatter(sigma, mu, color=\"black\", label=\"true\")\nax.set(\n xlabel=\"$\\sigma$\", \n ylabel=\"$\\mu$\", \n xlim=(0, None), \n ylim=(0, None), \n title=\"Updated posterior samples of $\\mu$ and $\\sigma$\"\n)\nax.legend()\n\nplt.show()\n "},{"location":"examples/binomial/","title":"Binomial Model","text":"from conjugate.distributions import Beta, Binomial, BetaBinomial\nfrom conjugate.models import binomial_beta, binomial_beta_posterior_predictive\n\nimport matplotlib.pyplot as plt\n\nN = 10\ntrue_dist = Binomial(n=N, p=0.5)\n\n# Observed Data\nX = true_dist.dist.rvs(size=1, random_state=42)\n\n# Conjugate prior\nprior = Beta(alpha=1, beta=1)\nposterior: Beta = binomial_beta(n=N, x=X, beta_prior=prior)\n\n# Comparison\nprior_predictive: BetaBinomial = binomial_beta_posterior_predictive(n=N, beta=prior)\nposterior_predictive: BetaBinomial = binomial_beta_posterior_predictive(n=N, beta=posterior)\n\n# Figure \nfig, axes = plt.subplots(ncols=2, nrows=1, figsize=(8, 4))\n\nax: plt.Axes = axes[0]\nposterior.plot_pdf(ax=ax, label=\"posterior\")\nprior.plot_pdf(ax=ax, label=\"prior\")\nax.axvline(x=X/N, color=\"black\", ymax=0.05, label=\"MLE\")\nax.axvline(x=true_dist.p, color=\"black\", ymax=0.05, linestyle=\"--\", label=\"True\")\nax.set_title(\"Success Rate\")\nax.legend()\n\nax: plt.Axes = axes[1]\ntrue_dist.plot_pmf(ax=ax, label=\"true distribution\", color=\"C2\")\nposterior_predictive.plot_pmf(ax=ax, label=\"posterior predictive\")\nprior_predictive.plot_pmf(ax=ax, label=\"prior predictive\")\nax.axvline(x=X, color=\"black\", ymax=0.05, label=\"Sample\")\nax.set_title(\"Number of Successes\")\nax.legend()\n\nplt.show()\n "},{"location":"examples/generalized-inputs/","title":"Generalized Numerical Inputs","text":"Though the plotting is meant for numpy and python numbers, the conjugate models work with anything that works like numbers. For instance, Bayesian models in SQL using the SQL Builder, PyPika from pypika import Field \n\n# Columns from table in database\nN = Field(\"total\")\nX = Field(\"successes\")\n\n# Conjugate prior\nprior = Beta(alpha=1, beta=1)\nposterior = binomial_beta(n=N, x=X, beta_prior=prior)\n\nprint(\"Posterior alpha:\", posterior.alpha)\nprint(\"Posterior beta:\", posterior.beta)\n# Posterior alpha: 1+\"successes\"\n# Posterior beta: 1+\"total\"-\"successes\"\n\n# Priors can be fields too\nalpha = Field(\"previous_successes\") - 1\nbeta = Field(\"previous_failures\") - 1\n\nprior = Beta(alpha=alpha, beta=beta)\nposterior = binomial_beta(n=N, x=X, beta_prior=prior)\n\nprint(\"Posterior alpha:\", posterior.alpha)\nprint(\"Posterior beta:\", posterior.beta)\n# Posterior alpha: \"previous_successes\"-1+\"successes\"\n# Posterior beta: \"previous_failures\"-1+\"total\"-\"successes\"\n Using PyMC distributions for sampling with additional uncertainty import pymc as pm \n\nalpha = pm.Gamma.dist(alpha=1, beta=20)\nbeta = pm.Gamma.dist(alpha=1, beta=20)\n\n# Observed Data\nN = 10\nX = 4\n\n# Conjugate prior \nprior = Beta(alpha=alpha, beta=beta)\nposterior = binomial_beta(n=N, x=X, beta_prior=prior)\n\n# Reconstruct the posterior distribution with PyMC\nprior_dist = pm.Beta.dist(alpha=prior.alpha, beta=prior.beta)\nposterior_dist = pm.Beta.dist(alpha=posterior.alpha, beta=posterior.beta)\n\nsamples = pm.draw([alpha, beta, prior_dist, posterior_dist], draws=1000)\n "},{"location":"examples/indexing/","title":"Indexing Parameters","text":"The distributions can be indexed for subsets. beta = np.arange(1, 10)\nprior = Beta(alpha=1, beta=beta)\n\nidx = [0, 5, -1]\nprior_subset = prior[idx]\nprior_subset.plot_pdf(label = lambda i: f\"prior {i}\")\nplt.legend()\nplt.show()\n "},{"location":"examples/linear-regression/","title":"Linear Regression","text":"We can fit linear regression that includes a predictive distribution for new data using a conjugate prior. This example only has one covariate, but the same approach can be used for multiple covariates. "},{"location":"examples/linear-regression/#simulate-data","title":"Simulate Data","text":"We are going to simulate data from a linear regression model. The true intercept is 3.5, the true slope is -2.0, and the true variance is 2.5. import numpy as np\nimport pandas as pd\n\nimport matplotlib.pyplot as plt\n\nfrom conjugate.distributions import NormalInverseGamma, MultivariateStudentT\nfrom conjugate.models import linear_regression, linear_regression_posterior_predictive\n\nintercept = 3.5\nslope = -2.0\nsigma = 2.5\n\nrng = np.random.default_rng(0)\n\nx_lim = 3\nn_points = 100\nx = np.linspace(-x_lim, x_lim, n_points)\ny = intercept + slope * x + rng.normal(scale=sigma, size=n_points)\n "},{"location":"examples/linear-regression/#define-prior-and-find-posterior","title":"Define Prior and Find Posterior","text":"There needs to be a prior for the intercept, slope, and the variance. prior = NormalInverseGamma(\n mu=np.array([0, 0]),\n delta_inverse=np.array([[1, 0], [0, 1]]),\n alpha=1,\n beta=1,\n)\n\ndef create_X(x: np.ndarray) -> np.ndarray:\n return np.stack([np.ones_like(x), x]).T\n\nX = create_X(x)\nposterior: NormalInverseGamma = linear_regression(\n X=X,\n y=y,\n normal_inverse_gamma_prior=prior,\n)\n "},{"location":"examples/linear-regression/#posterior-predictive-for-new-data","title":"Posterior Predictive for New Data","text":"The multivariate student-t distribution is used for the posterior predictive distribution. We have to draw samples from it since the scipy implementation does not have a ppf method. # New Data\nx_lim_new = 1.5 * x_lim\nx_new = np.linspace(-x_lim_new, x_lim_new, 20)\nX_new = create_X(x_new)\npp: MultivariateStudentT = linear_regression_posterior_predictive(normal_inverse_gamma=posterior, X=X_new)\n\nsamples = pp.dist.rvs(5_000).T\ndf_samples = pd.DataFrame(samples, index=x_new)\n "},{"location":"examples/linear-regression/#plot-results","title":"Plot Results","text":"We can see that the posterior predictive distribution begins to widen as we move away from the data. Overall, the posterior predictive distribution is a good fit for the data. The true line is within the 95% posterior predictive interval. def plot_abline(intercept: float, slope: float, ax: plt.Axes = None, **kwargs):\n \"\"\"Plot a line from slope and intercept\"\"\"\n if ax is None:\n ax = plt.gca()\n\n x_vals = np.array(ax.get_xlim())\n y_vals = intercept + slope * x_vals\n ax.plot(x_vals, y_vals, **kwargs)\n\n\ndef plot_lines(ax: plt.Axes, samples: np.ndarray, label: str, color: str, alpha: float):\n for i, betas in enumerate(samples):\n label = label if i == 0 else None\n plot_abline(betas[0], betas[1], ax=ax, color=color, alpha=alpha, label=label)\n\n\nfig, ax = plt.subplots()\nax.set_xlim(-x_lim, x_lim)\nax.set_ylim(y.min(), y.max())\n\nax.scatter(x, y, label=\"data\")\n\nplot_lines(\n ax=ax,\n samples=prior.sample_beta(size=100, random_state=rng),\n label=\"prior\",\n color=\"blue\",\n alpha=0.05,\n)\nplot_lines(\n ax=ax,\n samples=posterior.sample_beta(size=100, random_state=rng),\n label=\"posterior\",\n color=\"black\",\n alpha=0.2,\n)\n\nplot_abline(intercept, slope, ax=ax, label=\"true\", color=\"red\")\n\nax.set(xlabel=\"x\", ylabel=\"y\", title=\"Linear regression with conjugate prior\")\n\n# New Data\nax.plot(x_new, pp.mu, color=\"green\", label=\"posterior predictive mean\")\ndf_quantile = df_samples.T.quantile([0.025, 0.975]).T\nax.fill_between(\n x_new,\n df_quantile[0.025],\n df_quantile[0.975],\n alpha=0.2,\n color=\"green\",\n label=\"95% posterior predictive interval\",\n)\nax.legend()\nax.set(xlim=(-x_lim_new, x_lim_new))\nplt.show()\n "},{"location":"examples/plotting/","title":"Plotting Distributions","text":"All the distributions can be plotted using the plot_pdf and plot_pmf methods. The plot_pdf method is used for continuous distributions and the plot_pmf method is used for discrete distributions. There is limited support for some distributions like the Dirichlet or those without a dist scipy. from conjugate.distributions import Beta, Gamma, Normal\n\nimport matplotlib.pyplot as plt\n\nbeta = Beta(1, 1)\ngamma = Gamma(1, 1)\nnormal = Normal(0, 1)\n\nbound = 3\n\ndist = [beta, gamma, normal]\nlabels = [\"beta\", \"gamma\", \"normal\"]\nax = plt.gca()\nfor label, dist in zip(labels, dist):\n dist.set_bounds(-bound, bound).plot_pdf(label=label)\n\nax.legend()\nplt.show()\n The plotting is also supported for vectorized inputs. "},{"location":"examples/pymc-sampling/","title":"Unsupported Posterior Predictive Distributions with PyMC Sampling","text":"The geometric beta model posterior predictive doesn't have a common dist, but what doesn't mean the posterior predictive can be used. For instance, PyMC can be used to fill in this gap. import pymc as pm\n\nfrom conjugate.distribution import Beta\nfrom conjugate.models import geometric_beta\n\nprior = Beta(1, 1)\nposterior: Beta = geometric_beta(x=1, n=10, beta_prior=prior)\n\nposterior_dist = pm.Beta.dist(alpha=posterior.alpha, beta=posterior.beta)\ngeometric_posterior_predictive = pm.Geometric.dist(posterior_dist)\n\nposterior_predictive_samples = pm.draw(geometric_posterior_predictive, draws=100)\n "},{"location":"examples/scaling-distributions/","title":"Scaling Distributions","text":"Some of the distributions can be scaled by a constant factor or added together. For instance, operations with Poisson distribution represent the number of events in a given time interval. from conjugate.distributions import Poisson\n\nimport matplotlib.pyplot as plt\n\ndaily_rate = 0.25\ndaily_pois = Poisson(lam=daily_rate)\n\ntwo_day_pois = daily_pois + daily_pois\nweekly_pois = 7 * daily_pois\n\nmax_value = 7\nax = plt.gca()\ndists = [daily_pois, two_day_pois, weekly_pois]\nbase_labels = [\"daily\", \"two day\", \"weekly\"]\nfor dist, base_label in zip(dists, base_labels):\n label = f\"{base_label} rate={dist.lam}\"\n dist.set_max_value(max_value).plot_pmf(ax=ax, label=label)\n\nax.legend()\nplt.show()\n The normal distribution also supports scaling making use of the fact that the variance of a scaled normal distribution is the square of the scaling factor. from conjugate.distributions import Normal\n\nimport matplotlib.pyplot as plt\n\nnorm = Normal(mu=0, sigma=1)\nnorm_times_2 = norm * 2\n\nbound = 6\nax = norm.set_bounds(-bound, bound).plot_pdf(label=f\"normal (std = {norm.sigma:.2f})\")\nnorm_times_2.set_bounds(-bound, bound).plot_pdf(ax=ax, label=f\"normal * 2 (std = {norm_times_2.sigma:.2f})\")\nax.legend()\nplt.show()\n "},{"location":"examples/scipy-connection/","title":"Connection to SciPy Distributions","text":"Many distributions have the dist attribute which is a scipy.stats distribution object. From there, the methods from scipy.stats to get the pdf, cdf, etc can be leveraged. from conjugate.distribution import Beta \n\nbeta = Beta(1, 1)\nscipy_dist = beta.dist \n\nprint(scipy_dist.mean())\n# 0.5\nprint(scipy_dist.ppf([0.025, 0.975]))\n# [0.025 0.975]\n\nsamples = scipy_dist.rvs(100)\n "},{"location":"examples/vectorized-inputs/","title":"Vectorized Inputs","text":"All data and priors will allow for vectorized assuming the shapes work for broadcasting. The plotting also supports arrays of results import numpy as np\n\nfrom conjugate.distributions import Beta\nfrom conjugate.models import binomial_beta\n\nimport matplotlib.pyplot as plt\n\n# Analytics \nprior = Beta(alpha=1, beta=np.array([1, 5]))\nposterior = binomial_beta(n=N, x=x, beta_prior=prior)\n\n# Figure\nax = prior.plot_pdf(label=lambda i: f\"prior {i}\")\nposterior.plot_pdf(ax=ax, label=lambda i: f\"posterior {i}\")\nax.axvline(x=x / N, ymax=0.05, color=\"black\", linestyle=\"--\", label=\"MLE\")\nax.legend()\nplt.show()\n "}]}
\ No newline at end of file
+{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"Conjugate Models","text":"Bayesian conjugate models in Python "},{"location":"#installation","title":"Installation","text":"pip install conjugate-models\n "},{"location":"#features","title":"Features","text":" - Connection to Scipy Distributions with
dist attribute - Built in Plotting with
plot_pdf and plot_pmf methods - Vectorized Operations for parameters and data
- Indexing Parameters for subsetting and slicing
- Generalized Numerical Inputs for inputs other than builtins and numpy arrays
- Unsupported Distributions for sampling from unsupported distributions
"},{"location":"#supported-models","title":"Supported Models","text":"Many likelihoods are supported including Bernoulli / Binomial Categorical / Multinomial Poisson Normal (including linear regression) - and many more
"},{"location":"#basic-usage","title":"Basic Usage","text":" - Define prior distribution from
distributions module - Pass data and prior into model from
models modules - Analytics with posterior and posterior predictive distributions
from conjugate.distributions import Beta, BetaBinomial\nfrom conjugate.models import binomial_beta, binomial_beta_posterior_predictive\n\n# Observed Data\nX = 4\nN = 10\n\n# Analytics\nprior = Beta(1, 1)\nprior_predictive: BetaBinomial = binomial_beta_posterior_predictive(n=N, beta=prior)\n\nposterior: Beta = binomial_beta(n=N, x=X, beta_prior=prior)\nposterior_predictive: BetaBinomial = binomial_beta_posterior_predictive(n=N, beta=posterior) \n\n# Figure\nimport matplotlib.pyplot as plt\n\nfig, axes = plt.subplots(ncols=2)\n\nax = axes[0]\nax = posterior.plot_pdf(ax=ax, label=\"posterior\")\nprior.plot_pdf(ax=ax, label=\"prior\")\nax.axvline(x=X/N, color=\"black\", ymax=0.05, label=\"MLE\")\nax.set_title(\"Success Rate\")\nax.legend()\n\nax = axes[1]\nposterior_predictive.plot_pmf(ax=ax, label=\"posterior predictive\")\nprior_predictive.plot_pmf(ax=ax, label=\"prior predictive\")\nax.axvline(x=X, color=\"black\", ymax=0.05, label=\"Sample\")\nax.set_title(\"Number of Successes\")\nax.legend()\nplt.show()\n "},{"location":"#too-simple","title":"Too Simple?","text":"Simple model, sure. Useful model, potentially. Constant probability of success, p , for n trials. rng = np.random.default_rng(42)\n\n# Observed Data\nn_times = 75\np = np.repeat(0.5, n_times)\nsamples = rng.binomial(n=1, p=p, size=n_times)\n\n# Model\nn = np.arange(n_times) + 1\nprior = Beta(alpha=1, beta=1)\nposterior = binomial_beta(n=n, x=samples.cumsum(), beta_prior=prior)\n\n# Figure\nplt.plot(n, p, color=\"black\", label=\"true p\", linestyle=\"--\")\nplt.scatter(n, samples, color=\"black\", label=\"observed samples\")\nplt.plot(n, posterior.dist.mean(), color=\"red\", label=\"posterior mean\")\n# fill between the 95% credible interval\nplt.fill_between(\n n, \n posterior.dist.ppf(0.025),\n posterior.dist.ppf(0.975),\n color=\"red\",\n alpha=0.2,\n label=\"95% credible interval\",\n)\npadding = 0.025\nplt.ylim(0 - padding, 1 + padding)\nplt.xlim(1, n_times)\nplt.legend(loc=\"best\")\nplt.xlabel(\"Number of trials\")\nplt.ylabel(\"Probability\")\nplt.show()\n Even with a moving probability, this simple to implement model can be useful. ...\n\ndef sigmoid(x):\n return 1 / (1 + np.exp(-x))\n\np_raw = rng.normal(loc=0, scale=0.2, size=n_times).cumsum()\np = sigmoid(p_raw)\n\n...\n "},{"location":"#resources","title":"Resources","text":""},{"location":"distributions/","title":"Distributions","text":"These are the supported distributions based on the conjugate models. Many have the dist attribute which is a scipy.stats distribution object. From there, you can use the methods from scipy.stats to get the pdf, cdf, etc. Distributions can be plotted using the plot_pmf or plot_pdf methods of the distribution. from conjugate.distribution import Beta \n\nbeta = Beta(1, 1)\nscipy_dist = beta.dist \n\nprint(scipy_dist.mean())\n# 0.5\nprint(scipy_dist.ppf([0.025, 0.975]))\n# [0.025 0.975]\n\nsamples = scipy_dist.rvs(100)\n\nbeta.plot_pmf(label=\"beta distribution\")\n Distributions like Poisson can be added with other Poissons or multiplied by numerical values in order to scale rate. For instance, daily_rate = 0.25\ndaily_pois = Poisson(lam=daily_rate)\n\ntwo_day_pois = daily_pois + daily_pois\nweekly_pois = 7 * daily_pois\n Below are the currently supported distributions "},{"location":"distributions/#conjugate.distributions.Beta","title":"Beta dataclass ","text":" Bases: ContinuousPlotDistMixin , SliceMixin Beta distribution. Parameters: Name Type Description Default alpha NUMERIC shape parameter required beta NUMERIC shape parameter required Source code in conjugate/distributions.py @dataclass\nclass Beta(ContinuousPlotDistMixin, SliceMixin):\n \"\"\"Beta distribution.\n\n Args:\n alpha: shape parameter\n beta: shape parameter\n\n \"\"\"\n\n alpha: NUMERIC\n beta: NUMERIC\n\n def __post_init__(self) -> None:\n self.max_value = 1.0\n\n @classmethod\n def from_mean(cls, mean: float, alpha: float) -> \"Beta\":\n \"\"\"Alternative constructor from mean and alpha.\"\"\"\n beta = get_beta_param_from_mean_and_alpha(mean=mean, alpha=alpha)\n return cls(alpha=alpha, beta=beta)\n\n @classmethod\n def from_successes_and_failures(cls, successes: int, failures: int) -> \"Beta\":\n \"\"\"Alternative constructor based on hyperparameter interpretation.\"\"\"\n alpha = successes + 1\n beta = failures + 1\n return cls(alpha=alpha, beta=beta)\n\n @property\n def dist(self):\n return stats.beta(self.alpha, self.beta)\n "},{"location":"distributions/#conjugate.distributions.Beta.from_mean","title":"from_mean(mean, alpha) classmethod ","text":"Alternative constructor from mean and alpha. Source code in conjugate/distributions.py @classmethod\ndef from_mean(cls, mean: float, alpha: float) -> \"Beta\":\n \"\"\"Alternative constructor from mean and alpha.\"\"\"\n beta = get_beta_param_from_mean_and_alpha(mean=mean, alpha=alpha)\n return cls(alpha=alpha, beta=beta)\n "},{"location":"distributions/#conjugate.distributions.Beta.from_successes_and_failures","title":"from_successes_and_failures(successes, failures) classmethod ","text":"Alternative constructor based on hyperparameter interpretation. Source code in conjugate/distributions.py @classmethod\ndef from_successes_and_failures(cls, successes: int, failures: int) -> \"Beta\":\n \"\"\"Alternative constructor based on hyperparameter interpretation.\"\"\"\n alpha = successes + 1\n beta = failures + 1\n return cls(alpha=alpha, beta=beta)\n "},{"location":"distributions/#conjugate.distributions.BetaBinomial","title":"BetaBinomial dataclass ","text":" Bases: DiscretePlotMixin , SliceMixin Beta binomial distribution. Parameters: Name Type Description Default n NUMERIC number of trials required alpha NUMERIC shape parameter required beta NUMERIC shape parameter required Source code in conjugate/distributions.py @dataclass\nclass BetaBinomial(DiscretePlotMixin, SliceMixin):\n \"\"\"Beta binomial distribution.\n\n Args:\n n: number of trials\n alpha: shape parameter\n beta: shape parameter\n\n \"\"\"\n\n n: NUMERIC\n alpha: NUMERIC\n beta: NUMERIC\n\n def __post_init__(self):\n if isinstance(self.n, np.ndarray):\n self.max_value = self.n.max()\n else:\n self.max_value = self.n\n\n @property\n def dist(self):\n return stats.betabinom(self.n, self.alpha, self.beta)\n "},{"location":"distributions/#conjugate.distributions.BetaNegativeBinomial","title":"BetaNegativeBinomial dataclass ","text":" Bases: SliceMixin Beta negative binomial distribution. Parameters: Name Type Description Default n NUMERIC number of successes required alpha NUMERIC shape parameter required Source code in conjugate/distributions.py @dataclass\nclass BetaNegativeBinomial(SliceMixin):\n \"\"\"Beta negative binomial distribution.\n\n Args:\n n: number of successes\n alpha: shape parameter\n\n \"\"\"\n\n n: NUMERIC\n alpha: NUMERIC\n beta: NUMERIC\n "},{"location":"distributions/#conjugate.distributions.Binomial","title":"Binomial dataclass ","text":" Bases: DiscretePlotMixin , SliceMixin Binomial distribution. Parameters: Name Type Description Default n NUMERIC number of trials required p NUMERIC probability of success required Source code in conjugate/distributions.py @dataclass\nclass Binomial(DiscretePlotMixin, SliceMixin):\n \"\"\"Binomial distribution.\n\n Args:\n n: number of trials\n p: probability of success\n\n \"\"\"\n\n n: NUMERIC\n p: NUMERIC\n\n def __post_init__(self):\n if isinstance(self.n, np.ndarray):\n self.max_value = self.n.max()\n else:\n self.max_value = self.n\n\n @property\n def dist(self):\n return stats.binom(n=self.n, p=self.p)\n "},{"location":"distributions/#conjugate.distributions.Dirichlet","title":"Dirichlet dataclass ","text":" Bases: DirichletPlotDistMixin Dirichlet distribution. Parameters: Name Type Description Default alpha NUMERIC shape parameter required Source code in conjugate/distributions.py @dataclass\nclass Dirichlet(DirichletPlotDistMixin):\n \"\"\"Dirichlet distribution.\n\n Args:\n alpha: shape parameter\n\n \"\"\"\n\n alpha: NUMERIC\n\n def __post_init__(self) -> None:\n self.max_value = 1.0\n\n @property\n def dist(self):\n if self.alpha.ndim == 1:\n return stats.dirichlet(self.alpha)\n\n return VectorizedDist(self.alpha, dist=stats.dirichlet)\n "},{"location":"distributions/#conjugate.distributions.Exponential","title":"Exponential dataclass ","text":" Bases: ContinuousPlotDistMixin , SliceMixin Exponential distribution. Parameters: Name Type Description Default lam NUMERIC rate parameter required Source code in conjugate/distributions.py @dataclass\nclass Exponential(ContinuousPlotDistMixin, SliceMixin):\n \"\"\"Exponential distribution.\n\n Args:\n lam: rate parameter\n\n \"\"\"\n\n lam: NUMERIC\n\n @property\n def dist(self):\n return stats.expon(scale=self.lam)\n\n def __mul__(self, other):\n return Gamma(alpha=other, beta=1 / self.lam)\n\n __rmul__ = __mul__\n "},{"location":"distributions/#conjugate.distributions.Gamma","title":"Gamma dataclass ","text":" Bases: ContinuousPlotDistMixin , SliceMixin Gamma distribution. Gamma Distribution Scipy Docmentation Parameters: Name Type Description Default alpha NUMERIC shape parameter required beta NUMERIC rate parameter required Source code in conjugate/distributions.py @dataclass\nclass Gamma(ContinuousPlotDistMixin, SliceMixin):\n \"\"\"Gamma distribution.\n\n <a href=https://en.wikipedia.org/wiki/Gamma_distribution>Gamma Distribution</a>\n <a href=https://docs.scipy.org/doc/scipy/reference/generated/scipy.stats.gamma.html>Scipy Docmentation</a>\n\n Args:\n alpha: shape parameter\n beta: rate parameter\n \"\"\"\n\n alpha: NUMERIC\n beta: NUMERIC\n\n @property\n def dist(self):\n return stats.gamma(a=self.alpha, scale=1 / self.beta)\n\n def __mul__(self, other):\n return Gamma(alpha=self.alpha * other, beta=self.beta)\n\n __rmul__ = __mul__\n "},{"location":"distributions/#conjugate.distributions.Geometric","title":"Geometric dataclass ","text":" Bases: DiscretePlotMixin , SliceMixin Geometric distribution. Parameters: Name Type Description Default p NUMERIC probability of success required Source code in conjugate/distributions.py @dataclass\nclass Geometric(DiscretePlotMixin, SliceMixin):\n \"\"\"Geometric distribution.\n\n Args:\n p: probability of success\n\n \"\"\"\n\n p: NUMERIC\n\n @property\n def dist(self):\n return stats.geom(self.p)\n "},{"location":"distributions/#conjugate.distributions.InverseGamma","title":"InverseGamma dataclass ","text":" Bases: ContinuousPlotDistMixin , SliceMixin InverseGamma distribution. Parameters: Name Type Description Default alpha NUMERIC shape required beta NUMERIC scale required Source code in conjugate/distributions.py @dataclass\nclass InverseGamma(ContinuousPlotDistMixin, SliceMixin):\n \"\"\"InverseGamma distribution.\n\n Args:\n alpha: shape\n beta: scale\n\n \"\"\"\n\n alpha: NUMERIC\n beta: NUMERIC\n\n @property\n def dist(self):\n return stats.invgamma(a=self.alpha, scale=self.beta)\n "},{"location":"distributions/#conjugate.distributions.NegativeBinomial","title":"NegativeBinomial dataclass ","text":" Bases: DiscretePlotMixin , SliceMixin Negative binomial distribution. Parameters: Name Type Description Default n NUMERIC number of successes required p NUMERIC probability of success required Source code in conjugate/distributions.py @dataclass\nclass NegativeBinomial(DiscretePlotMixin, SliceMixin):\n \"\"\"Negative binomial distribution.\n\n Args:\n n: number of successes\n p: probability of success\n\n \"\"\"\n\n n: NUMERIC\n p: NUMERIC\n\n @property\n def dist(self):\n return stats.nbinom(n=self.n, p=self.p)\n\n def __mul__(self, other):\n return NegativeBinomial(n=self.n * other, p=self.p)\n\n __rmul__ = __mul__\n "},{"location":"distributions/#conjugate.distributions.Normal","title":"Normal dataclass ","text":" Bases: ContinuousPlotDistMixin , SliceMixin Normal distribution. Parameters: Name Type Description Default mu NUMERIC mean required sigma NUMERIC standard deviation required Source code in conjugate/distributions.py @dataclass\nclass Normal(ContinuousPlotDistMixin, SliceMixin):\n \"\"\"Normal distribution.\n\n Args:\n mu: mean\n sigma: standard deviation\n\n \"\"\"\n\n mu: NUMERIC\n sigma: NUMERIC\n\n @property\n def dist(self):\n return stats.norm(self.mu, self.sigma)\n\n def __mul__(self, other):\n sigma = ((self.sigma**2) * other) ** 0.5\n return Normal(mu=self.mu * other, sigma=sigma)\n\n __rmul__ = __mul__\n "},{"location":"distributions/#conjugate.distributions.NormalInverseGamma","title":"NormalInverseGamma dataclass ","text":"Normal inverse gamma distribution. Parameters: Name Type Description Default mu NUMERIC mean required delta_inverse NUMERIC covariance matrix required alpha NUMERIC shape required beta NUMERIC scale required Source code in conjugate/distributions.py @dataclass\nclass NormalInverseGamma:\n \"\"\"Normal inverse gamma distribution.\n\n Args:\n mu: mean\n delta_inverse: covariance matrix\n alpha: shape\n beta: scale\n\n \"\"\"\n\n mu: NUMERIC\n delta_inverse: NUMERIC\n alpha: NUMERIC\n beta: NUMERIC\n\n @classmethod\n def from_inverse_gamma(\n cls, mu: NUMERIC, delta_inverse: NUMERIC, inverse_gamma: InverseGamma\n ) -> \"NormalInverseGamma\":\n return cls(\n mu=mu,\n delta_inverse=delta_inverse,\n alpha=inverse_gamma.alpha,\n beta=inverse_gamma.beta,\n )\n\n @property\n def inverse_gamma(self) -> InverseGamma:\n return InverseGamma(alpha=self.alpha, beta=self.beta)\n\n def sample_variance(self, size: int, random_state=None) -> NUMERIC:\n \"\"\"Sample variance from the inverse gamma distribution.\n\n Args:\n size: number of samples\n random_state: random state\n\n Returns:\n samples from the inverse gamma distribution\n\n \"\"\"\n return self.inverse_gamma.dist.rvs(size=size, random_state=random_state)\n\n def sample_beta(\n self, size: int, return_variance: bool = False, random_state=None\n ) -> Union[NUMERIC, Tuple[NUMERIC, NUMERIC]]:\n \"\"\"Sample beta from the normal distribution.\n\n Args:\n size: number of samples\n return_variance: whether to return variance as well\n random_state: random state\n\n Returns:\n samples from the normal distribution and optionally variance\n\n \"\"\"\n variance = self.sample_variance(size=size, random_state=random_state)\n\n beta = np.stack(\n [\n stats.multivariate_normal(self.mu, v * self.delta_inverse).rvs(\n size=1, random_state=random_state\n )\n for v in variance\n ]\n )\n\n if return_variance:\n return beta, variance\n\n return beta\n "},{"location":"distributions/#conjugate.distributions.NormalInverseGamma.sample_beta","title":"sample_beta(size, return_variance=False, random_state=None) ","text":"Sample beta from the normal distribution. Parameters: Name Type Description Default size int number of samples required return_variance bool whether to return variance as well False random_state random state None Returns: Type Description Union[NUMERIC, Tuple[NUMERIC, NUMERIC]] samples from the normal distribution and optionally variance Source code in conjugate/distributions.py def sample_beta(\n self, size: int, return_variance: bool = False, random_state=None\n) -> Union[NUMERIC, Tuple[NUMERIC, NUMERIC]]:\n \"\"\"Sample beta from the normal distribution.\n\n Args:\n size: number of samples\n return_variance: whether to return variance as well\n random_state: random state\n\n Returns:\n samples from the normal distribution and optionally variance\n\n \"\"\"\n variance = self.sample_variance(size=size, random_state=random_state)\n\n beta = np.stack(\n [\n stats.multivariate_normal(self.mu, v * self.delta_inverse).rvs(\n size=1, random_state=random_state\n )\n for v in variance\n ]\n )\n\n if return_variance:\n return beta, variance\n\n return beta\n "},{"location":"distributions/#conjugate.distributions.NormalInverseGamma.sample_variance","title":"sample_variance(size, random_state=None) ","text":"Sample variance from the inverse gamma distribution. Parameters: Name Type Description Default size int number of samples required random_state random state None Returns: Type Description NUMERIC samples from the inverse gamma distribution Source code in conjugate/distributions.py def sample_variance(self, size: int, random_state=None) -> NUMERIC:\n \"\"\"Sample variance from the inverse gamma distribution.\n\n Args:\n size: number of samples\n random_state: random state\n\n Returns:\n samples from the inverse gamma distribution\n\n \"\"\"\n return self.inverse_gamma.dist.rvs(size=size, random_state=random_state)\n "},{"location":"distributions/#conjugate.distributions.Poisson","title":"Poisson dataclass ","text":" Bases: DiscretePlotMixin , SliceMixin Poisson distribution. Parameters: Name Type Description Default lam NUMERIC rate parameter required Source code in conjugate/distributions.py @dataclass\nclass Poisson(DiscretePlotMixin, SliceMixin):\n \"\"\"Poisson distribution.\n\n Args:\n lam: rate parameter\n\n \"\"\"\n\n lam: NUMERIC\n\n @property\n def dist(self):\n return stats.poisson(self.lam)\n\n def __mul__(self, other) -> \"Poisson\":\n return Poisson(lam=self.lam * other)\n\n __rmul__ = __mul__\n\n def __add__(self, other) -> \"Poisson\":\n return Poisson(self.lam + other.lam)\n\n __radd__ = __add__\n "},{"location":"distributions/#conjugate.distributions.StudentT","title":"StudentT dataclass ","text":" Bases: ContinuousPlotDistMixin , SliceMixin StudentT distribution. Parameters: Name Type Description Default mu NUMERIC mean required sigma NUMERIC standard deviation required nu NUMERIC degrees of freedom required Source code in conjugate/distributions.py @dataclass\nclass StudentT(ContinuousPlotDistMixin, SliceMixin):\n \"\"\"StudentT distribution.\n\n Args:\n mu: mean\n sigma: standard deviation\n nu: degrees of freedom\n\n \"\"\"\n\n mu: NUMERIC\n sigma: NUMERIC\n nu: NUMERIC\n\n @property\n def dist(self):\n return stats.t(self.nu, self.mu, self.sigma)\n "},{"location":"distributions/#conjugate.distributions.Uniform","title":"Uniform dataclass ","text":" Bases: ContinuousPlotDistMixin , SliceMixin Uniform distribution. Parameters: Name Type Description Default low NUMERIC lower bound required high NUMERIC upper bound required Source code in conjugate/distributions.py @dataclass\nclass Uniform(ContinuousPlotDistMixin, SliceMixin):\n \"\"\"Uniform distribution.\n\n Args:\n low: lower bound\n high: upper bound\n\n \"\"\"\n\n low: NUMERIC\n high: NUMERIC\n\n def __post_init__(self):\n self.min_value = self.low\n self.max_value = self.high\n\n @property\n def dist(self):\n return stats.uniform(self.low, self.high)\n "},{"location":"mixins/","title":"Mixins","text":"Two sets of mixins to support the plotting and slicing of the distribution parameters "},{"location":"mixins/#conjugate.plot.ContinuousPlotDistMixin","title":"ContinuousPlotDistMixin ","text":" Bases: PlotDistMixin Functionality for plot_pdf method of continuous distributions. Source code in conjugate/plot.py class ContinuousPlotDistMixin(PlotDistMixin):\n \"\"\"Functionality for plot_pdf method of continuous distributions.\"\"\"\n\n def plot_pdf(self, ax: Optional[plt.Axes] = None, **kwargs) -> plt.Axes:\n \"\"\"Plot the pdf of distribution\n\n Args:\n ax: matplotlib Axes, optional\n **kwargs: Additonal kwargs to pass to matplotlib\n\n Returns:\n new or modified Axes\n\n \"\"\"\n ax = self._settle_axis(ax=ax)\n\n x = self._create_x_values()\n x = self._reshape_x_values(x)\n\n return self._create_plot_on_axis(x, ax, **kwargs)\n\n def _create_x_values(self) -> np.ndarray:\n return np.linspace(self.min_value, self.max_value, 100)\n\n def _setup_labels(self, ax) -> None:\n ax.set_xlabel(\"Domain\")\n ax.set_ylabel(\"Density $f(x)$\")\n\n def _create_plot_on_axis(self, x, ax, **kwargs) -> plt.Axes:\n yy = self.dist.pdf(x)\n if \"label\" in kwargs:\n label = kwargs.pop(\"label\")\n label = resolve_label(label, yy)\n else:\n label = None\n\n ax.plot(x, yy, label=label, **kwargs)\n self._setup_labels(ax=ax)\n ax.set_ylim(0, None)\n return ax\n "},{"location":"mixins/#conjugate.plot.ContinuousPlotDistMixin.plot_pdf","title":"plot_pdf(ax=None, **kwargs) ","text":"Plot the pdf of distribution Parameters: Name Type Description Default ax Optional[Axes] matplotlib Axes, optional None **kwargs Additonal kwargs to pass to matplotlib {} Returns: Type Description Axes new or modified Axes Source code in conjugate/plot.py def plot_pdf(self, ax: Optional[plt.Axes] = None, **kwargs) -> plt.Axes:\n \"\"\"Plot the pdf of distribution\n\n Args:\n ax: matplotlib Axes, optional\n **kwargs: Additonal kwargs to pass to matplotlib\n\n Returns:\n new or modified Axes\n\n \"\"\"\n ax = self._settle_axis(ax=ax)\n\n x = self._create_x_values()\n x = self._reshape_x_values(x)\n\n return self._create_plot_on_axis(x, ax, **kwargs)\n "},{"location":"mixins/#conjugate.plot.DirichletPlotDistMixin","title":"DirichletPlotDistMixin ","text":" Bases: ContinuousPlotDistMixin Plot the pdf using samples from the dirichlet distribution. Source code in conjugate/plot.py class DirichletPlotDistMixin(ContinuousPlotDistMixin):\n \"\"\"Plot the pdf using samples from the dirichlet distribution.\"\"\"\n\n def plot_pdf(\n self, ax: Optional[plt.Axes] = None, samples: int = 1_000, **kwargs\n ) -> plt.Axes:\n \"\"\"Plots the pdf\"\"\"\n distribution_samples = self.dist.rvs(size=samples)\n\n ax = self._settle_axis(ax=ax)\n xx = self._create_x_values()\n\n for x in distribution_samples.T:\n kde = gaussian_kde(x)\n\n yy = kde(xx)\n\n ax.plot(xx, yy, **kwargs)\n\n self._setup_labels(ax=ax)\n return ax\n "},{"location":"mixins/#conjugate.plot.DirichletPlotDistMixin.plot_pdf","title":"plot_pdf(ax=None, samples=1000, **kwargs) ","text":"Plots the pdf Source code in conjugate/plot.py def plot_pdf(\n self, ax: Optional[plt.Axes] = None, samples: int = 1_000, **kwargs\n) -> plt.Axes:\n \"\"\"Plots the pdf\"\"\"\n distribution_samples = self.dist.rvs(size=samples)\n\n ax = self._settle_axis(ax=ax)\n xx = self._create_x_values()\n\n for x in distribution_samples.T:\n kde = gaussian_kde(x)\n\n yy = kde(xx)\n\n ax.plot(xx, yy, **kwargs)\n\n self._setup_labels(ax=ax)\n return ax\n "},{"location":"mixins/#conjugate.plot.DiscretePlotMixin","title":"DiscretePlotMixin ","text":" Bases: PlotDistMixin Adding the plot_pmf method to class. Source code in conjugate/plot.py class DiscretePlotMixin(PlotDistMixin):\n \"\"\"Adding the plot_pmf method to class.\"\"\"\n\n def plot_pmf(\n self, ax: Optional[plt.Axes] = None, mark: str = \"o-\", **kwargs\n ) -> plt.Axes:\n ax = self._settle_axis(ax=ax)\n\n x = self._create_x_values()\n x = self._reshape_x_values(x)\n\n return self._create_plot_on_axis(x, ax, mark, **kwargs)\n\n def _create_x_values(self) -> np.ndarray:\n return np.arange(self.min_value, self.max_value + 1, 1)\n\n def _create_plot_on_axis(\n self, x, ax, mark, conditional: bool = False, **kwargs\n ) -> plt.Axes:\n yy = self.dist.pmf(x)\n if conditional:\n yy = yy / np.sum(yy)\n ylabel = f\"Conditional Probability $f(x|{self.min_value} \\\\leq x \\\\leq {self.max_value})$\"\n else:\n ylabel = \"Probability $f(x)$\"\n\n if \"label\" in kwargs:\n label = kwargs.pop(\"label\")\n label = resolve_label(label, yy)\n else:\n label = None\n\n ax.plot(x, yy, mark, label=label, **kwargs)\n\n if self.max_value - self.min_value < 15:\n ax.set_xticks(x.ravel())\n else:\n ax.set_xticks(x.ravel(), minor=True)\n ax.set_xticks(x[::5].ravel())\n\n ax.set_xlabel(\"Domain\")\n ax.set_ylabel(ylabel)\n ax.set_ylim(0, None)\n return ax\n "},{"location":"mixins/#conjugate.plot.PlotDistMixin","title":"PlotDistMixin ","text":"Base mixin in order to support plotting. Requires the dist attribute of the scipy distribution. Source code in conjugate/plot.py class PlotDistMixin:\n \"\"\"Base mixin in order to support plotting. Requires the dist attribute of the scipy distribution.\"\"\"\n\n @property\n def dist(self) -> Distribution:\n raise NotImplementedError(\"Implement this property in the subclass.\")\n\n @property\n def max_value(self) -> float:\n if not hasattr(self, \"_max_value\"):\n raise ValueError(\"Set the max value before plotting.\")\n\n return self._max_value\n\n @max_value.setter\n def max_value(self, value: float) -> None:\n self._max_value = value\n\n def set_max_value(self, value: float) -> \"PlotDistMixin\":\n self.max_value = value\n\n return self\n\n @property\n def min_value(self) -> float:\n if not hasattr(self, \"_min_value\"):\n self._min_value = 0.0\n\n return self._min_value\n\n @min_value.setter\n def min_value(self, value: float) -> None:\n self._min_value = value\n\n def set_min_value(self, value: float) -> \"PlotDistMixin\":\n \"\"\"Set the minimum value for plotting.\"\"\"\n self.min_value = value\n\n return self\n\n def set_bounds(self, lower: float, upper: float) -> \"PlotDistMixin\":\n \"\"\"Set both the min and max values for plotting.\"\"\"\n return self.set_min_value(lower).set_max_value(upper)\n\n def _reshape_x_values(self, x: np.ndarray) -> np.ndarray:\n \"\"\"Make sure that the values are ready for plotting.\"\"\"\n for value in asdict(self).values():\n if not isinstance(value, float):\n return x[:, None]\n\n return x\n\n def _settle_axis(self, ax: Optional[plt.Axes] = None) -> plt.Axes:\n return ax if ax is not None else plt.gca()\n "},{"location":"mixins/#conjugate.plot.PlotDistMixin.set_bounds","title":"set_bounds(lower, upper) ","text":"Set both the min and max values for plotting. Source code in conjugate/plot.py def set_bounds(self, lower: float, upper: float) -> \"PlotDistMixin\":\n \"\"\"Set both the min and max values for plotting.\"\"\"\n return self.set_min_value(lower).set_max_value(upper)\n "},{"location":"mixins/#conjugate.plot.PlotDistMixin.set_min_value","title":"set_min_value(value) ","text":"Set the minimum value for plotting. Source code in conjugate/plot.py def set_min_value(self, value: float) -> \"PlotDistMixin\":\n \"\"\"Set the minimum value for plotting.\"\"\"\n self.min_value = value\n\n return self\n "},{"location":"mixins/#conjugate.plot.resolve_label","title":"resolve_label(label, yy) ","text":"https://stackoverflow.com/questions/73662931/matplotlib-plot-a-numpy-array-as-many-lines-with-a-single-label Source code in conjugate/plot.py def resolve_label(label: LABEL_INPUT, yy: np.ndarray):\n \"\"\"\n\n https://stackoverflow.com/questions/73662931/matplotlib-plot-a-numpy-array-as-many-lines-with-a-single-label\n \"\"\"\n if yy.ndim == 1:\n return label\n\n ncols = yy.shape[1]\n if ncols != 1:\n if isinstance(label, str):\n return [f\"{label} {i}\" for i in range(1, ncols + 1)]\n\n if callable(label):\n return [label(i) for i in range(ncols)]\n\n if isinstance(label, Iterable):\n return label\n\n raise ValueError(\"Label must be a string, iterable, or callable.\")\n\n return label\n "},{"location":"mixins/#conjugate.slice.SliceMixin","title":"SliceMixin ","text":"Mixin in order to slice the parameters Source code in conjugate/slice.py class SliceMixin:\n \"\"\"Mixin in order to slice the parameters\"\"\"\n\n def __getitem__(self, key):\n params = asdict(self)\n\n def slice(value, key):\n try:\n return value[key]\n except Exception:\n return value\n\n new_params = {k: slice(value=v, key=key) for k, v in params.items()}\n\n return self.__class__(**new_params)\n "},{"location":"models/","title":"Models","text":"For more on these models, check out the Conjugate Prior Wikipedia Table Below are the supported models "},{"location":"models/#conjugate.models.bernoulli_beta","title":"bernoulli_beta(x, beta_prior) ","text":"Posterior distribution for a bernoulli likelihood with a beta prior. Parameters: Name Type Description Default x NUMERIC sucesses from that trials required beta_prior Beta Beta distribution prior required Returns: Type Description Beta Beta distribution posterior Source code in conjugate/models.py def bernoulli_beta(x: NUMERIC, beta_prior: Beta) -> Beta:\n \"\"\"Posterior distribution for a bernoulli likelihood with a beta prior.\n\n Args:\n x: sucesses from that trials\n beta_prior: Beta distribution prior\n\n Returns:\n Beta distribution posterior\n\n \"\"\"\n return binomial_beta(n=1, x=x, beta_prior=beta_prior)\n "},{"location":"models/#conjugate.models.bernoulli_beta_posterior_predictive","title":"bernoulli_beta_posterior_predictive(beta) ","text":"Posterior predictive distribution for a bernoulli likelihood with a beta prior. Parameters: Name Type Description Default beta Beta Beta distribution required Returns: Type Description BetaBinomial BetaBinomial posterior predictive distribution Source code in conjugate/models.py def bernoulli_beta_posterior_predictive(beta: Beta) -> BetaBinomial:\n \"\"\"Posterior predictive distribution for a bernoulli likelihood with a beta prior.\n\n Args:\n beta: Beta distribution\n\n Returns:\n BetaBinomial posterior predictive distribution\n\n \"\"\"\n return binomial_beta_posterior_predictive(n=1, beta=beta)\n "},{"location":"models/#conjugate.models.binomial_beta","title":"binomial_beta(n, x, beta_prior) ","text":"Posterior distribution for a binomial likelihood with a beta prior. Parameters: Name Type Description Default n NUMERIC total number of trials required x NUMERIC sucesses from that trials required beta_prior Beta Beta distribution prior required Returns: Type Description Beta Beta distribution posterior Source code in conjugate/models.py def binomial_beta(n: NUMERIC, x: NUMERIC, beta_prior: Beta) -> Beta:\n \"\"\"Posterior distribution for a binomial likelihood with a beta prior.\n\n Args:\n n: total number of trials\n x: sucesses from that trials\n beta_prior: Beta distribution prior\n\n Returns:\n Beta distribution posterior\n\n \"\"\"\n alpha_post, beta_post = get_binomial_beta_posterior_params(\n beta_prior.alpha, beta_prior.beta, n, x\n )\n\n return Beta(alpha=alpha_post, beta=beta_post)\n "},{"location":"models/#conjugate.models.binomial_beta_posterior_predictive","title":"binomial_beta_posterior_predictive(n, beta) ","text":"Posterior predictive distribution for a binomial likelihood with a beta prior. Parameters: Name Type Description Default n NUMERIC number of trials required beta Beta Beta distribution required Returns: Type Description BetaBinomial BetaBinomial posterior predictive distribution Source code in conjugate/models.py def binomial_beta_posterior_predictive(n: NUMERIC, beta: Beta) -> BetaBinomial:\n \"\"\"Posterior predictive distribution for a binomial likelihood with a beta prior.\n\n Args:\n n: number of trials\n beta: Beta distribution\n\n Returns:\n BetaBinomial posterior predictive distribution\n\n \"\"\"\n return BetaBinomial(n=n, alpha=beta.alpha, beta=beta.beta)\n "},{"location":"models/#conjugate.models.categorical_dirichlet","title":"categorical_dirichlet(x, dirichlet_prior) ","text":"Posterior distribution of Categorical model with Dirichlet prior. Parameters: Name Type Description Default x NUMERIC counts required dirichlet_prior Dirichlet Dirichlet prior on the counts required Returns: Type Description Dirichlet Dirichlet posterior distribution Source code in conjugate/models.py def categorical_dirichlet(x: NUMERIC, dirichlet_prior: Dirichlet) -> Dirichlet:\n \"\"\"Posterior distribution of Categorical model with Dirichlet prior.\n\n Args:\n x: counts\n dirichlet_prior: Dirichlet prior on the counts\n\n Returns:\n Dirichlet posterior distribution\n\n \"\"\"\n alpha_post = get_dirichlet_posterior_params(dirichlet_prior.alpha, x)\n\n return Dirichlet(alpha=alpha_post)\n "},{"location":"models/#conjugate.models.exponential_gamma","title":"exponential_gamma(x_total, n, gamma_prior) ","text":"Posterior distribution for an exponential likelihood with a gamma prior. Parameters: Name Type Description Default x_total NUMERIC sum of all outcomes required n NUMERIC total number of samples in x_total required gamma_prior Gamma Gamma prior required Returns: Type Description Gamma Gamma posterior distribution Source code in conjugate/models.py def exponential_gamma(x_total: NUMERIC, n: NUMERIC, gamma_prior: Gamma) -> Gamma:\n \"\"\"Posterior distribution for an exponential likelihood with a gamma prior.\n\n Args:\n x_total: sum of all outcomes\n n: total number of samples in x_total\n gamma_prior: Gamma prior\n\n Returns:\n Gamma posterior distribution\n\n \"\"\"\n alpha_post, beta_post = get_exponential_gamma_posterior_params(\n alpha=gamma_prior.alpha, beta=gamma_prior.beta, x_total=x_total, n=n\n )\n\n return Gamma(alpha=alpha_post, beta=beta_post)\n "},{"location":"models/#conjugate.models.geometric_beta","title":"geometric_beta(x_total, n, beta_prior, one_start=True) ","text":"Posterior distribution for a geometric likelihood with a beta prior. Parameters: Name Type Description Default x_total sum of all trials outcomes required n total number of trials required beta_prior Beta Beta distribution prior required one_start bool whether to outcomes start at 1, defaults to True. False is 0 start. True Returns: Type Description Beta Beta distribution posterior Source code in conjugate/models.py def geometric_beta(x_total, n, beta_prior: Beta, one_start: bool = True) -> Beta:\n \"\"\"Posterior distribution for a geometric likelihood with a beta prior.\n\n Args:\n x_total: sum of all trials outcomes\n n: total number of trials\n beta_prior: Beta distribution prior\n one_start: whether to outcomes start at 1, defaults to True. False is 0 start.\n\n Returns:\n Beta distribution posterior\n\n \"\"\"\n alpha_post = beta_prior.alpha + n\n beta_post = beta_prior.beta + x_total\n\n if one_start:\n beta_post = beta_post - n\n\n return Beta(alpha=alpha_post, beta=beta_post)\n "},{"location":"models/#conjugate.models.linear_regression","title":"linear_regression(X, y, normal_inverse_gamma_prior, inv=np.linalg.inv) ","text":"Posterior distribution for a linear regression model with a normal inverse gamma prior. Derivation taken from this blog here. Parameters: Name Type Description Default X NUMERIC design matrix required y NUMERIC response vector required normal_inverse_gamma_prior NormalInverseGamma NormalInverseGamma prior required inv function to invert matrix, defaults to np.linalg.inv inv Returns: Type Description NormalInverseGamma NormalInverseGamma posterior distribution Source code in conjugate/models.py def linear_regression(\n X: NUMERIC,\n y: NUMERIC,\n normal_inverse_gamma_prior: NormalInverseGamma,\n inv=np.linalg.inv,\n) -> NormalInverseGamma:\n \"\"\"Posterior distribution for a linear regression model with a normal inverse gamma prior.\n\n Derivation taken from this blog [here](https://gregorygundersen.com/blog/2020/02/04/bayesian-linear-regression/).\n\n Args:\n X: design matrix\n y: response vector\n normal_inverse_gamma_prior: NormalInverseGamma prior\n inv: function to invert matrix, defaults to np.linalg.inv\n\n Returns:\n NormalInverseGamma posterior distribution\n\n \"\"\"\n N = X.shape[0]\n\n delta = inv(normal_inverse_gamma_prior.delta_inverse)\n\n delta_post = (X.T @ X) + delta\n delta_post_inverse = inv(delta_post)\n\n mu_post = (\n # (B, B)\n delta_post_inverse\n # (B, 1)\n # (B, B) * (B, 1) + (B, N) * (N, 1)\n @ (delta @ normal_inverse_gamma_prior.mu + X.T @ y)\n )\n\n alpha_post = normal_inverse_gamma_prior.alpha + (0.5 * N)\n beta_post = normal_inverse_gamma_prior.beta + (\n 0.5\n * (\n (y.T @ y)\n # (1, B) * (B, B) * (B, 1)\n + (normal_inverse_gamma_prior.mu.T @ delta @ normal_inverse_gamma_prior.mu)\n # (1, B) * (B, B) * (B, 1)\n - (mu_post.T @ delta_post @ mu_post)\n )\n )\n\n return NormalInverseGamma(\n mu=mu_post, delta_inverse=delta_post_inverse, alpha=alpha_post, beta=beta_post\n )\n "},{"location":"models/#conjugate.models.linear_regression_posterior_predictive","title":"linear_regression_posterior_predictive(normal_inverse_gamma, X, eye=np.eye) ","text":"Posterior predictive distribution for a linear regression model with a normal inverse gamma prior. Parameters: Name Type Description Default normal_inverse_gamma NormalInverseGamma NormalInverseGamma posterior required X NUMERIC design matrix required eye function to get identity matrix, defaults to np.eye eye Returns: Type Description MultivariateStudentT MultivariateStudentT posterior predictive distribution Source code in conjugate/models.py def linear_regression_posterior_predictive(\n normal_inverse_gamma: NormalInverseGamma, X: NUMERIC, eye=np.eye\n) -> MultivariateStudentT:\n \"\"\"Posterior predictive distribution for a linear regression model with a normal inverse gamma prior.\n\n Args:\n normal_inverse_gamma: NormalInverseGamma posterior\n X: design matrix\n eye: function to get identity matrix, defaults to np.eye\n\n Returns:\n MultivariateStudentT posterior predictive distribution\n\n \"\"\"\n mu = X @ normal_inverse_gamma.mu\n sigma = (normal_inverse_gamma.beta / normal_inverse_gamma.alpha) * (\n eye(X.shape[0]) + (X @ normal_inverse_gamma.delta_inverse @ X.T)\n )\n nu = 2 * normal_inverse_gamma.alpha\n\n return MultivariateStudentT(\n mu=mu,\n sigma=sigma,\n nu=nu,\n )\n "},{"location":"models/#conjugate.models.multinomial_dirichlet","title":"multinomial_dirichlet(x, dirichlet_prior) ","text":"Posterior distribution of Multinomial model with Dirichlet prior. Parameters: Name Type Description Default x NUMERIC counts required dirichlet_prior Dirichlet Dirichlet prior on the counts required Returns: Type Description Dirichlet Dirichlet posterior distribution Source code in conjugate/models.py def multinomial_dirichlet(x: NUMERIC, dirichlet_prior: Dirichlet) -> Dirichlet:\n \"\"\"Posterior distribution of Multinomial model with Dirichlet prior.\n\n Args:\n x: counts\n dirichlet_prior: Dirichlet prior on the counts\n\n Returns:\n Dirichlet posterior distribution\n\n \"\"\"\n alpha_post = get_dirichlet_posterior_params(dirichlet_prior.alpha, x)\n\n return Dirichlet(alpha=alpha_post)\n "},{"location":"models/#conjugate.models.negative_binomial_beta","title":"negative_binomial_beta(r, n, x, beta_prior) ","text":"Posterior distribution for a negative binomial likelihood with a beta prior. Parameters: Name Type Description Default r number of failures required n number of trials required x number of successes required beta_prior Beta Beta distribution prior required Returns: Type Description Beta Beta distribution posterior Source code in conjugate/models.py def negative_binomial_beta(r, n, x, beta_prior: Beta) -> Beta:\n \"\"\"Posterior distribution for a negative binomial likelihood with a beta prior.\n\n Args:\n r: number of failures\n n: number of trials\n x: number of successes\n beta_prior: Beta distribution prior\n\n Returns:\n Beta distribution posterior\n\n \"\"\"\n alpha_post = beta_prior.alpha + (r * n)\n beta_post = beta_prior.beta + x\n\n return Beta(alpha=alpha_post, beta=beta_post)\n "},{"location":"models/#conjugate.models.negative_binomial_beta_posterior_predictive","title":"negative_binomial_beta_posterior_predictive(r, beta) ","text":"Posterior predictive distribution for a negative binomial likelihood with a beta prior Source code in conjugate/models.py def negative_binomial_beta_posterior_predictive(r, beta: Beta) -> BetaNegativeBinomial:\n \"\"\"Posterior predictive distribution for a negative binomial likelihood with a beta prior\"\"\"\n return BetaNegativeBinomial(r=r, alpha=beta.alpha, beta=beta.beta)\n "},{"location":"models/#conjugate.models.normal_known_mean","title":"normal_known_mean(x_total, x2_total, n, mu, inverse_gamma_prior) ","text":"Posterior distribution for a normal likelihood with a known mean and a variance prior. Parameters: Name Type Description Default x_total NUMERIC sum of all outcomes required x2_total NUMERIC sum of all outcomes squared required n NUMERIC total number of samples in x_total required mu NUMERIC known mean required inverse_gamma_prior InverseGamma InverseGamma prior for variance required Returns: Type Description InverseGamma InverseGamma posterior distribution for the variance Source code in conjugate/models.py def normal_known_mean(\n x_total: NUMERIC,\n x2_total: NUMERIC,\n n: NUMERIC,\n mu: NUMERIC,\n inverse_gamma_prior: InverseGamma,\n) -> InverseGamma:\n \"\"\"Posterior distribution for a normal likelihood with a known mean and a variance prior.\n\n Args:\n x_total: sum of all outcomes\n x2_total: sum of all outcomes squared\n n: total number of samples in x_total\n mu: known mean\n inverse_gamma_prior: InverseGamma prior for variance\n\n Returns:\n InverseGamma posterior distribution for the variance\n\n \"\"\"\n alpha_post = inverse_gamma_prior.alpha + (n / 2)\n beta_post = inverse_gamma_prior.beta + (\n 0.5 * (x2_total - (2 * mu * x_total) + (n * (mu**2)))\n )\n\n return InverseGamma(alpha=alpha_post, beta=beta_post)\n "},{"location":"models/#conjugate.models.normal_known_mean_posterior_predictive","title":"normal_known_mean_posterior_predictive(mu, inverse_gamma) ","text":"Posterior predictive distribution for a normal likelihood with a known mean and a variance prior. Parameters: Name Type Description Default mu NUMERIC known mean required inverse_gamma InverseGamma InverseGamma prior required Returns: Type Description StudentT StudentT posterior predictive distribution Source code in conjugate/models.py def normal_known_mean_posterior_predictive(\n mu: NUMERIC, inverse_gamma: InverseGamma\n) -> StudentT:\n \"\"\"Posterior predictive distribution for a normal likelihood with a known mean and a variance prior.\n\n Args:\n mu: known mean\n inverse_gamma: InverseGamma prior\n\n Returns:\n StudentT posterior predictive distribution\n\n \"\"\"\n return StudentT(\n n=2 * inverse_gamma.alpha,\n mu=mu,\n sigma=(inverse_gamma.beta / inverse_gamma.alpha) ** 0.5,\n )\n "},{"location":"models/#conjugate.models.poisson_gamma","title":"poisson_gamma(x_total, n, gamma_prior) ","text":"Posterior distribution for a poisson likelihood with a gamma prior. Parameters: Name Type Description Default x_total NUMERIC sum of all outcomes required n NUMERIC total number of samples in x_total required gamma_prior Gamma Gamma prior required Returns: Type Description Gamma Gamma posterior distribution Source code in conjugate/models.py def poisson_gamma(x_total: NUMERIC, n: NUMERIC, gamma_prior: Gamma) -> Gamma:\n \"\"\"Posterior distribution for a poisson likelihood with a gamma prior.\n\n Args:\n x_total: sum of all outcomes\n n: total number of samples in x_total\n gamma_prior: Gamma prior\n\n Returns:\n Gamma posterior distribution\n\n \"\"\"\n alpha_post, beta_post = get_poisson_gamma_posterior_params(\n alpha=gamma_prior.alpha, beta=gamma_prior.beta, x_total=x_total, n=n\n )\n\n return Gamma(alpha=alpha_post, beta=beta_post)\n "},{"location":"models/#conjugate.models.poisson_gamma_posterior_predictive","title":"poisson_gamma_posterior_predictive(gamma, n=1) ","text":"Posterior predictive distribution for a poisson likelihood with a gamma prior Parameters: Name Type Description Default gamma Gamma Gamma distribution required n NUMERIC Number of trials for each sample, defaults to 1. Can be used to scale the distributions to a different unit of time. 1 Returns: Type Description NegativeBinomial NegativeBinomial distribution related to posterior predictive Source code in conjugate/models.py def poisson_gamma_posterior_predictive(\n gamma: Gamma, n: NUMERIC = 1\n) -> NegativeBinomial:\n \"\"\"Posterior predictive distribution for a poisson likelihood with a gamma prior\n\n Args:\n gamma: Gamma distribution\n n: Number of trials for each sample, defaults to 1.\n Can be used to scale the distributions to a different unit of time.\n\n Returns:\n NegativeBinomial distribution related to posterior predictive\n\n \"\"\"\n n = n * gamma.alpha\n p = gamma.beta / (1 + gamma.beta)\n\n return NegativeBinomial(n=n, p=p)\n "},{"location":"examples/bayesian-update/","title":"Bayesian Update","text":"Easy to use Bayesian inference incrementally by making the posterior the prior for the next update. import numpy as np\nimport matplotlib.pyplot as plt\n\nfrom conjugate.distributions import NormalInverseGamma\nfrom conjugate.models import linear_regression\n\ndef create_sampler(mu, sigma, rng): \n \"\"\"Generate a sampler from a normal distribution with mean `mu` and standard deviation `sigma`.\"\"\"\n def sample(n: int): \n return rng.normal(loc=mu, scale=sigma, size=n)\n\n return sample\n\n\nmu = 5.0\nsigma = 2.5\nrng = np.random.default_rng(0)\nsample = create_sampler(mu=mu, sigma=sigma, rng=rng)\n\n\nprior = NormalInverseGamma(\n mu=np.array([0]), \n delta_inverse=np.array([[1]]), \n alpha=1, beta=1, \n)\n\n\ncumsum = 0\nbatch_sizes = [5, 10, 25]\nax = plt.gca()\nfor batch_size in batch_sizes:\n y = sample(n=batch_size)\n X = np.ones_like(y)[:, None]\n\n posterior = linear_regression(X, y, prior)\n beta_samples, variance_samples = posterior.sample_beta(size=1000, return_variance=True, random_state=rng)\n\n cumsum += batch_size\n label = f\"n={cumsum}\"\n ax.scatter(variance_samples ** 0.5, beta_samples, alpha=0.25, label=label)\n\n prior = posterior \n\nax.scatter(sigma, mu, color=\"black\", label=\"true\")\nax.set(\n xlabel=\"$\\sigma$\", \n ylabel=\"$\\mu$\", \n xlim=(0, None), \n ylim=(0, None), \n title=\"Updated posterior samples of $\\mu$ and $\\sigma$\"\n)\nax.legend()\n\nplt.show()\n "},{"location":"examples/binomial/","title":"Binomial Model","text":"from conjugate.distributions import Beta, Binomial, BetaBinomial\nfrom conjugate.models import binomial_beta, binomial_beta_posterior_predictive\n\nimport matplotlib.pyplot as plt\n\nN = 10\ntrue_dist = Binomial(n=N, p=0.5)\n\n# Observed Data\nX = true_dist.dist.rvs(size=1, random_state=42)\n\n# Conjugate prior\nprior = Beta(alpha=1, beta=1)\nposterior: Beta = binomial_beta(n=N, x=X, beta_prior=prior)\n\n# Comparison\nprior_predictive: BetaBinomial = binomial_beta_posterior_predictive(n=N, beta=prior)\nposterior_predictive: BetaBinomial = binomial_beta_posterior_predictive(n=N, beta=posterior)\n\n# Figure \nfig, axes = plt.subplots(ncols=2, nrows=1, figsize=(8, 4))\n\nax: plt.Axes = axes[0]\nposterior.plot_pdf(ax=ax, label=\"posterior\")\nprior.plot_pdf(ax=ax, label=\"prior\")\nax.axvline(x=X/N, color=\"black\", ymax=0.05, label=\"MLE\")\nax.axvline(x=true_dist.p, color=\"black\", ymax=0.05, linestyle=\"--\", label=\"True\")\nax.set_title(\"Success Rate\")\nax.legend()\n\nax: plt.Axes = axes[1]\ntrue_dist.plot_pmf(ax=ax, label=\"true distribution\", color=\"C2\")\nposterior_predictive.plot_pmf(ax=ax, label=\"posterior predictive\")\nprior_predictive.plot_pmf(ax=ax, label=\"prior predictive\")\nax.axvline(x=X, color=\"black\", ymax=0.05, label=\"Sample\")\nax.set_title(\"Number of Successes\")\nax.legend()\n\nplt.show()\n "},{"location":"examples/generalized-inputs/","title":"Generalized Numerical Inputs","text":"Though the plotting is meant for numpy and python numbers, the conjugate models work with anything that works like numbers. For instance, Bayesian models in SQL using the SQL Builder, PyPika from pypika import Field \n\n# Columns from table in database\nN = Field(\"total\")\nX = Field(\"successes\")\n\n# Conjugate prior\nprior = Beta(alpha=1, beta=1)\nposterior = binomial_beta(n=N, x=X, beta_prior=prior)\n\nprint(\"Posterior alpha:\", posterior.alpha)\nprint(\"Posterior beta:\", posterior.beta)\n# Posterior alpha: 1+\"successes\"\n# Posterior beta: 1+\"total\"-\"successes\"\n\n# Priors can be fields too\nalpha = Field(\"previous_successes\") - 1\nbeta = Field(\"previous_failures\") - 1\n\nprior = Beta(alpha=alpha, beta=beta)\nposterior = binomial_beta(n=N, x=X, beta_prior=prior)\n\nprint(\"Posterior alpha:\", posterior.alpha)\nprint(\"Posterior beta:\", posterior.beta)\n# Posterior alpha: \"previous_successes\"-1+\"successes\"\n# Posterior beta: \"previous_failures\"-1+\"total\"-\"successes\"\n Using PyMC distributions for sampling with additional uncertainty import pymc as pm \n\nalpha = pm.Gamma.dist(alpha=1, beta=20)\nbeta = pm.Gamma.dist(alpha=1, beta=20)\n\n# Observed Data\nN = 10\nX = 4\n\n# Conjugate prior \nprior = Beta(alpha=alpha, beta=beta)\nposterior = binomial_beta(n=N, x=X, beta_prior=prior)\n\n# Reconstruct the posterior distribution with PyMC\nprior_dist = pm.Beta.dist(alpha=prior.alpha, beta=prior.beta)\nposterior_dist = pm.Beta.dist(alpha=posterior.alpha, beta=posterior.beta)\n\nsamples = pm.draw([alpha, beta, prior_dist, posterior_dist], draws=1000)\n "},{"location":"examples/indexing/","title":"Indexing Parameters","text":"The distributions can be indexed for subsets. beta = np.arange(1, 10)\nprior = Beta(alpha=1, beta=beta)\n\nidx = [0, 5, -1]\nprior_subset = prior[idx]\nprior_subset.plot_pdf(label = lambda i: f\"prior {i}\")\nplt.legend()\nplt.show()\n "},{"location":"examples/linear-regression/","title":"Linear Regression","text":"We can fit linear regression that includes a predictive distribution for new data using a conjugate prior. This example only has one covariate, but the same approach can be used for multiple covariates. "},{"location":"examples/linear-regression/#simulate-data","title":"Simulate Data","text":"We are going to simulate data from a linear regression model. The true intercept is 3.5, the true slope is -2.0, and the true variance is 2.5. import numpy as np\nimport pandas as pd\n\nimport matplotlib.pyplot as plt\n\nfrom conjugate.distributions import NormalInverseGamma, MultivariateStudentT\nfrom conjugate.models import linear_regression, linear_regression_posterior_predictive\n\nintercept = 3.5\nslope = -2.0\nsigma = 2.5\n\nrng = np.random.default_rng(0)\n\nx_lim = 3\nn_points = 100\nx = np.linspace(-x_lim, x_lim, n_points)\ny = intercept + slope * x + rng.normal(scale=sigma, size=n_points)\n "},{"location":"examples/linear-regression/#define-prior-and-find-posterior","title":"Define Prior and Find Posterior","text":"There needs to be a prior for the intercept, slope, and the variance. prior = NormalInverseGamma(\n mu=np.array([0, 0]),\n delta_inverse=np.array([[1, 0], [0, 1]]),\n alpha=1,\n beta=1,\n)\n\ndef create_X(x: np.ndarray) -> np.ndarray:\n return np.stack([np.ones_like(x), x]).T\n\nX = create_X(x)\nposterior: NormalInverseGamma = linear_regression(\n X=X,\n y=y,\n normal_inverse_gamma_prior=prior,\n)\n "},{"location":"examples/linear-regression/#posterior-predictive-for-new-data","title":"Posterior Predictive for New Data","text":"The multivariate student-t distribution is used for the posterior predictive distribution. We have to draw samples from it since the scipy implementation does not have a ppf method. # New Data\nx_lim_new = 1.5 * x_lim\nx_new = np.linspace(-x_lim_new, x_lim_new, 20)\nX_new = create_X(x_new)\npp: MultivariateStudentT = linear_regression_posterior_predictive(normal_inverse_gamma=posterior, X=X_new)\n\nsamples = pp.dist.rvs(5_000).T\ndf_samples = pd.DataFrame(samples, index=x_new)\n "},{"location":"examples/linear-regression/#plot-results","title":"Plot Results","text":"We can see that the posterior predictive distribution begins to widen as we move away from the data. Overall, the posterior predictive distribution is a good fit for the data. The true line is within the 95% posterior predictive interval. def plot_abline(intercept: float, slope: float, ax: plt.Axes = None, **kwargs):\n \"\"\"Plot a line from slope and intercept\"\"\"\n if ax is None:\n ax = plt.gca()\n\n x_vals = np.array(ax.get_xlim())\n y_vals = intercept + slope * x_vals\n ax.plot(x_vals, y_vals, **kwargs)\n\n\ndef plot_lines(ax: plt.Axes, samples: np.ndarray, label: str, color: str, alpha: float):\n for i, betas in enumerate(samples):\n label = label if i == 0 else None\n plot_abline(betas[0], betas[1], ax=ax, color=color, alpha=alpha, label=label)\n\n\nfig, ax = plt.subplots()\nax.set_xlim(-x_lim, x_lim)\nax.set_ylim(y.min(), y.max())\n\nax.scatter(x, y, label=\"data\")\n\nplot_lines(\n ax=ax,\n samples=prior.sample_beta(size=100, random_state=rng),\n label=\"prior\",\n color=\"blue\",\n alpha=0.05,\n)\nplot_lines(\n ax=ax,\n samples=posterior.sample_beta(size=100, random_state=rng),\n label=\"posterior\",\n color=\"black\",\n alpha=0.2,\n)\n\nplot_abline(intercept, slope, ax=ax, label=\"true\", color=\"red\")\n\nax.set(xlabel=\"x\", ylabel=\"y\", title=\"Linear regression with conjugate prior\")\n\n# New Data\nax.plot(x_new, pp.mu, color=\"green\", label=\"posterior predictive mean\")\ndf_quantile = df_samples.T.quantile([0.025, 0.975]).T\nax.fill_between(\n x_new,\n df_quantile[0.025],\n df_quantile[0.975],\n alpha=0.2,\n color=\"green\",\n label=\"95% posterior predictive interval\",\n)\nax.legend()\nax.set(xlim=(-x_lim_new, x_lim_new))\nplt.show()\n "},{"location":"examples/plotting/","title":"Plotting Distributions","text":"All the distributions can be plotted using the plot_pdf and plot_pmf methods. The plot_pdf method is used for continuous distributions and the plot_pmf method is used for discrete distributions. There is limited support for some distributions like the Dirichlet or those without a dist scipy. from conjugate.distributions import Beta, Gamma, Normal\n\nimport matplotlib.pyplot as plt\n\nbeta = Beta(1, 1)\ngamma = Gamma(1, 1)\nnormal = Normal(0, 1)\n\nbound = 3\n\ndist = [beta, gamma, normal]\nlabels = [\"beta\", \"gamma\", \"normal\"]\nax = plt.gca()\nfor label, dist in zip(labels, dist):\n dist.set_bounds(-bound, bound).plot_pdf(label=label)\n\nax.legend()\nplt.show()\n The plotting is also supported for vectorized inputs. "},{"location":"examples/pymc-sampling/","title":"Unsupported Posterior Predictive Distributions with PyMC Sampling","text":"The geometric beta model posterior predictive doesn't have a common dist, but what doesn't mean the posterior predictive can be used. For instance, PyMC can be used to fill in this gap. import pymc as pm\n\nfrom conjugate.distribution import Beta\nfrom conjugate.models import geometric_beta\n\nprior = Beta(1, 1)\nposterior: Beta = geometric_beta(x=1, n=10, beta_prior=prior)\n\nposterior_dist = pm.Beta.dist(alpha=posterior.alpha, beta=posterior.beta)\ngeometric_posterior_predictive = pm.Geometric.dist(posterior_dist)\n\nposterior_predictive_samples = pm.draw(geometric_posterior_predictive, draws=100)\n "},{"location":"examples/scaling-distributions/","title":"Scaling Distributions","text":"Some of the distributions can be scaled by a constant factor or added together. For instance, operations with Poisson distribution represent the number of events in a given time interval. from conjugate.distributions import Poisson\n\nimport matplotlib.pyplot as plt\n\ndaily_rate = 0.25\ndaily_pois = Poisson(lam=daily_rate)\n\ntwo_day_pois = daily_pois + daily_pois\nweekly_pois = 7 * daily_pois\n\nmax_value = 7\nax = plt.gca()\ndists = [daily_pois, two_day_pois, weekly_pois]\nbase_labels = [\"daily\", \"two day\", \"weekly\"]\nfor dist, base_label in zip(dists, base_labels):\n label = f\"{base_label} rate={dist.lam}\"\n dist.set_max_value(max_value).plot_pmf(ax=ax, label=label)\n\nax.legend()\nplt.show()\n The normal distribution also supports scaling making use of the fact that the variance of a scaled normal distribution is the square of the scaling factor. from conjugate.distributions import Normal\n\nimport matplotlib.pyplot as plt\n\nnorm = Normal(mu=0, sigma=1)\nnorm_times_2 = norm * 2\n\nbound = 6\nax = norm.set_bounds(-bound, bound).plot_pdf(label=f\"normal (std = {norm.sigma:.2f})\")\nnorm_times_2.set_bounds(-bound, bound).plot_pdf(ax=ax, label=f\"normal * 2 (std = {norm_times_2.sigma:.2f})\")\nax.legend()\nplt.show()\n "},{"location":"examples/scipy-connection/","title":"Connection to SciPy Distributions","text":"Many distributions have the dist attribute which is a scipy.stats distribution object. From there, the methods from scipy.stats to get the pdf, cdf, etc can be leveraged. from conjugate.distribution import Beta \n\nbeta = Beta(1, 1)\nscipy_dist = beta.dist \n\nprint(scipy_dist.mean())\n# 0.5\nprint(scipy_dist.ppf([0.025, 0.975]))\n# [0.025 0.975]\n\nsamples = scipy_dist.rvs(100)\n "},{"location":"examples/vectorized-inputs/","title":"Vectorized Inputs","text":"All data and priors will allow for vectorized assuming the shapes work for broadcasting. The plotting also supports arrays of results import numpy as np\n\nfrom conjugate.distributions import Beta\nfrom conjugate.models import binomial_beta\n\nimport matplotlib.pyplot as plt\n\n# Analytics \nprior = Beta(alpha=1, beta=np.array([1, 5]))\nposterior = binomial_beta(n=N, x=x, beta_prior=prior)\n\n# Figure\nax = prior.plot_pdf(label=lambda i: f\"prior {i}\")\nposterior.plot_pdf(ax=ax, label=lambda i: f\"posterior {i}\")\nax.axvline(x=x / N, ymax=0.05, color=\"black\", linestyle=\"--\", label=\"MLE\")\nax.legend()\nplt.show()\n "}]}
\ No newline at end of file
diff --git a/sitemap.xml.gz b/sitemap.xml.gz
index b157b80..1a0ab12 100644
Binary files a/sitemap.xml.gz and b/sitemap.xml.gz differ
|