-
Notifications
You must be signed in to change notification settings - Fork 0
/
ch10_glm.qmd
73 lines (53 loc) · 1.64 KB
/
ch10_glm.qmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
# Big Entropy and the Generalized Linear Model {#GLM}
```{r }
#| include: false
library(rethinking)
library(brms)
library(tidyr)
library(dplyr)
library(ggplot2)
library(paletteer)
```
## Maximum Entropy
The example of pebbles and bucket is also very well explained by a NASA scientist at [monkey](http://maximum-entropy-blog.blogspot.com/2013/11/monkeys-and-multiplicity.html).
The nb of ways tp put the pebble is as follows. Let $N_i$ be the number of pebbles we put in bucket $i$. with the total nb of pebble being $N$ defined as $N = \sum_1^5N_i= 10$.
Then the nb of ways to put the $N$ pebbles in 5 buckests is
$$
\text{nb ways to put } N_1 \text{ pebbles in bucket } 1 \times \\
\text{nb ways to put } N_2 = N-N_1 \text{ pebbles in bucket } 2 \times \\
\text{nb ways to put } N_3 = N-N_2-N_1 = \text{ pebbles in bucket } 3 \times \\
\text{nb ways to put } N_4 = N-N_3-N_2-N_1 = \text{ pebbles in bucket } 4 \times \\
\text{nb ways to put } N_5 = N-N_4-N_3-N_2-N_1 = \text{ pebbles in bucket } 5 \times \\
$$
which is
$$
\binom{10}{N_1} \cdot \binom{10}{N_2} \cdot \binom{10}{N_3} \cdot \binom{10}{N_4} \cdot \binom{10}{N_5} =
\binom{10}{N_1,N_2,N_3,N_4,N_5}
$$
so, for example, for plot B we have
$$
\binom{10}{0,1,8,1,0} = 90
$$
```{r}
p <- list(
"A" = c(0, 0, 10, 0, 0),
"B" = c(0, 1, 8, 1, 0),
"C" = c(0, 2, 6, 2, 0),
"D" = c(1, 2, 4, 2, 1),
"E" = c(2, 2, 2, 2, 2)
)
```
```{r}
p_norm <- lapply(p, function(q) q / sum(q))
```
and the entropy is
```{r}
H <- sapply(p_norm, FUN = function(x) -sum(ifelse(x!=0, x * log(x), 0)))
H
```
### Gaussian
### Binomial
## Generalized linear models
## Maximum entropy priors
## Summary
## Practice