Skip to content

Commit

Permalink
note 11 fix
Browse files Browse the repository at this point in the history
  • Loading branch information
ishani07 committed Apr 30, 2024
1 parent 2785d77 commit 2c3e4c5
Show file tree
Hide file tree
Showing 11 changed files with 65 additions and 74 deletions.
10 changes: 4 additions & 6 deletions constant_model_loss_transformations/loss_transformations.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -735,12 +735,10 @@ $$\hat{z} = \theta_0 + \theta_1 x$$

It turns out that this linearized relationship can help us understand the underlying relationship between $x$ and $y$. If we rearrange the relationship above, we find:

$$
\log{(y)} = \theta_0 + \theta_1 x \\
y = e^{\theta_0 + \theta_1 x} \\
y = (e^{\theta_0})e^{\theta_1 x} \\
y_i = C e^{k x}
$$
$$\log{(y)} = \theta_0 + \theta_1 x$$
$$y = e^{\theta_0 + \theta_1 x}$$
$$y = (e^{\theta_0})e^{\theta_1 x}$$
$$y_i = C e^{k x}$$

For some constants $C$ and $k$.

Expand Down
129 changes: 61 additions & 68 deletions docs/constant_model_loss_transformations/loss_transformations.html

Large diffs are not rendered by default.

Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.

0 comments on commit 2c3e4c5

Please sign in to comment.