-
Notifications
You must be signed in to change notification settings - Fork 1
/
Copy pathSKL20-Models.tex
28 lines (24 loc) · 1.27 KB
/
SKL20-Models.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
\documentclass[SKL-MASTER.tex]{subfiles}
%================================================================================ %
\section{Working with Linear Models}
In this chapter, we will cover the following topics:
* Fitting a line through data
* Evaluating the linear regression model
* Using ridge regression to overcome linear regression's shortfalls
* Optimizing the ridge regression parameter
* Using sparsity to regularize models
* Taking a more fundamental approach to regularization with LARS
* Using linear methods for classification – logistic regression
* Directly applying Bayesian ridge regression
* Using boosting to learn from errors
\subsection{Introduction}
Linear models are fundamental in statistics and machine learning. Many methods rely on
a linear combination of variables to describe the relationship in the data. Quite often, great
efforts are taken in an attempt to make the transformations necessary so that the data can
be described in a linear combination.
In this chapter, we build up from the simplest idea of fitting a straight line through data to
classification, and finally to Bayesian ridge regression.
%========================================================%
% % - Working with Linear Models
% % - 56
\end{document}