05-1 Supervised Learning
05-1 Supervised Learning
05-1 Supervised Learning
is minimized.
Regression
● The simplest case is linear regressor:
● Optimization task: find w0 and w1 such that the error
is minimized.
● Analytic solution:
where and .
Linear Regression
Linear Regression
Linear Regression
Linear Model:
Linear Regression
Model
parameters
Linear Model:
Linear Regression
Model
parameters
Linear Model:
broadcasting
Linear Regression
Model
parameters
Linear Model:
broadcasting
A “trick”:
Linear Regression
Model
parameters
Linear Model:
broadcasting
A “trick”:
Linear Regression
Linear Regression
Linear Regression
Linear Regression
Assume
Linear Regression
Assume
Linear Regression
Assume
Linear Regression
Linear Regression
Moore-Penrose Pseudoinverse
Linear Regression: Toy data
Linear Basis Functions
Least Squares Solution to Regression
Polynomial Regressors
Polynomial Regressors
Polynomial Regressors
Polynomial Regressors
Polynomial Regressors
Polynomial Regressors
Polynomial Regressors
Polynomial Regressors
● Etrain is the error in the training data. It decreases as model complexity
increases.
● Etest is the error on the remaining 93 data points, or “test set”. It has minimum
at k = 3.
Polynomial Regressors
Wait a minute...
is d-by-d.
requires N x d2 multiplications and additions.
Inverting is O(d3) (Gauss-Jordan)
Afterwards, requires N x d2 + N x d multiplications.
Big Data!
is d-by-d.
requires N x d2 multiplications and additions.
Inverting is O(d3) (Gauss-Jordan)
Afterwards, requires N x d2 + N x d multiplications.