Parametric Estimation for Simple Linear Regression

r=f(X)+ϵr=f(X) + \epsilon;ϵN(μ,σ2)\epsilon\sim N(\mu,\sigma^2)

is a line that minimizes squared error.

g(xtw0,w1)=w1xt+w0g(x^t|w_0,w_1) = w_1x^t+w_0

is the line defined by parameters w0,w1w_0, w_1. We need to find the line that minimizes squared error, and to do so, we need to compute the values for w0,w1w_0, w_1 that minimize the squared error.

We have X={xt,rt}t=1NX=\{x^t,r^t\}_{t=1}^N

We need to compute argminw0,w1t(rt(w1xt+w0))2argmin_{w_0, w_1} \,\, \sum_t \,(r^t - (w_1x^t+w_0))^2

To solve for w0,w1w_0, w_1, take partial derivatives w.r.t w0w_0 and w1w_1 and equate them to 0. We will get 2 equations:

trt=Nw0+w1txt\sum_t \, r^t = Nw_0 + w_1 \sum_t x^t

trtxt=w0txt+w1t(xt)2\sum_t \, r^tx^t = w_0\sum_t\, x^t + w_1\sum_t\, (x^t)^2

To solve for w0,w1w_0, w_1, we use the closed form solution:

W=A1yW=A^{-1}y

where W=[w0w1],A=[Ntxttxtt(xt)2],Y=[trttrtxt]W=\begin{bmatrix}w_0\\w_1\end{bmatrix}, \, A=\begin{bmatrix}N & \sum_t x^t\\ \sum_t x^t & \sum_t (x^t)^2\end{bmatrix}, \, Y=\begin{bmatrix} \sum_t r^t \\ \sum_t r^tx^t\end{bmatrix}

Last updated