# Parametric Estimation for Simple Linear Regression

$$r=f(X) + \epsilon$$;$$\epsilon\sim N(\mu,\sigma^2)$$

is a line that minimizes squared error.

$$g(x^t|w\_0,w\_1) = w\_1x^t+w\_0$$

is the line defined by parameters $$w\_0, w\_1$$. We need to find the line that minimizes squared error, and to do so, we need to compute the values for $$w\_0, w\_1$$ that minimize the squared error.

We have $$X={x^t,r^t}\_{t=1}^N$$

We need to compute $$argmin\_{w\_0, w\_1} ,, \sum\_t ,(r^t - (w\_1x^t+w\_0))^2$$

To solve for $$w\_0, w\_1$$, take partial derivatives w\.r.t $$w\_0$$ and $$w\_1$$ and equate them to 0. We will get 2 equations:

$$\sum\_t , r^t = Nw\_0 + w\_1 \sum\_t x^t$$

$$\sum\_t , r^tx^t = w\_0\sum\_t, x^t + w\_1\sum\_t, (x^t)^2$$

To solve for $$w\_0, w\_1$$, we use the closed form solution:

$$W=A^{-1}y$$

where $$W=\begin{bmatrix}w\_0\w\_1\end{bmatrix}, , A=\begin{bmatrix}N & \sum\_t x^t\ \sum\_t x^t & \sum\_t (x^t)^2\end{bmatrix}, , Y=\begin{bmatrix} \sum\_t r^t \ \sum\_t r^tx^t\end{bmatrix}$$
