Parametric Estimation for Simple Polynomial Regression

In cases where the data cannot be fit using a linear decision boundary, we may want to use polynomial regression.

Say we want to use a degree 2 polynomial. The equation can be given by:

g(xw2,w1,w0)=w2x2+w1x+w0g(x|w_2,w_1,w_0) = w_2x^2 + w_1x + w_0

Our aim is to find values for w0,w1,w2w_0,w_1,w_2 that minimize the squared error t(rtg(xt))2\sum_t (r^t-g(x^t))^2

Note: Given a dataset {xt,rt}t=1N\{x^t,r^t\}_{t=1}^N where xtRx^t\in R i.e. where xt=[x1t]x^t=[x_1^t] (1 dimension), to find the polynomial of degree 2 g(xw2,w1,w0)=w2x2+w1x+w0g(x|w_2,w_1,w_0) = w_2x^2 + w_1x + w_0 that minimizes the squared error, we can construct a related dataset with inputs in R2R^2 (2 dimensions) with the second dimension x2t=(x1t)2x_2^t=(x_1^t)^2, and then use simple linear regression on this new dataset to obtain w2,w1,w0w_2,w_1,w_0 that minimize the squared error, and finally output g(xw2,w1,w0)=w2x2+w1x+w0g(x|w_2,w_1,w_0) = w_2x^2 + w_1x + w_0 with these w2,w1,w0w_2,w_1,w_0 values as the best 2 degree polynomial that fits the original dataset.

This can be extended to higher degree polynomials as well.

Last updated