Machine Learning - Stanford - Coursera
1.0.0
1.0.0
  • Acknowledgements
  • Introduction
  • Linear Algebra Review
  • Types of Machine Learning
  • Supervised Learning
    • Linear Regression
      • Linear Regression in One Variable
        • Cost Function
        • Gradient Descent
      • Multivariate Linear Regression
        • Cost Function
        • Gradient Descent
        • Feature Scaling
        • Mean Normalization
        • Choosing the Learning Rate α
    • Polynomial Regression
      • Normal Equation
      • Gradient Descent vs. Normal Equation
Powered by GitBook
On this page

Was this helpful?

  1. Supervised Learning
  2. Linear Regression

Multivariate Linear Regression

In univariate linear regression, we had only 1 input parameter x.

Here, we have n input parameters x1,...,xnx_1, ..., x_nx1​,...,xn​

So, the hypothesis function changes to:

hθ(x)=θ0+θ1x1+θ2x2+......+θnxnh_θ(x)=θ_0+θ_1x_1+θ_2x_2+......+θ_nx_nhθ​(x)=θ0​+θ1​x1​+θ2​x2​+......+θn​xn​

If θθθ denotes the (n+1) dimensional θθθ vector and X denotes the input vector,

hθ(x)=θTXh_θ(x)=θ_TXhθ​(x)=θT​X (if x0=1x_0=1x0​=1)

PreviousGradient DescentNextCost Function

Last updated 5 years ago

Was this helpful?