Machine Learning - Stanford - Coursera
1.0.0
1.0.0
  • Acknowledgements
  • Introduction
  • Linear Algebra Review
  • Types of Machine Learning
  • Supervised Learning
    • Linear Regression
      • Linear Regression in One Variable
        • Cost Function
        • Gradient Descent
      • Multivariate Linear Regression
        • Cost Function
        • Gradient Descent
        • Feature Scaling
        • Mean Normalization
        • Choosing the Learning Rate α
    • Polynomial Regression
      • Normal Equation
      • Gradient Descent vs. Normal Equation
Powered by GitBook
On this page

Was this helpful?

  1. Supervised Learning
  2. Linear Regression
  3. Multivariate Linear Regression

Cost Function

When we have multiple parameters θ0…...θnθ_0…...θ_nθ0​…...θn​, the cost function is given by:

J(θ)=(1/2m)∑i=1m(hθ(x(i))−y(i))2J(θ) = (1/2m) ∑_{i=1}^{m}(h_θ(x^{(i)})−y^{(i)})^2J(θ)=(1/2m)∑i=1m​(hθ​(x(i))−y(i))2

where θθθ is a vector.

PreviousMultivariate Linear RegressionNextGradient Descent

Last updated 5 years ago

Was this helpful?