Cost function | Coursera Community
Coursera Header
Solved

Cost function

  • 6 January 2019
  • 2 replies
  • 72 views

Badge +1
why are we minimizing the square of (prediction - actual) while finding out the value of theta 0 and theta 1 in hypothesis.
# sigma part

icon

Best answer by mars 6 January 2019, 15:46

We generally square the difference before summation to avoid zero. If we do not square the individual differences, and then sum over all the values, there a chance we may end up with a zero value for cost function.
While the cost function should only be zero when predicted value is equal to label. Squaring ensures this.
View original

2 replies

Userlevel 2
Badge +2
We generally square the difference before summation to avoid zero. If we do not square the individual differences, and then sum over all the values, there a chance we may end up with a zero value for cost function.
Userlevel 2
Badge +2
We generally square the difference before summation to avoid zero. If we do not square the individual differences, and then sum over all the values, there a chance we may end up with a zero value for cost function.
While the cost function should only be zero when predicted value is equal to label. Squaring ensures this.

Reply