Machine Learning
Linear regression with one variable - Cost function
뚜둔뚜둔
2022. 5. 3. 17:29
Coursera lecture summary
Cost function
We can measure the accuracy of our hypothesis fufnction by using a cost functon. This takes an acerage difference (actually a fancier version of an average) of all the results of the hypothesis with inputs from x's and the actual output y's
To break it apart, it is 1/2x where x is the mean of the squares of h@(Xi)-yi, or the difference between the predicted value andthe actual value.
This fuction is otherwise called the "Squared error function", or "Mean squared error".
ther mean is hlved (1/2) as a convenience for the computation of the gradient descent, as the derivative tem of the square function will cancel out the 1/2 term. The following image summmarizes what the cost function does:
반응형